2026-03-10T12:28:47.786 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-10T12:28:47.794 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T12:28:47.821 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '1029' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/v18.2.0 ' name: kyr-2026-03-10_01:00:38-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 8043 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b targets: vm00.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAxp8xVeqQ0GzqolG+aboEG67HTR9ypCHaHNzbdl3Ou0tZloNsRjQCKzsoRBSgO4HpmJRQsfSwXWHhBDoEbZ+Hg= vm07.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGwnEUPq0ZTxwZCptlP9eJB31LCweqbR0tR+HJsWnBODj63yigx81a8M1Lzlv6uTy5OM056w/+z4shNPn8fjGr0= tasks: - install: exclude_packages: - ceph-volume tag: v18.2.0 - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.io/ceph/ceph:v18.2.0 roleless: true - print: '**** done end installing v18.2.0 cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 2 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay false - cephadm.shell: host.a: - ceph fs set cephfs inline_data false - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: - /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/suites/orch/cephadm/mds_upgrade_sequence/tasks/3-upgrade-mgr-staggered.yaml meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: false teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-10_01:00:38 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs false || true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-10T12:28:47.822 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa; will attempt to use it 2026-03-10T12:28:47.822 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks 2026-03-10T12:28:47.822 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-10T12:28:47.822 INFO:teuthology.task.internal:Checking packages... 2026-03-10T12:28:47.822 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-10T12:28:47.822 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-10T12:28:47.822 INFO:teuthology.packaging:ref: None 2026-03-10T12:28:47.822 INFO:teuthology.packaging:tag: None 2026-03-10T12:28:47.822 INFO:teuthology.packaging:branch: squid 2026-03-10T12:28:47.822 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:28:47.823 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-10T12:28:48.609 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-10T12:28:48.609 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-10T12:28:48.610 INFO:teuthology.task.internal:no buildpackages task found 2026-03-10T12:28:48.610 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-10T12:28:48.610 INFO:teuthology.task.internal:Saving configuration 2026-03-10T12:28:48.618 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-10T12:28:48.619 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-10T12:28:48.626 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm00.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 12:27:33.996480', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:00', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAxp8xVeqQ0GzqolG+aboEG67HTR9ypCHaHNzbdl3Ou0tZloNsRjQCKzsoRBSgO4HpmJRQsfSwXWHhBDoEbZ+Hg='} 2026-03-10T12:28:48.630 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm07.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 12:27:33.996088', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:07', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGwnEUPq0ZTxwZCptlP9eJB31LCweqbR0tR+HJsWnBODj63yigx81a8M1Lzlv6uTy5OM056w/+z4shNPn8fjGr0='} 2026-03-10T12:28:48.630 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-10T12:28:48.631 INFO:teuthology.task.internal:roles: ubuntu@vm00.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-10T12:28:48.631 INFO:teuthology.task.internal:roles: ubuntu@vm07.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-10T12:28:48.631 INFO:teuthology.run_tasks:Running task console_log... 2026-03-10T12:28:48.636 DEBUG:teuthology.task.console_log:vm00 does not support IPMI; excluding 2026-03-10T12:28:48.641 DEBUG:teuthology.task.console_log:vm07 does not support IPMI; excluding 2026-03-10T12:28:48.641 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f0a22476170>, signals=[15]) 2026-03-10T12:28:48.641 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-10T12:28:48.642 INFO:teuthology.task.internal:Opening connections... 2026-03-10T12:28:48.642 DEBUG:teuthology.task.internal:connecting to ubuntu@vm00.local 2026-03-10T12:28:48.643 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm00.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T12:28:48.702 DEBUG:teuthology.task.internal:connecting to ubuntu@vm07.local 2026-03-10T12:28:48.702 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm07.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T12:28:48.762 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-10T12:28:48.763 DEBUG:teuthology.orchestra.run.vm00:> uname -m 2026-03-10T12:28:48.815 INFO:teuthology.orchestra.run.vm00.stdout:x86_64 2026-03-10T12:28:48.815 DEBUG:teuthology.orchestra.run.vm00:> cat /etc/os-release 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:NAME="CentOS Stream" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:VERSION="9" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:ID="centos" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:ID_LIKE="rhel fedora" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:VERSION_ID="9" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:PLATFORM_ID="platform:el9" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:ANSI_COLOR="0;31" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:LOGO="fedora-logo-icon" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:HOME_URL="https://centos.org/" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T12:28:48.870 INFO:teuthology.orchestra.run.vm00.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T12:28:48.871 INFO:teuthology.lock.ops:Updating vm00.local on lock server 2026-03-10T12:28:48.875 DEBUG:teuthology.orchestra.run.vm07:> uname -m 2026-03-10T12:28:48.888 INFO:teuthology.orchestra.run.vm07.stdout:x86_64 2026-03-10T12:28:48.889 DEBUG:teuthology.orchestra.run.vm07:> cat /etc/os-release 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:NAME="CentOS Stream" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:VERSION="9" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:ID="centos" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:ID_LIKE="rhel fedora" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:VERSION_ID="9" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:PLATFORM_ID="platform:el9" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:ANSI_COLOR="0;31" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:LOGO="fedora-logo-icon" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:HOME_URL="https://centos.org/" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T12:28:48.943 INFO:teuthology.orchestra.run.vm07.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T12:28:48.943 INFO:teuthology.lock.ops:Updating vm07.local on lock server 2026-03-10T12:28:48.973 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-10T12:28:48.975 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-10T12:28:48.979 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-10T12:28:48.980 DEBUG:teuthology.orchestra.run.vm00:> test '!' -e /home/ubuntu/cephtest 2026-03-10T12:28:48.981 DEBUG:teuthology.orchestra.run.vm07:> test '!' -e /home/ubuntu/cephtest 2026-03-10T12:28:48.997 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-10T12:28:48.999 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-10T12:28:48.999 DEBUG:teuthology.orchestra.run.vm00:> test -z $(ls -A /var/lib/ceph) 2026-03-10T12:28:49.039 DEBUG:teuthology.orchestra.run.vm07:> test -z $(ls -A /var/lib/ceph) 2026-03-10T12:28:49.054 INFO:teuthology.orchestra.run.vm00.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T12:28:49.054 INFO:teuthology.orchestra.run.vm07.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T12:28:49.054 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-10T12:28:49.061 DEBUG:teuthology.orchestra.run.vm00:> test -e /ceph-qa-ready 2026-03-10T12:28:49.107 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:28:49.289 DEBUG:teuthology.orchestra.run.vm07:> test -e /ceph-qa-ready 2026-03-10T12:28:49.304 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:28:49.496 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-10T12:28:49.497 INFO:teuthology.task.internal:Creating test directory... 2026-03-10T12:28:49.497 DEBUG:teuthology.orchestra.run.vm00:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T12:28:49.499 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T12:28:49.514 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-10T12:28:49.516 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-10T12:28:49.517 INFO:teuthology.task.internal:Creating archive directory... 2026-03-10T12:28:49.517 DEBUG:teuthology.orchestra.run.vm00:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T12:28:49.553 DEBUG:teuthology.orchestra.run.vm07:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T12:28:49.572 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-10T12:28:49.574 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-10T12:28:49.574 DEBUG:teuthology.orchestra.run.vm00:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T12:28:49.620 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:28:49.620 DEBUG:teuthology.orchestra.run.vm07:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T12:28:49.634 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:28:49.635 DEBUG:teuthology.orchestra.run.vm00:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T12:28:49.662 DEBUG:teuthology.orchestra.run.vm07:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T12:28:49.686 INFO:teuthology.orchestra.run.vm00.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T12:28:49.696 INFO:teuthology.orchestra.run.vm00.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T12:28:49.700 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T12:28:49.709 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T12:28:49.710 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-10T12:28:49.712 INFO:teuthology.task.internal:Configuring sudo... 2026-03-10T12:28:49.712 DEBUG:teuthology.orchestra.run.vm00:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T12:28:49.740 DEBUG:teuthology.orchestra.run.vm07:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T12:28:49.776 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-10T12:28:49.778 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-10T12:28:49.778 DEBUG:teuthology.orchestra.run.vm00:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T12:28:49.806 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T12:28:49.832 DEBUG:teuthology.orchestra.run.vm00:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T12:28:49.882 DEBUG:teuthology.orchestra.run.vm00:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T12:28:49.938 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:28:49.938 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T12:28:49.999 DEBUG:teuthology.orchestra.run.vm07:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T12:28:50.022 DEBUG:teuthology.orchestra.run.vm07:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T12:28:50.079 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:28:50.080 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T12:28:50.139 DEBUG:teuthology.orchestra.run.vm00:> sudo service rsyslog restart 2026-03-10T12:28:50.140 DEBUG:teuthology.orchestra.run.vm07:> sudo service rsyslog restart 2026-03-10T12:28:50.168 INFO:teuthology.orchestra.run.vm00.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T12:28:50.206 INFO:teuthology.orchestra.run.vm07.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T12:28:50.587 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-10T12:28:50.589 INFO:teuthology.task.internal:Starting timer... 2026-03-10T12:28:50.589 INFO:teuthology.run_tasks:Running task pcp... 2026-03-10T12:28:50.591 INFO:teuthology.run_tasks:Running task selinux... 2026-03-10T12:28:50.593 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-10T12:28:50.594 INFO:teuthology.task.selinux:Excluding vm00: VMs are not yet supported 2026-03-10T12:28:50.594 INFO:teuthology.task.selinux:Excluding vm07: VMs are not yet supported 2026-03-10T12:28:50.594 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-10T12:28:50.594 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-10T12:28:50.594 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-10T12:28:50.594 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-10T12:28:50.595 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-10T12:28:50.596 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-10T12:28:50.597 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-10T12:28:51.100 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-10T12:28:51.106 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-10T12:28:51.107 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventoryh2_dsj6q --limit vm00.local,vm07.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-10T12:30:41.234 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm00.local'), Remote(name='ubuntu@vm07.local')] 2026-03-10T12:30:41.234 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm00.local' 2026-03-10T12:30:41.235 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm00.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T12:30:41.299 DEBUG:teuthology.orchestra.run.vm00:> true 2026-03-10T12:30:41.376 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm00.local' 2026-03-10T12:30:41.376 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm07.local' 2026-03-10T12:30:41.376 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm07.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T12:30:41.441 DEBUG:teuthology.orchestra.run.vm07:> true 2026-03-10T12:30:41.521 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm07.local' 2026-03-10T12:30:41.521 INFO:teuthology.run_tasks:Running task clock... 2026-03-10T12:30:41.523 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-10T12:30:41.523 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T12:30:41.523 DEBUG:teuthology.orchestra.run.vm00:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T12:30:41.525 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T12:30:41.525 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T12:30:41.567 INFO:teuthology.orchestra.run.vm00.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T12:30:41.582 INFO:teuthology.orchestra.run.vm00.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T12:30:41.601 INFO:teuthology.orchestra.run.vm07.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T12:30:41.615 INFO:teuthology.orchestra.run.vm00.stderr:sudo: ntpd: command not found 2026-03-10T12:30:41.618 INFO:teuthology.orchestra.run.vm07.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T12:30:41.628 INFO:teuthology.orchestra.run.vm00.stdout:506 Cannot talk to daemon 2026-03-10T12:30:41.643 INFO:teuthology.orchestra.run.vm00.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T12:30:41.646 INFO:teuthology.orchestra.run.vm07.stderr:sudo: ntpd: command not found 2026-03-10T12:30:41.661 INFO:teuthology.orchestra.run.vm00.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T12:30:41.663 INFO:teuthology.orchestra.run.vm07.stdout:506 Cannot talk to daemon 2026-03-10T12:30:41.678 INFO:teuthology.orchestra.run.vm07.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T12:30:41.697 INFO:teuthology.orchestra.run.vm07.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T12:30:41.712 INFO:teuthology.orchestra.run.vm00.stderr:bash: line 1: ntpq: command not found 2026-03-10T12:30:41.716 INFO:teuthology.orchestra.run.vm00.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T12:30:41.716 INFO:teuthology.orchestra.run.vm00.stdout:=============================================================================== 2026-03-10T12:30:41.716 INFO:teuthology.orchestra.run.vm00.stdout:^? cp.hypermediaa.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.716 INFO:teuthology.orchestra.run.vm00.stdout:^? 217.160.19.219 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.716 INFO:teuthology.orchestra.run.vm00.stdout:^? frank.askja.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.716 INFO:teuthology.orchestra.run.vm00.stdout:^? ip217-154-182-60.pbiaas.> 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.747 INFO:teuthology.orchestra.run.vm07.stderr:bash: line 1: ntpq: command not found 2026-03-10T12:30:41.753 INFO:teuthology.orchestra.run.vm07.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T12:30:41.753 INFO:teuthology.orchestra.run.vm07.stdout:=============================================================================== 2026-03-10T12:30:41.753 INFO:teuthology.orchestra.run.vm07.stdout:^? 217.160.19.219 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.753 INFO:teuthology.orchestra.run.vm07.stdout:^? frank.askja.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.753 INFO:teuthology.orchestra.run.vm07.stdout:^? ip217-154-182-60.pbiaas.> 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.753 INFO:teuthology.orchestra.run.vm07.stdout:^? cp.hypermediaa.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T12:30:41.753 INFO:teuthology.run_tasks:Running task install... 2026-03-10T12:30:41.756 DEBUG:teuthology.task.install:project ceph 2026-03-10T12:30:41.756 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T12:30:41.756 DEBUG:teuthology.task.install:config {'exclude_packages': ['ceph-volume'], 'tag': 'v18.2.0', 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T12:30:41.756 INFO:teuthology.task.install:Using flavor: default 2026-03-10T12:30:41.759 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-10T12:30:41.759 INFO:teuthology.task.install:extra packages: [] 2026-03-10T12:30:41.759 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-10T12:30:41.759 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T12:30:41.759 INFO:teuthology.packaging:ref: None 2026-03-10T12:30:41.759 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T12:30:41.759 INFO:teuthology.packaging:branch: None 2026-03-10T12:30:41.759 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:30:42.356 DEBUG:teuthology.repo_utils:git ls-remote https://github.com/ceph/ceph v18.2.0^{} -> 5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T12:30:42.357 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T12:30:42.357 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.0', 'wait_for_package': False} 2026-03-10T12:30:42.357 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T12:30:42.357 INFO:teuthology.packaging:ref: None 2026-03-10T12:30:42.358 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T12:30:42.358 INFO:teuthology.packaging:branch: None 2026-03-10T12:30:42.358 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:30:42.358 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T12:30:42.998 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-10T12:30:42.999 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-10T12:30:43.059 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/ 2026-03-10T12:30:43.059 INFO:teuthology.task.install.rpm:Package version is 18.2.0-0 2026-03-10T12:30:43.375 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T12:30:43.375 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:30:43.375 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T12:30:43.380 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/5dd24139a1eada541a3bc16b6941c5dde975e26d/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T12:30:43.380 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:30:43.380 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T12:30:43.404 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T12:30:43.404 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T12:30:43.404 INFO:teuthology.packaging:ref: None 2026-03-10T12:30:43.404 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T12:30:43.404 INFO:teuthology.packaging:branch: None 2026-03-10T12:30:43.404 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:30:43.404 DEBUG:teuthology.orchestra.run.vm00:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T12:30:43.410 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T12:30:43.416 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T12:30:43.416 INFO:teuthology.packaging:ref: None 2026-03-10T12:30:43.416 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T12:30:43.416 INFO:teuthology.packaging:branch: None 2026-03-10T12:30:43.416 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:30:43.416 DEBUG:teuthology.orchestra.run.vm07:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.0/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T12:30:43.485 DEBUG:teuthology.orchestra.run.vm00:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T12:30:43.486 DEBUG:teuthology.orchestra.run.vm07:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T12:30:43.566 DEBUG:teuthology.orchestra.run.vm00:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T12:30:43.575 DEBUG:teuthology.orchestra.run.vm07:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T12:30:43.608 INFO:teuthology.orchestra.run.vm07.stdout:check_obsoletes = 1 2026-03-10T12:30:43.610 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean all 2026-03-10T12:30:43.639 INFO:teuthology.orchestra.run.vm00.stdout:check_obsoletes = 1 2026-03-10T12:30:43.640 DEBUG:teuthology.orchestra.run.vm00:> sudo yum clean all 2026-03-10T12:30:43.846 INFO:teuthology.orchestra.run.vm07.stdout:41 files removed 2026-03-10T12:30:43.850 INFO:teuthology.orchestra.run.vm00.stdout:41 files removed 2026-03-10T12:30:43.874 DEBUG:teuthology.orchestra.run.vm00:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T12:30:43.893 DEBUG:teuthology.orchestra.run.vm07:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T12:30:44.942 INFO:teuthology.orchestra.run.vm07.stdout:ceph packages for x86_64 91 kB/s | 76 kB 00:00 2026-03-10T12:30:44.942 INFO:teuthology.orchestra.run.vm00.stdout:ceph packages for x86_64 93 kB/s | 76 kB 00:00 2026-03-10T12:30:45.587 INFO:teuthology.orchestra.run.vm00.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-10T12:30:45.588 INFO:teuthology.orchestra.run.vm07.stdout:ceph noarch packages 15 kB/s | 9.3 kB 00:00 2026-03-10T12:30:46.221 INFO:teuthology.orchestra.run.vm07.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-10T12:30:46.239 INFO:teuthology.orchestra.run.vm00.stdout:ceph source packages 3.4 kB/s | 2.2 kB 00:00 2026-03-10T12:30:47.270 INFO:teuthology.orchestra.run.vm00.stdout:CentOS Stream 9 - BaseOS 8.8 MB/s | 8.9 MB 00:01 2026-03-10T12:30:47.417 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - BaseOS 7.6 MB/s | 8.9 MB 00:01 2026-03-10T12:30:49.662 INFO:teuthology.orchestra.run.vm00.stdout:CentOS Stream 9 - AppStream 18 MB/s | 27 MB 00:01 2026-03-10T12:30:49.751 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - AppStream 18 MB/s | 27 MB 00:01 2026-03-10T12:30:53.396 INFO:teuthology.orchestra.run.vm00.stdout:CentOS Stream 9 - CRB 21 MB/s | 8.0 MB 00:00 2026-03-10T12:30:54.110 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - CRB 7.0 MB/s | 8.0 MB 00:01 2026-03-10T12:30:54.816 INFO:teuthology.orchestra.run.vm00.stdout:CentOS Stream 9 - Extras packages 56 kB/s | 20 kB 00:00 2026-03-10T12:30:55.624 INFO:teuthology.orchestra.run.vm00.stdout:Extra Packages for Enterprise Linux 29 MB/s | 20 MB 00:00 2026-03-10T12:30:55.637 INFO:teuthology.orchestra.run.vm07.stdout:CentOS Stream 9 - Extras packages 43 kB/s | 20 kB 00:00 2026-03-10T12:30:57.198 INFO:teuthology.orchestra.run.vm07.stdout:Extra Packages for Enterprise Linux 14 MB/s | 20 MB 00:01 2026-03-10T12:31:01.415 INFO:teuthology.orchestra.run.vm00.stdout:lab-extras 56 kB/s | 50 kB 00:00 2026-03-10T12:31:02.651 INFO:teuthology.orchestra.run.vm07.stdout:lab-extras 65 kB/s | 50 kB 00:00 2026-03-10T12:31:03.193 INFO:teuthology.orchestra.run.vm00.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T12:31:03.193 INFO:teuthology.orchestra.run.vm00.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T12:31:03.198 INFO:teuthology.orchestra.run.vm00.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T12:31:03.199 INFO:teuthology.orchestra.run.vm00.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T12:31:03.234 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:31:03.238 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:31:03.238 INFO:teuthology.orchestra.run.vm00.stdout: Package Arch Version Repository Size 2026-03-10T12:31:03.238 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:31:03.238 INFO:teuthology.orchestra.run.vm00.stdout:Installing: 2026-03-10T12:31:03.238 INFO:teuthology.orchestra.run.vm00.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout:Upgrading: 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout:Installing dependencies: 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T12:31:03.239 INFO:teuthology.orchestra.run.vm00.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T12:31:03.240 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T12:31:03.241 INFO:teuthology.orchestra.run.vm00.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout:Installing weak dependencies: 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout:Install 117 Packages 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout:Upgrade 2 Packages 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout:Total download size: 182 M 2026-03-10T12:31:03.242 INFO:teuthology.orchestra.run.vm00.stdout:Downloading Packages: 2026-03-10T12:31:04.288 INFO:teuthology.orchestra.run.vm07.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T12:31:04.292 INFO:teuthology.orchestra.run.vm07.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T12:31:04.298 INFO:teuthology.orchestra.run.vm07.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T12:31:04.299 INFO:teuthology.orchestra.run.vm07.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T12:31:04.302 INFO:teuthology.orchestra.run.vm00.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-10T12:31:04.342 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout:Installing: 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: ceph x86_64 2:18.2.0-0.el9 ceph 6.4 k 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base x86_64 2:18.2.0-0.el9 ceph 5.2 M 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 ceph 835 k 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 ceph 142 k 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 ceph 1.4 M 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 ceph-noarch 127 k 2026-03-10T12:31:04.347 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 ceph-noarch 1.7 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 ceph-noarch 7.4 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 ceph-noarch 47 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 ceph 7.6 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test x86_64 2:18.2.0-0.el9 ceph 40 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: cephadm noarch 2:18.2.0-0.el9 ceph-noarch 209 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 ceph 30 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 ceph 653 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel x86_64 2:18.2.0-0.el9 ceph 126 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 ceph 155 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados x86_64 2:18.2.0-0.el9 ceph 321 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd x86_64 2:18.2.0-0.el9 ceph 297 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw x86_64 2:18.2.0-0.el9 ceph 99 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 ceph 86 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 ceph 169 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout:Upgrading: 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: librados2 x86_64 2:18.2.0-0.el9 ceph 3.3 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: librbd1 x86_64 2:18.2.0-0.el9 ceph 3.0 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout:Installing dependencies: 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 ceph-noarch 23 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds x86_64 2:18.2.0-0.el9 ceph 2.1 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 ceph-noarch 240 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd x86_64 2:18.2.0-0.el9 ceph 18 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 ceph-noarch 15 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 ceph 24 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 ceph 161 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T12:31:04.348 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 ceph 474 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: librgw2 x86_64 2:18.2.0-0.el9 ceph 4.4 M 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 ceph 45 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 ceph 119 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T12:31:04.349 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout:Installing weak dependencies: 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout:Install 117 Packages 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout:Upgrade 2 Packages 2026-03-10T12:31:04.350 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:04.353 INFO:teuthology.orchestra.run.vm07.stdout:Total download size: 182 M 2026-03-10T12:31:04.354 INFO:teuthology.orchestra.run.vm07.stdout:Downloading Packages: 2026-03-10T12:31:05.008 INFO:teuthology.orchestra.run.vm00.stdout:(2/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 1.2 MB/s | 835 kB 00:00 2026-03-10T12:31:05.109 INFO:teuthology.orchestra.run.vm00.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 1.4 MB/s | 142 kB 00:00 2026-03-10T12:31:05.301 INFO:teuthology.orchestra.run.vm00.stdout:(4/119): ceph-base-18.2.0-0.el9.x86_64.rpm 4.0 MB/s | 5.2 MB 00:01 2026-03-10T12:31:05.385 INFO:teuthology.orchestra.run.vm07.stdout:(1/119): ceph-18.2.0-0.el9.x86_64.rpm 20 kB/s | 6.4 kB 00:00 2026-03-10T12:31:05.608 INFO:teuthology.orchestra.run.vm00.stdout:(5/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 4.7 MB/s | 1.4 MB 00:00 2026-03-10T12:31:05.621 INFO:teuthology.orchestra.run.vm00.stdout:(6/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 4.1 MB/s | 2.1 MB 00:00 2026-03-10T12:31:06.004 INFO:teuthology.orchestra.run.vm07.stdout:(2/119): ceph-fuse-18.2.0-0.el9.x86_64.rpm 1.3 MB/s | 835 kB 00:00 2026-03-10T12:31:06.109 INFO:teuthology.orchestra.run.vm07.stdout:(3/119): ceph-immutable-object-cache-18.2.0-0.e 1.3 MB/s | 142 kB 00:00 2026-03-10T12:31:06.412 INFO:teuthology.orchestra.run.vm00.stdout:(7/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 5.5 MB/s | 4.4 MB 00:00 2026-03-10T12:31:06.466 INFO:teuthology.orchestra.run.vm07.stdout:(4/119): ceph-mds-18.2.0-0.el9.x86_64.rpm 5.9 MB/s | 2.1 MB 00:00 2026-03-10T12:31:06.672 INFO:teuthology.orchestra.run.vm07.stdout:(5/119): ceph-base-18.2.0-0.el9.x86_64.rpm 3.2 MB/s | 5.2 MB 00:01 2026-03-10T12:31:06.791 INFO:teuthology.orchestra.run.vm07.stdout:(6/119): ceph-mgr-18.2.0-0.el9.x86_64.rpm 4.4 MB/s | 1.4 MB 00:00 2026-03-10T12:31:07.631 INFO:teuthology.orchestra.run.vm00.stdout:(8/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 6.3 MB/s | 7.6 MB 00:01 2026-03-10T12:31:07.680 INFO:teuthology.orchestra.run.vm07.stdout:(7/119): ceph-mon-18.2.0-0.el9.x86_64.rpm 4.4 MB/s | 4.4 MB 00:01 2026-03-10T12:31:07.733 INFO:teuthology.orchestra.run.vm00.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 237 kB/s | 24 kB 00:00 2026-03-10T12:31:08.796 INFO:teuthology.orchestra.run.vm07.stdout:(8/119): ceph-common-18.2.0-0.el9.x86_64.rpm 4.9 MB/s | 18 MB 00:03 2026-03-10T12:31:08.898 INFO:teuthology.orchestra.run.vm07.stdout:(9/119): ceph-selinux-18.2.0-0.el9.x86_64.rpm 236 kB/s | 24 kB 00:00 2026-03-10T12:31:09.293 INFO:teuthology.orchestra.run.vm07.stdout:(10/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 7.0 MB/s | 18 MB 00:02 2026-03-10T12:31:09.321 INFO:teuthology.orchestra.run.vm07.stdout:(11/119): ceph-radosgw-18.2.0-0.el9.x86_64.rpm 4.7 MB/s | 7.6 MB 00:01 2026-03-10T12:31:09.396 INFO:teuthology.orchestra.run.vm07.stdout:(12/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 297 kB/s | 30 kB 00:00 2026-03-10T12:31:09.501 INFO:teuthology.orchestra.run.vm07.stdout:(13/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.5 MB/s | 161 kB 00:00 2026-03-10T12:31:09.528 INFO:teuthology.orchestra.run.vm07.stdout:(14/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 3.1 MB/s | 653 kB 00:00 2026-03-10T12:31:09.606 INFO:teuthology.orchestra.run.vm07.stdout:(15/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-10T12:31:09.714 INFO:teuthology.orchestra.run.vm00.stdout:(10/119): ceph-common-18.2.0-0.el9.x86_64.rpm 3.2 MB/s | 18 MB 00:05 2026-03-10T12:31:09.728 INFO:teuthology.orchestra.run.vm07.stdout:(16/119): libradosstriper1-18.2.0-0.el9.x86_64. 2.3 MB/s | 474 kB 00:00 2026-03-10T12:31:09.828 INFO:teuthology.orchestra.run.vm07.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 450 kB/s | 45 kB 00:00 2026-03-10T12:31:09.928 INFO:teuthology.orchestra.run.vm07.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.2 MB/s | 119 kB 00:00 2026-03-10T12:31:09.971 INFO:teuthology.orchestra.run.vm00.stdout:(11/119): ceph-osd-18.2.0-0.el9.x86_64.rpm 4.0 MB/s | 18 MB 00:04 2026-03-10T12:31:10.029 INFO:teuthology.orchestra.run.vm07.stdout:(19/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.5 MB/s | 155 kB 00:00 2026-03-10T12:31:10.132 INFO:teuthology.orchestra.run.vm07.stdout:(20/119): python3-rados-18.2.0-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-10T12:31:10.173 INFO:teuthology.orchestra.run.vm00.stdout:(12/119): libcephfs2-18.2.0-0.el9.x86_64.rpm 3.2 MB/s | 653 kB 00:00 2026-03-10T12:31:10.236 INFO:teuthology.orchestra.run.vm07.stdout:(21/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-10T12:31:10.275 INFO:teuthology.orchestra.run.vm00.stdout:(13/119): libcephsqlite-18.2.0-0.el9.x86_64.rpm 1.5 MB/s | 161 kB 00:00 2026-03-10T12:31:10.337 INFO:teuthology.orchestra.run.vm07.stdout:(22/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 984 kB/s | 99 kB 00:00 2026-03-10T12:31:10.376 INFO:teuthology.orchestra.run.vm00.stdout:(14/119): librados-devel-18.2.0-0.el9.x86_64.rp 1.2 MB/s | 126 kB 00:00 2026-03-10T12:31:10.423 INFO:teuthology.orchestra.run.vm00.stdout:(15/119): libcephfs-devel-18.2.0-0.el9.x86_64.r 43 kB/s | 30 kB 00:00 2026-03-10T12:31:10.437 INFO:teuthology.orchestra.run.vm07.stdout:(23/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 855 kB/s | 86 kB 00:00 2026-03-10T12:31:10.538 INFO:teuthology.orchestra.run.vm07.stdout:(24/119): librgw2-18.2.0-0.el9.x86_64.rpm 4.7 MB/s | 4.4 MB 00:00 2026-03-10T12:31:10.646 INFO:teuthology.orchestra.run.vm07.stdout:(25/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.5 MB/s | 169 kB 00:00 2026-03-10T12:31:10.673 INFO:teuthology.orchestra.run.vm00.stdout:(16/119): libradosstriper1-18.2.0-0.el9.x86_64. 1.6 MB/s | 474 kB 00:00 2026-03-10T12:31:10.749 INFO:teuthology.orchestra.run.vm07.stdout:(26/119): ceph-grafana-dashboards-18.2.0-0.el9. 225 kB/s | 23 kB 00:00 2026-03-10T12:31:10.774 INFO:teuthology.orchestra.run.vm00.stdout:(17/119): python3-ceph-argparse-18.2.0-0.el9.x8 446 kB/s | 45 kB 00:00 2026-03-10T12:31:10.854 INFO:teuthology.orchestra.run.vm07.stdout:(27/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 1.2 MB/s | 127 kB 00:00 2026-03-10T12:31:10.875 INFO:teuthology.orchestra.run.vm00.stdout:(18/119): python3-ceph-common-18.2.0-0.el9.x86_ 1.2 MB/s | 119 kB 00:00 2026-03-10T12:31:10.976 INFO:teuthology.orchestra.run.vm00.stdout:(19/119): python3-cephfs-18.2.0-0.el9.x86_64.rp 1.5 MB/s | 155 kB 00:00 2026-03-10T12:31:11.176 INFO:teuthology.orchestra.run.vm00.stdout:(20/119): python3-rados-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 321 kB 00:00 2026-03-10T12:31:11.243 INFO:teuthology.orchestra.run.vm07.stdout:(28/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 3.7 MB/s | 3.0 MB 00:00 2026-03-10T12:31:11.275 INFO:teuthology.orchestra.run.vm07.stdout:(29/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 4.0 MB/s | 1.7 MB 00:00 2026-03-10T12:31:11.279 INFO:teuthology.orchestra.run.vm00.stdout:(21/119): python3-rbd-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-10T12:31:11.381 INFO:teuthology.orchestra.run.vm00.stdout:(22/119): python3-rgw-18.2.0-0.el9.x86_64.rpm 983 kB/s | 99 kB 00:00 2026-03-10T12:31:11.381 INFO:teuthology.orchestra.run.vm07.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 2.2 MB/s | 240 kB 00:00 2026-03-10T12:31:11.481 INFO:teuthology.orchestra.run.vm00.stdout:(23/119): rbd-fuse-18.2.0-0.el9.x86_64.rpm 857 kB/s | 86 kB 00:00 2026-03-10T12:31:11.484 INFO:teuthology.orchestra.run.vm07.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 460 kB/s | 47 kB 00:00 2026-03-10T12:31:11.587 INFO:teuthology.orchestra.run.vm07.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 142 kB/s | 15 kB 00:00 2026-03-10T12:31:11.693 INFO:teuthology.orchestra.run.vm07.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 1.9 MB/s | 209 kB 00:00 2026-03-10T12:31:11.782 INFO:teuthology.orchestra.run.vm07.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 455 kB/s | 40 kB 00:00 2026-03-10T12:31:11.851 INFO:teuthology.orchestra.run.vm07.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 1.0 MB/s | 72 kB 00:00 2026-03-10T12:31:11.956 INFO:teuthology.orchestra.run.vm07.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 7.4 MB/s | 794 kB 00:00 2026-03-10T12:31:12.013 INFO:teuthology.orchestra.run.vm07.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 3.2 MB/s | 184 kB 00:00 2026-03-10T12:31:12.043 INFO:teuthology.orchestra.run.vm07.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 1.1 MB/s | 33 kB 00:00 2026-03-10T12:31:12.078 INFO:teuthology.orchestra.run.vm07.stdout:(39/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 7.2 MB/s | 253 kB 00:00 2026-03-10T12:31:12.153 INFO:teuthology.orchestra.run.vm07.stdout:(40/119): python3-cryptography-36.0.1-5.el9.x86 17 MB/s | 1.2 MB 00:00 2026-03-10T12:31:12.185 INFO:teuthology.orchestra.run.vm07.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 3.3 MB/s | 106 kB 00:00 2026-03-10T12:31:12.217 INFO:teuthology.orchestra.run.vm07.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 4.1 MB/s | 135 kB 00:00 2026-03-10T12:31:12.250 INFO:teuthology.orchestra.run.vm07.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 3.9 MB/s | 126 kB 00:00 2026-03-10T12:31:12.283 INFO:teuthology.orchestra.run.vm07.stdout:(44/119): python3-urllib3-1.26.5-7.el9.noarch.r 6.4 MB/s | 218 kB 00:00 2026-03-10T12:31:12.404 INFO:teuthology.orchestra.run.vm07.stdout:(45/119): boost-program-options-1.75.0-13.el9.x 864 kB/s | 104 kB 00:00 2026-03-10T12:31:12.404 INFO:teuthology.orchestra.run.vm00.stdout:(24/119): librgw2-18.2.0-0.el9.x86_64.rpm 2.2 MB/s | 4.4 MB 00:01 2026-03-10T12:31:12.442 INFO:teuthology.orchestra.run.vm07.stdout:(46/119): flexiblas-3.0.4-9.el9.x86_64.rpm 790 kB/s | 30 kB 00:00 2026-03-10T12:31:12.479 INFO:teuthology.orchestra.run.vm00.stdout:(25/119): rbd-mirror-18.2.0-0.el9.x86_64.rpm 3.0 MB/s | 3.0 MB 00:00 2026-03-10T12:31:12.507 INFO:teuthology.orchestra.run.vm00.stdout:(26/119): rbd-nbd-18.2.0-0.el9.x86_64.rpm 1.6 MB/s | 169 kB 00:00 2026-03-10T12:31:12.581 INFO:teuthology.orchestra.run.vm00.stdout:(27/119): ceph-grafana-dashboards-18.2.0-0.el9. 226 kB/s | 23 kB 00:00 2026-03-10T12:31:12.609 INFO:teuthology.orchestra.run.vm00.stdout:(28/119): ceph-mgr-cephadm-18.2.0-0.el9.noarch. 1.2 MB/s | 127 kB 00:00 2026-03-10T12:31:12.679 INFO:teuthology.orchestra.run.vm07.stdout:(47/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 13 MB/s | 3.0 MB 00:00 2026-03-10T12:31:12.717 INFO:teuthology.orchestra.run.vm07.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 394 kB/s | 15 kB 00:00 2026-03-10T12:31:12.802 INFO:teuthology.orchestra.run.vm07.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.9 MB/s | 160 kB 00:00 2026-03-10T12:31:12.848 INFO:teuthology.orchestra.run.vm07.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 1.0 MB/s | 45 kB 00:00 2026-03-10T12:31:12.908 INFO:teuthology.orchestra.run.vm07.stdout:(51/119): librdkafka-1.6.1-102.el9.x86_64.rpm 11 MB/s | 662 kB 00:00 2026-03-10T12:31:12.952 INFO:teuthology.orchestra.run.vm07.stdout:(52/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 5.5 MB/s | 246 kB 00:00 2026-03-10T12:31:12.977 INFO:teuthology.orchestra.run.vm07.stdout:(53/119): libxslt-1.1.34-12.el9.x86_64.rpm 9.3 MB/s | 233 kB 00:00 2026-03-10T12:31:13.045 INFO:teuthology.orchestra.run.vm07.stdout:(54/119): ceph-mgr-diskprediction-local-18.2.0- 4.1 MB/s | 7.4 MB 00:01 2026-03-10T12:31:13.180 INFO:teuthology.orchestra.run.vm00.stdout:(29/119): ceph-mgr-dashboard-18.2.0-0.el9.noarc 2.8 MB/s | 1.7 MB 00:00 2026-03-10T12:31:13.206 INFO:teuthology.orchestra.run.vm07.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 1.2 MB/s | 292 kB 00:00 2026-03-10T12:31:13.284 INFO:teuthology.orchestra.run.vm00.stdout:(30/119): ceph-mgr-modules-core-18.2.0-0.el9.no 2.3 MB/s | 240 kB 00:00 2026-03-10T12:31:13.315 INFO:teuthology.orchestra.run.vm07.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 156 kB/s | 42 kB 00:00 2026-03-10T12:31:13.384 INFO:teuthology.orchestra.run.vm00.stdout:(31/119): ceph-mgr-rook-18.2.0-0.el9.noarch.rpm 475 kB/s | 47 kB 00:00 2026-03-10T12:31:13.483 INFO:teuthology.orchestra.run.vm00.stdout:(32/119): ceph-prometheus-alerts-18.2.0-0.el9.n 147 kB/s | 15 kB 00:00 2026-03-10T12:31:13.556 INFO:teuthology.orchestra.run.vm07.stdout:(57/119): openblas-openmp-0.3.29-1.el9.x86_64.r 15 MB/s | 5.3 MB 00:00 2026-03-10T12:31:13.585 INFO:teuthology.orchestra.run.vm00.stdout:(33/119): cephadm-18.2.0-0.el9.noarch.rpm 2.0 MB/s | 209 kB 00:00 2026-03-10T12:31:13.609 INFO:teuthology.orchestra.run.vm00.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 1.7 MB/s | 40 kB 00:00 2026-03-10T12:31:13.642 INFO:teuthology.orchestra.run.vm07.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 2.8 MB/s | 244 kB 00:00 2026-03-10T12:31:13.660 INFO:teuthology.orchestra.run.vm00.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 1.4 MB/s | 72 kB 00:00 2026-03-10T12:31:13.677 INFO:teuthology.orchestra.run.vm07.stdout:(59/119): python3-babel-2.9.1-2.el9.noarch.rpm 16 MB/s | 6.0 MB 00:00 2026-03-10T12:31:13.703 INFO:teuthology.orchestra.run.vm07.stdout:(60/119): python3-jinja2-2.11.3-8.el9.noarch.rp 3.9 MB/s | 249 kB 00:00 2026-03-10T12:31:13.709 INFO:teuthology.orchestra.run.vm07.stdout:(61/119): python3-jmespath-1.0.1-1.el9.noarch.r 1.4 MB/s | 48 kB 00:00 2026-03-10T12:31:13.719 INFO:teuthology.orchestra.run.vm00.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 13 MB/s | 794 kB 00:00 2026-03-10T12:31:13.756 INFO:teuthology.orchestra.run.vm00.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 4.9 MB/s | 184 kB 00:00 2026-03-10T12:31:13.760 INFO:teuthology.orchestra.run.vm00.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 8.6 MB/s | 33 kB 00:00 2026-03-10T12:31:13.760 INFO:teuthology.orchestra.run.vm07.stdout:(62/119): python3-libstoragemgmt-1.10.1-1.el9.x 3.1 MB/s | 177 kB 00:00 2026-03-10T12:31:13.773 INFO:teuthology.orchestra.run.vm07.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 2.7 MB/s | 172 kB 00:00 2026-03-10T12:31:13.791 INFO:teuthology.orchestra.run.vm00.stdout:(39/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 8.0 MB/s | 253 kB 00:00 2026-03-10T12:31:13.810 INFO:teuthology.orchestra.run.vm07.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 706 kB/s | 35 kB 00:00 2026-03-10T12:31:13.838 INFO:teuthology.orchestra.run.vm00.stdout:(40/119): python3-cryptography-36.0.1-5.el9.x86 27 MB/s | 1.2 MB 00:00 2026-03-10T12:31:13.851 INFO:teuthology.orchestra.run.vm00.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 7.9 MB/s | 106 kB 00:00 2026-03-10T12:31:13.869 INFO:teuthology.orchestra.run.vm00.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 7.6 MB/s | 135 kB 00:00 2026-03-10T12:31:13.876 INFO:teuthology.orchestra.run.vm00.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 19 MB/s | 126 kB 00:00 2026-03-10T12:31:13.883 INFO:teuthology.orchestra.run.vm00.stdout:(44/119): python3-urllib3-1.26.5-7.el9.noarch.r 30 MB/s | 218 kB 00:00 2026-03-10T12:31:13.897 INFO:teuthology.orchestra.run.vm07.stdout:(65/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 5.0 MB/s | 442 kB 00:00 2026-03-10T12:31:13.976 INFO:teuthology.orchestra.run.vm07.stdout:(66/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.9 MB/s | 157 kB 00:00 2026-03-10T12:31:14.024 INFO:teuthology.orchestra.run.vm07.stdout:(67/119): python3-pyasn1-modules-0.4.8-7.el9.no 5.7 MB/s | 277 kB 00:00 2026-03-10T12:31:14.070 INFO:teuthology.orchestra.run.vm07.stdout:(68/119): python3-requests-oauthlib-1.3.0-12.el 1.1 MB/s | 54 kB 00:00 2026-03-10T12:31:14.118 INFO:teuthology.orchestra.run.vm07.stdout:(69/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 18 MB/s | 6.1 MB 00:00 2026-03-10T12:31:14.153 INFO:teuthology.orchestra.run.vm07.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 1.2 MB/s | 42 kB 00:00 2026-03-10T12:31:14.215 INFO:teuthology.orchestra.run.vm07.stdout:(71/119): socat-1.7.4.1-8.el9.x86_64.rpm 4.8 MB/s | 303 kB 00:00 2026-03-10T12:31:14.257 INFO:teuthology.orchestra.run.vm07.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.5 MB/s | 64 kB 00:00 2026-03-10T12:31:14.272 INFO:teuthology.orchestra.run.vm07.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 7.2 MB/s | 111 kB 00:00 2026-03-10T12:31:14.289 INFO:teuthology.orchestra.run.vm07.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 18 MB/s | 308 kB 00:00 2026-03-10T12:31:14.325 INFO:teuthology.orchestra.run.vm00.stdout:(45/119): boost-program-options-1.75.0-13.el9.x 236 kB/s | 104 kB 00:00 2026-03-10T12:31:14.439 INFO:teuthology.orchestra.run.vm07.stdout:(75/119): libarrow-9.0.0-15.el9.x86_64.rpm 30 MB/s | 4.4 MB 00:00 2026-03-10T12:31:14.467 INFO:teuthology.orchestra.run.vm07.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 908 kB/s | 25 kB 00:00 2026-03-10T12:31:14.467 INFO:teuthology.orchestra.run.vm00.stdout:(46/119): ceph-test-18.2.0-0.el9.x86_64.rpm 5.9 MB/s | 40 MB 00:06 2026-03-10T12:31:14.468 INFO:teuthology.orchestra.run.vm00.stdout:(47/119): flexiblas-3.0.4-9.el9.x86_64.rpm 206 kB/s | 30 kB 00:00 2026-03-10T12:31:14.469 INFO:teuthology.orchestra.run.vm07.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 19 MB/s | 49 kB 00:00 2026-03-10T12:31:14.472 INFO:teuthology.orchestra.run.vm07.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 23 MB/s | 67 kB 00:00 2026-03-10T12:31:14.487 INFO:teuthology.orchestra.run.vm07.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 58 MB/s | 838 kB 00:00 2026-03-10T12:31:14.501 INFO:teuthology.orchestra.run.vm07.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 37 MB/s | 548 kB 00:00 2026-03-10T12:31:14.504 INFO:teuthology.orchestra.run.vm07.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 10 MB/s | 29 kB 00:00 2026-03-10T12:31:14.507 INFO:teuthology.orchestra.run.vm07.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 22 MB/s | 60 kB 00:00 2026-03-10T12:31:14.510 INFO:teuthology.orchestra.run.vm07.stdout:(83/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 15 MB/s | 43 kB 00:00 2026-03-10T12:31:14.513 INFO:teuthology.orchestra.run.vm07.stdout:(84/119): python3-cachetools-4.2.4-1.el9.noarch 12 MB/s | 32 kB 00:00 2026-03-10T12:31:14.516 INFO:teuthology.orchestra.run.vm07.stdout:(85/119): python3-certifi-2023.05.07-4.el9.noar 4.0 MB/s | 14 kB 00:00 2026-03-10T12:31:14.524 INFO:teuthology.orchestra.run.vm07.stdout:(86/119): python3-cheroot-10.0.1-4.el9.noarch.r 22 MB/s | 173 kB 00:00 2026-03-10T12:31:14.534 INFO:teuthology.orchestra.run.vm07.stdout:(87/119): python3-cherrypy-18.6.1-2.el9.noarch. 37 MB/s | 358 kB 00:00 2026-03-10T12:31:14.539 INFO:teuthology.orchestra.run.vm07.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 47 MB/s | 254 kB 00:00 2026-03-10T12:31:14.543 INFO:teuthology.orchestra.run.vm07.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 3.4 MB/s | 11 kB 00:00 2026-03-10T12:31:14.546 INFO:teuthology.orchestra.run.vm07.stdout:(90/119): python3-jaraco-classes-3.2.1-5.el9.no 6.1 MB/s | 18 kB 00:00 2026-03-10T12:31:14.548 INFO:teuthology.orchestra.run.vm07.stdout:(91/119): python3-jaraco-collections-3.0.0-8.el 9.2 MB/s | 23 kB 00:00 2026-03-10T12:31:14.551 INFO:teuthology.orchestra.run.vm07.stdout:(92/119): python3-jaraco-context-6.0.1-3.el9.no 8.3 MB/s | 20 kB 00:00 2026-03-10T12:31:14.553 INFO:teuthology.orchestra.run.vm00.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 176 kB/s | 15 kB 00:00 2026-03-10T12:31:14.603 INFO:teuthology.orchestra.run.vm07.stdout:(93/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 36 MB/s | 19 MB 00:00 2026-03-10T12:31:14.604 INFO:teuthology.orchestra.run.vm07.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 368 kB/s | 19 kB 00:00 2026-03-10T12:31:14.607 INFO:teuthology.orchestra.run.vm07.stdout:(95/119): python3-jwt+crypto-2.4.0-1.el9.noarch 3.9 MB/s | 9.0 kB 00:00 2026-03-10T12:31:14.608 INFO:teuthology.orchestra.run.vm07.stdout:(96/119): python3-jaraco-text-4.0.0-2.el9.noarc 6.0 MB/s | 26 kB 00:00 2026-03-10T12:31:14.609 INFO:teuthology.orchestra.run.vm07.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 16 MB/s | 41 kB 00:00 2026-03-10T12:31:14.613 INFO:teuthology.orchestra.run.vm07.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 13 MB/s | 46 kB 00:00 2026-03-10T12:31:14.618 INFO:teuthology.orchestra.run.vm07.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 17 MB/s | 79 kB 00:00 2026-03-10T12:31:14.622 INFO:teuthology.orchestra.run.vm07.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 15 MB/s | 58 kB 00:00 2026-03-10T12:31:14.629 INFO:teuthology.orchestra.run.vm07.stdout:(101/119): python3-kubernetes-26.1.0-3.el9.noar 48 MB/s | 1.0 MB 00:00 2026-03-10T12:31:14.632 INFO:teuthology.orchestra.run.vm07.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 27 MB/s | 272 kB 00:00 2026-03-10T12:31:14.633 INFO:teuthology.orchestra.run.vm07.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 4.9 MB/s | 16 kB 00:00 2026-03-10T12:31:14.636 INFO:teuthology.orchestra.run.vm07.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 24 MB/s | 90 kB 00:00 2026-03-10T12:31:14.637 INFO:teuthology.orchestra.run.vm07.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 8.2 MB/s | 31 kB 00:00 2026-03-10T12:31:14.641 INFO:teuthology.orchestra.run.vm07.stdout:(106/119): python3-routes-2.5.1-5.el9.noarch.rp 44 MB/s | 188 kB 00:00 2026-03-10T12:31:14.641 INFO:teuthology.orchestra.run.vm07.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 13 MB/s | 59 kB 00:00 2026-03-10T12:31:14.643 INFO:teuthology.orchestra.run.vm07.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 14 MB/s | 36 kB 00:00 2026-03-10T12:31:14.645 INFO:teuthology.orchestra.run.vm07.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 25 MB/s | 86 kB 00:00 2026-03-10T12:31:14.650 INFO:teuthology.orchestra.run.vm07.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 39 MB/s | 230 kB 00:00 2026-03-10T12:31:14.651 INFO:teuthology.orchestra.run.vm07.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 16 MB/s | 90 kB 00:00 2026-03-10T12:31:14.654 INFO:teuthology.orchestra.run.vm07.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 8.5 MB/s | 22 kB 00:00 2026-03-10T12:31:14.656 INFO:teuthology.orchestra.run.vm07.stdout:(113/119): python3-zc-lockfile-2.0-10.el9.noarc 7.5 MB/s | 20 kB 00:00 2026-03-10T12:31:14.661 INFO:teuthology.orchestra.run.vm07.stdout:(114/119): re2-20211101-20.el9.x86_64.rpm 38 MB/s | 191 kB 00:00 2026-03-10T12:31:14.679 INFO:teuthology.orchestra.run.vm07.stdout:(115/119): python3-werkzeug-2.0.3-3.el9.1.noarc 14 MB/s | 427 kB 00:00 2026-03-10T12:31:14.689 INFO:teuthology.orchestra.run.vm07.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 58 MB/s | 1.6 MB 00:00 2026-03-10T12:31:14.697 INFO:teuthology.orchestra.run.vm00.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.1 MB/s | 160 kB 00:00 2026-03-10T12:31:14.772 INFO:teuthology.orchestra.run.vm00.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 609 kB/s | 45 kB 00:00 2026-03-10T12:31:14.916 INFO:teuthology.orchestra.run.vm00.stdout:(51/119): librdkafka-1.6.1-102.el9.x86_64.rpm 4.5 MB/s | 662 kB 00:00 2026-03-10T12:31:14.999 INFO:teuthology.orchestra.run.vm00.stdout:(52/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 2.9 MB/s | 246 kB 00:00 2026-03-10T12:31:15.068 INFO:teuthology.orchestra.run.vm00.stdout:(53/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 5.0 MB/s | 3.0 MB 00:00 2026-03-10T12:31:15.146 INFO:teuthology.orchestra.run.vm00.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 1.5 MB/s | 233 kB 00:00 2026-03-10T12:31:15.183 INFO:teuthology.orchestra.run.vm00.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 2.5 MB/s | 292 kB 00:00 2026-03-10T12:31:15.221 INFO:teuthology.orchestra.run.vm00.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 566 kB/s | 42 kB 00:00 2026-03-10T12:31:15.578 INFO:teuthology.orchestra.run.vm00.stdout:(57/119): openblas-openmp-0.3.29-1.el9.x86_64.r 13 MB/s | 5.3 MB 00:00 2026-03-10T12:31:15.658 INFO:teuthology.orchestra.run.vm00.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 3.0 MB/s | 244 kB 00:00 2026-03-10T12:31:15.680 INFO:teuthology.orchestra.run.vm00.stdout:(59/119): python3-babel-2.9.1-2.el9.noarch.rpm 13 MB/s | 6.0 MB 00:00 2026-03-10T12:31:15.717 INFO:teuthology.orchestra.run.vm00.stdout:(60/119): ceph-mgr-diskprediction-local-18.2.0- 2.4 MB/s | 7.4 MB 00:03 2026-03-10T12:31:15.782 INFO:teuthology.orchestra.run.vm00.stdout:(61/119): python3-jinja2-2.11.3-8.el9.noarch.rp 2.0 MB/s | 249 kB 00:00 2026-03-10T12:31:15.792 INFO:teuthology.orchestra.run.vm00.stdout:(62/119): python3-jmespath-1.0.1-1.el9.noarch.r 425 kB/s | 48 kB 00:00 2026-03-10T12:31:15.793 INFO:teuthology.orchestra.run.vm07.stdout:(117/119): librbd1-18.2.0-0.el9.x86_64.rpm 2.7 MB/s | 3.0 MB 00:01 2026-03-10T12:31:15.857 INFO:teuthology.orchestra.run.vm00.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 2.2 MB/s | 172 kB 00:00 2026-03-10T12:31:15.868 INFO:teuthology.orchestra.run.vm00.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 464 kB/s | 35 kB 00:00 2026-03-10T12:31:15.917 INFO:teuthology.orchestra.run.vm07.stdout:(118/119): librados2-18.2.0-0.el9.x86_64.rpm 2.6 MB/s | 3.3 MB 00:01 2026-03-10T12:31:15.965 INFO:teuthology.orchestra.run.vm00.stdout:(65/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 4.5 MB/s | 442 kB 00:00 2026-03-10T12:31:16.037 INFO:teuthology.orchestra.run.vm00.stdout:(66/119): python3-libstoragemgmt-1.10.1-1.el9.x 554 kB/s | 177 kB 00:00 2026-03-10T12:31:16.063 INFO:teuthology.orchestra.run.vm00.stdout:(67/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 1.6 MB/s | 157 kB 00:00 2026-03-10T12:31:16.138 INFO:teuthology.orchestra.run.vm00.stdout:(68/119): python3-requests-oauthlib-1.3.0-12.el 708 kB/s | 54 kB 00:00 2026-03-10T12:31:16.164 INFO:teuthology.orchestra.run.vm00.stdout:(69/119): python3-pyasn1-modules-0.4.8-7.el9.no 2.1 MB/s | 277 kB 00:00 2026-03-10T12:31:16.261 INFO:teuthology.orchestra.run.vm00.stdout:(70/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 15 MB/s | 6.1 MB 00:00 2026-03-10T12:31:16.262 INFO:teuthology.orchestra.run.vm00.stdout:(71/119): python3-toml-0.10.2-6.el9.noarch.rpm 427 kB/s | 42 kB 00:00 2026-03-10T12:31:16.330 INFO:teuthology.orchestra.run.vm00.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 938 kB/s | 64 kB 00:00 2026-03-10T12:31:16.337 INFO:teuthology.orchestra.run.vm00.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 16 MB/s | 111 kB 00:00 2026-03-10T12:31:16.352 INFO:teuthology.orchestra.run.vm00.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 21 MB/s | 308 kB 00:00 2026-03-10T12:31:16.443 INFO:teuthology.orchestra.run.vm00.stdout:(75/119): libarrow-9.0.0-15.el9.x86_64.rpm 49 MB/s | 4.4 MB 00:00 2026-03-10T12:31:16.446 INFO:teuthology.orchestra.run.vm00.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 11 MB/s | 25 kB 00:00 2026-03-10T12:31:16.448 INFO:teuthology.orchestra.run.vm00.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 20 MB/s | 49 kB 00:00 2026-03-10T12:31:16.452 INFO:teuthology.orchestra.run.vm00.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 20 MB/s | 67 kB 00:00 2026-03-10T12:31:16.464 INFO:teuthology.orchestra.run.vm00.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 67 MB/s | 838 kB 00:00 2026-03-10T12:31:16.475 INFO:teuthology.orchestra.run.vm00.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 50 MB/s | 548 kB 00:00 2026-03-10T12:31:16.478 INFO:teuthology.orchestra.run.vm00.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 13 MB/s | 29 kB 00:00 2026-03-10T12:31:16.481 INFO:teuthology.orchestra.run.vm00.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 23 MB/s | 60 kB 00:00 2026-03-10T12:31:16.483 INFO:teuthology.orchestra.run.vm00.stdout:(83/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 19 MB/s | 43 kB 00:00 2026-03-10T12:31:16.485 INFO:teuthology.orchestra.run.vm00.stdout:(84/119): python3-cachetools-4.2.4-1.el9.noarch 15 MB/s | 32 kB 00:00 2026-03-10T12:31:16.488 INFO:teuthology.orchestra.run.vm00.stdout:(85/119): python3-certifi-2023.05.07-4.el9.noar 6.7 MB/s | 14 kB 00:00 2026-03-10T12:31:16.492 INFO:teuthology.orchestra.run.vm00.stdout:(86/119): python3-cheroot-10.0.1-4.el9.noarch.r 44 MB/s | 173 kB 00:00 2026-03-10T12:31:16.498 INFO:teuthology.orchestra.run.vm00.stdout:(87/119): python3-cherrypy-18.6.1-2.el9.noarch. 57 MB/s | 358 kB 00:00 2026-03-10T12:31:16.503 INFO:teuthology.orchestra.run.vm00.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 50 MB/s | 254 kB 00:00 2026-03-10T12:31:16.506 INFO:teuthology.orchestra.run.vm00.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 5.2 MB/s | 11 kB 00:00 2026-03-10T12:31:16.508 INFO:teuthology.orchestra.run.vm00.stdout:(90/119): python3-jaraco-classes-3.2.1-5.el9.no 8.3 MB/s | 18 kB 00:00 2026-03-10T12:31:16.512 INFO:teuthology.orchestra.run.vm00.stdout:(91/119): python3-jaraco-collections-3.0.0-8.el 5.3 MB/s | 23 kB 00:00 2026-03-10T12:31:16.515 INFO:teuthology.orchestra.run.vm00.stdout:(92/119): python3-jaraco-context-6.0.1-3.el9.no 8.7 MB/s | 20 kB 00:00 2026-03-10T12:31:16.517 INFO:teuthology.orchestra.run.vm00.stdout:(93/119): python3-jaraco-functools-3.5.0-2.el9. 9.0 MB/s | 19 kB 00:00 2026-03-10T12:31:16.519 INFO:teuthology.orchestra.run.vm00.stdout:(94/119): python3-jaraco-text-4.0.0-2.el9.noarc 12 MB/s | 26 kB 00:00 2026-03-10T12:31:16.522 INFO:teuthology.orchestra.run.vm00.stdout:(95/119): python3-jwt+crypto-2.4.0-1.el9.noarch 4.7 MB/s | 9.0 kB 00:00 2026-03-10T12:31:16.524 INFO:teuthology.orchestra.run.vm00.stdout:(96/119): python3-jwt-2.4.0-1.el9.noarch.rpm 16 MB/s | 41 kB 00:00 2026-03-10T12:31:16.539 INFO:teuthology.orchestra.run.vm00.stdout:(97/119): python3-kubernetes-26.1.0-3.el9.noarc 71 MB/s | 1.0 MB 00:00 2026-03-10T12:31:16.542 INFO:teuthology.orchestra.run.vm00.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 19 MB/s | 46 kB 00:00 2026-03-10T12:31:16.545 INFO:teuthology.orchestra.run.vm00.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 27 MB/s | 79 kB 00:00 2026-03-10T12:31:16.548 INFO:teuthology.orchestra.run.vm00.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 21 MB/s | 58 kB 00:00 2026-03-10T12:31:16.554 INFO:teuthology.orchestra.run.vm00.stdout:(101/119): python3-pecan-1.4.2-3.el9.noarch.rpm 42 MB/s | 272 kB 00:00 2026-03-10T12:31:16.557 INFO:teuthology.orchestra.run.vm00.stdout:(102/119): python3-portend-3.1.0-2.el9.noarch.r 7.0 MB/s | 16 kB 00:00 2026-03-10T12:31:16.560 INFO:teuthology.orchestra.run.vm00.stdout:(103/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 30 MB/s | 90 kB 00:00 2026-03-10T12:31:16.564 INFO:teuthology.orchestra.run.vm00.stdout:(104/119): python3-repoze-lru-0.7-16.el9.noarch 7.7 MB/s | 31 kB 00:00 2026-03-10T12:31:16.569 INFO:teuthology.orchestra.run.vm00.stdout:(105/119): python3-routes-2.5.1-5.el9.noarch.rp 38 MB/s | 188 kB 00:00 2026-03-10T12:31:16.572 INFO:teuthology.orchestra.run.vm00.stdout:(106/119): python3-rsa-4.9-2.el9.noarch.rpm 22 MB/s | 59 kB 00:00 2026-03-10T12:31:16.575 INFO:teuthology.orchestra.run.vm00.stdout:(107/119): python3-tempora-5.0.0-2.el9.noarch.r 13 MB/s | 36 kB 00:00 2026-03-10T12:31:16.578 INFO:teuthology.orchestra.run.vm00.stdout:(108/119): python3-typing-extensions-4.15.0-1.e 30 MB/s | 86 kB 00:00 2026-03-10T12:31:16.583 INFO:teuthology.orchestra.run.vm00.stdout:(109/119): python3-webob-1.8.8-2.el9.noarch.rpm 52 MB/s | 230 kB 00:00 2026-03-10T12:31:16.586 INFO:teuthology.orchestra.run.vm00.stdout:(110/119): python3-websocket-client-1.2.3-2.el9 29 MB/s | 90 kB 00:00 2026-03-10T12:31:16.595 INFO:teuthology.orchestra.run.vm00.stdout:(111/119): python3-werkzeug-2.0.3-3.el9.1.noarc 46 MB/s | 427 kB 00:00 2026-03-10T12:31:16.597 INFO:teuthology.orchestra.run.vm00.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 10 MB/s | 22 kB 00:00 2026-03-10T12:31:16.600 INFO:teuthology.orchestra.run.vm00.stdout:(113/119): python3-zc-lockfile-2.0-10.el9.noarc 9.4 MB/s | 20 kB 00:00 2026-03-10T12:31:16.604 INFO:teuthology.orchestra.run.vm00.stdout:(114/119): re2-20211101-20.el9.x86_64.rpm 41 MB/s | 191 kB 00:00 2026-03-10T12:31:16.627 INFO:teuthology.orchestra.run.vm00.stdout:(115/119): thrift-0.15.0-4.el9.x86_64.rpm 72 MB/s | 1.6 MB 00:00 2026-03-10T12:31:16.671 INFO:teuthology.orchestra.run.vm00.stdout:(116/119): socat-1.7.4.1-8.el9.x86_64.rpm 740 kB/s | 303 kB 00:00 2026-03-10T12:31:17.065 INFO:teuthology.orchestra.run.vm00.stdout:(117/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 21 MB/s | 19 MB 00:00 2026-03-10T12:31:17.771 INFO:teuthology.orchestra.run.vm00.stdout:(118/119): librbd1-18.2.0-0.el9.x86_64.rpm 2.7 MB/s | 3.0 MB 00:01 2026-03-10T12:31:18.061 INFO:teuthology.orchestra.run.vm00.stdout:(119/119): librados2-18.2.0-0.el9.x86_64.rpm 2.3 MB/s | 3.3 MB 00:01 2026-03-10T12:31:18.064 INFO:teuthology.orchestra.run.vm00.stdout:-------------------------------------------------------------------------------- 2026-03-10T12:31:18.064 INFO:teuthology.orchestra.run.vm00.stdout:Total 12 MB/s | 182 MB 00:14 2026-03-10T12:31:18.492 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:31:18.537 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:31:18.537 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:31:19.279 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:31:19.279 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:31:20.251 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:31:20.269 INFO:teuthology.orchestra.run.vm00.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T12:31:20.282 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T12:31:20.459 INFO:teuthology.orchestra.run.vm00.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T12:31:20.461 INFO:teuthology.orchestra.run.vm00.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T12:31:20.507 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T12:31:20.509 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T12:31:20.539 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T12:31:20.550 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T12:31:20.553 INFO:teuthology.orchestra.run.vm00.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T12:31:20.556 INFO:teuthology.orchestra.run.vm00.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T12:31:20.567 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T12:31:20.590 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T12:31:20.627 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T12:31:20.629 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T12:31:20.682 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T12:31:20.688 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T12:31:20.715 INFO:teuthology.orchestra.run.vm00.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T12:31:20.725 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T12:31:20.729 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T12:31:20.760 INFO:teuthology.orchestra.run.vm00.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T12:31:20.779 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T12:31:20.784 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T12:31:20.792 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T12:31:20.795 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T12:31:20.802 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T12:31:20.812 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T12:31:20.828 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T12:31:20.862 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T12:31:20.940 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T12:31:20.959 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T12:31:20.967 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T12:31:20.978 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T12:31:20.983 INFO:teuthology.orchestra.run.vm00.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-10T12:31:21.029 INFO:teuthology.orchestra.run.vm00.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T12:31:21.037 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T12:31:21.058 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T12:31:21.087 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T12:31:21.096 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T12:31:21.103 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T12:31:21.119 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T12:31:21.132 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T12:31:21.145 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T12:31:21.215 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T12:31:21.225 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T12:31:21.236 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T12:31:21.289 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T12:31:21.700 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T12:31:21.718 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T12:31:21.723 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T12:31:21.732 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T12:31:21.737 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T12:31:21.745 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T12:31:21.748 INFO:teuthology.orchestra.run.vm00.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T12:31:21.751 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T12:31:21.762 INFO:teuthology.orchestra.run.vm00.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T12:31:21.771 INFO:teuthology.orchestra.run.vm00.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T12:31:21.776 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T12:31:21.785 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T12:31:21.791 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T12:31:21.800 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T12:31:21.806 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T12:31:21.849 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T12:31:22.135 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T12:31:22.167 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T12:31:22.174 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T12:31:22.236 INFO:teuthology.orchestra.run.vm00.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T12:31:22.239 INFO:teuthology.orchestra.run.vm00.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T12:31:22.265 INFO:teuthology.orchestra.run.vm00.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T12:31:22.688 INFO:teuthology.orchestra.run.vm00.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T12:31:22.788 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T12:31:22.945 INFO:teuthology.orchestra.run.vm07.stdout:(119/119): ceph-test-18.2.0-0.el9.x86_64.rpm 2.8 MB/s | 40 MB 00:14 2026-03-10T12:31:22.948 INFO:teuthology.orchestra.run.vm07.stdout:-------------------------------------------------------------------------------- 2026-03-10T12:31:22.948 INFO:teuthology.orchestra.run.vm07.stdout:Total 9.8 MB/s | 182 MB 00:18 2026-03-10T12:31:23.434 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:31:23.477 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:31:23.478 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:31:23.677 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T12:31:23.709 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T12:31:23.716 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T12:31:23.721 INFO:teuthology.orchestra.run.vm00.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T12:31:23.890 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T12:31:23.893 INFO:teuthology.orchestra.run.vm00.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T12:31:23.926 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T12:31:23.930 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-10T12:31:23.939 INFO:teuthology.orchestra.run.vm00.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T12:31:24.171 INFO:teuthology.orchestra.run.vm00.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T12:31:24.173 INFO:teuthology.orchestra.run.vm00.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T12:31:24.194 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T12:31:24.203 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-10T12:31:24.223 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T12:31:24.226 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:31:24.227 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:31:24.245 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T12:31:24.349 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T12:31:24.366 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T12:31:24.399 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T12:31:24.443 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T12:31:24.650 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T12:31:24.912 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T12:31:25.043 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:31:25.074 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T12:31:25.167 INFO:teuthology.orchestra.run.vm07.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T12:31:25.250 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T12:31:25.295 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T12:31:25.319 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T12:31:25.325 INFO:teuthology.orchestra.run.vm00.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T12:31:25.327 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T12:31:25.347 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T12:31:25.347 INFO:teuthology.orchestra.run.vm00.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T12:31:25.347 INFO:teuthology.orchestra.run.vm00.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T12:31:25.347 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:25.360 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T12:31:25.390 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T12:31:25.390 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T12:31:25.390 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:25.411 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T12:31:25.468 INFO:teuthology.orchestra.run.vm07.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T12:31:25.468 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T12:31:25.471 INFO:teuthology.orchestra.run.vm07.stdout: Upgrading : librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T12:31:25.473 INFO:teuthology.orchestra.run.vm00.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T12:31:25.478 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-10T12:31:25.511 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-10T12:31:25.515 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-10T12:31:25.519 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T12:31:25.521 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T12:31:25.551 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 5/121 2026-03-10T12:31:25.560 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rados-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T12:31:25.564 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T12:31:25.567 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T12:31:25.578 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T12:31:25.652 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T12:31:25.691 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T12:31:25.693 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T12:31:25.749 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T12:31:25.755 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T12:31:25.782 INFO:teuthology.orchestra.run.vm07.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T12:31:25.791 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T12:31:25.795 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T12:31:25.824 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T12:31:25.841 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T12:31:25.846 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T12:31:25.853 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T12:31:25.856 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T12:31:25.862 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T12:31:25.872 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T12:31:25.887 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cephfs-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T12:31:25.927 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T12:31:25.993 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T12:31:26.011 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T12:31:26.019 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T12:31:26.028 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T12:31:26.033 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librados-devel-2:18.2.0-0.el9.x86_64 29/121 2026-03-10T12:31:26.066 INFO:teuthology.orchestra.run.vm07.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T12:31:26.072 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T12:31:26.091 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T12:31:26.116 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T12:31:26.123 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T12:31:26.129 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T12:31:26.144 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T12:31:26.156 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T12:31:26.168 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T12:31:26.244 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T12:31:26.252 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T12:31:26.263 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T12:31:26.316 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T12:31:26.528 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T12:31:26.582 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T12:31:26.728 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T12:31:26.744 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T12:31:26.750 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T12:31:26.757 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T12:31:26.763 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T12:31:26.771 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T12:31:26.774 INFO:teuthology.orchestra.run.vm07.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T12:31:26.777 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T12:31:26.788 INFO:teuthology.orchestra.run.vm07.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T12:31:26.796 INFO:teuthology.orchestra.run.vm07.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T12:31:26.801 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T12:31:26.810 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T12:31:26.815 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T12:31:26.824 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T12:31:26.830 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T12:31:26.874 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T12:31:26.911 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T12:31:26.918 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T12:31:26.960 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T12:31:26.960 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T12:31:26.960 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T12:31:26.960 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:26.966 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T12:31:27.157 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T12:31:27.194 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T12:31:27.201 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T12:31:27.267 INFO:teuthology.orchestra.run.vm07.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T12:31:27.270 INFO:teuthology.orchestra.run.vm07.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T12:31:27.294 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T12:31:27.675 INFO:teuthology.orchestra.run.vm07.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T12:31:27.765 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T12:31:28.561 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T12:31:28.591 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T12:31:28.599 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T12:31:28.606 INFO:teuthology.orchestra.run.vm07.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T12:31:28.768 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T12:31:28.771 INFO:teuthology.orchestra.run.vm07.stdout: Upgrading : librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T12:31:28.801 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 72/121 2026-03-10T12:31:28.805 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rbd-2:18.2.0-0.el9.x86_64 73/121 2026-03-10T12:31:28.812 INFO:teuthology.orchestra.run.vm07.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T12:31:29.029 INFO:teuthology.orchestra.run.vm07.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T12:31:29.032 INFO:teuthology.orchestra.run.vm07.stdout: Installing : librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T12:31:29.050 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 76/121 2026-03-10T12:31:29.060 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-rgw-2:18.2.0-0.el9.x86_64 77/121 2026-03-10T12:31:29.079 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T12:31:29.105 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T12:31:29.211 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T12:31:29.225 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T12:31:29.257 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T12:31:29.297 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T12:31:29.362 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T12:31:29.375 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T12:31:29.378 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T12:31:29.386 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T12:31:29.391 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T12:31:29.395 INFO:teuthology.orchestra.run.vm07.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T12:31:29.399 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T12:31:29.417 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T12:31:29.417 INFO:teuthology.orchestra.run.vm07.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T12:31:29.417 INFO:teuthology.orchestra.run.vm07.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T12:31:29.417 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:29.429 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T12:31:29.455 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T12:31:29.455 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T12:31:29.455 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:29.472 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T12:31:29.529 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T12:31:29.533 INFO:teuthology.orchestra.run.vm07.stdout: Installing : cephadm-2:18.2.0-0.el9.noarch 93/121 2026-03-10T12:31:29.538 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 94/121 2026-03-10T12:31:29.570 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 95/121 2026-03-10T12:31:29.579 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-ceph-common-2:18.2.0-0.el9.x86_64 96/121 2026-03-10T12:31:30.579 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T12:31:30.585 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T12:31:30.911 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 97/121 2026-03-10T12:31:30.919 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T12:31:30.957 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 98/121 2026-03-10T12:31:30.958 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T12:31:30.958 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T12:31:30.958 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:31.022 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /sys 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /proc 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /mnt 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /var/tmp 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /home 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /root 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /tmp 2026-03-10T12:31:33.744 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:33.774 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T12:31:33.906 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T12:31:33.912 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T12:31:34.493 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T12:31:34.495 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T12:31:34.559 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T12:31:34.643 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-10T12:31:34.646 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T12:31:34.670 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T12:31:34.670 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:34.670 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T12:31:34.670 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T12:31:34.670 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T12:31:34.670 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:34.685 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T12:31:34.801 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T12:31:34.804 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T12:31:34.827 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T12:31:34.827 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:34.827 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T12:31:34.827 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T12:31:34.827 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T12:31:34.827 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:35.081 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T12:31:35.102 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T12:31:35.103 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:35.103 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T12:31:35.103 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T12:31:35.103 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T12:31:35.103 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:36.007 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T12:31:36.036 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T12:31:36.036 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:36.036 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T12:31:36.036 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T12:31:36.036 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T12:31:36.036 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:36.441 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-10T12:31:36.522 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T12:31:36.545 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T12:31:36.545 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:36.546 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T12:31:36.546 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T12:31:36.546 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T12:31:36.546 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:36.599 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T12:31:36.628 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T12:31:36.628 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:36.628 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T12:31:36.628 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:36.841 INFO:teuthology.orchestra.run.vm00.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T12:31:36.864 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T12:31:36.864 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:36.864 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T12:31:36.864 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T12:31:36.864 INFO:teuthology.orchestra.run.vm00.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T12:31:36.864 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 99/121 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /sys 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /proc 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /mnt 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /var/tmp 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /home 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /root 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /tmp 2026-03-10T12:31:37.701 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:37.735 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T12:31:37.864 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 100/121 2026-03-10T12:31:37.869 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T12:31:38.416 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 101/121 2026-03-10T12:31:38.419 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T12:31:38.484 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 102/121 2026-03-10T12:31:38.565 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 103/121 2026-03-10T12:31:38.568 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T12:31:38.595 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 104/121 2026-03-10T12:31:38.595 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:38.595 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T12:31:38.596 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T12:31:38.596 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T12:31:38.596 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:38.608 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T12:31:38.724 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 105/121 2026-03-10T12:31:38.727 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T12:31:38.749 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 106/121 2026-03-10T12:31:38.750 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:38.750 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T12:31:38.750 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T12:31:38.750 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T12:31:38.750 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:38.978 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T12:31:39.000 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 107/121 2026-03-10T12:31:39.001 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:39.001 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T12:31:39.001 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T12:31:39.001 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T12:31:39.001 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:39.049 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-10T12:31:39.063 INFO:teuthology.orchestra.run.vm00.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-10T12:31:39.068 INFO:teuthology.orchestra.run.vm00.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-10T12:31:39.113 INFO:teuthology.orchestra.run.vm00.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-10T12:31:39.121 INFO:teuthology.orchestra.run.vm00.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-10T12:31:39.130 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T12:31:39.135 INFO:teuthology.orchestra.run.vm00.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T12:31:39.135 INFO:teuthology.orchestra.run.vm00.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T12:31:39.151 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T12:31:39.151 INFO:teuthology.orchestra.run.vm00.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T12:31:39.947 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T12:31:39.977 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 108/121 2026-03-10T12:31:39.977 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:39.977 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T12:31:39.977 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T12:31:39.977 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T12:31:39.977 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:40.367 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-2:18.2.0-0.el9.x86_64 109/121 2026-03-10T12:31:40.371 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T12:31:40.394 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 110/121 2026-03-10T12:31:40.394 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:40.394 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T12:31:40.394 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T12:31:40.394 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T12:31:40.394 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-10T12:31:40.405 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-10T12:31:40.406 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-10T12:31:40.407 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-10T12:31:40.407 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-10T12:31:40.407 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-10T12:31:40.407 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-10T12:31:40.407 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-10T12:31:40.407 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-10T12:31:40.407 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T12:31:40.411 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T12:31:40.412 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T12:31:40.413 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T12:31:40.414 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-10T12:31:40.425 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 111/121 2026-03-10T12:31:40.425 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:40.425 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T12:31:40.425 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout:Upgraded: 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout:Installed: 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T12:31:40.555 INFO:teuthology.orchestra.run.vm00.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.556 INFO:teuthology.orchestra.run.vm00.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:31:40.557 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:31:40.576 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T12:31:40.600 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 112/121 2026-03-10T12:31:40.600 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:31:40.600 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T12:31:40.600 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T12:31:40.600 INFO:teuthology.orchestra.run.vm07.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T12:31:40.601 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:40.662 DEBUG:teuthology.parallel:result is None 2026-03-10T12:31:42.682 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-test-2:18.2.0-0.el9.x86_64 113/121 2026-03-10T12:31:42.693 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-fuse-2:18.2.0-0.el9.x86_64 114/121 2026-03-10T12:31:42.699 INFO:teuthology.orchestra.run.vm07.stdout: Installing : rbd-nbd-2:18.2.0-0.el9.x86_64 115/121 2026-03-10T12:31:42.739 INFO:teuthology.orchestra.run.vm07.stdout: Installing : libcephfs-devel-2:18.2.0-0.el9.x86_64 116/121 2026-03-10T12:31:42.745 INFO:teuthology.orchestra.run.vm07.stdout: Installing : ceph-fuse-2:18.2.0-0.el9.x86_64 117/121 2026-03-10T12:31:42.754 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T12:31:42.760 INFO:teuthology.orchestra.run.vm07.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T12:31:42.760 INFO:teuthology.orchestra.run.vm07.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T12:31:42.775 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T12:31:42.775 INFO:teuthology.orchestra.run.vm07.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 2/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 3/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 4/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_6 5/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 6/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 7/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 8/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 9/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 10/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 11/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 12/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 13/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 14/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 15/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 16/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 17/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 18/121 2026-03-10T12:31:44.029 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 19/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 20/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 21/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 22/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 23/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 24/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 25/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 26/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 27/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 28/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 29/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 30/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noa 31/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 32/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 33/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 34/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 35/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T12:31:44.030 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T12:31:44.031 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T12:31:44.032 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T12:31:44.033 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T12:31:44.034 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T12:31:44.035 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T12:31:44.035 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T12:31:44.035 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T12:31:44.035 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T12:31:44.035 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 118/121 2026-03-10T12:31:44.035 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T12:31:44.035 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 120/121 2026-03-10T12:31:44.153 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T12:31:44.153 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:44.153 INFO:teuthology.orchestra.run.vm07.stdout:Upgraded: 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: librados2-2:18.2.0-0.el9.x86_64 librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout:Installed: 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.154 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T12:31:44.155 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T12:31:44.156 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:31:44.156 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:31:44.282 DEBUG:teuthology.parallel:result is None 2026-03-10T12:31:44.282 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T12:31:44.282 INFO:teuthology.packaging:ref: None 2026-03-10T12:31:44.282 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T12:31:44.282 INFO:teuthology.packaging:branch: None 2026-03-10T12:31:44.282 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:31:44.282 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T12:31:44.896 DEBUG:teuthology.orchestra.run.vm00:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T12:31:44.924 INFO:teuthology.orchestra.run.vm00.stdout:18.2.0-0.el9 2026-03-10T12:31:44.925 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-10T12:31:44.925 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-10T12:31:44.926 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T12:31:44.926 INFO:teuthology.packaging:ref: None 2026-03-10T12:31:44.926 INFO:teuthology.packaging:tag: v18.2.0 2026-03-10T12:31:44.926 INFO:teuthology.packaging:branch: None 2026-03-10T12:31:44.926 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:31:44.926 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=5dd24139a1eada541a3bc16b6941c5dde975e26d 2026-03-10T12:31:45.537 DEBUG:teuthology.orchestra.run.vm07:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T12:31:45.561 INFO:teuthology.orchestra.run.vm07.stdout:18.2.0-0.el9 2026-03-10T12:31:45.562 INFO:teuthology.packaging:The installed version of ceph is 18.2.0-0.el9 2026-03-10T12:31:45.562 INFO:teuthology.task.install:The correct ceph version 18.2.0-0 is installed. 2026-03-10T12:31:45.563 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-10T12:31:45.563 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:31:45.563 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T12:31:45.595 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:31:45.595 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T12:31:45.632 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-10T12:31:45.669 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:31:45.669 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T12:31:45.701 DEBUG:teuthology.orchestra.run.vm00:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T12:31:45.771 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:31:45.771 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T12:31:45.803 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T12:31:45.869 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-10T12:31:45.869 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:31:45.869 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T12:31:45.897 DEBUG:teuthology.orchestra.run.vm00:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T12:31:45.971 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:31:45.971 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T12:31:46.000 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T12:31:46.071 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-10T12:31:46.072 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:31:46.072 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T12:31:46.103 DEBUG:teuthology.orchestra.run.vm00:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T12:31:46.175 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:31:46.175 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T12:31:46.208 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T12:31:46.278 INFO:teuthology.run_tasks:Running task print... 2026-03-10T12:31:46.280 INFO:teuthology.task.print:**** done install task... 2026-03-10T12:31:46.280 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-10T12:31:46.326 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.io/ceph/ceph:v18.2.0', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-10T12:31:46.326 INFO:tasks.cephadm:Cluster image is quay.io/ceph/ceph:v18.2.0 2026-03-10T12:31:46.326 INFO:tasks.cephadm:Cluster fsid is 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:31:46.326 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-10T12:31:46.326 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-10T12:31:46.326 INFO:tasks.cephadm:Monitor IPs: {'mon.vm00': '192.168.123.100', 'mon.vm07': '192.168.123.107'} 2026-03-10T12:31:46.326 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-10T12:31:46.326 DEBUG:teuthology.orchestra.run.vm00:> sudo hostname $(hostname -s) 2026-03-10T12:31:46.358 DEBUG:teuthology.orchestra.run.vm07:> sudo hostname $(hostname -s) 2026-03-10T12:31:46.394 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-10T12:31:46.394 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:31:46.993 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-10T12:31:47.748 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-10T12:31:47.749 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T12:31:47.749 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T12:31:47.749 DEBUG:teuthology.orchestra.run.vm00:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T12:31:49.051 INFO:teuthology.orchestra.run.vm00.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 12:31 /home/ubuntu/cephtest/cephadm 2026-03-10T12:31:49.051 DEBUG:teuthology.orchestra.run.vm07:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T12:31:50.352 INFO:teuthology.orchestra.run.vm07.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 12:31 /home/ubuntu/cephtest/cephadm 2026-03-10T12:31:50.352 DEBUG:teuthology.orchestra.run.vm00:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T12:31:50.367 DEBUG:teuthology.orchestra.run.vm07:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T12:31:50.395 INFO:tasks.cephadm:Pulling image quay.io/ceph/ceph:v18.2.0 on all hosts... 2026-03-10T12:31:50.395 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-10T12:31:50.410 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 pull 2026-03-10T12:31:50.535 INFO:teuthology.orchestra.run.vm00.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T12:31:50.592 INFO:teuthology.orchestra.run.vm07.stderr:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout: "repo_digests": [ 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout: ] 2026-03-10T12:32:14.085 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout: "image_id": "dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946", 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout: "repo_digests": [ 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout: "quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007", 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout: "quay.io/ceph/ceph@sha256:d58cf65589d0abf9d5261cd46fb62be3bfb29098febc78c0fcdf116a15274d27" 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout: ] 2026-03-10T12:32:14.174 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-10T12:32:14.189 DEBUG:teuthology.orchestra.run.vm00:> sudo mkdir -p /etc/ceph 2026-03-10T12:32:14.220 DEBUG:teuthology.orchestra.run.vm07:> sudo mkdir -p /etc/ceph 2026-03-10T12:32:14.246 DEBUG:teuthology.orchestra.run.vm00:> sudo chmod 777 /etc/ceph 2026-03-10T12:32:14.288 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod 777 /etc/ceph 2026-03-10T12:32:14.313 INFO:tasks.cephadm:Writing seed config... 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-10T12:32:14.314 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-10T12:32:14.315 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-10T12:32:14.315 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:32:14.315 DEBUG:teuthology.orchestra.run.vm00:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-10T12:32:14.346 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = 1a52002a-1c7d-11f1-af82-51cdd81caea8 mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-10T12:32:14.346 DEBUG:teuthology.orchestra.run.vm00:mon.vm00> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service 2026-03-10T12:32:14.388 INFO:tasks.cephadm:Bootstrapping... 2026-03-10T12:32:14.388 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 -v bootstrap --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.100 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-10T12:32:14.511 INFO:teuthology.orchestra.run.vm00.stdout:-------------------------------------------------------------------------------- 2026-03-10T12:32:14.511 INFO:teuthology.orchestra.run.vm00.stdout:cephadm ['--image', 'quay.io/ceph/ceph:v18.2.0', '-v', 'bootstrap', '--fsid', '1a52002a-1c7d-11f1-af82-51cdd81caea8', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.100', '--skip-admin-label'] 2026-03-10T12:32:14.538 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stdout 5.8.0 2026-03-10T12:32:14.538 INFO:teuthology.orchestra.run.vm00.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-10T12:32:14.538 INFO:teuthology.orchestra.run.vm00.stdout:Verifying podman|docker is present... 2026-03-10T12:32:14.558 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stdout 5.8.0 2026-03-10T12:32:14.558 INFO:teuthology.orchestra.run.vm00.stdout:Verifying lvm2 is present... 2026-03-10T12:32:14.559 INFO:teuthology.orchestra.run.vm00.stdout:Verifying time synchronization is in place... 2026-03-10T12:32:14.567 INFO:teuthology.orchestra.run.vm00.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T12:32:14.567 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T12:32:14.573 INFO:teuthology.orchestra.run.vm00.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T12:32:14.574 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stdout inactive 2026-03-10T12:32:14.581 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stdout enabled 2026-03-10T12:32:14.586 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stdout active 2026-03-10T12:32:14.586 INFO:teuthology.orchestra.run.vm00.stdout:Unit chronyd.service is enabled and running 2026-03-10T12:32:14.586 INFO:teuthology.orchestra.run.vm00.stdout:Repeating the final host check... 2026-03-10T12:32:14.606 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stdout 5.8.0 2026-03-10T12:32:14.606 INFO:teuthology.orchestra.run.vm00.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-10T12:32:14.606 INFO:teuthology.orchestra.run.vm00.stdout:systemctl is present 2026-03-10T12:32:14.606 INFO:teuthology.orchestra.run.vm00.stdout:lvcreate is present 2026-03-10T12:32:14.614 INFO:teuthology.orchestra.run.vm00.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T12:32:14.614 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T12:32:14.622 INFO:teuthology.orchestra.run.vm00.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T12:32:14.622 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stdout inactive 2026-03-10T12:32:14.630 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stdout enabled 2026-03-10T12:32:14.636 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stdout active 2026-03-10T12:32:14.636 INFO:teuthology.orchestra.run.vm00.stdout:Unit chronyd.service is enabled and running 2026-03-10T12:32:14.636 INFO:teuthology.orchestra.run.vm00.stdout:Host looks OK 2026-03-10T12:32:14.636 INFO:teuthology.orchestra.run.vm00.stdout:Cluster fsid: 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:14.636 INFO:teuthology.orchestra.run.vm00.stdout:Acquiring lock 140176184566736 on /run/cephadm/1a52002a-1c7d-11f1-af82-51cdd81caea8.lock 2026-03-10T12:32:14.636 INFO:teuthology.orchestra.run.vm00.stdout:Lock 140176184566736 acquired on /run/cephadm/1a52002a-1c7d-11f1-af82-51cdd81caea8.lock 2026-03-10T12:32:14.637 INFO:teuthology.orchestra.run.vm00.stdout:Verifying IP 192.168.123.100 port 3300 ... 2026-03-10T12:32:14.637 INFO:teuthology.orchestra.run.vm00.stdout:Verifying IP 192.168.123.100 port 6789 ... 2026-03-10T12:32:14.637 INFO:teuthology.orchestra.run.vm00.stdout:Base mon IP(s) is [192.168.123.100:3300, 192.168.123.100:6789], mon addrv is [v2:192.168.123.100:3300,v1:192.168.123.100:6789] 2026-03-10T12:32:14.641 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.100 metric 100 2026-03-10T12:32:14.641 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.100 metric 100 2026-03-10T12:32:14.643 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-10T12:32:14.643 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-10T12:32:14.645 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-10T12:32:14.645 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-10T12:32:14.645 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T12:32:14.645 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-10T12:32:14.646 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:0/64 scope link noprefixroute 2026-03-10T12:32:14.646 INFO:teuthology.orchestra.run.vm00.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T12:32:14.646 INFO:teuthology.orchestra.run.vm00.stdout:Mon IP `192.168.123.100` is in CIDR network `192.168.123.0/24` 2026-03-10T12:32:14.646 INFO:teuthology.orchestra.run.vm00.stdout:Mon IP `192.168.123.100` is in CIDR network `192.168.123.0/24` 2026-03-10T12:32:14.646 INFO:teuthology.orchestra.run.vm00.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-10T12:32:14.647 INFO:teuthology.orchestra.run.vm00.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-10T12:32:14.647 INFO:teuthology.orchestra.run.vm00.stdout:Pulling container image quay.io/ceph/ceph:v18.2.0... 2026-03-10T12:32:15.863 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stdout dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-10T12:32:15.863 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stderr Trying to pull quay.io/ceph/ceph:v18.2.0... 2026-03-10T12:32:15.863 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stderr Getting image source signatures 2026-03-10T12:32:15.863 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stderr Copying blob sha256:3bd20aeff60302f668275dc2005d10679ae56492967a3a5a54fd3dde85333aec 2026-03-10T12:32:15.863 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stderr Copying blob sha256:46af8f5390d4e94fc57efb422ccb97bb53dfe5b948546bfc191b46557eb2dbd9 2026-03-10T12:32:15.863 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stderr Copying config sha256:dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 2026-03-10T12:32:15.864 INFO:teuthology.orchestra.run.vm00.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-10T12:32:16.050 INFO:teuthology.orchestra.run.vm00.stdout:ceph: stdout ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-10T12:32:16.050 INFO:teuthology.orchestra.run.vm00.stdout:Ceph version: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable) 2026-03-10T12:32:16.050 INFO:teuthology.orchestra.run.vm00.stdout:Extracting ceph user uid/gid from container image... 2026-03-10T12:32:16.135 INFO:teuthology.orchestra.run.vm00.stdout:stat: stdout 167 167 2026-03-10T12:32:16.135 INFO:teuthology.orchestra.run.vm00.stdout:Creating initial keys... 2026-03-10T12:32:16.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph-authtool: stdout AQBQD7BpepTEDRAACL/8YbPaHlxKV/UJhoZuKw== 2026-03-10T12:32:16.368 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph-authtool: stdout AQBQD7Bp98LUFBAAnDm/ISIp0z/f0bfQr73qwA== 2026-03-10T12:32:16.498 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph-authtool: stdout AQBQD7Bp8h0CGxAATsPQdjm+ROAQVC4ML/3DVA== 2026-03-10T12:32:16.498 INFO:teuthology.orchestra.run.vm00.stdout:Creating initial monmap... 2026-03-10T12:32:16.597 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T12:32:16.597 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-10T12:32:16.597 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:16.597 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T12:32:16.597 INFO:teuthology.orchestra.run.vm00.stdout:monmaptool for vm00 [v2:192.168.123.100:3300,v1:192.168.123.100:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T12:32:16.597 INFO:teuthology.orchestra.run.vm00.stdout:setting min_mon_release = pacific 2026-03-10T12:32:16.598 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/monmaptool: set fsid to 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:16.598 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T12:32:16.598 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:16.598 INFO:teuthology.orchestra.run.vm00.stdout:Creating mon... 2026-03-10T12:32:16.748 INFO:teuthology.orchestra.run.vm00.stdout:create mon.vm00 on 2026-03-10T12:32:17.015 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-10T12:32:17.154 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-10T12:32:17.284 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8.target → /etc/systemd/system/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8.target. 2026-03-10T12:32:17.284 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8.target → /etc/systemd/system/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8.target. 2026-03-10T12:32:17.433 INFO:teuthology.orchestra.run.vm00.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00 2026-03-10T12:32:17.433 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Failed to reset failed state of unit ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service: Unit ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service not loaded. 2026-03-10T12:32:17.578 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8.target.wants/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service → /etc/systemd/system/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@.service. 2026-03-10T12:32:17.753 INFO:teuthology.orchestra.run.vm00.stdout:firewalld does not appear to be present 2026-03-10T12:32:17.753 INFO:teuthology.orchestra.run.vm00.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T12:32:17.753 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for mon to start... 2026-03-10T12:32:17.753 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for mon... 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout cluster: 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout id: 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout services: 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm00 (age 0.181211s) 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout data: 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout pgs: 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.908+0000 7ff6458d3700 1 Processor -- start 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.909+0000 7ff6458d3700 1 -- start start 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.909+0000 7ff6458d3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64007be00 0x7ff64007a300 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.909+0000 7ff6458d3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff64007a840 con 0x7ff64007be00 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.910+0000 7ff63effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64007be00 0x7ff64007a300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:17.990 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.910+0000 7ff63effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64007be00 0x7ff64007a300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50282/0 (socket says 192.168.123.100:50282) 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.910+0000 7ff63effd700 1 -- 192.168.123.100:0/3440744392 learned_addr learned my addr 192.168.123.100:0/3440744392 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.910+0000 7ff63effd700 1 -- 192.168.123.100:0/3440744392 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff64007a980 con 0x7ff64007be00 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.911+0000 7ff63effd700 1 --2- 192.168.123.100:0/3440744392 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64007be00 0x7ff64007a300 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7ff628009cf0 tx=0x7ff62800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=72fe1c7fc60b7b33 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.911+0000 7ff63dffb700 1 -- 192.168.123.100:0/3440744392 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff628004030 con 0x7ff64007be00 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.911+0000 7ff63dffb700 1 -- 192.168.123.100:0/3440744392 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7ff628004190 con 0x7ff64007be00 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.911+0000 7ff63dffb700 1 -- 192.168.123.100:0/3440744392 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff628004320 con 0x7ff64007be00 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.911+0000 7ff6458d3700 1 -- 192.168.123.100:0/3440744392 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64007be00 msgr2=0x7ff64007a300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.911+0000 7ff6458d3700 1 --2- 192.168.123.100:0/3440744392 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64007be00 0x7ff64007a300 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7ff628009cf0 tx=0x7ff62800b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.912+0000 7ff6458d3700 1 -- 192.168.123.100:0/3440744392 shutdown_connections 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.912+0000 7ff6458d3700 1 --2- 192.168.123.100:0/3440744392 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64007be00 0x7ff64007a300 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.912+0000 7ff6458d3700 1 -- 192.168.123.100:0/3440744392 >> 192.168.123.100:0/3440744392 conn(0x7ff640101420 msgr2=0x7ff640103830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.912+0000 7ff6458d3700 1 -- 192.168.123.100:0/3440744392 shutdown_connections 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.912+0000 7ff6458d3700 1 -- 192.168.123.100:0/3440744392 wait complete. 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.912+0000 7ff6458d3700 1 Processor -- start 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.912+0000 7ff6458d3700 1 -- start start 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.913+0000 7ff6458d3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64019b3e0 0x7ff64019b7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.913+0000 7ff6458d3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff64019bd30 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.913+0000 7ff63effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64019b3e0 0x7ff64019b7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.913+0000 7ff63effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64019b3e0 0x7ff64019b7f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50290/0 (socket says 192.168.123.100:50290) 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.913+0000 7ff63effd700 1 -- 192.168.123.100:0/3524230563 learned_addr learned my addr 192.168.123.100:0/3524230563 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.913+0000 7ff63effd700 1 -- 192.168.123.100:0/3524230563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff628009740 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.913+0000 7ff63effd700 1 --2- 192.168.123.100:0/3524230563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64019b3e0 0x7ff64019b7f0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7ff628009130 tx=0x7ff628004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.914+0000 7ff6448d1700 1 -- 192.168.123.100:0/3524230563 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff628004030 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.914+0000 7ff6448d1700 1 -- 192.168.123.100:0/3524230563 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7ff6280036a0 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.914+0000 7ff6448d1700 1 -- 192.168.123.100:0/3524230563 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff628003810 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.914+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff64019bf30 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.914+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff64019eb90 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.915+0000 7ff6448d1700 1 -- 192.168.123.100:0/3524230563 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7ff628022020 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.915+0000 7ff6448d1700 1 -- 192.168.123.100:0/3524230563 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7ff62801ba60 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.915+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff640195300 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.917+0000 7ff6448d1700 1 -- 192.168.123.100:0/3524230563 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7ff628044b00 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.957+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7ff640062380 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.958+0000 7ff6448d1700 1 -- 192.168.123.100:0/3524230563 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7ff628033030 con 0x7ff64019b3e0 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.959+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64019b3e0 msgr2=0x7ff64019b7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.959+0000 7ff6458d3700 1 --2- 192.168.123.100:0/3524230563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64019b3e0 0x7ff64019b7f0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7ff628009130 tx=0x7ff628004750 comp rx=0 tx=0).stop 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.959+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 shutdown_connections 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.959+0000 7ff6458d3700 1 --2- 192.168.123.100:0/3524230563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff64019b3e0 0x7ff64019b7f0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.959+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 >> 192.168.123.100:0/3524230563 conn(0x7ff640101420 msgr2=0x7ff6401037d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.959+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 shutdown_connections 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:17.959+0000 7ff6458d3700 1 -- 192.168.123.100:0/3524230563 wait complete. 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:mon is available 2026-03-10T12:32:17.991 INFO:teuthology.orchestra.run.vm00.stdout:Assimilating anything we can from ceph.conf... 2026-03-10T12:32:18.254 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:18.254 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout fsid = 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.100:3300,v1:192.168.123.100:6789] 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T12:32:18.255 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.141+0000 7f8a905b8700 1 Processor -- start 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.142+0000 7f8a905b8700 1 -- start start 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.142+0000 7f8a905b8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a880791f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.142+0000 7f8a905b8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a88079730 con 0x7f8a8807acf0 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.142+0000 7f8a8e354700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a880791f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.142+0000 7f8a8e354700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a880791f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50300/0 (socket says 192.168.123.100:50300) 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.142+0000 7f8a8e354700 1 -- 192.168.123.100:0/2644162091 learned_addr learned my addr 192.168.123.100:0/2644162091 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.143+0000 7f8a8e354700 1 -- 192.168.123.100:0/2644162091 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a88079870 con 0x7f8a8807acf0 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.143+0000 7f8a8e354700 1 --2- 192.168.123.100:0/2644162091 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a880791f0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f8a7c009a90 tx=0x7f8a7c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=80241765826a02d3 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.143+0000 7f8a8d352700 1 -- 192.168.123.100:0/2644162091 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a7c004030 con 0x7f8a8807acf0 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.143+0000 7f8a8d352700 1 -- 192.168.123.100:0/2644162091 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f8a7c004190 con 0x7f8a8807acf0 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.143+0000 7f8a8d352700 1 -- 192.168.123.100:0/2644162091 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a7c004320 con 0x7f8a8807acf0 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.144+0000 7f8a905b8700 1 -- 192.168.123.100:0/2644162091 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 msgr2=0x7f8a880791f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.144+0000 7f8a905b8700 1 --2- 192.168.123.100:0/2644162091 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a880791f0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f8a7c009a90 tx=0x7f8a7c009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.144+0000 7f8a905b8700 1 -- 192.168.123.100:0/2644162091 shutdown_connections 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.144+0000 7f8a905b8700 1 --2- 192.168.123.100:0/2644162091 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a880791f0 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.144+0000 7f8a905b8700 1 -- 192.168.123.100:0/2644162091 >> 192.168.123.100:0/2644162091 conn(0x7f8a881013a0 msgr2=0x7f8a881037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.144+0000 7f8a905b8700 1 -- 192.168.123.100:0/2644162091 shutdown_connections 2026-03-10T12:32:18.256 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.144+0000 7f8a905b8700 1 -- 192.168.123.100:0/2644162091 wait complete. 2026-03-10T12:32:18.257 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.145+0000 7f8a905b8700 1 Processor -- start 2026-03-10T12:32:18.257 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.145+0000 7f8a905b8700 1 -- start start 2026-03-10T12:32:18.257 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.145+0000 7f8a905b8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a8819b300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:18.257 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.145+0000 7f8a905b8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a8819b840 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.145+0000 7f8a8e354700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a8819b300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.145+0000 7f8a8e354700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a8819b300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50314/0 (socket says 192.168.123.100:50314) 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.145+0000 7f8a8e354700 1 -- 192.168.123.100:0/2946063963 learned_addr learned my addr 192.168.123.100:0/2946063963 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.146+0000 7f8a8e354700 1 -- 192.168.123.100:0/2946063963 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8a7c009740 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.146+0000 7f8a8e354700 1 --2- 192.168.123.100:0/2946063963 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a8819b300 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f8a7c004000 tx=0x7f8a7c004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.146+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a7c00be40 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.146+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f8a7c029aa0 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.146+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8a7c029d90 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.146+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8a8819ba40 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.146+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8a8819bee0 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.147+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f8a7c0039a0 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.147+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f8a7c0338c0 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.148+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8a88195280 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.149+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f8a7c033460 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.191+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f8a8802d090 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.195+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 7 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f8a7c003d40 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.196+0000 7f8a7b7fe700 1 -- 192.168.123.100:0/2946063963 <== mon.0 v2:192.168.123.100:3300/0 8 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+435 (secure 0 0 0) 0x7f8a7c042d80 con 0x7f8a8807acf0 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.198+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 msgr2=0x7f8a8819b300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.198+0000 7f8a905b8700 1 --2- 192.168.123.100:0/2946063963 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a8819b300 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f8a7c004000 tx=0x7f8a7c004750 comp rx=0 tx=0).stop 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.198+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 shutdown_connections 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.198+0000 7f8a905b8700 1 --2- 192.168.123.100:0/2946063963 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8a8807acf0 0x7f8a8819b300 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.198+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 >> 192.168.123.100:0/2946063963 conn(0x7f8a881013a0 msgr2=0x7f8a88102c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.198+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 shutdown_connections 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.198+0000 7f8a905b8700 1 -- 192.168.123.100:0/2946063963 wait complete. 2026-03-10T12:32:18.258 INFO:teuthology.orchestra.run.vm00.stdout:Generating new minimal ceph.conf... 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.386+0000 7f57a4c42700 1 Processor -- start 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.386+0000 7f57a4c42700 1 -- start start 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.386+0000 7f57a4c42700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a0107f40 0x7f57a0108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.386+0000 7f57a4c42700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57a0108890 con 0x7f57a0107f40 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.387+0000 7f579e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a0107f40 0x7f57a0108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.387+0000 7f579e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a0107f40 0x7f57a0108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50320/0 (socket says 192.168.123.100:50320) 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.387+0000 7f579e59c700 1 -- 192.168.123.100:0/2770329453 learned_addr learned my addr 192.168.123.100:0/2770329453 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.387+0000 7f579e59c700 1 -- 192.168.123.100:0/2770329453 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57a01089d0 con 0x7f57a0107f40 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.387+0000 7f579e59c700 1 --2- 192.168.123.100:0/2770329453 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a0107f40 0x7f57a0108350 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f5788009cf0 tx=0x7f578800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4c79f18e81dd13bc server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.388+0000 7f579d59a700 1 -- 192.168.123.100:0/2770329453 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5788004030 con 0x7f57a0107f40 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.388+0000 7f579d59a700 1 -- 192.168.123.100:0/2770329453 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f578800b810 con 0x7f57a0107f40 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.388+0000 7f57a4c42700 1 -- 192.168.123.100:0/2770329453 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a0107f40 msgr2=0x7f57a0108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.388+0000 7f57a4c42700 1 --2- 192.168.123.100:0/2770329453 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a0107f40 0x7f57a0108350 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f5788009cf0 tx=0x7f578800b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.388+0000 7f57a4c42700 1 -- 192.168.123.100:0/2770329453 shutdown_connections 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.388+0000 7f57a4c42700 1 --2- 192.168.123.100:0/2770329453 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a0107f40 0x7f57a0108350 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.388+0000 7f57a4c42700 1 -- 192.168.123.100:0/2770329453 >> 192.168.123.100:0/2770329453 conn(0x7f57a0103770 msgr2=0x7f57a0105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.389+0000 7f57a4c42700 1 -- 192.168.123.100:0/2770329453 shutdown_connections 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.389+0000 7f57a4c42700 1 -- 192.168.123.100:0/2770329453 wait complete. 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.389+0000 7f57a4c42700 1 Processor -- start 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.389+0000 7f57a4c42700 1 -- start start 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.389+0000 7f57a4c42700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a019bc60 0x7f57a019c070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.389+0000 7f57a4c42700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57a0108890 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f579e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a019bc60 0x7f57a019c070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f579e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a019bc60 0x7f57a019c070 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50332/0 (socket says 192.168.123.100:50332) 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f579e59c700 1 -- 192.168.123.100:0/3454864433 learned_addr learned my addr 192.168.123.100:0/3454864433 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f579e59c700 1 -- 192.168.123.100:0/3454864433 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5788009740 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f579e59c700 1 --2- 192.168.123.100:0/3454864433 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a019bc60 0x7f57a019c070 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f5788009130 tx=0x7f5788011750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f57977fe700 1 -- 192.168.123.100:0/3454864433 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f57880036a0 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f57977fe700 1 -- 192.168.123.100:0/3454864433 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f5788011e10 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f57977fe700 1 -- 192.168.123.100:0/3454864433 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f578801ac80 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.390+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57a019c5b0 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.391+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57a019f230 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.391+0000 7f57977fe700 1 -- 192.168.123.100:0/3454864433 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f5788024750 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.391+0000 7f57977fe700 1 -- 192.168.123.100:0/3454864433 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f5788019d80 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.392+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f57a004f9e0 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.393+0000 7f57977fe700 1 -- 192.168.123.100:0/3454864433 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f5788029050 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.430+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f57a0062380 con 0x7f57a019bc60 2026-03-10T12:32:18.493 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.430+0000 7f57977fe700 1 -- 192.168.123.100:0/3454864433 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7f578801f070 con 0x7f57a019bc60 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.431+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a019bc60 msgr2=0x7f57a019c070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.431+0000 7f57a4c42700 1 --2- 192.168.123.100:0/3454864433 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a019bc60 0x7f57a019c070 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f5788009130 tx=0x7f5788011750 comp rx=0 tx=0).stop 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.432+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 shutdown_connections 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.432+0000 7f57a4c42700 1 --2- 192.168.123.100:0/3454864433 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57a019bc60 0x7f57a019c070 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.432+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 >> 192.168.123.100:0/3454864433 conn(0x7f57a0103770 msgr2=0x7f57a01052f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.432+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 shutdown_connections 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.432+0000 7f57a4c42700 1 -- 192.168.123.100:0/3454864433 wait complete. 2026-03-10T12:32:18.494 INFO:teuthology.orchestra.run.vm00.stdout:Restarting the monitor... 2026-03-10T12:32:18.656 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 bash[50602]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00 2026-03-10T12:32:18.814 INFO:teuthology.orchestra.run.vm00.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service: Deactivated successfully. 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 systemd[1]: Stopped Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 systemd[1]: Starting Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 podman[50672]: 2026-03-10 12:32:18.770882283 +0000 UTC m=+0.016781873 container create c8d836b38502143acd65e1297ec718326020ef5a02520a61c79d2f72d906ddd6 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, CEPH_POINT_RELEASE=-18.2.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.vendor=CentOS, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, ceph=True, io.buildah.version=1.29.1, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0) 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 podman[50672]: 2026-03-10 12:32:18.802731443 +0000 UTC m=+0.048631033 container init c8d836b38502143acd65e1297ec718326020ef5a02520a61c79d2f72d906ddd6 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, CEPH_POINT_RELEASE=-18.2.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.build-date=20231212, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.vendor=CentOS, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0) 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 podman[50672]: 2026-03-10 12:32:18.805282118 +0000 UTC m=+0.051181708 container start c8d836b38502143acd65e1297ec718326020ef5a02520a61c79d2f72d906ddd6 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS) 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 bash[50672]: c8d836b38502143acd65e1297ec718326020ef5a02520a61c79d2f72d906ddd6 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 podman[50672]: 2026-03-10 12:32:18.763154843 +0000 UTC m=+0.009054444 image pull dc2bc1663786ad5862ef6efcff18243ab7dfeda0bcb911ddf757a0d20feca946 quay.io/ceph/ceph:v18.2.0 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 systemd[1]: Started Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable), process ceph-mon, pid 2 2026-03-10T12:32:18.932 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: pidfile_write: ignore empty --pid-file 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: load: jerasure load: lrc 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: RocksDB version: 7.9.2 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Git sha 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Compile date 2023-08-03 19:21:13 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: DB SUMMARY 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: DB Session ID: TBGS5CKZIETOQ5V17P6G 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: CURRENT file: CURRENT 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm00/store.db dir, Total Num: 1, files: 000008.sst 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm00/store.db: 000009.log size: 89048 ; 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.error_if_exists: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.create_if_missing: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.paranoid_checks: 1 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.env: 0x559d1afda720 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.info_log: 0x559d1cfe9340 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.statistics: (nil) 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.use_fsync: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_log_file_size: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.allow_fallocate: 1 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.use_direct_reads: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.db_log_dir: 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.wal_dir: 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.write_buffer_manager: 0x559d1c2785a0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T12:32:18.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.unordered_write: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.row_cache: None 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.wal_filter: None 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.two_write_queues: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.wal_compression: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.atomic_flush: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.log_readahead_size: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_background_jobs: 2 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_background_compactions: -1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_subcompactions: 1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_open_files: -1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_background_flushes: -1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Compression algorithms supported: 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kZSTD supported: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kXpressCompression supported: 0 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kZlibCompression supported: 1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kSnappyCompression supported: 1 2026-03-10T12:32:18.934 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kLZ4Compression supported: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: kBZip2Compression supported: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm00/store.db/MANIFEST-000010 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.merge_operator: 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_filter: None 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559d1cfe9460) 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: cache_index_and_filter_blocks: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: pin_top_level_index_and_filter: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: index_type: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: data_block_index_type: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: index_shortening: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: checksum: 4 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: no_block_cache: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache: 0x559d1c2fb350 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache_name: BinnedLRUCache 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache_options: 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: capacity : 536870912 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: num_shard_bits : 4 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: strict_capacity_limit : 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: high_pri_pool_ratio: 0.000 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache_compressed: (nil) 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: persistent_cache: (nil) 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_size: 4096 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_size_deviation: 10 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_restart_interval: 16 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: index_block_restart_interval: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: metadata_block_size: 4096 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: partition_filters: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: use_delta_encoding: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: filter_policy: bloomfilter 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: whole_key_filtering: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: verify_compression: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: read_amp_bytes_per_bit: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: format_version: 5 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: enable_index_compression: 1 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_align: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: max_auto_readahead_size: 262144 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: prepopulate_block_cache: 0 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: initial_auto_readahead_size: 8192 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression: NoCompression 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T12:32:18.935 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.num_levels: 7 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.table_properties_collectors: 2026-03-10T12:32:18.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.inplace_update_support: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.bloom_locality: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.max_successive_merges: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.ttl: 2592000 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.enable_blob_files: false 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.min_blob_size: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm00/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2849e21e-e961-4f79-abd2-83e75de95a7e 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773145938828865, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773145938829873, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84711, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 287, "table_properties": {"data_size": 82789, "index_size": 209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 13288, "raw_average_key_size": 51, "raw_value_size": 75614, "raw_average_value_size": 293, "num_data_blocks": 9, "num_entries": 258, "num_filter_entries": 258, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773145938, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2849e21e-e961-4f79-abd2-83e75de95a7e", "db_session_id": "TBGS5CKZIETOQ5V17P6G", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773145938829908, "job": 1, "event": "recovery_finished"} 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: mon.vm00 is new leader, mons vm00 in quorum (ranks 0) 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: monmap e1: 1 mons at {vm00=[v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0]} removed_ranks: {} 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: fsmap 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: osdmap e1: 0 total, 0 up, 0 in 2026-03-10T12:32:18.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:18 vm00 ceph-mon[50686]: mgrmap e1: no daemons active 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.950+0000 7f8032894700 1 Processor -- start 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.950+0000 7f8032894700 1 -- start start 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.951+0000 7f8032894700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c108330 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.951+0000 7f8032894700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f802c108870 con 0x7f802c107f20 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.951+0000 7f802bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c108330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.951+0000 7f802bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c108330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50334/0 (socket says 192.168.123.100:50334) 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.951+0000 7f802bfff700 1 -- 192.168.123.100:0/2158859864 learned_addr learned my addr 192.168.123.100:0/2158859864 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.952+0000 7f802bfff700 1 -- 192.168.123.100:0/2158859864 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f802c1089b0 con 0x7f802c107f20 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.952+0000 7f802bfff700 1 --2- 192.168.123.100:0/2158859864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c108330 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f801c009cf0 tx=0x7f801c00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a22b4459047e0043 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.952+0000 7f802b7fe700 1 -- 192.168.123.100:0/2158859864 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f801c004030 con 0x7f802c107f20 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.952+0000 7f802b7fe700 1 -- 192.168.123.100:0/2158859864 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f801c00b810 con 0x7f802c107f20 2026-03-10T12:32:19.054 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f802b7fe700 1 -- 192.168.123.100:0/2158859864 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f801c0039c0 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f8032894700 1 -- 192.168.123.100:0/2158859864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 msgr2=0x7f802c108330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f8032894700 1 --2- 192.168.123.100:0/2158859864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c108330 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f801c009cf0 tx=0x7f801c00b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f8032894700 1 -- 192.168.123.100:0/2158859864 shutdown_connections 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f8032894700 1 --2- 192.168.123.100:0/2158859864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c108330 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f8032894700 1 -- 192.168.123.100:0/2158859864 >> 192.168.123.100:0/2158859864 conn(0x7f802c07b4b0 msgr2=0x7f802c07b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f8032894700 1 -- 192.168.123.100:0/2158859864 shutdown_connections 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.953+0000 7f8032894700 1 -- 192.168.123.100:0/2158859864 wait complete. 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.954+0000 7f8032894700 1 Processor -- start 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.954+0000 7f8032894700 1 -- start start 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.954+0000 7f8032894700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c19baf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.954+0000 7f8032894700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f802c19c030 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.955+0000 7f802bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c19baf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.955+0000 7f802bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c19baf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50350/0 (socket says 192.168.123.100:50350) 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.955+0000 7f802bfff700 1 -- 192.168.123.100:0/3817293451 learned_addr learned my addr 192.168.123.100:0/3817293451 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.955+0000 7f802bfff700 1 -- 192.168.123.100:0/3817293451 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f801c009740 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.955+0000 7f802bfff700 1 --2- 192.168.123.100:0/3817293451 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c19baf0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f801c0117c0 tx=0x7f801c0118a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.955+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f801c011b40 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.956+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f801c011ca0 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.956+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f801c0194b0 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.956+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f802c19c230 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.956+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f802c19c610 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.957+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f801c019910 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.957+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f801c01e070 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.957+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f802c04f9e0 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.958+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f801c011e10 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.996+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7f802c062380 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.998+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 7 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f801c01aac0 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:18.998+0000 7f8029ffb700 1 -- 192.168.123.100:0/3817293451 <== mon.0 v2:192.168.123.100:3300/0 8 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7f801c02c410 con 0x7f802c107f20 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.000+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 msgr2=0x7f802c19baf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.000+0000 7f8032894700 1 --2- 192.168.123.100:0/3817293451 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c19baf0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f801c0117c0 tx=0x7f801c0118a0 comp rx=0 tx=0).stop 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.000+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 shutdown_connections 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.000+0000 7f8032894700 1 --2- 192.168.123.100:0/3817293451 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f802c107f20 0x7f802c19baf0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.000+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 >> 192.168.123.100:0/3817293451 conn(0x7f802c07b4b0 msgr2=0x7f802c1056a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.000+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 shutdown_connections 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.000+0000 7f8032894700 1 -- 192.168.123.100:0/3817293451 wait complete. 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-10T12:32:19.055 INFO:teuthology.orchestra.run.vm00.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-10T12:32:19.056 INFO:teuthology.orchestra.run.vm00.stdout:Creating mgr... 2026-03-10T12:32:19.056 INFO:teuthology.orchestra.run.vm00.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-10T12:32:19.057 INFO:teuthology.orchestra.run.vm00.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-10T12:32:19.057 INFO:teuthology.orchestra.run.vm00.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-10T12:32:19.210 INFO:teuthology.orchestra.run.vm00.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mgr.vm00.nescmq 2026-03-10T12:32:19.211 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Failed to reset failed state of unit ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mgr.vm00.nescmq.service: Unit ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mgr.vm00.nescmq.service not loaded. 2026-03-10T12:32:19.345 INFO:teuthology.orchestra.run.vm00.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8.target.wants/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mgr.vm00.nescmq.service → /etc/systemd/system/ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@.service. 2026-03-10T12:32:19.525 INFO:teuthology.orchestra.run.vm00.stdout:firewalld does not appear to be present 2026-03-10T12:32:19.525 INFO:teuthology.orchestra.run.vm00.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T12:32:19.525 INFO:teuthology.orchestra.run.vm00.stdout:firewalld does not appear to be present 2026-03-10T12:32:19.525 INFO:teuthology.orchestra.run.vm00.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-10T12:32:19.525 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for mgr to start... 2026-03-10T12:32:19.525 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for mgr... 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsid": "1a52002a-1c7d-11f1-af82-51cdd81caea8", 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 0 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "vm00" 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_age": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T12:32:19.783 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T12:32:17.778783+0000", 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.695+0000 7fc52935e700 1 Processor -- start 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.695+0000 7fc52935e700 1 -- start start 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.695+0000 7fc52935e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc524071200 0x7fc524071610 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.695+0000 7fc52935e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5240728b0 con 0x7fc524071200 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.697+0000 7fc522ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc524071200 0x7fc524071610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.697+0000 7fc522ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc524071200 0x7fc524071610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50388/0 (socket says 192.168.123.100:50388) 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.697+0000 7fc522ffd700 1 -- 192.168.123.100:0/1478108074 learned_addr learned my addr 192.168.123.100:0/1478108074 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.698+0000 7fc522ffd700 1 -- 192.168.123.100:0/1478108074 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5240729f0 con 0x7fc524071200 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.698+0000 7fc522ffd700 1 --2- 192.168.123.100:0/1478108074 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc524071200 0x7fc524071610 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc514009a90 tx=0x7fc514009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=2041d8b804ef7193 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.699+0000 7fc521ffb700 1 -- 192.168.123.100:0/1478108074 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc514004030 con 0x7fc524071200 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.699+0000 7fc521ffb700 1 -- 192.168.123.100:0/1478108074 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc51400b7e0 con 0x7fc524071200 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.699+0000 7fc521ffb700 1 -- 192.168.123.100:0/1478108074 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc5140039f0 con 0x7fc524071200 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.699+0000 7fc52935e700 1 -- 192.168.123.100:0/1478108074 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc524071200 msgr2=0x7fc524071610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.699+0000 7fc52935e700 1 --2- 192.168.123.100:0/1478108074 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc524071200 0x7fc524071610 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc514009a90 tx=0x7fc514009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.700+0000 7fc52935e700 1 -- 192.168.123.100:0/1478108074 shutdown_connections 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.700+0000 7fc52935e700 1 --2- 192.168.123.100:0/1478108074 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc524071200 0x7fc524071610 secure :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fc514009a90 tx=0x7fc514009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.700+0000 7fc52935e700 1 -- 192.168.123.100:0/1478108074 >> 192.168.123.100:0/1478108074 conn(0x7fc52406cc30 msgr2=0x7fc52406f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.701+0000 7fc52935e700 1 -- 192.168.123.100:0/1478108074 shutdown_connections 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.701+0000 7fc52935e700 1 -- 192.168.123.100:0/1478108074 wait complete. 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.701+0000 7fc52935e700 1 Processor -- start 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.701+0000 7fc52935e700 1 -- start start 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.701+0000 7fc52935e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5241a89f0 0x7fc5241a8e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.701+0000 7fc52935e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5241a9340 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.702+0000 7fc522ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5241a89f0 0x7fc5241a8e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.702+0000 7fc522ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5241a89f0 0x7fc5241a8e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50396/0 (socket says 192.168.123.100:50396) 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.702+0000 7fc522ffd700 1 -- 192.168.123.100:0/4161386942 learned_addr learned my addr 192.168.123.100:0/4161386942 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.702+0000 7fc522ffd700 1 -- 192.168.123.100:0/4161386942 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc514009740 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.703+0000 7fc522ffd700 1 --2- 192.168.123.100:0/4161386942 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5241a89f0 0x7fc5241a8e00 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc5240725e0 tx=0x7fc514003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.704+0000 7fc50bfff700 1 -- 192.168.123.100:0/4161386942 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc514003fa0 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.704+0000 7fc52935e700 1 -- 192.168.123.100:0/4161386942 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5241a9540 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.704+0000 7fc52935e700 1 -- 192.168.123.100:0/4161386942 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc52407b250 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.704+0000 7fc50bfff700 1 -- 192.168.123.100:0/4161386942 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc5140045a0 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.704+0000 7fc50bfff700 1 -- 192.168.123.100:0/4161386942 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc51401b440 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.705+0000 7fc52935e700 1 -- 192.168.123.100:0/4161386942 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc510005320 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.705+0000 7fc50bfff700 1 -- 192.168.123.100:0/4161386942 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fc514004100 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.706+0000 7fc50bfff700 1 -- 192.168.123.100:0/4161386942 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc51401b910 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.707+0000 7fc50bfff700 1 -- 192.168.123.100:0/4161386942 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fc51401f070 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.747+0000 7fc52935e700 1 -- 192.168.123.100:0/4161386942 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fc5100059f0 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.747+0000 7fc50bfff700 1 -- 192.168.123.100:0/4161386942 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fc514024810 con 0x7fc5241a89f0 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.749+0000 7fc509ffb700 1 -- 192.168.123.100:0/4161386942 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5241a89f0 msgr2=0x7fc5241a8e00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.749+0000 7fc509ffb700 1 --2- 192.168.123.100:0/4161386942 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5241a89f0 0x7fc5241a8e00 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fc5240725e0 tx=0x7fc514003b40 comp rx=0 tx=0).stop 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.750+0000 7fc509ffb700 1 -- 192.168.123.100:0/4161386942 shutdown_connections 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.750+0000 7fc509ffb700 1 --2- 192.168.123.100:0/4161386942 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5241a89f0 0x7fc5241a8e00 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.750+0000 7fc509ffb700 1 -- 192.168.123.100:0/4161386942 >> 192.168.123.100:0/4161386942 conn(0x7fc52406cc30 msgr2=0x7fc524113cb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.750+0000 7fc509ffb700 1 -- 192.168.123.100:0/4161386942 shutdown_connections 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:19.750+0000 7fc509ffb700 1 -- 192.168.123.100:0/4161386942 wait complete. 2026-03-10T12:32:19.784 INFO:teuthology.orchestra.run.vm00.stdout:mgr not available, waiting (1/15)... 2026-03-10T12:32:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:20 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3817293451' entity='client.admin' 2026-03-10T12:32:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:20 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/4161386942' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T12:32:22.021 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:22.021 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsid": "1a52002a-1c7d-11f1-af82-51cdd81caea8", 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 0 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "vm00" 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T12:32:22.022 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T12:32:22.024 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:22.024 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T12:32:22.024 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T12:32:17.778783+0000", 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.934+0000 7f5fd6fc3700 1 Processor -- start 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.934+0000 7f5fd6fc3700 1 -- start start 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.934+0000 7f5fd6fc3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd0071060 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.934+0000 7f5fd6fc3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5fd00715a0 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.935+0000 7f5fd5fc1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd0071060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.935+0000 7f5fd5fc1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd0071060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50398/0 (socket says 192.168.123.100:50398) 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.935+0000 7f5fd5fc1700 1 -- 192.168.123.100:0/3560003005 learned_addr learned my addr 192.168.123.100:0/3560003005 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.936+0000 7f5fd5fc1700 1 -- 192.168.123.100:0/3560003005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5fd00716e0 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.936+0000 7f5fd5fc1700 1 --2- 192.168.123.100:0/3560003005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd0071060 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f5fcc009a90 tx=0x7f5fcc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=86b65715bf478270 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.936+0000 7f5fd4fbf700 1 -- 192.168.123.100:0/3560003005 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5fcc004030 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.936+0000 7f5fd4fbf700 1 -- 192.168.123.100:0/3560003005 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f5fcc00b7e0 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.936+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/3560003005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 msgr2=0x7f5fd0071060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.937+0000 7f5fd4fbf700 1 -- 192.168.123.100:0/3560003005 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5fcc004030 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.937+0000 7f5fd6fc3700 1 --2- 192.168.123.100:0/3560003005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd0071060 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f5fcc009a90 tx=0x7f5fcc009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.937+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/3560003005 shutdown_connections 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.937+0000 7f5fd6fc3700 1 --2- 192.168.123.100:0/3560003005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd0071060 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.937+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/3560003005 >> 192.168.123.100:0/3560003005 conn(0x7f5fd006c9d0 msgr2=0x7f5fd006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.937+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/3560003005 shutdown_connections 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.937+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/3560003005 wait complete. 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.938+0000 7f5fd6fc3700 1 Processor -- start 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.938+0000 7f5fd6fc3700 1 -- start start 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.938+0000 7f5fd6fc3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd01a87a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.938+0000 7f5fd6fc3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5fd01a8ce0 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.938+0000 7f5fd5fc1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd01a87a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.939+0000 7f5fd5fc1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd01a87a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50402/0 (socket says 192.168.123.100:50402) 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.939+0000 7f5fd5fc1700 1 -- 192.168.123.100:0/685206696 learned_addr learned my addr 192.168.123.100:0/685206696 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.939+0000 7f5fd5fc1700 1 -- 192.168.123.100:0/685206696 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5fcc009740 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.939+0000 7f5fd5fc1700 1 --2- 192.168.123.100:0/685206696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd01a87a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f5fcc0036e0 tx=0x7f5fcc00bf30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.939+0000 7f5fc6ffd700 1 -- 192.168.123.100:0/685206696 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5fcc003fe0 con 0x7f5fd0072a40 2026-03-10T12:32:22.025 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.940+0000 7f5fc6ffd700 1 -- 192.168.123.100:0/685206696 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f5fcc01a460 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.940+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/685206696 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5fd01a8ee0 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.940+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/685206696 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5fd01a9380 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.941+0000 7f5fc6ffd700 1 -- 192.168.123.100:0/685206696 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5fcc004140 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.941+0000 7f5fc6ffd700 1 -- 192.168.123.100:0/685206696 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f5fcc004330 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.941+0000 7f5fc6ffd700 1 -- 192.168.123.100:0/685206696 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f5fcc028030 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.941+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/685206696 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5fd0062380 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.944+0000 7f5fc6ffd700 1 -- 192.168.123.100:0/685206696 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7f5fcc01e030 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.984+0000 7f5fd6fc3700 1 -- 192.168.123.100:0/685206696 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f5fd01abe30 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.985+0000 7f5fc6ffd700 1 -- 192.168.123.100:0/685206696 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f5fcc01e030 con 0x7f5fd0072a40 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.986+0000 7f5fc4ff9700 1 -- 192.168.123.100:0/685206696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 msgr2=0x7f5fd01a87a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.986+0000 7f5fc4ff9700 1 --2- 192.168.123.100:0/685206696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd01a87a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f5fcc0036e0 tx=0x7f5fcc00bf30 comp rx=0 tx=0).stop 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.988+0000 7f5fc4ff9700 1 -- 192.168.123.100:0/685206696 shutdown_connections 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.988+0000 7f5fc4ff9700 1 --2- 192.168.123.100:0/685206696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5fd0072a40 0x7f5fd01a87a0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.988+0000 7f5fc4ff9700 1 -- 192.168.123.100:0/685206696 >> 192.168.123.100:0/685206696 conn(0x7f5fd006c9d0 msgr2=0x7f5fd006e080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.988+0000 7f5fc4ff9700 1 -- 192.168.123.100:0/685206696 shutdown_connections 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:21.988+0000 7f5fc4ff9700 1 -- 192.168.123.100:0/685206696 wait complete. 2026-03-10T12:32:22.026 INFO:teuthology.orchestra.run.vm00.stdout:mgr not available, waiting (2/15)... 2026-03-10T12:32:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:22 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/685206696' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsid": "1a52002a-1c7d-11f1-af82-51cdd81caea8", 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 0 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "vm00" 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T12:32:24.271 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T12:32:17.778783+0000", 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.159+0000 7fb9c9b2b700 1 Processor -- start 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.159+0000 7fb9c9b2b700 1 -- start start 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.159+0000 7fb9c9b2b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc0a4350 0x7fb9bc0a4760 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.159+0000 7fb9c9b2b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9bc0a4d30 con 0x7fb9bc0a4350 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.159+0000 7fb9c8b29700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc0a4350 0x7fb9bc0a4760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.159+0000 7fb9c8b29700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc0a4350 0x7fb9bc0a4760 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50408/0 (socket says 192.168.123.100:50408) 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.159+0000 7fb9c8b29700 1 -- 192.168.123.100:0/4291067920 learned_addr learned my addr 192.168.123.100:0/4291067920 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.160+0000 7fb9c8b29700 1 -- 192.168.123.100:0/4291067920 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9bc0a5550 con 0x7fb9bc0a4350 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.160+0000 7fb9c8b29700 1 --2- 192.168.123.100:0/4291067920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc0a4350 0x7fb9bc0a4760 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb9b8009cf0 tx=0x7fb9b800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c66d0d79296e2c00 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.160+0000 7fb9c37fe700 1 -- 192.168.123.100:0/4291067920 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9b8004030 con 0x7fb9bc0a4350 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.160+0000 7fb9c37fe700 1 -- 192.168.123.100:0/4291067920 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb9b800b810 con 0x7fb9bc0a4350 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4291067920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc0a4350 msgr2=0x7fb9bc0a4760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 --2- 192.168.123.100:0/4291067920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc0a4350 0x7fb9bc0a4760 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fb9b8009cf0 tx=0x7fb9b800b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4291067920 shutdown_connections 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 --2- 192.168.123.100:0/4291067920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc0a4350 0x7fb9bc0a4760 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4291067920 >> 192.168.123.100:0/4291067920 conn(0x7fb9bc09f490 msgr2=0x7fb9bc0a18e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4291067920 shutdown_connections 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4291067920 wait complete. 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.161+0000 7fb9c9b2b700 1 Processor -- start 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.162+0000 7fb9c9b2b700 1 -- start start 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.162+0000 7fb9c9b2b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc137db0 0x7fb9bc1381c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.162+0000 7fb9c9b2b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9bc0a4d30 con 0x7fb9bc137db0 2026-03-10T12:32:24.272 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.162+0000 7fb9c8b29700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc137db0 0x7fb9bc1381c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.162+0000 7fb9c8b29700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc137db0 0x7fb9bc1381c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50412/0 (socket says 192.168.123.100:50412) 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.162+0000 7fb9c8b29700 1 -- 192.168.123.100:0/4065565177 learned_addr learned my addr 192.168.123.100:0/4065565177 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.162+0000 7fb9c8b29700 1 -- 192.168.123.100:0/4065565177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9b8009740 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.163+0000 7fb9c8b29700 1 --2- 192.168.123.100:0/4065565177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc137db0 0x7fb9bc1381c0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb9b800bde0 tx=0x7fb9b800bec0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.163+0000 7fb9c1ffb700 1 -- 192.168.123.100:0/4065565177 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9b8003f10 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.163+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4065565177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9bc138700 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.163+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4065565177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9bc13b390 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.164+0000 7fb9c1ffb700 1 -- 192.168.123.100:0/4065565177 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb9b8004510 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.164+0000 7fb9c1ffb700 1 -- 192.168.123.100:0/4065565177 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9b801ace0 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.164+0000 7fb9c1ffb700 1 -- 192.168.123.100:0/4065565177 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fb9b802c730 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.164+0000 7fb9c1ffb700 1 -- 192.168.123.100:0/4065565177 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fb9b8011ab0 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.166+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4065565177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb9a8005320 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.167+0000 7fb9c1ffb700 1 -- 192.168.123.100:0/4065565177 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72422 (secure 0 0 0) 0x7fb9b8006ca0 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.205+0000 7fb9c9b2b700 1 -- 192.168.123.100:0/4065565177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fb9a8005190 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.206+0000 7fb9c1ffb700 1 -- 192.168.123.100:0/4065565177 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fb9b801a5f0 con 0x7fb9bc137db0 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.211+0000 7fb9af7fe700 1 -- 192.168.123.100:0/4065565177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc137db0 msgr2=0x7fb9bc1381c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.211+0000 7fb9af7fe700 1 --2- 192.168.123.100:0/4065565177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc137db0 0x7fb9bc1381c0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb9b800bde0 tx=0x7fb9b800bec0 comp rx=0 tx=0).stop 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.211+0000 7fb9af7fe700 1 -- 192.168.123.100:0/4065565177 shutdown_connections 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.211+0000 7fb9af7fe700 1 --2- 192.168.123.100:0/4065565177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9bc137db0 0x7fb9bc1381c0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.211+0000 7fb9af7fe700 1 -- 192.168.123.100:0/4065565177 >> 192.168.123.100:0/4065565177 conn(0x7fb9bc09f490 msgr2=0x7fb9bc0a9b80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.211+0000 7fb9af7fe700 1 -- 192.168.123.100:0/4065565177 shutdown_connections 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:24.211+0000 7fb9af7fe700 1 -- 192.168.123.100:0/4065565177 wait complete. 2026-03-10T12:32:24.273 INFO:teuthology.orchestra.run.vm00.stdout:mgr not available, waiting (3/15)... 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/4065565177' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: Activating manager daemon vm00.nescmq 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: mgrmap e2: vm00.nescmq(active, starting, since 0.00453313s) 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: Manager daemon vm00.nescmq is now available 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:24 vm00 ceph-mon[50686]: from='mgr.14100 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:26.569 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:26 vm00 ceph-mon[50686]: mgrmap e3: vm00.nescmq(active, since 1.00974s) 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsid": "1a52002a-1c7d-11f1-af82-51cdd81caea8", 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 0 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "vm00" 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "quorum_age": 7, 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:26.626 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ], 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T12:32:17.778783+0000", 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout }, 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.399+0000 7faba522f700 1 Processor -- start 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.400+0000 7faba522f700 1 -- start start 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.400+0000 7faba522f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba0104620 0x7faba0106a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.400+0000 7faba522f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faba00745b0 con 0x7faba0104620 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.400+0000 7fab9ed9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba0104620 0x7faba0106a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.400+0000 7fab9ed9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba0104620 0x7faba0106a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50482/0 (socket says 192.168.123.100:50482) 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.400+0000 7fab9ed9d700 1 -- 192.168.123.100:0/114923493 learned_addr learned my addr 192.168.123.100:0/114923493 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.401+0000 7fab9ed9d700 1 -- 192.168.123.100:0/114923493 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faba00746f0 con 0x7faba0104620 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.401+0000 7fab9ed9d700 1 --2- 192.168.123.100:0/114923493 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba0104620 0x7faba0106a40 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fab88009cf0 tx=0x7fab8800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4aa091f4bd1ba152 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.401+0000 7fab9dd9b700 1 -- 192.168.123.100:0/114923493 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab88004030 con 0x7faba0104620 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.401+0000 7fab9dd9b700 1 -- 192.168.123.100:0/114923493 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fab8800b810 con 0x7faba0104620 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.401+0000 7faba522f700 1 -- 192.168.123.100:0/114923493 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba0104620 msgr2=0x7faba0106a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.401+0000 7faba522f700 1 --2- 192.168.123.100:0/114923493 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba0104620 0x7faba0106a40 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fab88009cf0 tx=0x7fab8800b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.402+0000 7faba522f700 1 -- 192.168.123.100:0/114923493 shutdown_connections 2026-03-10T12:32:26.627 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.402+0000 7faba522f700 1 --2- 192.168.123.100:0/114923493 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba0104620 0x7faba0106a40 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.402+0000 7faba522f700 1 -- 192.168.123.100:0/114923493 >> 192.168.123.100:0/114923493 conn(0x7faba0100270 msgr2=0x7faba01026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.402+0000 7faba522f700 1 -- 192.168.123.100:0/114923493 shutdown_connections 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.402+0000 7faba522f700 1 -- 192.168.123.100:0/114923493 wait complete. 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.402+0000 7faba522f700 1 Processor -- start 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.402+0000 7faba522f700 1 -- start start 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.403+0000 7faba522f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba01a01c0 0x7faba01a05d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.403+0000 7faba522f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faba00745b0 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.403+0000 7fab9ed9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba01a01c0 0x7faba01a05d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.403+0000 7fab9ed9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba01a01c0 0x7faba01a05d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50496/0 (socket says 192.168.123.100:50496) 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.403+0000 7fab9ed9d700 1 -- 192.168.123.100:0/3613706768 learned_addr learned my addr 192.168.123.100:0/3613706768 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.403+0000 7fab9ed9d700 1 -- 192.168.123.100:0/3613706768 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab88009740 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.404+0000 7fab9ed9d700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba01a01c0 0x7faba01a05d0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fab88009cc0 tx=0x7fab88003cb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.404+0000 7fab97fff700 1 -- 192.168.123.100:0/3613706768 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab88003ed0 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.404+0000 7fab97fff700 1 -- 192.168.123.100:0/3613706768 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fab880044d0 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.404+0000 7fab97fff700 1 -- 192.168.123.100:0/3613706768 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab8801ac60 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.404+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faba01a0b10 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.404+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faba01a37a0 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.405+0000 7fab97fff700 1 -- 192.168.123.100:0/3613706768 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7fab88004030 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.405+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faba004fa50 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.405+0000 7fab97fff700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fab8c038410 0x7fab8c03a8c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.405+0000 7fab97fff700 1 -- 192.168.123.100:0/3613706768 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fab8804b230 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.406+0000 7fab9e59c700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fab8c038410 0x7fab8c03a8c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.408+0000 7fab9e59c700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fab8c038410 0x7fab8c03a8c0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fab90006fd0 tx=0x7fab90006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.409+0000 7fab97fff700 1 -- 192.168.123.100:0/3613706768 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fab8801adc0 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.568+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7faba0199900 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.571+0000 7fab97fff700 1 -- 192.168.123.100:0/3613706768 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7fab8801f020 con 0x7faba01a01c0 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.574+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fab8c038410 msgr2=0x7fab8c03a8c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.574+0000 7faba522f700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fab8c038410 0x7fab8c03a8c0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fab90006fd0 tx=0x7fab90006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.574+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba01a01c0 msgr2=0x7faba01a05d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.574+0000 7faba522f700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba01a01c0 0x7faba01a05d0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fab88009cc0 tx=0x7fab88003cb0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.575+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 shutdown_connections 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.575+0000 7faba522f700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fab8c038410 0x7fab8c03a8c0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.575+0000 7faba522f700 1 --2- 192.168.123.100:0/3613706768 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faba01a01c0 0x7faba01a05d0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.575+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 >> 192.168.123.100:0/3613706768 conn(0x7faba0100270 msgr2=0x7faba0106840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.575+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 shutdown_connections 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.575+0000 7faba522f700 1 -- 192.168.123.100:0/3613706768 wait complete. 2026-03-10T12:32:26.628 INFO:teuthology.orchestra.run.vm00.stdout:mgr is available 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout fsid = 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T12:32:26.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.759+0000 7f77ece49700 1 Processor -- start 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.760+0000 7f77ece49700 1 -- start start 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.760+0000 7f77ece49700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8108c10 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.760+0000 7f77ece49700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77e80745b0 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.760+0000 7f77e659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8108c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.760+0000 7f77e659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8108c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50508/0 (socket says 192.168.123.100:50508) 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.760+0000 7f77e659c700 1 -- 192.168.123.100:0/1486363238 learned_addr learned my addr 192.168.123.100:0/1486363238 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.761+0000 7f77e659c700 1 -- 192.168.123.100:0/1486363238 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77e80746f0 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.761+0000 7f77e659c700 1 --2- 192.168.123.100:0/1486363238 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8108c10 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f77d8009a90 tx=0x7f77d8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=932ca4d06340b847 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.761+0000 7f77e559a700 1 -- 192.168.123.100:0/1486363238 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f77d8004030 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.761+0000 7f77e559a700 1 -- 192.168.123.100:0/1486363238 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f77d800b7e0 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.762+0000 7f77ece49700 1 -- 192.168.123.100:0/1486363238 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 msgr2=0x7f77e8108c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.762+0000 7f77ece49700 1 --2- 192.168.123.100:0/1486363238 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8108c10 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f77d8009a90 tx=0x7f77d8009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.762+0000 7f77ece49700 1 -- 192.168.123.100:0/1486363238 shutdown_connections 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.762+0000 7f77ece49700 1 --2- 192.168.123.100:0/1486363238 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8108c10 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.762+0000 7f77ece49700 1 -- 192.168.123.100:0/1486363238 >> 192.168.123.100:0/1486363238 conn(0x7f77e8100270 msgr2=0x7f77e81026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.763+0000 7f77ece49700 1 -- 192.168.123.100:0/1486363238 shutdown_connections 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.763+0000 7f77ece49700 1 -- 192.168.123.100:0/1486363238 wait complete. 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.763+0000 7f77ece49700 1 Processor -- start 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.763+0000 7f77ece49700 1 -- start start 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77ece49700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8199e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77ece49700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77e80745b0 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77e659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8199e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77e659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8199e40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50514/0 (socket says 192.168.123.100:50514) 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77e659c700 1 -- 192.168.123.100:0/3419304080 learned_addr learned my addr 192.168.123.100:0/3419304080 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77e659c700 1 -- 192.168.123.100:0/3419304080 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77d8009740 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77e659c700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8199e40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f77d800be00 tx=0x7f77d800bee0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.764+0000 7f77d77fe700 1 -- 192.168.123.100:0/3419304080 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f77d8003f60 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.765+0000 7f77d77fe700 1 -- 192.168.123.100:0/3419304080 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f77d8004560 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.765+0000 7f77d77fe700 1 -- 192.168.123.100:0/3419304080 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f77d8024d30 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.765+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f77e819a380 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.765+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77e819a820 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.766+0000 7f77d77fe700 1 -- 192.168.123.100:0/3419304080 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7f77d802b030 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.767+0000 7f77d77fe700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77d0038400 0x7f77d003a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.767+0000 7f77d77fe700 1 -- 192.168.123.100:0/3419304080 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f77d804bcc0 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.767+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f77e8193c80 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.767+0000 7f77e5d9b700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77d0038400 0x7f77d003a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.770+0000 7f77d77fe700 1 -- 192.168.123.100:0/3419304080 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f77d801f030 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.770+0000 7f77e5d9b700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77d0038400 0x7f77d003a8b0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f77dc006fd0 tx=0x7f77dc006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.876+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f77e802cce0 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.880+0000 7f77d77fe700 1 -- 192.168.123.100:0/3419304080 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+373 (secure 0 0 0) 0x7f77d8049020 con 0x7f77e8106830 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.882+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77d0038400 msgr2=0x7f77d003a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.882+0000 7f77ece49700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77d0038400 0x7f77d003a8b0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f77dc006fd0 tx=0x7f77dc006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.882+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 msgr2=0x7f77e8199e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.882+0000 7f77ece49700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8199e40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f77d800be00 tx=0x7f77d800bee0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.883+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 shutdown_connections 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.883+0000 7f77ece49700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77d0038400 0x7f77d003a8b0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.883+0000 7f77ece49700 1 --2- 192.168.123.100:0/3419304080 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77e8106830 0x7f77e8199e40 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.883+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 >> 192.168.123.100:0/3419304080 conn(0x7f77e8100270 msgr2=0x7f77e81019c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:26.916 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.883+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 shutdown_connections 2026-03-10T12:32:26.917 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:26.883+0000 7f77ece49700 1 -- 192.168.123.100:0/3419304080 wait complete. 2026-03-10T12:32:26.917 INFO:teuthology.orchestra.run.vm00.stdout:Enabling cephadm module... 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.036+0000 7fac95728700 1 Processor -- start 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.036+0000 7fac95728700 1 -- start start 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.037+0000 7fac95728700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac90108c10 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.037+0000 7fac95728700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac900745b0 con 0x7fac90106830 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.037+0000 7fac8effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac90108c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.037+0000 7fac8effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac90108c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50522/0 (socket says 192.168.123.100:50522) 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.037+0000 7fac8effd700 1 -- 192.168.123.100:0/1264571586 learned_addr learned my addr 192.168.123.100:0/1264571586 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.038+0000 7fac8effd700 1 -- 192.168.123.100:0/1264571586 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac900746f0 con 0x7fac90106830 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.038+0000 7fac8effd700 1 --2- 192.168.123.100:0/1264571586 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac90108c10 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fac78009cf0 tx=0x7fac7800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=85f8851be5f5e169 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.038+0000 7fac8dffb700 1 -- 192.168.123.100:0/1264571586 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fac78004030 con 0x7fac90106830 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.038+0000 7fac8dffb700 1 -- 192.168.123.100:0/1264571586 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fac7800b810 con 0x7fac90106830 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.038+0000 7fac8dffb700 1 -- 192.168.123.100:0/1264571586 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fac78003a90 con 0x7fac90106830 2026-03-10T12:32:27.341 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 -- 192.168.123.100:0/1264571586 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 msgr2=0x7fac90108c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 --2- 192.168.123.100:0/1264571586 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac90108c10 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fac78009cf0 tx=0x7fac7800b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 -- 192.168.123.100:0/1264571586 shutdown_connections 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 --2- 192.168.123.100:0/1264571586 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac90108c10 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 -- 192.168.123.100:0/1264571586 >> 192.168.123.100:0/1264571586 conn(0x7fac90100270 msgr2=0x7fac901026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 -- 192.168.123.100:0/1264571586 shutdown_connections 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 -- 192.168.123.100:0/1264571586 wait complete. 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.039+0000 7fac95728700 1 Processor -- start 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.040+0000 7fac95728700 1 -- start start 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.040+0000 7fac95728700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac901a0010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.040+0000 7fac95728700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac901a0550 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.040+0000 7fac8effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac901a0010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.040+0000 7fac8effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac901a0010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50524/0 (socket says 192.168.123.100:50524) 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.040+0000 7fac8effd700 1 -- 192.168.123.100:0/2982805212 learned_addr learned my addr 192.168.123.100:0/2982805212 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.040+0000 7fac8effd700 1 -- 192.168.123.100:0/2982805212 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac78009740 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.041+0000 7fac8effd700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac901a0010 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fac78009710 tx=0x7fac78003e00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.041+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fac78004120 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.041+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fac78004280 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.041+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fac78011510 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.041+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac901a0750 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.041+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac901a0bf0 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.042+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 4) v1 ==== 45068+0+0 (secure 0 0 0) 0x7fac780043f0 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.042+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fac90191670 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.042+0000 7fac87fff700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fac7c0383c0 0x7fac7c03a870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.042+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fac7804b980 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.042+0000 7fac8e7fc700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fac7c0383c0 0x7fac7c03a870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.045+0000 7fac8e7fc700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fac7c0383c0 0x7fac7c03a870 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fac80006fd0 tx=0x7fac80006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.046+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fac78020070 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.177+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7fac90062380 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.272+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fac78011670 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.272+0000 7fac87fff700 1 -- 192.168.123.100:0/2982805212 <== mon.0 v2:192.168.123.100:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v5) v1 ==== 86+0+0 (secure 0 0 0) 0x7fac7804e040 con 0x7fac90106830 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fac7c0383c0 msgr2=0x7fac7c03a870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fac7c0383c0 0x7fac7c03a870 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fac80006fd0 tx=0x7fac80006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 msgr2=0x7fac901a0010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac901a0010 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fac78009710 tx=0x7fac78003e00 comp rx=0 tx=0).stop 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 shutdown_connections 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fac7c0383c0 0x7fac7c03a870 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 --2- 192.168.123.100:0/2982805212 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fac90106830 0x7fac901a0010 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 >> 192.168.123.100:0/2982805212 conn(0x7fac90100270 msgr2=0x7fac90100f20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 shutdown_connections 2026-03-10T12:32:27.342 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.275+0000 7fac95728700 1 -- 192.168.123.100:0/2982805212 wait complete. 2026-03-10T12:32:27.437 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:27 vm00 ceph-mon[50686]: mgrmap e4: vm00.nescmq(active, since 2s) 2026-03-10T12:32:27.437 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:27 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3613706768' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T12:32:27.437 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:27 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3419304080' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-10T12:32:27.437 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:27 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2982805212' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 5, 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "active_name": "vm00.nescmq", 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.484+0000 7f2ed69ce700 1 Processor -- start 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.484+0000 7f2ed69ce700 1 -- start start 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.484+0000 7f2ed69ce700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed0071050 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.484+0000 7f2ed69ce700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ed0071590 con 0x7f2ed0072b50 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.485+0000 7f2ecffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed0071050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.485+0000 7f2ecffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed0071050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50548/0 (socket says 192.168.123.100:50548) 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.485+0000 7f2ecffff700 1 -- 192.168.123.100:0/973921852 learned_addr learned my addr 192.168.123.100:0/973921852 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.485+0000 7f2ecffff700 1 -- 192.168.123.100:0/973921852 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2ed00716d0 con 0x7f2ed0072b50 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.485+0000 7f2ecffff700 1 --2- 192.168.123.100:0/973921852 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed0071050 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f2ec000b0d0 tx=0x7f2ec000b490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=99eeeb3ac171997f server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.486+0000 7f2eceffd700 1 -- 192.168.123.100:0/973921852 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2ec000e070 con 0x7f2ed0072b50 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.486+0000 7f2eceffd700 1 -- 192.168.123.100:0/973921852 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f2ec0003a20 con 0x7f2ed0072b50 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.486+0000 7f2eceffd700 1 -- 192.168.123.100:0/973921852 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2ec0004670 con 0x7f2ed0072b50 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.490+0000 7f2ed69ce700 1 -- 192.168.123.100:0/973921852 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 msgr2=0x7f2ed0071050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.490+0000 7f2ed69ce700 1 --2- 192.168.123.100:0/973921852 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed0071050 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f2ec000b0d0 tx=0x7f2ec000b490 comp rx=0 tx=0).stop 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.491+0000 7f2ed69ce700 1 -- 192.168.123.100:0/973921852 shutdown_connections 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.491+0000 7f2ed69ce700 1 --2- 192.168.123.100:0/973921852 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed0071050 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.491+0000 7f2ed69ce700 1 -- 192.168.123.100:0/973921852 >> 192.168.123.100:0/973921852 conn(0x7f2ed006c970 msgr2=0x7f2ed006eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.491+0000 7f2ed69ce700 1 -- 192.168.123.100:0/973921852 shutdown_connections 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.491+0000 7f2ed69ce700 1 -- 192.168.123.100:0/973921852 wait complete. 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.491+0000 7f2ed69ce700 1 Processor -- start 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.491+0000 7f2ed69ce700 1 -- start start 2026-03-10T12:32:27.706 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.492+0000 7f2ed69ce700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed01a8720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.492+0000 7f2ed69ce700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ed01a8c60 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.492+0000 7f2ecffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed01a8720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.492+0000 7f2ecffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed01a8720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:50560/0 (socket says 192.168.123.100:50560) 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.492+0000 7f2ecffff700 1 -- 192.168.123.100:0/2565746801 learned_addr learned my addr 192.168.123.100:0/2565746801 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.492+0000 7f2ecffff700 1 -- 192.168.123.100:0/2565746801 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2ec0009d20 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.493+0000 7f2ecffff700 1 --2- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed01a8720 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f2ec0000f80 tx=0x7f2ec000bd60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.493+0000 7f2ecd7fa700 1 -- 192.168.123.100:0/2565746801 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2ec000e070 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.493+0000 7f2ed69ce700 1 -- 192.168.123.100:0/2565746801 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2ed01a8e60 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.493+0000 7f2ed69ce700 1 -- 192.168.123.100:0/2565746801 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2ed01a9360 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.493+0000 7f2ecd7fa700 1 -- 192.168.123.100:0/2565746801 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f2ec00092e0 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.493+0000 7f2ecd7fa700 1 -- 192.168.123.100:0/2565746801 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2ec00129a0 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.494+0000 7f2ecd7fa700 1 -- 192.168.123.100:0/2565746801 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f2ec0019040 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.494+0000 7f2ecd7fa700 1 --2- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2ebc0384c0 0x7f2ebc03a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.494+0000 7f2ecf7fe700 1 -- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2ebc0384c0 msgr2=0x7f2ebc03a970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.494+0000 7f2ecf7fe700 1 --2- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2ebc0384c0 0x7f2ebc03a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.494+0000 7f2ecd7fa700 1 -- 192.168.123.100:0/2565746801 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f2ec004b940 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.495+0000 7f2ed69ce700 1 -- 192.168.123.100:0/2565746801 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2eb0005320 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.498+0000 7f2ecd7fa700 1 -- 192.168.123.100:0/2565746801 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2ec001f070 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.636+0000 7f2ed69ce700 1 -- 192.168.123.100:0/2565746801 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f2eb0006200 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.637+0000 7f2ecd7fa700 1 -- 192.168.123.100:0/2565746801 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v5) v1 ==== 56+0+98 (secure 0 0 0) 0x7f2ec0017030 con 0x7f2ed0072b50 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.640+0000 7f2ec6ffd700 1 -- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2ebc0384c0 msgr2=0x7f2ebc03a970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.640+0000 7f2ec6ffd700 1 --2- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2ebc0384c0 0x7f2ebc03a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.640+0000 7f2ec6ffd700 1 -- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 msgr2=0x7f2ed01a8720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.640+0000 7f2ec6ffd700 1 --2- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed01a8720 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f2ec0000f80 tx=0x7f2ec000bd60 comp rx=0 tx=0).stop 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.640+0000 7f2ec6ffd700 1 -- 192.168.123.100:0/2565746801 shutdown_connections 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.640+0000 7f2ec6ffd700 1 --2- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2ebc0384c0 0x7f2ebc03a970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.640+0000 7f2ec6ffd700 1 --2- 192.168.123.100:0/2565746801 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2ed0072b50 0x7f2ed01a8720 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.641+0000 7f2ec6ffd700 1 -- 192.168.123.100:0/2565746801 >> 192.168.123.100:0/2565746801 conn(0x7f2ed006c970 msgr2=0x7f2ed006d700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.641+0000 7f2ec6ffd700 1 -- 192.168.123.100:0/2565746801 shutdown_connections 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.641+0000 7f2ec6ffd700 1 -- 192.168.123.100:0/2565746801 wait complete. 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for the mgr to restart... 2026-03-10T12:32:27.707 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for mgr epoch 5... 2026-03-10T12:32:28.532 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:28 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2982805212' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-10T12:32:28.532 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:28 vm00 ceph-mon[50686]: mgrmap e5: vm00.nescmq(active, since 3s) 2026-03-10T12:32:28.533 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:28 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2565746801' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: Active manager daemon vm00.nescmq restarted 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: Activating manager daemon vm00.nescmq 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: osdmap e2: 0 total, 0 up, 0 in 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: mgrmap e6: vm00.nescmq(active, starting, since 0.0562175s) 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: Manager daemon vm00.nescmq is now available 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:32.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:32 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 7, 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c35c8a700 1 Processor -- start 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c35c8a700 1 -- start start 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c35c8a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c30072a40 0x7f8c30071060 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c35c8a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c300715a0 con 0x7f8c30072a40 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c34c88700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c30072a40 0x7f8c30071060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c34c88700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c30072a40 0x7f8c30071060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60114/0 (socket says 192.168.123.100:60114) 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c34c88700 1 -- 192.168.123.100:0/4215381851 learned_addr learned my addr 192.168.123.100:0/4215381851 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.854+0000 7f8c34c88700 1 -- 192.168.123.100:0/4215381851 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c300716e0 con 0x7f8c30072a40 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.855+0000 7f8c34c88700 1 --2- 192.168.123.100:0/4215381851 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c30072a40 0x7f8c30071060 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f8c20009a90 tx=0x7f8c20009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=19f05ddd7f84dd7 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.859+0000 7f8c2f7fe700 1 -- 192.168.123.100:0/4215381851 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c20004030 con 0x7f8c30072a40 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.859+0000 7f8c2f7fe700 1 -- 192.168.123.100:0/4215381851 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f8c2000b7e0 con 0x7f8c30072a40 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.859+0000 7f8c2f7fe700 1 -- 192.168.123.100:0/4215381851 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c200039f0 con 0x7f8c30072a40 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.859+0000 7f8c35c8a700 1 -- 192.168.123.100:0/4215381851 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c30072a40 msgr2=0x7f8c30071060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.859+0000 7f8c35c8a700 1 --2- 192.168.123.100:0/4215381851 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c30072a40 0x7f8c30071060 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f8c20009a90 tx=0x7f8c20009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.860+0000 7f8c35c8a700 1 -- 192.168.123.100:0/4215381851 shutdown_connections 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.860+0000 7f8c35c8a700 1 --2- 192.168.123.100:0/4215381851 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c30072a40 0x7f8c30071060 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.860+0000 7f8c35c8a700 1 -- 192.168.123.100:0/4215381851 >> 192.168.123.100:0/4215381851 conn(0x7f8c3006c9d0 msgr2=0x7f8c3006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.860+0000 7f8c35c8a700 1 -- 192.168.123.100:0/4215381851 shutdown_connections 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.861+0000 7f8c35c8a700 1 -- 192.168.123.100:0/4215381851 wait complete. 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.861+0000 7f8c35c8a700 1 Processor -- start 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.861+0000 7f8c35c8a700 1 -- start start 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.861+0000 7f8c35c8a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c301a8820 0x7f8c301a8c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.861+0000 7f8c35c8a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c301a9170 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.862+0000 7f8c34c88700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c301a8820 0x7f8c301a8c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.862+0000 7f8c34c88700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c301a8820 0x7f8c301a8c30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60128/0 (socket says 192.168.123.100:60128) 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.862+0000 7f8c34c88700 1 -- 192.168.123.100:0/1147243162 learned_addr learned my addr 192.168.123.100:0/1147243162 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.863+0000 7f8c34c88700 1 -- 192.168.123.100:0/1147243162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c20009740 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.863+0000 7f8c34c88700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c301a8820 0x7f8c301a8c30 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f8c200037e0 tx=0x7f8c20003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.863+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c20003fd0 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.863+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8c301a9370 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.864+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8c3007b250 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.864+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f8c20024460 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.864+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c2001b440 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.865+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 5) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f8c2001b5a0 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.865+0000 7f8c2dffb700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.865+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f8c2004d190 con 0x7f8c301a8820 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.865+0000 7f8c2ffff700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 msgr2=0x7f8c1803a970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:33.185 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.865+0000 7f8c2ffff700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:27.866+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f8c1c000d40 con 0x7f8c180384c0 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:28.065+0000 7f8c2ffff700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 msgr2=0x7f8c1803a970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:28.065+0000 7f8c2ffff700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:28.466+0000 7f8c2ffff700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 msgr2=0x7f8c1803a970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:28.466+0000 7f8c2ffff700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:29.267+0000 7f8c2ffff700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 msgr2=0x7f8c1803a970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:29.267+0000 7f8c2ffff700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:30.868+0000 7f8c2ffff700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 msgr2=0x7f8c1803a970 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:30.868+0000 7f8c2ffff700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:32.155+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mgrmap(e 6) v1 ==== 44846+0+0 (secure 0 0 0) 0x7f8c2002e430 con 0x7f8c301a8820 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:32.155+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 msgr2=0x7f8c1803a970 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:32.155+0000 7f8c2dffb700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.147+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f8c2000eca0 con 0x7f8c301a8820 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.147+0000 7f8c2dffb700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.147+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f8c1c000d40 con 0x7f8c180384c0 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.150+0000 7f8c2ffff700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.150+0000 7f8c2ffff700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f8c24003a60 tx=0x7f8c240092b0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.151+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f8c1c000d40 con 0x7f8c180384c0 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.155+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f8c1c002800 con 0x7f8c180384c0 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c2dffb700 1 -- 192.168.123.100:0/1147243162 <== mgr.14120 v2:192.168.123.100:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7f8c1c002800 con 0x7f8c180384c0 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 msgr2=0x7f8c1803a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c35c8a700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f8c24003a60 tx=0x7f8c240092b0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c301a8820 msgr2=0x7f8c301a8c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c35c8a700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c301a8820 0x7f8c301a8c30 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f8c200037e0 tx=0x7f8c20003b40 comp rx=0 tx=0).stop 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 shutdown_connections 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c35c8a700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c180384c0 0x7f8c1803a970 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.156+0000 7f8c35c8a700 1 --2- 192.168.123.100:0/1147243162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c301a8820 0x7f8c301a8c30 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.157+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 >> 192.168.123.100:0/1147243162 conn(0x7f8c3006c9d0 msgr2=0x7f8c3006dfd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.157+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 shutdown_connections 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.157+0000 7f8c35c8a700 1 -- 192.168.123.100:0/1147243162 wait complete. 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:mgr epoch 5 is available 2026-03-10T12:32:33.186 INFO:teuthology.orchestra.run.vm00.stdout:Setting orchestrator backend to cephadm... 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: Found migration_current of "None". Setting to last migration. 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:33.418 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:33.419 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:33 vm00 ceph-mon[50686]: mgrmap e7: vm00.nescmq(active, since 1.04997s) 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.417+0000 7fc594d23700 1 Processor -- start 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.417+0000 7fc594d23700 1 -- start start 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.418+0000 7fc594d23700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590106260 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.418+0000 7fc594d23700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5901067a0 con 0x7fc590105e50 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.418+0000 7fc58e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590106260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.418+0000 7fc58e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590106260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60212/0 (socket says 192.168.123.100:60212) 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.418+0000 7fc58e59c700 1 -- 192.168.123.100:0/3819768830 learned_addr learned my addr 192.168.123.100:0/3819768830 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.418+0000 7fc58e59c700 1 -- 192.168.123.100:0/3819768830 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5901068e0 con 0x7fc590105e50 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc58e59c700 1 --2- 192.168.123.100:0/3819768830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590106260 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fc578009a90 tx=0x7fc578009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=9faa51074eb4b1fd server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc58d59a700 1 -- 192.168.123.100:0/3819768830 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc57800fbf0 con 0x7fc590105e50 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc58d59a700 1 -- 192.168.123.100:0/3819768830 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc5780044d0 con 0x7fc590105e50 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc594d23700 1 -- 192.168.123.100:0/3819768830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 msgr2=0x7fc590106260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc594d23700 1 --2- 192.168.123.100:0/3819768830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590106260 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fc578009a90 tx=0x7fc578009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc594d23700 1 -- 192.168.123.100:0/3819768830 shutdown_connections 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc594d23700 1 --2- 192.168.123.100:0/3819768830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590106260 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.419+0000 7fc594d23700 1 -- 192.168.123.100:0/3819768830 >> 192.168.123.100:0/3819768830 conn(0x7fc590101420 msgr2=0x7fc590103830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.420+0000 7fc594d23700 1 -- 192.168.123.100:0/3819768830 shutdown_connections 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.420+0000 7fc594d23700 1 -- 192.168.123.100:0/3819768830 wait complete. 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.420+0000 7fc594d23700 1 Processor -- start 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.420+0000 7fc594d23700 1 -- start start 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.420+0000 7fc594d23700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590197a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.420+0000 7fc594d23700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc578003e60 con 0x7fc590105e50 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc58e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590197a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc58e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590197a60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60224/0 (socket says 192.168.123.100:60224) 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc58e59c700 1 -- 192.168.123.100:0/983230811 learned_addr learned my addr 192.168.123.100:0/983230811 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:33.629 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc58e59c700 1 -- 192.168.123.100:0/983230811 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc578009740 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc58e59c700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590197a60 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc5780044f0 tx=0x7fc5780045d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc5877fe700 1 -- 192.168.123.100:0/983230811 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc578003cc0 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc5877fe700 1 -- 192.168.123.100:0/983230811 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc578017e10 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc590197fa0 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc5877fe700 1 -- 192.168.123.100:0/983230811 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc5780213e0 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.421+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc590198440 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.422+0000 7fc5877fe700 1 -- 192.168.123.100:0/983230811 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7fc578021540 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.423+0000 7fc5877fe700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc57c0383e0 0x7fc57c03a890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.423+0000 7fc5877fe700 1 -- 192.168.123.100:0/983230811 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fc578050f00 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.423+0000 7fc58dd9b700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc57c0383e0 0x7fc57c03a890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.423+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc5901068e0 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.427+0000 7fc5877fe700 1 -- 192.168.123.100:0/983230811 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc5901068e0 con 0x7fc590105e50 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.427+0000 7fc58dd9b700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc57c0383e0 0x7fc57c03a890 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc580006fd0 tx=0x7fc580006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.543+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7fc5901068e0 con 0x7fc57c0383e0 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.554+0000 7fc5877fe700 1 -- 192.168.123.100:0/983230811 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7fc5901068e0 con 0x7fc57c0383e0 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.557+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc57c0383e0 msgr2=0x7fc57c03a890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.557+0000 7fc594d23700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc57c0383e0 0x7fc57c03a890 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc580006fd0 tx=0x7fc580006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.557+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 msgr2=0x7fc590197a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.557+0000 7fc594d23700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590197a60 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc5780044f0 tx=0x7fc5780045d0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.558+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 shutdown_connections 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.558+0000 7fc594d23700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc57c0383e0 0x7fc57c03a890 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.558+0000 7fc594d23700 1 --2- 192.168.123.100:0/983230811 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc590105e50 0x7fc590197a60 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.558+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 >> 192.168.123.100:0/983230811 conn(0x7fc590101420 msgr2=0x7fc590102dc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.558+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 shutdown_connections 2026-03-10T12:32:33.630 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.558+0000 7fc594d23700 1 -- 192.168.123.100:0/983230811 wait complete. 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.763+0000 7f2451f4b700 1 Processor -- start 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.764+0000 7f2451f4b700 1 -- start start 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.764+0000 7f2451f4b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c106a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.764+0000 7f2451f4b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f244c0745b0 con 0x7f244c104620 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.764+0000 7f244b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c106a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.764+0000 7f244b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c106a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60230/0 (socket says 192.168.123.100:60230) 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.764+0000 7f244b7fe700 1 -- 192.168.123.100:0/411572668 learned_addr learned my addr 192.168.123.100:0/411572668 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.764+0000 7f244b7fe700 1 -- 192.168.123.100:0/411572668 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f244c0746f0 con 0x7f244c104620 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.765+0000 7f244b7fe700 1 --2- 192.168.123.100:0/411572668 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c106a40 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f2434009cf0 tx=0x7f243400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3110e446d3b30727 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.765+0000 7f244a7fc700 1 -- 192.168.123.100:0/411572668 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2434004030 con 0x7f244c104620 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.765+0000 7f244a7fc700 1 -- 192.168.123.100:0/411572668 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f243400b810 con 0x7f244c104620 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.766+0000 7f2451f4b700 1 -- 192.168.123.100:0/411572668 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 msgr2=0x7f244c106a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.766+0000 7f2451f4b700 1 --2- 192.168.123.100:0/411572668 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c106a40 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f2434009cf0 tx=0x7f243400b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.766+0000 7f2451f4b700 1 -- 192.168.123.100:0/411572668 shutdown_connections 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.766+0000 7f2451f4b700 1 --2- 192.168.123.100:0/411572668 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c106a40 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.766+0000 7f2451f4b700 1 -- 192.168.123.100:0/411572668 >> 192.168.123.100:0/411572668 conn(0x7f244c100270 msgr2=0x7f244c1026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.766+0000 7f2451f4b700 1 -- 192.168.123.100:0/411572668 shutdown_connections 2026-03-10T12:32:34.018 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.766+0000 7f2451f4b700 1 -- 192.168.123.100:0/411572668 wait complete. 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.767+0000 7f2451f4b700 1 Processor -- start 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.767+0000 7f2451f4b700 1 -- start start 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.767+0000 7f2451f4b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c1a0090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.767+0000 7f244b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c1a0090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.767+0000 7f244b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c1a0090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60234/0 (socket says 192.168.123.100:60234) 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.767+0000 7f244b7fe700 1 -- 192.168.123.100:0/4079357364 learned_addr learned my addr 192.168.123.100:0/4079357364 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.767+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f244c0745b0 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.768+0000 7f244b7fe700 1 -- 192.168.123.100:0/4079357364 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2434009740 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.768+0000 7f244b7fe700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c1a0090 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f2434003f50 tx=0x7f2434004030 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.768+0000 7f2448ff9700 1 -- 192.168.123.100:0/4079357364 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2434004340 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.768+0000 7f2448ff9700 1 -- 192.168.123.100:0/4079357364 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f24340044a0 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.769+0000 7f2448ff9700 1 -- 192.168.123.100:0/4079357364 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f243401a5b0 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.769+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f244c1a05d0 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.769+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f244c1a0a70 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.770+0000 7f2448ff9700 1 -- 192.168.123.100:0/4079357364 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7f2434023480 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.770+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f244c199900 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.770+0000 7f2448ff9700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2438037f80 0x7f243803a430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.770+0000 7f2448ff9700 1 -- 192.168.123.100:0/4079357364 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f243401e070 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.772+0000 7f244affd700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2438037f80 0x7f243803a430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.772+0000 7f244affd700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2438037f80 0x7f243803a430 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f243c006fd0 tx=0x7f243c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.774+0000 7f2448ff9700 1 -- 192.168.123.100:0/4079357364 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2434021560 con 0x7f244c104620 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.885+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7f244c02cfa0 con 0x7f2438037f80 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.885+0000 7f2448ff9700 1 -- 192.168.123.100:0/4079357364 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7f244c02cfa0 con 0x7f2438037f80 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2438037f80 msgr2=0x7f243803a430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2438037f80 0x7f243803a430 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f243c006fd0 tx=0x7f243c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 msgr2=0x7f244c1a0090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c1a0090 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f2434003f50 tx=0x7f2434004030 comp rx=0 tx=0).stop 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 shutdown_connections 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2438037f80 0x7f243803a430 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 --2- 192.168.123.100:0/4079357364 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f244c104620 0x7f244c1a0090 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.888+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 >> 192.168.123.100:0/4079357364 conn(0x7f244c100270 msgr2=0x7f244c1026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.889+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 shutdown_connections 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:33.889+0000 7f2451f4b700 1 -- 192.168.123.100:0/4079357364 wait complete. 2026-03-10T12:32:34.019 INFO:teuthology.orchestra.run.vm00.stdout:Generating ssh key... 2026-03-10T12:32:34.404 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.159+0000 7fb0008af700 1 Processor -- start 2026-03-10T12:32:34.404 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.159+0000 7fb0008af700 1 -- start start 2026-03-10T12:32:34.404 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.159+0000 7fb0008af700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc105f60 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.404 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.159+0000 7fb0008af700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faffc1064a0 con 0x7faffc105b50 2026-03-10T12:32:34.405 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.160+0000 7faffad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc105f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.405 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.160+0000 7faffad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc105f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60240/0 (socket says 192.168.123.100:60240) 2026-03-10T12:32:34.405 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.160+0000 7faffad9d700 1 -- 192.168.123.100:0/2902287643 learned_addr learned my addr 192.168.123.100:0/2902287643 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.160+0000 7faffad9d700 1 -- 192.168.123.100:0/2902287643 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faffc1065e0 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.160+0000 7faffad9d700 1 --2- 192.168.123.100:0/2902287643 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc105f60 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fafe4009cf0 tx=0x7fafe400f4f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ffe1e2f50e731413 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.161+0000 7faff9d9b700 1 -- 192.168.123.100:0/2902287643 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafe400fc20 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.161+0000 7faff9d9b700 1 -- 192.168.123.100:0/2902287643 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fafe40044d0 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.161+0000 7fb0008af700 1 -- 192.168.123.100:0/2902287643 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 msgr2=0x7faffc105f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.161+0000 7fb0008af700 1 --2- 192.168.123.100:0/2902287643 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc105f60 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fafe4009cf0 tx=0x7fafe400f4f0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.162+0000 7fb0008af700 1 -- 192.168.123.100:0/2902287643 shutdown_connections 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.162+0000 7fb0008af700 1 --2- 192.168.123.100:0/2902287643 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc105f60 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.162+0000 7fb0008af700 1 -- 192.168.123.100:0/2902287643 >> 192.168.123.100:0/2902287643 conn(0x7faffc1011a0 msgr2=0x7faffc1035d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.162+0000 7fb0008af700 1 -- 192.168.123.100:0/2902287643 shutdown_connections 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.162+0000 7fb0008af700 1 -- 192.168.123.100:0/2902287643 wait complete. 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.162+0000 7fb0008af700 1 Processor -- start 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.163+0000 7fb0008af700 1 -- start start 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.163+0000 7fb0008af700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc1975e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.163+0000 7faffad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc1975e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.163+0000 7faffad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc1975e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60250/0 (socket says 192.168.123.100:60250) 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.163+0000 7faffad9d700 1 -- 192.168.123.100:0/553189079 learned_addr learned my addr 192.168.123.100:0/553189079 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.163+0000 7fb0008af700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafe4003e60 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.163+0000 7faffad9d700 1 -- 192.168.123.100:0/553189079 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafe4009740 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.164+0000 7faffad9d700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc1975e0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fafe4012040 tx=0x7fafe40179f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.164+0000 7faff3fff700 1 -- 192.168.123.100:0/553189079 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafe4003a60 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.164+0000 7faff3fff700 1 -- 192.168.123.100:0/553189079 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fafe4004640 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.164+0000 7faff3fff700 1 -- 192.168.123.100:0/553189079 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fafe401f650 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.164+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faffc197b20 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.164+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faffc197f40 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.165+0000 7faff3fff700 1 -- 192.168.123.100:0/553189079 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7fafe401e040 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.165+0000 7faff3fff700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fafe8038340 0x7fafe803a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.165+0000 7faff3fff700 1 -- 192.168.123.100:0/553189079 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fafe404cc30 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.165+0000 7faffa59c700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fafe8038340 0x7fafe803a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.166+0000 7faffa59c700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fafe8038340 0x7fafe803a7f0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fafec006fd0 tx=0x7fafec006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.166+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faffc1914e0 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.169+0000 7faff3fff700 1 -- 192.168.123.100:0/553189079 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fafe401c070 con 0x7faffc105b50 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.291+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7faffc198260 con 0x7fafe8038340 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.370+0000 7faff3fff700 1 -- 192.168.123.100:0/553189079 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7faffc198260 con 0x7fafe8038340 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fafe8038340 msgr2=0x7fafe803a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fafe8038340 0x7fafe803a7f0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fafec006fd0 tx=0x7fafec006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 msgr2=0x7faffc1975e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc1975e0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fafe4012040 tx=0x7fafe40179f0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 shutdown_connections 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fafe8038340 0x7fafe803a7f0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 --2- 192.168.123.100:0/553189079 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faffc105b50 0x7faffc1975e0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 >> 192.168.123.100:0/553189079 conn(0x7faffc1011a0 msgr2=0x7faffc102a10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 shutdown_connections 2026-03-10T12:32:34.407 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.373+0000 7fb0008af700 1 -- 192.168.123.100:0/553189079 wait complete. 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:32] ENGINE Bus STARTING 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:32] ENGINE Serving on http://192.168.123.100:8765 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:32] ENGINE Serving on https://192.168.123.100:7150 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:32] ENGINE Bus STARTED 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:34.528 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:34 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzZRQ37HtENKJEtU72puwFDv4yHsrS240AznofJBuhBEq2ma9FuKuvTUoszP+iPl2yAbg7pC+V8pZZgRBCE0y9VZ/f8xjXUFjHMeOlsftuDrI22fwtp7NeqcTjTvn+QbSCr2cs7fdQqakVHkQ60C5u60GW4vDCNYNhQv3Bb+h6XOGQGC3vt5Pu5OoVE6yalSWhMNKiAn/OWdrQdF92zM9EKRzTFt1MJwuspyqHZeBoDjmIqchQ7ZWchq562VuK9kAQo7s2ReydXWzWoI8sdoH41vHweJSx0MSHoRrmENAh9XCaMwzlJjl9SrjE3zq6HD4QxR3xSV0T+BobUuiMfQ+aQxLVK8YshnhGpa9hIeeiNizce6p/Q664pxGmt+m1qMyrtXp46uXe6X8SXM5NHIwWiV8T6FFxgUCQiVycYfRjU4iXQrj5bE+2jSy0aIrRNbZ2hQXMVHfKnoASUouJ9Jm7wGK6V4wbv9biuxHwFS/A3Gaua/oV5Y15aVM5u3cf5PE= ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.548+0000 7fdf2c1c5700 1 Processor -- start 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf2c1c5700 1 -- start start 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf2c1c5700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf24079250 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf2c1c5700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdf24079790 con 0x7fdf2407ad50 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf29f61700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf24079250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf29f61700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf24079250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60256/0 (socket says 192.168.123.100:60256) 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf29f61700 1 -- 192.168.123.100:0/4182693524 learned_addr learned my addr 192.168.123.100:0/4182693524 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf29f61700 1 -- 192.168.123.100:0/4182693524 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdf240798d0 con 0x7fdf2407ad50 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.549+0000 7fdf29f61700 1 --2- 192.168.123.100:0/4182693524 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf24079250 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fdf14009cf0 tx=0x7fdf1400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ee16d3ee010fd327 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf28f5f700 1 -- 192.168.123.100:0/4182693524 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdf14004030 con 0x7fdf2407ad50 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf28f5f700 1 -- 192.168.123.100:0/4182693524 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fdf1400b810 con 0x7fdf2407ad50 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/4182693524 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 msgr2=0x7fdf24079250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf2c1c5700 1 --2- 192.168.123.100:0/4182693524 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf24079250 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7fdf14009cf0 tx=0x7fdf1400b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/4182693524 shutdown_connections 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf2c1c5700 1 --2- 192.168.123.100:0/4182693524 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf24079250 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/4182693524 >> 192.168.123.100:0/4182693524 conn(0x7fdf241013a0 msgr2=0x7fdf241037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.550+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/4182693524 shutdown_connections 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.551+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/4182693524 wait complete. 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.551+0000 7fdf2c1c5700 1 Processor -- start 2026-03-10T12:32:34.718 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.551+0000 7fdf2c1c5700 1 -- start start 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.551+0000 7fdf2c1c5700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf241a0420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.551+0000 7fdf2c1c5700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdf24079790 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.551+0000 7fdf29f61700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf241a0420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf29f61700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf241a0420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60264/0 (socket says 192.168.123.100:60264) 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf29f61700 1 -- 192.168.123.100:0/1193703898 learned_addr learned my addr 192.168.123.100:0/1193703898 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf29f61700 1 -- 192.168.123.100:0/1193703898 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdf14009740 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf29f61700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf241a0420 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fdf14006e90 tx=0x7fdf14003fe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf1affd700 1 -- 192.168.123.100:0/1193703898 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdf1400bed0 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf1affd700 1 -- 192.168.123.100:0/1193703898 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fdf14003710 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf1affd700 1 -- 192.168.123.100:0/1193703898 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdf1401adb0 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdf241a0960 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.552+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdf241a0e00 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.553+0000 7fdf1affd700 1 -- 192.168.123.100:0/1193703898 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 7) v1 ==== 44973+0+0 (secure 0 0 0) 0x7fdf14004230 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.553+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdf2419a870 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.553+0000 7fdf1affd700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdf10038380 0x7fdf1003a830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.553+0000 7fdf1affd700 1 -- 192.168.123.100:0/1193703898 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fdf1404b950 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.556+0000 7fdf1affd700 1 -- 192.168.123.100:0/1193703898 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdf1401a430 con 0x7fdf2407ad50 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.556+0000 7fdf29760700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdf10038380 0x7fdf1003a830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.556+0000 7fdf29760700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdf10038380 0x7fdf1003a830 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fdf20006fd0 tx=0x7fdf20006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.659+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7fdf2402d050 con 0x7fdf10038380 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.660+0000 7fdf1affd700 1 -- 192.168.123.100:0/1193703898 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+595 (secure 0 0 0) 0x7fdf2402d050 con 0x7fdf10038380 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.662+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdf10038380 msgr2=0x7fdf1003a830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.662+0000 7fdf2c1c5700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdf10038380 0x7fdf1003a830 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fdf20006fd0 tx=0x7fdf20006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.662+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 msgr2=0x7fdf241a0420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.663+0000 7fdf2c1c5700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf241a0420 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fdf14006e90 tx=0x7fdf14003fe0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.663+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 shutdown_connections 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.663+0000 7fdf2c1c5700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdf10038380 0x7fdf1003a830 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.663+0000 7fdf2c1c5700 1 --2- 192.168.123.100:0/1193703898 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdf2407ad50 0x7fdf241a0420 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.663+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 >> 192.168.123.100:0/1193703898 conn(0x7fdf241013a0 msgr2=0x7fdf24102000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.663+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 shutdown_connections 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.663+0000 7fdf2c1c5700 1 -- 192.168.123.100:0/1193703898 wait complete. 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:Adding key to root@localhost authorized_keys... 2026-03-10T12:32:34.719 INFO:teuthology.orchestra.run.vm00.stdout:Adding host vm00... 2026-03-10T12:32:35.374 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:35 vm00 ceph-mon[50686]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:35.375 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:35 vm00 ceph-mon[50686]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:35.375 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:35 vm00 ceph-mon[50686]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:35.375 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:35 vm00 ceph-mon[50686]: Generating ssh key... 2026-03-10T12:32:35.375 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:35 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:35.375 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:35 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:35.375 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:35 vm00 ceph-mon[50686]: mgrmap e8: vm00.nescmq(active, since 2s) 2026-03-10T12:32:36.509 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:36 vm00 ceph-mon[50686]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:36.509 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:36 vm00 ceph-mon[50686]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm00", "addr": "192.168.123.100", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Added host 'vm00' with addr '192.168.123.100' 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.859+0000 7f3e72864700 1 Processor -- start 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.859+0000 7f3e72864700 1 -- start start 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.860+0000 7f3e72864700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c07ad10 0x7f3e6c079210 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.860+0000 7f3e72864700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e6c079750 con 0x7f3e6c07ad10 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.860+0000 7f3e6bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c07ad10 0x7f3e6c079210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.860+0000 7f3e6bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c07ad10 0x7f3e6c079210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60272/0 (socket says 192.168.123.100:60272) 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.860+0000 7f3e6bfff700 1 -- 192.168.123.100:0/2870939456 learned_addr learned my addr 192.168.123.100:0/2870939456 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.860+0000 7f3e6bfff700 1 -- 192.168.123.100:0/2870939456 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e6c079890 con 0x7f3e6c07ad10 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.861+0000 7f3e6bfff700 1 --2- 192.168.123.100:0/2870939456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c07ad10 0x7f3e6c079210 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f3e54009a90 tx=0x7f3e54009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=86c68eb1906c6999 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.861+0000 7f3e6affd700 1 -- 192.168.123.100:0/2870939456 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3e54004030 con 0x7f3e6c07ad10 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.861+0000 7f3e6affd700 1 -- 192.168.123.100:0/2870939456 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3e5400b7e0 con 0x7f3e6c07ad10 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.861+0000 7f3e6affd700 1 -- 192.168.123.100:0/2870939456 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3e54003a40 con 0x7f3e6c07ad10 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.861+0000 7f3e72864700 1 -- 192.168.123.100:0/2870939456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c07ad10 msgr2=0x7f3e6c079210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.861+0000 7f3e72864700 1 --2- 192.168.123.100:0/2870939456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c07ad10 0x7f3e6c079210 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f3e54009a90 tx=0x7f3e54009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.862+0000 7f3e72864700 1 -- 192.168.123.100:0/2870939456 shutdown_connections 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.862+0000 7f3e72864700 1 --2- 192.168.123.100:0/2870939456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c07ad10 0x7f3e6c079210 secure :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f3e54009a90 tx=0x7f3e54009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.862+0000 7f3e72864700 1 -- 192.168.123.100:0/2870939456 >> 192.168.123.100:0/2870939456 conn(0x7f3e6c1013a0 msgr2=0x7f3e6c1037b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.862+0000 7f3e72864700 1 -- 192.168.123.100:0/2870939456 shutdown_connections 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.862+0000 7f3e72864700 1 -- 192.168.123.100:0/2870939456 wait complete. 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.863+0000 7f3e72864700 1 Processor -- start 2026-03-10T12:32:36.971 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.863+0000 7f3e72864700 1 -- start start 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.863+0000 7f3e72864700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c19be10 0x7f3e6c19c220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.863+0000 7f3e72864700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e6c19c760 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.863+0000 7f3e6bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c19be10 0x7f3e6c19c220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e6bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c19be10 0x7f3e6c19c220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60276/0 (socket says 192.168.123.100:60276) 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e6bfff700 1 -- 192.168.123.100:0/781847614 learned_addr learned my addr 192.168.123.100:0/781847614 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e6bfff700 1 -- 192.168.123.100:0/781847614 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e54009740 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e6bfff700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c19be10 0x7f3e6c19c220 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f3e54003710 tx=0x7f3e54003b00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e697fa700 1 -- 192.168.123.100:0/781847614 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3e54003fc0 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e697fa700 1 -- 192.168.123.100:0/781847614 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3e54024460 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3e6c19c960 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.864+0000 7f3e697fa700 1 -- 192.168.123.100:0/781847614 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3e5401b440 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.865+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3e6c19f5c0 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.866+0000 7f3e697fa700 1 -- 192.168.123.100:0/781847614 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f3e5401b5a0 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.866+0000 7f3e697fa700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e58040cd0 0x7f3e58043180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.866+0000 7f3e697fa700 1 -- 192.168.123.100:0/781847614 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f3e5404d170 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.866+0000 7f3e6b7fe700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e58040cd0 0x7f3e58043180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.866+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3e6c062380 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.868+0000 7f3e6b7fe700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e58040cd0 0x7f3e58043180 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f3e5c006fd0 tx=0x7f3e5c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.869+0000 7f3e697fa700 1 -- 192.168.123.100:0/781847614 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3e5401f030 con 0x7f3e6c19be10 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:34.977+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm00", "addr": "192.168.123.100", "target": ["mon-mgr", ""]}) v1 -- 0x7f3e6c1027a0 con 0x7f3e58040cd0 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.830+0000 7f3e697fa700 1 -- 192.168.123.100:0/781847614 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f3e6c1027a0 con 0x7f3e58040cd0 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e58040cd0 msgr2=0x7f3e58043180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e58040cd0 0x7f3e58043180 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f3e5c006fd0 tx=0x7f3e5c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c19be10 msgr2=0x7f3e6c19c220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c19be10 0x7f3e6c19c220 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f3e54003710 tx=0x7f3e54003b00 comp rx=0 tx=0).stop 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 shutdown_connections 2026-03-10T12:32:36.972 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e58040cd0 0x7f3e58043180 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:36.973 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 --2- 192.168.123.100:0/781847614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e6c19be10 0x7f3e6c19c220 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:36.973 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 >> 192.168.123.100:0/781847614 conn(0x7f3e6c1013a0 msgr2=0x7f3e6c102090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:36.973 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 shutdown_connections 2026-03-10T12:32:36.973 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:36.834+0000 7f3e72864700 1 -- 192.168.123.100:0/781847614 wait complete. 2026-03-10T12:32:36.973 INFO:teuthology.orchestra.run.vm00.stdout:Deploying mon service with default placement... 2026-03-10T12:32:37.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:37 vm00 ceph-mon[50686]: Deploying cephadm binary to vm00 2026-03-10T12:32:37.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:37 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:37.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:37 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.148+0000 7fce8961d700 1 Processor -- start 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.149+0000 7fce8961d700 1 -- start start 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.149+0000 7fce8961d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce84104e60 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.149+0000 7fce83fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce84104e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.150+0000 7fce83fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce84104e60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60278/0 (socket says 192.168.123.100:60278) 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.150+0000 7fce83fff700 1 -- 192.168.123.100:0/3691475859 learned_addr learned my addr 192.168.123.100:0/3691475859 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.150+0000 7fce8961d700 1 -- 192.168.123.100:0/3691475859 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fce841053a0 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.150+0000 7fce83fff700 1 -- 192.168.123.100:0/3691475859 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fce841054e0 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.151+0000 7fce83fff700 1 --2- 192.168.123.100:0/3691475859 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce84104e60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fce74009cf0 tx=0x7fce7400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7f7b08764ee6e0c3 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.151+0000 7fce82ffd700 1 -- 192.168.123.100:0/3691475859 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fce74004030 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.151+0000 7fce82ffd700 1 -- 192.168.123.100:0/3691475859 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fce7400b810 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.152+0000 7fce8961d700 1 -- 192.168.123.100:0/3691475859 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 msgr2=0x7fce84104e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.152+0000 7fce8961d700 1 --2- 192.168.123.100:0/3691475859 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce84104e60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fce74009cf0 tx=0x7fce7400b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.152+0000 7fce8961d700 1 -- 192.168.123.100:0/3691475859 shutdown_connections 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.152+0000 7fce8961d700 1 --2- 192.168.123.100:0/3691475859 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce84104e60 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.152+0000 7fce8961d700 1 -- 192.168.123.100:0/3691475859 >> 192.168.123.100:0/3691475859 conn(0x7fce841000c0 msgr2=0x7fce841024d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.153+0000 7fce8961d700 1 -- 192.168.123.100:0/3691475859 shutdown_connections 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.153+0000 7fce8961d700 1 -- 192.168.123.100:0/3691475859 wait complete. 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.153+0000 7fce8961d700 1 Processor -- start 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.153+0000 7fce8961d700 1 -- start start 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.153+0000 7fce8961d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce841975c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.153+0000 7fce8961d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fce74014070 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.154+0000 7fce83fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce841975c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.154+0000 7fce83fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce841975c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60292/0 (socket says 192.168.123.100:60292) 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.154+0000 7fce83fff700 1 -- 192.168.123.100:0/2751463675 learned_addr learned my addr 192.168.123.100:0/2751463675 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.154+0000 7fce83fff700 1 -- 192.168.123.100:0/2751463675 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fce74009740 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.154+0000 7fce83fff700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce841975c0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fce74004320 tx=0x7fce74004400 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.155+0000 7fce817fa700 1 -- 192.168.123.100:0/2751463675 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fce74010470 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.155+0000 7fce817fa700 1 -- 192.168.123.100:0/2751463675 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fce74010a70 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.155+0000 7fce817fa700 1 -- 192.168.123.100:0/2751463675 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fce74019980 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.155+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fce84197b00 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.155+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fce84198000 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.156+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fce84191380 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.157+0000 7fce817fa700 1 -- 192.168.123.100:0/2751463675 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fce74010be0 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.157+0000 7fce817fa700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fce6c038460 0x7fce6c03a910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.157+0000 7fce817fa700 1 -- 192.168.123.100:0/2751463675 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fce7404bf40 con 0x7fce84104a50 2026-03-10T12:32:37.886 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.157+0000 7fce837fe700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fce6c038460 0x7fce6c03a910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.158+0000 7fce837fe700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fce6c038460 0x7fce6c03a910 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fce78006fd0 tx=0x7fce78006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.161+0000 7fce817fa700 1 -- 192.168.123.100:0/2751463675 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fce74019bb0 con 0x7fce84104a50 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.305+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7fce840008d0 con 0x7fce6c038460 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.310+0000 7fce817fa700 1 -- 192.168.123.100:0/2751463675 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fce840008d0 con 0x7fce6c038460 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.316+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fce6c038460 msgr2=0x7fce6c03a910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.316+0000 7fce8961d700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fce6c038460 0x7fce6c03a910 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fce78006fd0 tx=0x7fce78006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.316+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 msgr2=0x7fce841975c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.316+0000 7fce8961d700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce841975c0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fce74004320 tx=0x7fce74004400 comp rx=0 tx=0).stop 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.317+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 shutdown_connections 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.317+0000 7fce8961d700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fce6c038460 0x7fce6c03a910 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.317+0000 7fce8961d700 1 --2- 192.168.123.100:0/2751463675 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fce84104a50 0x7fce841975c0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.317+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 >> 192.168.123.100:0/2751463675 conn(0x7fce841000c0 msgr2=0x7fce8418e4c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.317+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 shutdown_connections 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:37.317+0000 7fce8961d700 1 -- 192.168.123.100:0/2751463675 wait complete. 2026-03-10T12:32:37.887 INFO:teuthology.orchestra.run.vm00.stdout:Deploying mgr service with default placement... 2026-03-10T12:32:38.245 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-10T12:32:38.245 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.040+0000 7fc65d49f700 1 Processor -- start 2026-03-10T12:32:38.245 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.040+0000 7fc65d49f700 1 -- start start 2026-03-10T12:32:38.245 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.041+0000 7fc65d49f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc658071200 0x7fc658071610 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.041+0000 7fc65d49f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6580728b0 con 0x7fc658071200 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.041+0000 7fc656ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc658071200 0x7fc658071610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.041+0000 7fc656ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc658071200 0x7fc658071610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33134/0 (socket says 192.168.123.100:33134) 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.041+0000 7fc656ffd700 1 -- 192.168.123.100:0/2781721962 learned_addr learned my addr 192.168.123.100:0/2781721962 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.042+0000 7fc656ffd700 1 -- 192.168.123.100:0/2781721962 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc6580729f0 con 0x7fc658071200 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.042+0000 7fc656ffd700 1 --2- 192.168.123.100:0/2781721962 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc658071200 0x7fc658071610 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fc648009a90 tx=0x7fc648009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=147805a655962c47 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.042+0000 7fc655ffb700 1 -- 192.168.123.100:0/2781721962 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc648004030 con 0x7fc658071200 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.042+0000 7fc655ffb700 1 -- 192.168.123.100:0/2781721962 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc64800b7e0 con 0x7fc658071200 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 -- 192.168.123.100:0/2781721962 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc658071200 msgr2=0x7fc658071610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 --2- 192.168.123.100:0/2781721962 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc658071200 0x7fc658071610 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fc648009a90 tx=0x7fc648009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 -- 192.168.123.100:0/2781721962 shutdown_connections 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 --2- 192.168.123.100:0/2781721962 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc658071200 0x7fc658071610 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 -- 192.168.123.100:0/2781721962 >> 192.168.123.100:0/2781721962 conn(0x7fc65806cc30 msgr2=0x7fc65806f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 -- 192.168.123.100:0/2781721962 shutdown_connections 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 -- 192.168.123.100:0/2781721962 wait complete. 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 Processor -- start 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.043+0000 7fc65d49f700 1 -- start start 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc65d49f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc65811b4a0 0x7fc65811b8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc65d49f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc6580728b0 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc656ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc65811b4a0 0x7fc65811b8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc656ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc65811b4a0 0x7fc65811b8b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33150/0 (socket says 192.168.123.100:33150) 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc656ffd700 1 -- 192.168.123.100:0/3610695834 learned_addr learned my addr 192.168.123.100:0/3610695834 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc656ffd700 1 -- 192.168.123.100:0/3610695834 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc648009740 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc656ffd700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc65811b4a0 0x7fc65811b8b0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fc64800be70 tx=0x7fc64800bf50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.044+0000 7fc63ffff700 1 -- 192.168.123.100:0/3610695834 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc648003eb0 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.045+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc65811bdf0 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.045+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc6581a3910 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.045+0000 7fc63ffff700 1 -- 192.168.123.100:0/3610695834 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc6480044f0 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.045+0000 7fc63ffff700 1 -- 192.168.123.100:0/3610695834 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc64801ad40 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.046+0000 7fc63ffff700 1 -- 192.168.123.100:0/3610695834 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fc64802c430 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.046+0000 7fc63ffff700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc640038490 0x7fc64003a940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.047+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc644005320 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.047+0000 7fc6567fc700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc640038490 0x7fc64003a940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.047+0000 7fc6567fc700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc640038490 0x7fc64003a940 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fc64c006fd0 tx=0x7fc64c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.047+0000 7fc63ffff700 1 -- 192.168.123.100:0/3610695834 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fc64804c790 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.050+0000 7fc63ffff700 1 -- 192.168.123.100:0/3610695834 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc648018940 con 0x7fc65811b4a0 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.178+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7fc644000bf0 con 0x7fc640038490 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.185+0000 7fc63ffff700 1 -- 192.168.123.100:0/3610695834 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7fc644000bf0 con 0x7fc640038490 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc640038490 msgr2=0x7fc64003a940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc640038490 0x7fc64003a940 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7fc64c006fd0 tx=0x7fc64c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc65811b4a0 msgr2=0x7fc65811b8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.248 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc65811b4a0 0x7fc65811b8b0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fc64800be70 tx=0x7fc64800bf50 comp rx=0 tx=0).stop 2026-03-10T12:32:38.249 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 shutdown_connections 2026-03-10T12:32:38.249 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc640038490 0x7fc64003a940 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.249 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 --2- 192.168.123.100:0/3610695834 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc65811b4a0 0x7fc65811b8b0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.249 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.188+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 >> 192.168.123.100:0/3610695834 conn(0x7fc65806cc30 msgr2=0x7fc6581125b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:38.249 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.191+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 shutdown_connections 2026-03-10T12:32:38.249 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.191+0000 7fc65d49f700 1 -- 192.168.123.100:0/3610695834 wait complete. 2026-03-10T12:32:38.249 INFO:teuthology.orchestra.run.vm00.stdout:Deploying crash service with default placement... 2026-03-10T12:32:38.428 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:38 vm00 ceph-mon[50686]: Added host vm00 2026-03-10T12:32:38.429 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:38 vm00 ceph-mon[50686]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:38.429 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:38 vm00 ceph-mon[50686]: Saving service mon spec with placement count:5 2026-03-10T12:32:38.429 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:38 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:38.429 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:38 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:38.429 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:38 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:38.429 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:38 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.410+0000 7fcf2d930700 1 Processor -- start 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.410+0000 7fcf2d930700 1 -- start start 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.410+0000 7fcf2d930700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28072ac0 0x7fcf28070fc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.410+0000 7fcf2d930700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf28071500 con 0x7fcf28072ac0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.410+0000 7fcf2c92e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28072ac0 0x7fcf28070fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.410+0000 7fcf2c92e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28072ac0 0x7fcf28070fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33162/0 (socket says 192.168.123.100:33162) 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.410+0000 7fcf2c92e700 1 -- 192.168.123.100:0/178614468 learned_addr learned my addr 192.168.123.100:0/178614468 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.411+0000 7fcf2c92e700 1 -- 192.168.123.100:0/178614468 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcf28071640 con 0x7fcf28072ac0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.412+0000 7fcf2c92e700 1 --2- 192.168.123.100:0/178614468 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28072ac0 0x7fcf28070fc0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fcf180098d0 tx=0x7fcf18009be0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=848d7c7d379c46a4 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.413+0000 7fcf277fe700 1 -- 192.168.123.100:0/178614468 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fcf18004030 con 0x7fcf28072ac0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.413+0000 7fcf277fe700 1 -- 192.168.123.100:0/178614468 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcf1800b7e0 con 0x7fcf28072ac0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.413+0000 7fcf2d930700 1 -- 192.168.123.100:0/178614468 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28072ac0 msgr2=0x7fcf28070fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.413+0000 7fcf2d930700 1 --2- 192.168.123.100:0/178614468 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28072ac0 0x7fcf28070fc0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fcf180098d0 tx=0x7fcf18009be0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.413+0000 7fcf2d930700 1 -- 192.168.123.100:0/178614468 shutdown_connections 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.413+0000 7fcf2d930700 1 --2- 192.168.123.100:0/178614468 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28072ac0 0x7fcf28070fc0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.413+0000 7fcf2d930700 1 -- 192.168.123.100:0/178614468 >> 192.168.123.100:0/178614468 conn(0x7fcf2806c9d0 msgr2=0x7fcf2806ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.414+0000 7fcf2d930700 1 -- 192.168.123.100:0/178614468 shutdown_connections 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.414+0000 7fcf2d930700 1 -- 192.168.123.100:0/178614468 wait complete. 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.414+0000 7fcf2d930700 1 Processor -- start 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.414+0000 7fcf2d930700 1 -- start start 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.414+0000 7fcf2d930700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28086430 0x7fcf28089a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.414+0000 7fcf2d930700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf18018ac0 con 0x7fcf28086430 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.415+0000 7fcf2c92e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28086430 0x7fcf28089a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.415+0000 7fcf2c92e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28086430 0x7fcf28089a10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33172/0 (socket says 192.168.123.100:33172) 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.415+0000 7fcf2c92e700 1 -- 192.168.123.100:0/1855013228 learned_addr learned my addr 192.168.123.100:0/1855013228 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.415+0000 7fcf2c92e700 1 -- 192.168.123.100:0/1855013228 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcf18009580 con 0x7fcf28086430 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.422+0000 7fcf2c92e700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28086430 0x7fcf28089a10 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fcf1800ba10 tx=0x7fcf1800baf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.423+0000 7fcf25ffb700 1 -- 192.168.123.100:0/1855013228 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fcf18018920 con 0x7fcf28086430 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.423+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcf28086840 con 0x7fcf28086430 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.423+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcf28086d40 con 0x7fcf28086430 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.423+0000 7fcf25ffb700 1 -- 192.168.123.100:0/1855013228 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcf180036a0 con 0x7fcf28086430 2026-03-10T12:32:38.590 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.424+0000 7fcf25ffb700 1 -- 192.168.123.100:0/1855013228 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fcf180177d0 con 0x7fcf28086430 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.425+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcf14005320 con 0x7fcf28086430 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.426+0000 7fcf25ffb700 1 -- 192.168.123.100:0/1855013228 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fcf18003cf0 con 0x7fcf28086430 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.426+0000 7fcf25ffb700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf100384b0 0x7fcf1003a960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.426+0000 7fcf25ffb700 1 -- 192.168.123.100:0/1855013228 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fcf1802f080 con 0x7fcf28086430 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.428+0000 7fcf27fff700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf100384b0 0x7fcf1003a960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.428+0000 7fcf27fff700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf100384b0 0x7fcf1003a960 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fcf2000ad30 tx=0x7fcf200093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.431+0000 7fcf25ffb700 1 -- 192.168.123.100:0/1855013228 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcf18020d80 con 0x7fcf28086430 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.548+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7fcf14000bf0 con 0x7fcf100384b0 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.553+0000 7fcf25ffb700 1 -- 192.168.123.100:0/1855013228 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7fcf14000bf0 con 0x7fcf100384b0 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf100384b0 msgr2=0x7fcf1003a960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf100384b0 0x7fcf1003a960 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fcf2000ad30 tx=0x7fcf200093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28086430 msgr2=0x7fcf28089a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28086430 0x7fcf28089a10 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fcf1800ba10 tx=0x7fcf1800baf0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 shutdown_connections 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf100384b0 0x7fcf1003a960 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 --2- 192.168.123.100:0/1855013228 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf28086430 0x7fcf28089a10 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 >> 192.168.123.100:0/1855013228 conn(0x7fcf2806c9d0 msgr2=0x7fcf2806d440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 shutdown_connections 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.557+0000 7fcf2d930700 1 -- 192.168.123.100:0/1855013228 wait complete. 2026-03-10T12:32:38.591 INFO:teuthology.orchestra.run.vm00.stdout:Deploying ceph-exporter service with default placement... 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.743+0000 7f908ebcb700 1 Processor -- start 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.743+0000 7f908ebcb700 1 -- start start 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.743+0000 7f908ebcb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088072ac0 0x7f9088070fc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.743+0000 7f908ebcb700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9088071500 con 0x7f9088072ac0 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.744+0000 7f908dbc9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088072ac0 0x7f9088070fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.744+0000 7f908dbc9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088072ac0 0x7f9088070fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33174/0 (socket says 192.168.123.100:33174) 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.744+0000 7f908dbc9700 1 -- 192.168.123.100:0/1823714880 learned_addr learned my addr 192.168.123.100:0/1823714880 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.744+0000 7f908dbc9700 1 -- 192.168.123.100:0/1823714880 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9088071640 con 0x7f9088072ac0 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908dbc9700 1 --2- 192.168.123.100:0/1823714880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088072ac0 0x7f9088070fc0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f908400d0d0 tx=0x7f908400d3e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=841d60f6b91e7dd0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908cbc7700 1 -- 192.168.123.100:0/1823714880 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9084010070 con 0x7f9088072ac0 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908cbc7700 1 -- 192.168.123.100:0/1823714880 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9084004030 con 0x7f9088072ac0 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908cbc7700 1 -- 192.168.123.100:0/1823714880 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9084003d10 con 0x7f9088072ac0 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908ebcb700 1 -- 192.168.123.100:0/1823714880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088072ac0 msgr2=0x7f9088070fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908ebcb700 1 --2- 192.168.123.100:0/1823714880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088072ac0 0x7f9088070fc0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f908400d0d0 tx=0x7f908400d3e0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908ebcb700 1 -- 192.168.123.100:0/1823714880 shutdown_connections 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908ebcb700 1 --2- 192.168.123.100:0/1823714880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088072ac0 0x7f9088070fc0 secure :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f908400d0d0 tx=0x7f908400d3e0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908ebcb700 1 -- 192.168.123.100:0/1823714880 >> 192.168.123.100:0/1823714880 conn(0x7f908806c9d0 msgr2=0x7f908806ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908ebcb700 1 -- 192.168.123.100:0/1823714880 shutdown_connections 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.745+0000 7f908ebcb700 1 -- 192.168.123.100:0/1823714880 wait complete. 2026-03-10T12:32:38.937 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.746+0000 7f908ebcb700 1 Processor -- start 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.746+0000 7f908ebcb700 1 -- start start 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.746+0000 7f908ebcb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f90880800e0 0x7f90880804f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.746+0000 7f908ebcb700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90880836c0 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.746+0000 7f908dbc9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f90880800e0 0x7f90880804f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.746+0000 7f908dbc9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f90880800e0 0x7f90880804f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33188/0 (socket says 192.168.123.100:33188) 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.746+0000 7f908dbc9700 1 -- 192.168.123.100:0/2062968201 learned_addr learned my addr 192.168.123.100:0/2062968201 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.747+0000 7f908dbc9700 1 -- 192.168.123.100:0/2062968201 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f90840088c0 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.747+0000 7f908dbc9700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f90880800e0 0x7f90880804f0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f908400d0a0 tx=0x7f908400de10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.749+0000 7f907effd700 1 -- 192.168.123.100:0/2062968201 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9084010050 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.749+0000 7f908ebcb700 1 -- 192.168.123.100:0/2062968201 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f90880838c0 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.749+0000 7f908ebcb700 1 -- 192.168.123.100:0/2062968201 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9088080c80 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.751+0000 7f907effd700 1 -- 192.168.123.100:0/2062968201 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f90840043b0 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.751+0000 7f907effd700 1 -- 192.168.123.100:0/2062968201 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f90840165e0 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.751+0000 7f907effd700 1 -- 192.168.123.100:0/2062968201 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f9084016800 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.751+0000 7f907effd700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9074038510 0x7f907403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.752+0000 7f908d3c8700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9074038510 0x7f907403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.752+0000 7f908ebcb700 1 -- 192.168.123.100:0/2062968201 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9088062380 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.756+0000 7f908d3c8700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9074038510 0x7f907403a9c0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f908000ad30 tx=0x7f90800093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.756+0000 7f907effd700 1 -- 192.168.123.100:0/2062968201 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f908404ccd0 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.756+0000 7f907effd700 1 -- 192.168.123.100:0/2062968201 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f908401b030 con 0x7f90880800e0 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.888+0000 7f908ebcb700 1 -- 192.168.123.100:0/2062968201 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f908806dde0 con 0x7f9074038510 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.894+0000 7f907effd700 1 -- 192.168.123.100:0/2062968201 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f908806dde0 con 0x7f9074038510 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.898+0000 7f907cff9700 1 -- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9074038510 msgr2=0x7f907403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.898+0000 7f907cff9700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9074038510 0x7f907403a9c0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f908000ad30 tx=0x7f90800093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.899+0000 7f907cff9700 1 -- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f90880800e0 msgr2=0x7f90880804f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.899+0000 7f907cff9700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f90880800e0 0x7f90880804f0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f908400d0a0 tx=0x7f908400de10 comp rx=0 tx=0).stop 2026-03-10T12:32:38.938 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.899+0000 7f907cff9700 1 -- 192.168.123.100:0/2062968201 shutdown_connections 2026-03-10T12:32:38.939 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.899+0000 7f907cff9700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9074038510 0x7f907403a9c0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.939 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.899+0000 7f907cff9700 1 --2- 192.168.123.100:0/2062968201 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f90880800e0 0x7f90880804f0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:38.939 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.899+0000 7f907cff9700 1 -- 192.168.123.100:0/2062968201 >> 192.168.123.100:0/2062968201 conn(0x7f908806c9d0 msgr2=0x7f908806d6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:38.939 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.900+0000 7f907cff9700 1 -- 192.168.123.100:0/2062968201 shutdown_connections 2026-03-10T12:32:38.939 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:38.900+0000 7f907cff9700 1 -- 192.168.123.100:0/2062968201 wait complete. 2026-03-10T12:32:38.939 INFO:teuthology.orchestra.run.vm00.stdout:Deploying prometheus service with default placement... 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.094+0000 7f1873437700 1 Processor -- start 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.094+0000 7f1873437700 1 -- start start 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.094+0000 7f1873437700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c072b50 0x7f186c071050 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.094+0000 7f1873437700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f186c071590 con 0x7f186c072b50 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.095+0000 7f18711d3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c072b50 0x7f186c071050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.095+0000 7f18711d3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c072b50 0x7f186c071050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33192/0 (socket says 192.168.123.100:33192) 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.095+0000 7f18711d3700 1 -- 192.168.123.100:0/1256528696 learned_addr learned my addr 192.168.123.100:0/1256528696 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.095+0000 7f18711d3700 1 -- 192.168.123.100:0/1256528696 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f186c0716d0 con 0x7f186c072b50 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.095+0000 7f18711d3700 1 --2- 192.168.123.100:0/1256528696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c072b50 0x7f186c071050 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f185c009a90 tx=0x7f185c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ea8465160a033d0d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1863fff700 1 -- 192.168.123.100:0/1256528696 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f185c004030 con 0x7f186c072b50 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1863fff700 1 -- 192.168.123.100:0/1256528696 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f185c00b7e0 con 0x7f186c072b50 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1863fff700 1 -- 192.168.123.100:0/1256528696 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f185c003b30 con 0x7f186c072b50 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1873437700 1 -- 192.168.123.100:0/1256528696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c072b50 msgr2=0x7f186c071050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1873437700 1 --2- 192.168.123.100:0/1256528696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c072b50 0x7f186c071050 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f185c009a90 tx=0x7f185c009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1873437700 1 -- 192.168.123.100:0/1256528696 shutdown_connections 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1873437700 1 --2- 192.168.123.100:0/1256528696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c072b50 0x7f186c071050 secure :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f185c009a90 tx=0x7f185c009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.096+0000 7f1873437700 1 -- 192.168.123.100:0/1256528696 >> 192.168.123.100:0/1256528696 conn(0x7f186c06c970 msgr2=0x7f186c06eda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.097+0000 7f1873437700 1 -- 192.168.123.100:0/1256528696 shutdown_connections 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.097+0000 7f1873437700 1 -- 192.168.123.100:0/1256528696 wait complete. 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.097+0000 7f1873437700 1 Processor -- start 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.097+0000 7f1873437700 1 -- start start 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.097+0000 7f1873437700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c1a88e0 0x7f186c1a8cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.266 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.097+0000 7f1873437700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f186c1a9230 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.097+0000 7f18711d3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c1a88e0 0x7f186c1a8cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.098+0000 7f18711d3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c1a88e0 0x7f186c1a8cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33196/0 (socket says 192.168.123.100:33196) 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.098+0000 7f18711d3700 1 -- 192.168.123.100:0/3553555073 learned_addr learned my addr 192.168.123.100:0/3553555073 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.098+0000 7f18711d3700 1 -- 192.168.123.100:0/3553555073 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f185c009740 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.098+0000 7f18711d3700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c1a88e0 0x7f186c1a8cf0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f185c009a90 tx=0x7f185c01a6b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.099+0000 7f18627fc700 1 -- 192.168.123.100:0/3553555073 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f185c01a8d0 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.099+0000 7f18627fc700 1 -- 192.168.123.100:0/3553555073 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f185c01f070 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.099+0000 7f18627fc700 1 -- 192.168.123.100:0/3553555073 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f185c01bcc0 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.099+0000 7f1873437700 1 -- 192.168.123.100:0/3553555073 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f186c1a9430 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.099+0000 7f1873437700 1 -- 192.168.123.100:0/3553555073 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f186c1ac090 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.100+0000 7f18627fc700 1 -- 192.168.123.100:0/3553555073 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f185c021030 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.100+0000 7f18627fc700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1858038160 0x7f185803a610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.101+0000 7f18627fc700 1 -- 192.168.123.100:0/3553555073 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f185c04bbe0 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.101+0000 7f18709d2700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1858038160 0x7f185803a610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.102+0000 7f18709d2700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1858038160 0x7f185803a610 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f186400ad30 tx=0x7f18640093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.102+0000 7f1873437700 1 -- 192.168.123.100:0/3553555073 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f186c062380 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.106+0000 7f18627fc700 1 -- 192.168.123.100:0/3553555073 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f185c022560 con 0x7f186c1a88e0 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.217+0000 7f1873437700 1 -- 192.168.123.100:0/3553555073 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7f186c1ac340 con 0x7f1858038160 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.225+0000 7f18627fc700 1 -- 192.168.123.100:0/3553555073 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7f186c1ac340 con 0x7f1858038160 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 -- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1858038160 msgr2=0x7f185803a610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1858038160 0x7f185803a610 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f186400ad30 tx=0x7f18640093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 -- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c1a88e0 msgr2=0x7f186c1a8cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c1a88e0 0x7f186c1a8cf0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f185c009a90 tx=0x7f185c01a6b0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 -- 192.168.123.100:0/3553555073 shutdown_connections 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1858038160 0x7f185803a610 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 --2- 192.168.123.100:0/3553555073 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f186c1a88e0 0x7f186c1a8cf0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.228+0000 7f1857fff700 1 -- 192.168.123.100:0/3553555073 >> 192.168.123.100:0/3553555073 conn(0x7f186c06c970 msgr2=0x7f186c06df70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.229+0000 7f1857fff700 1 -- 192.168.123.100:0/3553555073 shutdown_connections 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.229+0000 7f1857fff700 1 -- 192.168.123.100:0/3553555073 wait complete. 2026-03-10T12:32:39.267 INFO:teuthology.orchestra.run.vm00.stdout:Deploying grafana service with default placement... 2026-03-10T12:32:39.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:39 vm00 ceph-mon[50686]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:39.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:39 vm00 ceph-mon[50686]: Saving service mgr spec with placement count:2 2026-03-10T12:32:39.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:39 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:39.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:39 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:39.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:39 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:39.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:39 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.430+0000 7f93fbfff700 1 Processor -- start 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.431+0000 7f93fbfff700 1 -- start start 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.431+0000 7f93fbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.431+0000 7f93fbfff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93fc071d60 con 0x7f93fc071410 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.431+0000 7f93faffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.431+0000 7f93faffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33202/0 (socket says 192.168.123.100:33202) 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.431+0000 7f93faffd700 1 -- 192.168.123.100:0/3816054249 learned_addr learned my addr 192.168.123.100:0/3816054249 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.431+0000 7f93faffd700 1 -- 192.168.123.100:0/3816054249 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93fc071ea0 con 0x7f93fc071410 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.432+0000 7f93faffd700 1 --2- 192.168.123.100:0/3816054249 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc071820 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f93ec00d180 tx=0x7f93ec00d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=37e5a78a2d1f0a2c server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.432+0000 7f93f9ffb700 1 -- 192.168.123.100:0/3816054249 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93ec010070 con 0x7f93fc071410 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.432+0000 7f93f9ffb700 1 -- 192.168.123.100:0/3816054249 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f93ec004510 con 0x7f93fc071410 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.432+0000 7f93fbfff700 1 -- 192.168.123.100:0/3816054249 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 msgr2=0x7f93fc071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.610 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.432+0000 7f93fbfff700 1 --2- 192.168.123.100:0/3816054249 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc071820 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f93ec00d180 tx=0x7f93ec00d490 comp rx=0 tx=0).stop 2026-03-10T12:32:39.611 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 -- 192.168.123.100:0/3816054249 shutdown_connections 2026-03-10T12:32:39.611 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 --2- 192.168.123.100:0/3816054249 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc071820 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.611 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 -- 192.168.123.100:0/3816054249 >> 192.168.123.100:0/3816054249 conn(0x7f93fc06c9d0 msgr2=0x7f93fc06ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:39.611 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 -- 192.168.123.100:0/3816054249 shutdown_connections 2026-03-10T12:32:39.611 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 -- 192.168.123.100:0/3816054249 wait complete. 2026-03-10T12:32:39.611 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 Processor -- start 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 -- start start 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc1a19a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.433+0000 7f93fbfff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93ec003c20 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93faffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc1a19a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93faffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc1a19a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33208/0 (socket says 192.168.123.100:33208) 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93faffd700 1 -- 192.168.123.100:0/3904468687 learned_addr learned my addr 192.168.123.100:0/3904468687 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93faffd700 1 -- 192.168.123.100:0/3904468687 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93ec0087c0 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93faffd700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc1a19a0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f93ec008c40 tx=0x7f93ec008d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93e3fff700 1 -- 192.168.123.100:0/3904468687 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93ec010050 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93fbfff700 1 -- 192.168.123.100:0/3904468687 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f93fc1a1ee0 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.434+0000 7f93fbfff700 1 -- 192.168.123.100:0/3904468687 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f93fc1a2380 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.436+0000 7f93e3fff700 1 -- 192.168.123.100:0/3904468687 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f93ec00b150 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.436+0000 7f93e3fff700 1 -- 192.168.123.100:0/3904468687 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93ec0164e0 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.436+0000 7f93e3fff700 1 -- 192.168.123.100:0/3904468687 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f93ec016700 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.436+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f93fc04f030 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.439+0000 7f93e3fff700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93e40384e0 0x7f93e403a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.439+0000 7f93fa7fc700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93e40384e0 0x7f93e403a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.440+0000 7f93fa7fc700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93e40384e0 0x7f93e403a990 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f93f400ad30 tx=0x7f93f40093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.440+0000 7f93e3fff700 1 -- 192.168.123.100:0/3904468687 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f93ec04cbe0 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.440+0000 7f93e3fff700 1 -- 192.168.123.100:0/3904468687 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f93ec0521d0 con 0x7f93fc071410 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.562+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7f93fc062380 con 0x7f93e40384e0 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.573+0000 7f93e3fff700 1 -- 192.168.123.100:0/3904468687 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7f93fc062380 con 0x7f93e40384e0 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93e40384e0 msgr2=0x7f93e403a990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93e40384e0 0x7f93e403a990 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f93f400ad30 tx=0x7f93f40093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 msgr2=0x7f93fc1a19a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc1a19a0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f93ec008c40 tx=0x7f93ec008d20 comp rx=0 tx=0).stop 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 shutdown_connections 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93e40384e0 0x7f93e403a990 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 --2- 192.168.123.100:0/3904468687 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93fc071410 0x7f93fc1a19a0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.576+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 >> 192.168.123.100:0/3904468687 conn(0x7f93fc06c9d0 msgr2=0x7f93fc06e0a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.577+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 shutdown_connections 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.577+0000 7f93e1ffb700 1 -- 192.168.123.100:0/3904468687 wait complete. 2026-03-10T12:32:39.612 INFO:teuthology.orchestra.run.vm00.stdout:Deploying node-exporter service with default placement... 2026-03-10T12:32:39.984 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-10T12:32:39.984 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.761+0000 7f55a472b700 1 Processor -- start 2026-03-10T12:32:39.984 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.761+0000 7f55a472b700 1 -- start start 2026-03-10T12:32:39.984 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.762+0000 7f55a472b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c108330 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.984 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.762+0000 7f55a472b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f559c108870 con 0x7f559c107f20 2026-03-10T12:32:39.984 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.762+0000 7f55a24c7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c108330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.762+0000 7f55a24c7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c108330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33212/0 (socket says 192.168.123.100:33212) 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.762+0000 7f55a24c7700 1 -- 192.168.123.100:0/2358292180 learned_addr learned my addr 192.168.123.100:0/2358292180 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.762+0000 7f55a24c7700 1 -- 192.168.123.100:0/2358292180 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f559c1089b0 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.762+0000 7f55a24c7700 1 --2- 192.168.123.100:0/2358292180 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c108330 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f5594009cf0 tx=0x7f559400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=1fb05c8ae8564356 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.763+0000 7f55a14c5700 1 -- 192.168.123.100:0/2358292180 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5594004030 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.763+0000 7f55a14c5700 1 -- 192.168.123.100:0/2358292180 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f559400b810 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.764+0000 7f55a14c5700 1 -- 192.168.123.100:0/2358292180 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5594003b10 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.764+0000 7f55a472b700 1 -- 192.168.123.100:0/2358292180 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 msgr2=0x7f559c108330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.764+0000 7f55a472b700 1 --2- 192.168.123.100:0/2358292180 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c108330 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f5594009cf0 tx=0x7f559400b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.764+0000 7f55a472b700 1 -- 192.168.123.100:0/2358292180 shutdown_connections 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.764+0000 7f55a472b700 1 --2- 192.168.123.100:0/2358292180 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c108330 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.764+0000 7f55a472b700 1 -- 192.168.123.100:0/2358292180 >> 192.168.123.100:0/2358292180 conn(0x7f559c07b4b0 msgr2=0x7f559c07b8d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.766+0000 7f55a472b700 1 -- 192.168.123.100:0/2358292180 shutdown_connections 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.766+0000 7f55a472b700 1 -- 192.168.123.100:0/2358292180 wait complete. 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.766+0000 7f55a472b700 1 Processor -- start 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a472b700 1 -- start start 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a472b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c1a0080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a472b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f559c1a05c0 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a24c7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c1a0080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a24c7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c1a0080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33220/0 (socket says 192.168.123.100:33220) 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a24c7700 1 -- 192.168.123.100:0/4220439658 learned_addr learned my addr 192.168.123.100:0/4220439658 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a24c7700 1 -- 192.168.123.100:0/4220439658 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5594009740 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.767+0000 7f55a24c7700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c1a0080 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f5594000c00 tx=0x7f5594011700 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.768+0000 7f559b7fe700 1 -- 192.168.123.100:0/4220439658 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5594011b60 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.768+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f559c1a07c0 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.768+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f559c1a0c60 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.768+0000 7f559b7fe700 1 -- 192.168.123.100:0/4220439658 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5594011cc0 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.768+0000 7f559b7fe700 1 -- 192.168.123.100:0/4220439658 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f559401a5d0 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.769+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5588005320 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.769+0000 7f559b7fe700 1 -- 192.168.123.100:0/4220439658 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f559401b440 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.771+0000 7f559b7fe700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5584038420 0x7f558403a8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.771+0000 7f559b7fe700 1 -- 192.168.123.100:0/4220439658 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f559404d890 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.773+0000 7f55a1cc6700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5584038420 0x7f558403a8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.773+0000 7f559b7fe700 1 -- 192.168.123.100:0/4220439658 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f55940189a0 con 0x7f559c107f20 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.773+0000 7f55a1cc6700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5584038420 0x7f558403a8d0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f558c00ad30 tx=0x7f558c0093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.901+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f5588000bf0 con 0x7f5584038420 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.938+0000 7f559b7fe700 1 -- 192.168.123.100:0/4220439658 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f5588000bf0 con 0x7f5584038420 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5584038420 msgr2=0x7f558403a8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5584038420 0x7f558403a8d0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f558c00ad30 tx=0x7f558c0093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 msgr2=0x7f559c1a0080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c1a0080 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f5594000c00 tx=0x7f5594011700 comp rx=0 tx=0).stop 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 shutdown_connections 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5584038420 0x7f558403a8d0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 --2- 192.168.123.100:0/4220439658 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f559c107f20 0x7f559c1a0080 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.941+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 >> 192.168.123.100:0/4220439658 conn(0x7f559c07b4b0 msgr2=0x7f559c105680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.942+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 shutdown_connections 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:39.942+0000 7f55a472b700 1 -- 192.168.123.100:0/4220439658 wait complete. 2026-03-10T12:32:39.985 INFO:teuthology.orchestra.run.vm00.stdout:Deploying alertmanager service with default placement... 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.121+0000 7f6af6713700 1 Processor -- start 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.122+0000 7f6af6713700 1 -- start start 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.122+0000 7f6af6713700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af0106a40 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.122+0000 7f6af6713700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6af00745b0 con 0x7f6af0104620 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.122+0000 7f6aeffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af0106a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.122+0000 7f6aeffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af0106a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33230/0 (socket says 192.168.123.100:33230) 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.122+0000 7f6aeffff700 1 -- 192.168.123.100:0/2312961375 learned_addr learned my addr 192.168.123.100:0/2312961375 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.123+0000 7f6aeffff700 1 -- 192.168.123.100:0/2312961375 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6af00746f0 con 0x7f6af0104620 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.123+0000 7f6aeffff700 1 --2- 192.168.123.100:0/2312961375 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af0106a40 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f6ad8009cf0 tx=0x7f6ad800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4218087be74af097 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.123+0000 7f6aeeffd700 1 -- 192.168.123.100:0/2312961375 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6ad8004030 con 0x7f6af0104620 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.124+0000 7f6aeeffd700 1 -- 192.168.123.100:0/2312961375 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6ad800b810 con 0x7f6af0104620 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.124+0000 7f6af6713700 1 -- 192.168.123.100:0/2312961375 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 msgr2=0x7f6af0106a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.124+0000 7f6af6713700 1 --2- 192.168.123.100:0/2312961375 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af0106a40 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f6ad8009cf0 tx=0x7f6ad800b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.124+0000 7f6af6713700 1 -- 192.168.123.100:0/2312961375 shutdown_connections 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.124+0000 7f6af6713700 1 --2- 192.168.123.100:0/2312961375 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af0106a40 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.124+0000 7f6af6713700 1 -- 192.168.123.100:0/2312961375 >> 192.168.123.100:0/2312961375 conn(0x7f6af0100270 msgr2=0x7f6af01026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.125+0000 7f6af6713700 1 -- 192.168.123.100:0/2312961375 shutdown_connections 2026-03-10T12:32:40.307 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.125+0000 7f6af6713700 1 -- 192.168.123.100:0/2312961375 wait complete. 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.125+0000 7f6af6713700 1 Processor -- start 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.125+0000 7f6af6713700 1 -- start start 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.125+0000 7f6af6713700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af01a00b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.126+0000 7f6aeffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af01a00b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.126+0000 7f6aeffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af01a00b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33246/0 (socket says 192.168.123.100:33246) 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.126+0000 7f6aeffff700 1 -- 192.168.123.100:0/1390414583 learned_addr learned my addr 192.168.123.100:0/1390414583 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.126+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6af00745b0 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.126+0000 7f6aeffff700 1 -- 192.168.123.100:0/1390414583 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ad8009740 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.126+0000 7f6aeffff700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af01a00b0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f6ad8003fe0 tx=0x7f6ad80040c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.127+0000 7f6aed7fa700 1 -- 192.168.123.100:0/1390414583 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6ad80036a0 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.127+0000 7f6aed7fa700 1 -- 192.168.123.100:0/1390414583 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6ad8004310 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.127+0000 7f6aed7fa700 1 -- 192.168.123.100:0/1390414583 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6ad801a5f0 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.127+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6af01a05f0 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.127+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6af01a0a90 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.128+0000 7f6aed7fa700 1 -- 192.168.123.100:0/1390414583 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f6ad801a750 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.128+0000 7f6aed7fa700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6adc038460 0x7f6adc03a910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.128+0000 7f6aed7fa700 1 -- 192.168.123.100:0/1390414583 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f6ad804d270 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.128+0000 7f6aef7fe700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6adc038460 0x7f6adc03a910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.128+0000 7f6aef7fe700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6adc038460 0x7f6adc03a910 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f6ae0006fd0 tx=0x7f6ae0006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.129+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6af01a0d40 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.132+0000 7f6aed7fa700 1 -- 192.168.123.100:0/1390414583 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6ad801aa00 con 0x7f6af0104620 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.244+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7f6af010a3e0 con 0x7f6adc038460 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.248+0000 7f6aed7fa700 1 -- 192.168.123.100:0/1390414583 <== mgr.14120 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7f6af010a3e0 con 0x7f6adc038460 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6adc038460 msgr2=0x7f6adc03a910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6adc038460 0x7f6adc03a910 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f6ae0006fd0 tx=0x7f6ae0006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 msgr2=0x7f6af01a00b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af01a00b0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f6ad8003fe0 tx=0x7f6ad80040c0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 shutdown_connections 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6adc038460 0x7f6adc03a910 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 --2- 192.168.123.100:0/1390414583 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af0104620 0x7f6af01a00b0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 >> 192.168.123.100:0/1390414583 conn(0x7f6af0100270 msgr2=0x7f6af01026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 shutdown_connections 2026-03-10T12:32:40.308 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.251+0000 7f6af6713700 1 -- 192.168.123.100:0/1390414583 wait complete. 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: Saving service crash spec with placement * 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: Saving service ceph-exporter spec with placement * 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: Saving service prometheus spec with placement count:1 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:40.355 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:40 vm00 ceph-mon[50686]: from='mgr.14120 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:40.607 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.435+0000 7fef84a72700 1 Processor -- start 2026-03-10T12:32:40.607 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.436+0000 7fef84a72700 1 -- start start 2026-03-10T12:32:40.607 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.436+0000 7fef84a72700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef80105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.607 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.436+0000 7fef84a72700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fef80105570 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.437+0000 7fef7e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef80105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.437+0000 7fef7e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef80105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33262/0 (socket says 192.168.123.100:33262) 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.437+0000 7fef7e59c700 1 -- 192.168.123.100:0/1395009308 learned_addr learned my addr 192.168.123.100:0/1395009308 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.437+0000 7fef7e59c700 1 -- 192.168.123.100:0/1395009308 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fef801056b0 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.438+0000 7fef7e59c700 1 --2- 192.168.123.100:0/1395009308 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef80105030 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fef6800bf90 tx=0x7fef6800d5d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=279dd5e93819d02 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.438+0000 7fef7d59a700 1 -- 192.168.123.100:0/1395009308 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fef6800dcc0 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.438+0000 7fef7d59a700 1 -- 192.168.123.100:0/1395009308 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fef6800de20 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.438+0000 7fef7d59a700 1 -- 192.168.123.100:0/1395009308 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fef68010470 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.439+0000 7fef84a72700 1 -- 192.168.123.100:0/1395009308 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 msgr2=0x7fef80105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.439+0000 7fef84a72700 1 --2- 192.168.123.100:0/1395009308 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef80105030 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fef6800bf90 tx=0x7fef6800d5d0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.439+0000 7fef84a72700 1 -- 192.168.123.100:0/1395009308 shutdown_connections 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.439+0000 7fef84a72700 1 --2- 192.168.123.100:0/1395009308 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef80105030 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.439+0000 7fef84a72700 1 -- 192.168.123.100:0/1395009308 >> 192.168.123.100:0/1395009308 conn(0x7fef80100270 msgr2=0x7fef801026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.439+0000 7fef84a72700 1 -- 192.168.123.100:0/1395009308 shutdown_connections 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.439+0000 7fef84a72700 1 -- 192.168.123.100:0/1395009308 wait complete. 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.440+0000 7fef84a72700 1 Processor -- start 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.440+0000 7fef84a72700 1 -- start start 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.440+0000 7fef84a72700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef801977f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.440+0000 7fef84a72700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fef80197d30 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.440+0000 7fef7e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef801977f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef7e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef801977f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33268/0 (socket says 192.168.123.100:33268) 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef7e59c700 1 -- 192.168.123.100:0/1846298052 learned_addr learned my addr 192.168.123.100:0/1846298052 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef7e59c700 1 -- 192.168.123.100:0/1846298052 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fef6800b9e0 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef7e59c700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef801977f0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fef6800ddf0 tx=0x7fef68004670 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef777fe700 1 -- 192.168.123.100:0/1846298052 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fef68011840 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef777fe700 1 -- 192.168.123.100:0/1846298052 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fef68011e80 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fef80197f30 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.441+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fef801983d0 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.442+0000 7fef777fe700 1 -- 192.168.123.100:0/1846298052 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fef68011840 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.442+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fef80191550 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.443+0000 7fef777fe700 1 -- 192.168.123.100:0/1846298052 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7fef680119a0 con 0x7fef80104c20 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.443+0000 7fef777fe700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fef6c03c8b0 0x7fef6c03ed60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.443+0000 7fef7dd9b700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fef6c03c8b0 0x7fef6c03ed60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.443+0000 7fef7dd9b700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fef6c03c8b0 0x7fef6c03ed60 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fef70006fd0 tx=0x7fef70006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.608 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.443+0000 7fef777fe700 1 -- 192.168.123.100:0/1846298052 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fef68011c50 con 0x7fef80104c20 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.446+0000 7fef777fe700 1 -- 192.168.123.100:0/1846298052 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fef680186e0 con 0x7fef80104c20 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.548+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7fef80073960 con 0x7fef80104c20 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.552+0000 7fef777fe700 1 -- 192.168.123.100:0/1846298052 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7fef680186e0 con 0x7fef80104c20 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.558+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fef6c03c8b0 msgr2=0x7fef6c03ed60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.558+0000 7fef84a72700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fef6c03c8b0 0x7fef6c03ed60 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fef70006fd0 tx=0x7fef70006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.558+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 msgr2=0x7fef801977f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.558+0000 7fef84a72700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef801977f0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fef6800ddf0 tx=0x7fef68004670 comp rx=0 tx=0).stop 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.559+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 shutdown_connections 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.559+0000 7fef84a72700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fef6c03c8b0 0x7fef6c03ed60 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.559+0000 7fef84a72700 1 --2- 192.168.123.100:0/1846298052 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fef80104c20 0x7fef801977f0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.559+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 >> 192.168.123.100:0/1846298052 conn(0x7fef80100270 msgr2=0x7fef8018e690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.559+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 shutdown_connections 2026-03-10T12:32:40.609 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.559+0000 7fef84a72700 1 -- 192.168.123.100:0/1846298052 wait complete. 2026-03-10T12:32:40.914 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.739+0000 7f4d420e7700 1 Processor -- start 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.739+0000 7f4d420e7700 1 -- start start 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.739+0000 7f4d420e7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.739+0000 7f4d420e7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d3c105570 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.739+0000 7f4d3b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.739+0000 7f4d3b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33280/0 (socket says 192.168.123.100:33280) 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.739+0000 7f4d3b7fe700 1 -- 192.168.123.100:0/2309949784 learned_addr learned my addr 192.168.123.100:0/2309949784 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.740+0000 7f4d3b7fe700 1 -- 192.168.123.100:0/2309949784 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d3c1056b0 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.740+0000 7f4d3b7fe700 1 --2- 192.168.123.100:0/2309949784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c105030 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f4d3400b3e0 tx=0x7f4d3400b6f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=5d59260e9a4013b0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.740+0000 7f4d3a7fc700 1 -- 192.168.123.100:0/2309949784 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d3400e070 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.740+0000 7f4d3a7fc700 1 -- 192.168.123.100:0/2309949784 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4d3400bd40 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.740+0000 7f4d3a7fc700 1 -- 192.168.123.100:0/2309949784 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d34004780 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.741+0000 7f4d420e7700 1 -- 192.168.123.100:0/2309949784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 msgr2=0x7f4d3c105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.741+0000 7f4d420e7700 1 --2- 192.168.123.100:0/2309949784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c105030 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f4d3400b3e0 tx=0x7f4d3400b6f0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.741+0000 7f4d420e7700 1 -- 192.168.123.100:0/2309949784 shutdown_connections 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.741+0000 7f4d420e7700 1 --2- 192.168.123.100:0/2309949784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c105030 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.741+0000 7f4d420e7700 1 -- 192.168.123.100:0/2309949784 >> 192.168.123.100:0/2309949784 conn(0x7f4d3c100270 msgr2=0x7f4d3c1026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.741+0000 7f4d420e7700 1 -- 192.168.123.100:0/2309949784 shutdown_connections 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.741+0000 7f4d420e7700 1 -- 192.168.123.100:0/2309949784 wait complete. 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.742+0000 7f4d420e7700 1 Processor -- start 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.742+0000 7f4d420e7700 1 -- start start 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.742+0000 7f4d420e7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c199a70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.742+0000 7f4d420e7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d3c199fb0 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d3b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c199a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d3b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c199a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33286/0 (socket says 192.168.123.100:33286) 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d3b7fe700 1 -- 192.168.123.100:0/1035106166 learned_addr learned my addr 192.168.123.100:0/1035106166 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d3b7fe700 1 -- 192.168.123.100:0/1035106166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d34009d20 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d3b7fe700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c199a70 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f4d34000f80 tx=0x7f4d340040c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d38ff9700 1 -- 192.168.123.100:0/1035106166 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d3400e040 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d38ff9700 1 -- 192.168.123.100:0/1035106166 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4d3401ca30 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d38ff9700 1 -- 192.168.123.100:0/1035106166 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4d34012bb0 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d3c19a1b0 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.743+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d3c19a6b0 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.744+0000 7f4d38ff9700 1 -- 192.168.123.100:0/1035106166 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f4d34012d10 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.745+0000 7f4d38ff9700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4d28038470 0x7f4d2803a920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.745+0000 7f4d38ff9700 1 -- 192.168.123.100:0/1035106166 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f4d3404d700 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.745+0000 7f4d3affd700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4d28038470 0x7f4d2803a920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.745+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4d3c04f9e0 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.748+0000 7f4d38ff9700 1 -- 192.168.123.100:0/1035106166 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4d34020070 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.748+0000 7f4d3affd700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4d28038470 0x7f4d2803a920 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4d2c006fd0 tx=0x7f4d2c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.850+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7f4d3c062380 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.855+0000 7f4d38ff9700 1 -- 192.168.123.100:0/1035106166 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7f4d3c062380 con 0x7f4d3c104c20 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.860+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4d28038470 msgr2=0x7f4d2803a920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.860+0000 7f4d420e7700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4d28038470 0x7f4d2803a920 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4d2c006fd0 tx=0x7f4d2c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.860+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 msgr2=0x7f4d3c199a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.860+0000 7f4d420e7700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c199a70 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f4d34000f80 tx=0x7f4d340040c0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.860+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 shutdown_connections 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.860+0000 7f4d420e7700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4d28038470 0x7f4d2803a920 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.860+0000 7f4d420e7700 1 --2- 192.168.123.100:0/1035106166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d3c104c20 0x7f4d3c199a70 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.861+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 >> 192.168.123.100:0/1035106166 conn(0x7f4d3c100270 msgr2=0x7f4d3c074090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.861+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 shutdown_connections 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:40.861+0000 7f4d420e7700 1 -- 192.168.123.100:0/1035106166 wait complete. 2026-03-10T12:32:40.915 INFO:teuthology.orchestra.run.vm00.stdout:Enabling the dashboard module... 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: Saving service grafana spec with placement count:1 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: Saving service node-exporter spec with placement * 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: Saving service alertmanager spec with placement count:1 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1846298052' entity='client.admin' 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1035106166' entity='client.admin' 2026-03-10T12:32:41.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:41 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/470061899' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.045+0000 7f40377b7700 1 Processor -- start 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.045+0000 7f40377b7700 1 -- start start 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.046+0000 7f40377b7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f4030105030 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.046+0000 7f40377b7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4030105570 con 0x7f4030104c20 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.046+0000 7f4035553700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f4030105030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.046+0000 7f4035553700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f4030105030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33300/0 (socket says 192.168.123.100:33300) 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.046+0000 7f4035553700 1 -- 192.168.123.100:0/1224529752 learned_addr learned my addr 192.168.123.100:0/1224529752 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.046+0000 7f4035553700 1 -- 192.168.123.100:0/1224529752 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40301056b0 con 0x7f4030104c20 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.046+0000 7f4035553700 1 --2- 192.168.123.100:0/1224529752 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f4030105030 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f4024009a90 tx=0x7f4024009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=38f0549370c2767 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.047+0000 7f4023fff700 1 -- 192.168.123.100:0/1224529752 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4024004030 con 0x7f4030104c20 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.047+0000 7f4023fff700 1 -- 192.168.123.100:0/1224529752 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f402400b7e0 con 0x7f4030104c20 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.047+0000 7f4023fff700 1 -- 192.168.123.100:0/1224529752 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4024003b30 con 0x7f4030104c20 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.047+0000 7f40377b7700 1 -- 192.168.123.100:0/1224529752 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 msgr2=0x7f4030105030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.047+0000 7f40377b7700 1 --2- 192.168.123.100:0/1224529752 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f4030105030 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f4024009a90 tx=0x7f4024009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.048+0000 7f40377b7700 1 -- 192.168.123.100:0/1224529752 shutdown_connections 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.048+0000 7f40377b7700 1 --2- 192.168.123.100:0/1224529752 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f4030105030 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.048+0000 7f40377b7700 1 -- 192.168.123.100:0/1224529752 >> 192.168.123.100:0/1224529752 conn(0x7f4030100270 msgr2=0x7f40301026a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.048+0000 7f40377b7700 1 -- 192.168.123.100:0/1224529752 shutdown_connections 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.048+0000 7f40377b7700 1 -- 192.168.123.100:0/1224529752 wait complete. 2026-03-10T12:32:41.911 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.048+0000 7f40377b7700 1 Processor -- start 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.048+0000 7f40377b7700 1 -- start start 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f40377b7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f403019bbe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f40377b7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f403019c120 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f4035553700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f403019bbe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f4035553700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f403019bbe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33302/0 (socket says 192.168.123.100:33302) 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f4035553700 1 -- 192.168.123.100:0/470061899 learned_addr learned my addr 192.168.123.100:0/470061899 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f4035553700 1 -- 192.168.123.100:0/470061899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4024009740 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f4035553700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f403019bbe0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f4024000c00 tx=0x7f4024004080 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.049+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f40240044e0 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.050+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4024010950 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.050+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4024021950 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.050+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f403019c320 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.050+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f403019c7c0 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.051+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 8) v1 ==== 45079+0+0 (secure 0 0 0) 0x7f40240193f0 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.051+0000 7f40227fc700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f401c038470 0x7f401c03a920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.051+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f402404cf10 con 0x7f4030104c20 2026-03-10T12:32:41.912 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.051+0000 7f4034d52700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f401c038470 0x7f401c03a920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.051+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4030195960 con 0x7f4030104c20 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.054+0000 7f4034d52700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f401c038470 0x7f401c03a920 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f402c006fd0 tx=0x7f402c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.055+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f40240288e0 con 0x7f4030104c20 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.194+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7f403002d050 con 0x7f4030104c20 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.857+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f4024019880 con 0x7f4030104c20 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.858+0000 7f40227fc700 1 -- 192.168.123.100:0/470061899 <== mon.0 v2:192.168.123.100:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v9) v1 ==== 88+0+0 (secure 0 0 0) 0x7f4024021400 con 0x7f4030104c20 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f401c038470 msgr2=0x7f401c03a920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f401c038470 0x7f401c03a920 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f402c006fd0 tx=0x7f402c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 msgr2=0x7f403019bbe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f403019bbe0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f4024000c00 tx=0x7f4024004080 comp rx=0 tx=0).stop 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 shutdown_connections 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f401c038470 0x7f401c03a920 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 --2- 192.168.123.100:0/470061899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4030104c20 0x7f403019bbe0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 >> 192.168.123.100:0/470061899 conn(0x7f4030100270 msgr2=0x7f4030192aa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 shutdown_connections 2026-03-10T12:32:41.913 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:41.862+0000 7f40377b7700 1 -- 192.168.123.100:0/470061899 wait complete. 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "epoch": 9, 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "active_name": "vm00.nescmq", 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.107+0000 7fb32fc84700 1 Processor -- start 2026-03-10T12:32:42.295 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.109+0000 7fb32fc84700 1 -- start start 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.109+0000 7fb32fc84700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328108220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.109+0000 7fb32fc84700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb328108760 con 0x7fb328107e10 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.109+0000 7fb32da20700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328108220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.109+0000 7fb32da20700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328108220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33326/0 (socket says 192.168.123.100:33326) 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.109+0000 7fb32da20700 1 -- 192.168.123.100:0/2067331522 learned_addr learned my addr 192.168.123.100:0/2067331522 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.110+0000 7fb32da20700 1 -- 192.168.123.100:0/2067331522 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3281088a0 con 0x7fb328107e10 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.110+0000 7fb32da20700 1 --2- 192.168.123.100:0/2067331522 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328108220 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fb32401ab30 tx=0x7fb32401ae40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=addba88b28c098b4 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.111+0000 7fb32ca1e700 1 -- 192.168.123.100:0/2067331522 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb324004030 con 0x7fb328107e10 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.111+0000 7fb32ca1e700 1 -- 192.168.123.100:0/2067331522 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb32401c8b0 con 0x7fb328107e10 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.111+0000 7fb32ca1e700 1 -- 192.168.123.100:0/2067331522 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb324003b50 con 0x7fb328107e10 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.112+0000 7fb32fc84700 1 -- 192.168.123.100:0/2067331522 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 msgr2=0x7fb328108220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.112+0000 7fb32fc84700 1 --2- 192.168.123.100:0/2067331522 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328108220 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fb32401ab30 tx=0x7fb32401ae40 comp rx=0 tx=0).stop 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.113+0000 7fb32fc84700 1 -- 192.168.123.100:0/2067331522 shutdown_connections 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.113+0000 7fb32fc84700 1 --2- 192.168.123.100:0/2067331522 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328108220 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.113+0000 7fb32fc84700 1 -- 192.168.123.100:0/2067331522 >> 192.168.123.100:0/2067331522 conn(0x7fb32807b220 msgr2=0x7fb32807b620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.113+0000 7fb32fc84700 1 -- 192.168.123.100:0/2067331522 shutdown_connections 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.113+0000 7fb32fc84700 1 -- 192.168.123.100:0/2067331522 wait complete. 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.113+0000 7fb32fc84700 1 Processor -- start 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.114+0000 7fb32fc84700 1 -- start start 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.114+0000 7fb32fc84700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328072740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.114+0000 7fb32fc84700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb328072c80 con 0x7fb328107e10 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.114+0000 7fb32da20700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328072740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.114+0000 7fb32da20700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328072740 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33330/0 (socket says 192.168.123.100:33330) 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.114+0000 7fb32da20700 1 -- 192.168.123.100:0/1628505116 learned_addr learned my addr 192.168.123.100:0/1628505116 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.115+0000 7fb32da20700 1 -- 192.168.123.100:0/1628505116 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb32401a7e0 con 0x7fb328107e10 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.115+0000 7fb32da20700 1 --2- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328072740 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fb324006b20 tx=0x7fb324004060 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:42.296 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.116+0000 7fb31effd700 1 -- 192.168.123.100:0/1628505116 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb3240043c0 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.116+0000 7fb31effd700 1 -- 192.168.123.100:0/1628505116 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb324004520 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.116+0000 7fb31effd700 1 -- 192.168.123.100:0/1628505116 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb324022820 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.116+0000 7fb32fc84700 1 -- 192.168.123.100:0/1628505116 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb328070d90 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.116+0000 7fb32fc84700 1 -- 192.168.123.100:0/1628505116 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb328071230 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.117+0000 7fb31effd700 1 -- 192.168.123.100:0/1628505116 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fb32403b030 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.118+0000 7fb31effd700 1 --2- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb314038500 0x7fb31403a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.118+0000 7fb31effd700 1 -- 192.168.123.100:0/1628505116 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb32405d940 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.118+0000 7fb32d21f700 1 -- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb314038500 msgr2=0x7fb31403a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.118+0000 7fb32d21f700 1 --2- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb314038500 0x7fb31403a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.118+0000 7fb32fc84700 1 -- 192.168.123.100:0/1628505116 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb328072dc0 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.124+0000 7fb31effd700 1 -- 192.168.123.100:0/1628505116 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb32402c440 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.258+0000 7fb32fc84700 1 -- 192.168.123.100:0/1628505116 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7fb3281088a0 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.258+0000 7fb31effd700 1 -- 192.168.123.100:0/1628505116 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v9) v1 ==== 56+0+98 (secure 0 0 0) 0x7fb3281088a0 con 0x7fb328107e10 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 -- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb314038500 msgr2=0x7fb31403a9b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 --2- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb314038500 0x7fb31403a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 -- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 msgr2=0x7fb328072740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 --2- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328072740 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fb324006b20 tx=0x7fb324004060 comp rx=0 tx=0).stop 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 -- 192.168.123.100:0/1628505116 shutdown_connections 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 --2- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb314038500 0x7fb31403a9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 --2- 192.168.123.100:0/1628505116 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb328107e10 0x7fb328072740 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 -- 192.168.123.100:0/1628505116 >> 192.168.123.100:0/1628505116 conn(0x7fb32807b220 msgr2=0x7fb328105620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 -- 192.168.123.100:0/1628505116 shutdown_connections 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.261+0000 7fb31cff9700 1 -- 192.168.123.100:0/1628505116 wait complete. 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for the mgr to restart... 2026-03-10T12:32:42.297 INFO:teuthology.orchestra.run.vm00.stdout:Waiting for mgr epoch 9... 2026-03-10T12:32:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:42 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/470061899' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-10T12:32:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:42 vm00 ceph-mon[50686]: mgrmap e9: vm00.nescmq(active, since 9s) 2026-03-10T12:32:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:42 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1628505116' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: Active manager daemon vm00.nescmq restarted 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: Activating manager daemon vm00.nescmq 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: osdmap e3: 0 total, 0 up, 0 in 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: mgrmap e10: vm00.nescmq(active, starting, since 0.00486503s) 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: Manager daemon vm00.nescmq is now available 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:46.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:46 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout { 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 11, 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout } 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.445+0000 7fde64a57700 1 Processor -- start 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.445+0000 7fde64a57700 1 -- start start 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.445+0000 7fde64a57700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60070fc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.445+0000 7fde64a57700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde60071500 con 0x7fde60072ac0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.446+0000 7fde5f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60070fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.446+0000 7fde5f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60070fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33344/0 (socket says 192.168.123.100:33344) 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.446+0000 7fde5f7fe700 1 -- 192.168.123.100:0/648609029 learned_addr learned my addr 192.168.123.100:0/648609029 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.446+0000 7fde5f7fe700 1 -- 192.168.123.100:0/648609029 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fde60071640 con 0x7fde60072ac0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.446+0000 7fde5f7fe700 1 --2- 192.168.123.100:0/648609029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60070fc0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fde5000d0d0 tx=0x7fde5000d3e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=743b40d5eb4fff54 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde5e7fc700 1 -- 192.168.123.100:0/648609029 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fde50010070 con 0x7fde60072ac0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde5e7fc700 1 -- 192.168.123.100:0/648609029 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fde50004030 con 0x7fde60072ac0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde64a57700 1 -- 192.168.123.100:0/648609029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 msgr2=0x7fde60070fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde64a57700 1 --2- 192.168.123.100:0/648609029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60070fc0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fde5000d0d0 tx=0x7fde5000d3e0 comp rx=0 tx=0).stop 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde64a57700 1 -- 192.168.123.100:0/648609029 shutdown_connections 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde64a57700 1 --2- 192.168.123.100:0/648609029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60070fc0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde64a57700 1 -- 192.168.123.100:0/648609029 >> 192.168.123.100:0/648609029 conn(0x7fde6006c9d0 msgr2=0x7fde6006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde64a57700 1 -- 192.168.123.100:0/648609029 shutdown_connections 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.447+0000 7fde64a57700 1 -- 192.168.123.100:0/648609029 wait complete. 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde64a57700 1 Processor -- start 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde64a57700 1 -- start start 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde64a57700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60089950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde64a57700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde50003b70 con 0x7fde60072ac0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde5f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60089950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde5f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60089950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33350/0 (socket says 192.168.123.100:33350) 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde5f7fe700 1 -- 192.168.123.100:0/1627101429 learned_addr learned my addr 192.168.123.100:0/1627101429 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:47.842 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde5f7fe700 1 -- 192.168.123.100:0/1627101429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fde500088c0 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.448+0000 7fde5f7fe700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60089950 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fde50019040 tx=0x7fde500047f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.449+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fde50010040 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.449+0000 7fde64a57700 1 -- 192.168.123.100:0/1627101429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fde600863b0 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.449+0000 7fde64a57700 1 -- 192.168.123.100:0/1627101429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fde60086850 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.450+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fde50016bd0 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.450+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fde50021ba0 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.450+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 9) v1 ==== 45092+0+0 (secure 0 0 0) 0x7fde5001d070 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.451+0000 7fde5cff9700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.451+0000 7fde5effd700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 msgr2=0x7fde4803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.451+0000 7fde5effd700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.451+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fde5004fbb0 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.451+0000 7fde64a57700 1 -- 192.168.123.100:0/1627101429 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fde6004efc0 con 0x7fde48038500 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.652+0000 7fde5effd700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 msgr2=0x7fde4803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:42.652+0000 7fde5effd700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:43.052+0000 7fde5effd700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 msgr2=0x7fde4803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:43.052+0000 7fde5effd700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:43.853+0000 7fde5effd700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 msgr2=0x7fde4803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:43.853+0000 7fde5effd700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:45.454+0000 7fde5effd700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 msgr2=0x7fde4803a9b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:45.454+0000 7fde5effd700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:46.756+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mgrmap(e 10) v1 ==== 44859+0+0 (secure 0 0 0) 0x7fde50014dc0 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:46.756+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 msgr2=0x7fde4803a9b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:46.756+0000 7fde5cff9700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.760+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7fde50016d40 con 0x7fde60072ac0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.760+0000 7fde5cff9700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.760+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7fde6004efc0 con 0x7fde48038500 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.762+0000 7fde5effd700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.763+0000 7fde5effd700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fde58003de0 tx=0x7fde580073e0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.763+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mgr.14164 v2:192.168.123.100:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7fde6004efc0 con 0x7fde48038500 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.767+0000 7fde64a57700 1 -- 192.168.123.100:0/1627101429 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7fde600876f0 con 0x7fde48038500 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.767+0000 7fde5cff9700 1 -- 192.168.123.100:0/1627101429 <== mgr.14164 v2:192.168.123.100:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7fde600876f0 con 0x7fde48038500 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.767+0000 7fde4e7fc700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 msgr2=0x7fde4803a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.767+0000 7fde4e7fc700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fde58003de0 tx=0x7fde580073e0 comp rx=0 tx=0).stop 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.767+0000 7fde4e7fc700 1 -- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 msgr2=0x7fde60089950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.767+0000 7fde4e7fc700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60089950 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fde50019040 tx=0x7fde500047f0 comp rx=0 tx=0).stop 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.768+0000 7fde4e7fc700 1 -- 192.168.123.100:0/1627101429 shutdown_connections 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.768+0000 7fde4e7fc700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fde48038500 0x7fde4803a9b0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.768+0000 7fde4e7fc700 1 --2- 192.168.123.100:0/1627101429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fde60072ac0 0x7fde60089950 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.768+0000 7fde4e7fc700 1 -- 192.168.123.100:0/1627101429 >> 192.168.123.100:0/1627101429 conn(0x7fde6006c9d0 msgr2=0x7fde6006d6b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.768+0000 7fde4e7fc700 1 -- 192.168.123.100:0/1627101429 shutdown_connections 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.768+0000 7fde4e7fc700 1 -- 192.168.123.100:0/1627101429 wait complete. 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:mgr epoch 9 is available 2026-03-10T12:32:47.843 INFO:teuthology.orchestra.run.vm00.stdout:Generating a dashboard self-signed certificate... 2026-03-10T12:32:48.128 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:47 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:32:48.128 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:47 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:32:48.128 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:47 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:48.128 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:47 vm00 ceph-mon[50686]: mgrmap e11: vm00.nescmq(active, since 1.00785s) 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.995+0000 7f99fffff700 1 Processor -- start 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99fffff700 1 -- start start 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99fffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a00071410 0x7f9a00071820 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99fffff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a00071d60 con 0x7f9a00071410 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99feffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a00071410 0x7f9a00071820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99feffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a00071410 0x7f9a00071820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60978/0 (socket says 192.168.123.100:60978) 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99feffd700 1 -- 192.168.123.100:0/2530177072 learned_addr learned my addr 192.168.123.100:0/2530177072 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99feffd700 1 -- 192.168.123.100:0/2530177072 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9a00071ea0 con 0x7f9a00071410 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.996+0000 7f99feffd700 1 --2- 192.168.123.100:0/2530177072 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a00071410 0x7f9a00071820 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f99f0009cf0 tx=0x7f99f000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3b4625c107996ce3 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.997+0000 7f99fdffb700 1 -- 192.168.123.100:0/2530177072 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f99f0004030 con 0x7f9a00071410 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.997+0000 7f99fdffb700 1 -- 192.168.123.100:0/2530177072 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f99f000b810 con 0x7f9a00071410 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.997+0000 7f99fffff700 1 -- 192.168.123.100:0/2530177072 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a00071410 msgr2=0x7f9a00071820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 --2- 192.168.123.100:0/2530177072 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a00071410 0x7f9a00071820 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f99f0009cf0 tx=0x7f99f000b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 -- 192.168.123.100:0/2530177072 shutdown_connections 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 --2- 192.168.123.100:0/2530177072 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a00071410 0x7f9a00071820 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 -- 192.168.123.100:0/2530177072 >> 192.168.123.100:0/2530177072 conn(0x7f9a0006c9d0 msgr2=0x7f9a0006ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 -- 192.168.123.100:0/2530177072 shutdown_connections 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 -- 192.168.123.100:0/2530177072 wait complete. 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 Processor -- start 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 -- start start 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a001123e0 0x7f9a001a2d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.998+0000 7f99fffff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a00071d60 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99feffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a001123e0 0x7f9a001a2d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99feffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a001123e0 0x7f9a001a2d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60990/0 (socket says 192.168.123.100:60990) 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99feffd700 1 -- 192.168.123.100:0/258924840 learned_addr learned my addr 192.168.123.100:0/258924840 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99feffd700 1 -- 192.168.123.100:0/258924840 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99f0009740 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99feffd700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a001123e0 0x7f9a001a2d30 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f99f0009cc0 tx=0x7f99f000bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99e7fff700 1 -- 192.168.123.100:0/258924840 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f99f0003950 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99fffff700 1 -- 192.168.123.100:0/258924840 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9a001a3270 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:47.999+0000 7f99fffff700 1 -- 192.168.123.100:0/258924840 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9a001a3710 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.000+0000 7f99e7fff700 1 -- 192.168.123.100:0/258924840 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f99f00043c0 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.001+0000 7f99e7fff700 1 -- 192.168.123.100:0/258924840 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f99f001ac80 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.002+0000 7f99fffff700 1 -- 192.168.123.100:0/258924840 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f99ec005320 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.002+0000 7f99e7fff700 1 -- 192.168.123.100:0/258924840 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f99f0011420 con 0x7f9a001123e0 2026-03-10T12:32:48.209 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.002+0000 7f99e7fff700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f99e8038410 0x7f99e803a8c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.002+0000 7f99e7fff700 1 -- 192.168.123.100:0/258924840 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f99f004c620 con 0x7f9a001123e0 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.006+0000 7f99fe7fc700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f99e8038410 0x7f99e803a8c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.006+0000 7f99e7fff700 1 -- 192.168.123.100:0/258924840 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f99f0022360 con 0x7f9a001123e0 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.006+0000 7f99fe7fc700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f99e8038410 0x7f99e803a8c0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f99f800ad30 tx=0x7f99f80093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.125+0000 7f99fffff700 1 -- 192.168.123.100:0/258924840 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7f99ec000bf0 con 0x7f99e8038410 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.162+0000 7f99e7fff700 1 -- 192.168.123.100:0/258924840 <== mgr.14164 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f99ec000bf0 con 0x7f99e8038410 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.164+0000 7f99e5ffb700 1 -- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f99e8038410 msgr2=0x7f99e803a8c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.164+0000 7f99e5ffb700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f99e8038410 0x7f99e803a8c0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f99f800ad30 tx=0x7f99f80093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.165+0000 7f99e5ffb700 1 -- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a001123e0 msgr2=0x7f9a001a2d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.165+0000 7f99e5ffb700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a001123e0 0x7f9a001a2d30 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f99f0009cc0 tx=0x7f99f000bfa0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.166+0000 7f99e5ffb700 1 -- 192.168.123.100:0/258924840 shutdown_connections 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.166+0000 7f99e5ffb700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f99e8038410 0x7f99e803a8c0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.166+0000 7f99e5ffb700 1 --2- 192.168.123.100:0/258924840 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9a001123e0 0x7f9a001a2d30 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.166+0000 7f99e5ffb700 1 -- 192.168.123.100:0/258924840 >> 192.168.123.100:0/258924840 conn(0x7f9a0006c9d0 msgr2=0x7f9a0006d680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.166+0000 7f99e5ffb700 1 -- 192.168.123.100:0/258924840 shutdown_connections 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.167+0000 7f99e5ffb700 1 -- 192.168.123.100:0/258924840 wait complete. 2026-03-10T12:32:48.210 INFO:teuthology.orchestra.run.vm00.stdout:Creating initial admin user... 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$B0j.DoWMJv/4vasklnDtc.lPADb4VhU2fHrWq3lv8PR28QUA7Taua", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773145968, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.405+0000 7f77a9b70700 1 Processor -- start 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.405+0000 7f77a9b70700 1 -- start start 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.405+0000 7f77a9b70700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a4071060 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.405+0000 7f77a9b70700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77a40715a0 con 0x7f77a4072a40 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.406+0000 7f77a8b6e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a4071060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.406+0000 7f77a8b6e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a4071060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60996/0 (socket says 192.168.123.100:60996) 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.406+0000 7f77a8b6e700 1 -- 192.168.123.100:0/2186267005 learned_addr learned my addr 192.168.123.100:0/2186267005 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.406+0000 7f77a8b6e700 1 -- 192.168.123.100:0/2186267005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77a40716e0 con 0x7f77a4072a40 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.407+0000 7f77a8b6e700 1 --2- 192.168.123.100:0/2186267005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a4071060 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f7798009a90 tx=0x7f7798009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=42c9f716a7d33b1c server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.407+0000 7f77a37fe700 1 -- 192.168.123.100:0/2186267005 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7798004030 con 0x7f77a4072a40 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.407+0000 7f77a37fe700 1 -- 192.168.123.100:0/2186267005 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f779800b7e0 con 0x7f77a4072a40 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.407+0000 7f77a37fe700 1 -- 192.168.123.100:0/2186267005 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7798003ae0 con 0x7f77a4072a40 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.408+0000 7f77a9b70700 1 -- 192.168.123.100:0/2186267005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 msgr2=0x7f77a4071060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.408+0000 7f77a9b70700 1 --2- 192.168.123.100:0/2186267005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a4071060 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f7798009a90 tx=0x7f7798009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.747 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.408+0000 7f77a9b70700 1 -- 192.168.123.100:0/2186267005 shutdown_connections 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.408+0000 7f77a9b70700 1 --2- 192.168.123.100:0/2186267005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a4071060 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.408+0000 7f77a9b70700 1 -- 192.168.123.100:0/2186267005 >> 192.168.123.100:0/2186267005 conn(0x7f77a406c9d0 msgr2=0x7f77a406ee00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.408+0000 7f77a9b70700 1 -- 192.168.123.100:0/2186267005 shutdown_connections 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.408+0000 7f77a9b70700 1 -- 192.168.123.100:0/2186267005 wait complete. 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a9b70700 1 Processor -- start 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a9b70700 1 -- start start 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a9b70700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a411af80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a9b70700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77a411c930 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a8b6e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a411af80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a8b6e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a411af80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32768/0 (socket says 192.168.123.100:32768) 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a8b6e700 1 -- 192.168.123.100:0/3066946095 learned_addr learned my addr 192.168.123.100:0/3066946095 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a8b6e700 1 -- 192.168.123.100:0/3066946095 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7798009740 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a8b6e700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a411af80 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f7798000c00 tx=0x7f779800bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.409+0000 7f77a1ffb700 1 -- 192.168.123.100:0/3066946095 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7798004160 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.410+0000 7f77a1ffb700 1 -- 192.168.123.100:0/3066946095 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f77980042c0 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.410+0000 7f77a1ffb700 1 -- 192.168.123.100:0/3066946095 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7798011620 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.410+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f77a411b4c0 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.410+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77a411b960 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.410+0000 7f77a1ffb700 1 -- 192.168.123.100:0/3066946095 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f7798028020 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.411+0000 7f77a1ffb700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77940383f0 0x7f779403a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.411+0000 7f77a1ffb700 1 -- 192.168.123.100:0/3066946095 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f779804be60 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.411+0000 7f77a3fff700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77940383f0 0x7f779403a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.411+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f77a404efc0 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.414+0000 7f77a1ffb700 1 -- 192.168.123.100:0/3066946095 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f779801aa80 con 0x7f77a4072a40 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.416+0000 7f77a3fff700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77940383f0 0x7f779403a8a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f7790009940 tx=0x7f7790006e30 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.558+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7f77a40621a0 con 0x7f77940383f0 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.717+0000 7f77a1ffb700 1 -- 192.168.123.100:0/3066946095 <== mgr.14164 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7f77a40621a0 con 0x7f77940383f0 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.720+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77940383f0 msgr2=0x7f779403a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.720+0000 7f77a9b70700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77940383f0 0x7f779403a8a0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f7790009940 tx=0x7f7790006e30 comp rx=0 tx=0).stop 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.720+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 msgr2=0x7f77a411af80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.720+0000 7f77a9b70700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a411af80 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f7798000c00 tx=0x7f779800bfa0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.721+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 shutdown_connections 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.721+0000 7f77a9b70700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f77940383f0 0x7f779403a8a0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.721+0000 7f77a9b70700 1 --2- 192.168.123.100:0/3066946095 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f77a4072a40 0x7f77a411af80 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.721+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 >> 192.168.123.100:0/3066946095 conn(0x7f77a406c9d0 msgr2=0x7f77a406d6a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.721+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 shutdown_connections 2026-03-10T12:32:48.748 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.721+0000 7f77a9b70700 1 -- 192.168.123.100:0/3066946095 wait complete. 2026-03-10T12:32:48.749 INFO:teuthology.orchestra.run.vm00.stdout:Fetching dashboard port number... 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:46] ENGINE Bus STARTING 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:47] ENGINE Serving on http://192.168.123.100:8765 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:47] ENGINE Serving on https://192.168.123.100:7150 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: [10/Mar/2026:12:32:47] ENGINE Bus STARTED 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:48.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:48 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stdout 8443 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.891+0000 7f211394b700 1 Processor -- start 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.891+0000 7f211394b700 1 -- start start 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.892+0000 7f211394b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c107f40 0x7f210c108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.892+0000 7f211394b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f210c108890 con 0x7f210c107f40 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.892+0000 7f21116e7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c107f40 0x7f210c108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.892+0000 7f21116e7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c107f40 0x7f210c108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32784/0 (socket says 192.168.123.100:32784) 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.892+0000 7f21116e7700 1 -- 192.168.123.100:0/2527809696 learned_addr learned my addr 192.168.123.100:0/2527809696 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.893+0000 7f21116e7700 1 -- 192.168.123.100:0/2527809696 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f210c1089d0 con 0x7f210c107f40 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.893+0000 7f21116e7700 1 --2- 192.168.123.100:0/2527809696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c107f40 0x7f210c108350 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f2100009cf0 tx=0x7f210000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8f18632ed7d016e9 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.893+0000 7f20fffff700 1 -- 192.168.123.100:0/2527809696 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2100004030 con 0x7f210c107f40 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.893+0000 7f20fffff700 1 -- 192.168.123.100:0/2527809696 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f210000b810 con 0x7f210c107f40 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.893+0000 7f20fffff700 1 -- 192.168.123.100:0/2527809696 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2100003b10 con 0x7f210c107f40 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.894+0000 7f211394b700 1 -- 192.168.123.100:0/2527809696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c107f40 msgr2=0x7f210c108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.894+0000 7f211394b700 1 --2- 192.168.123.100:0/2527809696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c107f40 0x7f210c108350 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f2100009cf0 tx=0x7f210000b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.895+0000 7f211394b700 1 -- 192.168.123.100:0/2527809696 shutdown_connections 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.895+0000 7f211394b700 1 --2- 192.168.123.100:0/2527809696 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c107f40 0x7f210c108350 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.895+0000 7f211394b700 1 -- 192.168.123.100:0/2527809696 >> 192.168.123.100:0/2527809696 conn(0x7f210c103770 msgr2=0x7f210c105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.895+0000 7f211394b700 1 -- 192.168.123.100:0/2527809696 shutdown_connections 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.895+0000 7f211394b700 1 -- 192.168.123.100:0/2527809696 wait complete. 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.895+0000 7f211394b700 1 Processor -- start 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.896+0000 7f211394b700 1 -- start start 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.896+0000 7f211394b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c19bea0 0x7f210c19c2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.896+0000 7f211394b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f210c19c7f0 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.896+0000 7f21116e7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c19bea0 0x7f210c19c2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.896+0000 7f21116e7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c19bea0 0x7f210c19c2b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32788/0 (socket says 192.168.123.100:32788) 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.896+0000 7f21116e7700 1 -- 192.168.123.100:0/3279192564 learned_addr learned my addr 192.168.123.100:0/3279192564 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.897+0000 7f21116e7700 1 -- 192.168.123.100:0/3279192564 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2100009740 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.897+0000 7f21116e7700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c19bea0 0x7f210c19c2b0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f2100009cc0 tx=0x7f210001b800 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.897+0000 7f20fe7fc700 1 -- 192.168.123.100:0/3279192564 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f210001ba20 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.897+0000 7f20fe7fc700 1 -- 192.168.123.100:0/3279192564 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f210001bb80 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.897+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f210c19c9f0 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.897+0000 7f20fe7fc700 1 -- 192.168.123.100:0/3279192564 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f210001c450 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.897+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f210c19f650 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.898+0000 7f20fe7fc700 1 -- 192.168.123.100:0/3279192564 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f2100022070 con 0x7f210c19bea0 2026-03-10T12:32:49.056 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.898+0000 7f20fe7fc700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f20f80383a0 0x7f20f803a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.899+0000 7f20fe7fc700 1 -- 192.168.123.100:0/3279192564 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f210004d4f0 con 0x7f210c19bea0 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.899+0000 7f2110ee6700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f20f80383a0 0x7f20f803a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.899+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f210c04f9e0 con 0x7f210c19bea0 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.903+0000 7f20fe7fc700 1 -- 192.168.123.100:0/3279192564 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f210001c5b0 con 0x7f210c19bea0 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:48.904+0000 7f2110ee6700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f20f80383a0 0x7f20f803a850 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f2108006fd0 tx=0x7f2108006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.008+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7f210c19fd30 con 0x7f210c19bea0 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.009+0000 7f20fe7fc700 1 -- 192.168.123.100:0/3279192564 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7f210004a030 con 0x7f210c19bea0 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.012+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f20f80383a0 msgr2=0x7f20f803a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.012+0000 7f211394b700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f20f80383a0 0x7f20f803a850 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f2108006fd0 tx=0x7f2108006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.012+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c19bea0 msgr2=0x7f210c19c2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.012+0000 7f211394b700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c19bea0 0x7f210c19c2b0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f2100009cc0 tx=0x7f210001b800 comp rx=0 tx=0).stop 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.013+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 shutdown_connections 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.013+0000 7f211394b700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f20f80383a0 0x7f20f803a850 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.013+0000 7f211394b700 1 --2- 192.168.123.100:0/3279192564 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f210c19bea0 0x7f210c19c2b0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.013+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 >> 192.168.123.100:0/3279192564 conn(0x7f210c103770 msgr2=0x7f210c105490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.013+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 shutdown_connections 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.013+0000 7f211394b700 1 -- 192.168.123.100:0/3279192564 wait complete. 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:firewalld does not appear to be present 2026-03-10T12:32:49.057 INFO:teuthology.orchestra.run.vm00.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-10T12:32:49.058 INFO:teuthology.orchestra.run.vm00.stdout:Ceph Dashboard is now available at: 2026-03-10T12:32:49.058 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.058 INFO:teuthology.orchestra.run.vm00.stdout: URL: https://vm00.local:8443/ 2026-03-10T12:32:49.058 INFO:teuthology.orchestra.run.vm00.stdout: User: admin 2026-03-10T12:32:49.058 INFO:teuthology.orchestra.run.vm00.stdout: Password: js9pikc612 2026-03-10T12:32:49.058 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.058 INFO:teuthology.orchestra.run.vm00.stdout:Saving cluster configuration to /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config directory 2026-03-10T12:32:49.059 INFO:teuthology.orchestra.run.vm00.stdout:Enabling autotune for osd_memory_target 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.201+0000 7f04a3146700 1 Processor -- start 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.201+0000 7f04a3146700 1 -- start start 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.202+0000 7f04a3146700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.202+0000 7f04a3146700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f049c108890 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.202+0000 7f04a0ee2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.202+0000 7f04a0ee2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32798/0 (socket says 192.168.123.100:32798) 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.202+0000 7f04a0ee2700 1 -- 192.168.123.100:0/1521387528 learned_addr learned my addr 192.168.123.100:0/1521387528 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.203+0000 7f04a0ee2700 1 -- 192.168.123.100:0/1521387528 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f049c1089d0 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.203+0000 7f04a0ee2700 1 --2- 192.168.123.100:0/1521387528 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c108350 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f0498009cf0 tx=0x7f049800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=97552b1253ab519f server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.203+0000 7f04937fe700 1 -- 192.168.123.100:0/1521387528 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0498004030 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.203+0000 7f04937fe700 1 -- 192.168.123.100:0/1521387528 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f049800b810 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.203+0000 7f04937fe700 1 -- 192.168.123.100:0/1521387528 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0498003b10 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.204+0000 7f04a3146700 1 -- 192.168.123.100:0/1521387528 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 msgr2=0x7f049c108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.204+0000 7f04a3146700 1 --2- 192.168.123.100:0/1521387528 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c108350 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f0498009cf0 tx=0x7f049800b0e0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.204+0000 7f04a3146700 1 -- 192.168.123.100:0/1521387528 shutdown_connections 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.204+0000 7f04a3146700 1 --2- 192.168.123.100:0/1521387528 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c108350 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.204+0000 7f04a3146700 1 -- 192.168.123.100:0/1521387528 >> 192.168.123.100:0/1521387528 conn(0x7f049c103770 msgr2=0x7f049c105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.204+0000 7f04a3146700 1 -- 192.168.123.100:0/1521387528 shutdown_connections 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.204+0000 7f04a3146700 1 -- 192.168.123.100:0/1521387528 wait complete. 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.205+0000 7f04a3146700 1 Processor -- start 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.205+0000 7f04a3146700 1 -- start start 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.205+0000 7f04a3146700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c19bce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.205+0000 7f04a3146700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f049c19c220 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.206+0000 7f04a0ee2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c19bce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.206+0000 7f04a0ee2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c19bce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32808/0 (socket says 192.168.123.100:32808) 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.206+0000 7f04a0ee2700 1 -- 192.168.123.100:0/2171383366 learned_addr learned my addr 192.168.123.100:0/2171383366 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.206+0000 7f04a0ee2700 1 -- 192.168.123.100:0/2171383366 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0498009740 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.207+0000 7f04a0ee2700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c19bce0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f0498000c00 tx=0x7f0498011890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.207+0000 7f0491ffb700 1 -- 192.168.123.100:0/2171383366 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0498011bc0 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.207+0000 7f0491ffb700 1 -- 192.168.123.100:0/2171383366 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0498011d20 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.207+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f049c19c420 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.207+0000 7f0491ffb700 1 -- 192.168.123.100:0/2171383366 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f049801a590 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.207+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f049c19c8c0 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.208+0000 7f0491ffb700 1 -- 192.168.123.100:0/2171383366 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f049801a6f0 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.208+0000 7f0491ffb700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f04840383a0 0x7f048403a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.208+0000 7f0491ffb700 1 -- 192.168.123.100:0/2171383366 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f049804d0b0 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.209+0000 7f0493fff700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f04840383a0 0x7f048403a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.209+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f049c04f9e0 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.212+0000 7f0493fff700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f04840383a0 0x7f048403a850 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f048c006fd0 tx=0x7f048c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.212+0000 7f0491ffb700 1 -- 192.168.123.100:0/2171383366 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f049801f080 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.313+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f049c062380 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.316+0000 7f0491ffb700 1 -- 192.168.123.100:0/2171383366 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f049c062380 con 0x7f049c107f40 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.319+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f04840383a0 msgr2=0x7f048403a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.319+0000 7f04a3146700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f04840383a0 0x7f048403a850 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f048c006fd0 tx=0x7f048c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.319+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 msgr2=0x7f049c19bce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.319+0000 7f04a3146700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c19bce0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f0498000c00 tx=0x7f0498011890 comp rx=0 tx=0).stop 2026-03-10T12:32:49.373 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.319+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 shutdown_connections 2026-03-10T12:32:49.374 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.320+0000 7f04a3146700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f04840383a0 0x7f048403a850 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.374 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.320+0000 7f04a3146700 1 --2- 192.168.123.100:0/2171383366 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f049c107f40 0x7f049c19bce0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.374 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.320+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 >> 192.168.123.100:0/2171383366 conn(0x7f049c103770 msgr2=0x7f049c1053e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:49.374 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.320+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 shutdown_connections 2026-03-10T12:32:49.374 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.320+0000 7f04a3146700 1 -- 192.168.123.100:0/2171383366 wait complete. 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.506+0000 7f8d1be02700 1 Processor -- start 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.507+0000 7f8d1be02700 1 -- start start 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.507+0000 7f8d1be02700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d14108350 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.507+0000 7f8d1be02700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d14108890 con 0x7f8d14107f40 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.507+0000 7f8d19b9e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d14108350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.507+0000 7f8d19b9e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d14108350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32820/0 (socket says 192.168.123.100:32820) 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.507+0000 7f8d19b9e700 1 -- 192.168.123.100:0/252656774 learned_addr learned my addr 192.168.123.100:0/252656774 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.508+0000 7f8d19b9e700 1 -- 192.168.123.100:0/252656774 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8d141089d0 con 0x7f8d14107f40 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.508+0000 7f8d19b9e700 1 --2- 192.168.123.100:0/252656774 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d14108350 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f8d10009a90 tx=0x7f8d10009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a32c1cec3d8fd4c8 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.508+0000 7f8d18b9c700 1 -- 192.168.123.100:0/252656774 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8d10004030 con 0x7f8d14107f40 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.508+0000 7f8d18b9c700 1 -- 192.168.123.100:0/252656774 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8d1000b7e0 con 0x7f8d14107f40 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.508+0000 7f8d18b9c700 1 -- 192.168.123.100:0/252656774 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8d10003ae0 con 0x7f8d14107f40 2026-03-10T12:32:49.750 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.509+0000 7f8d1be02700 1 -- 192.168.123.100:0/252656774 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 msgr2=0x7f8d14108350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.509+0000 7f8d1be02700 1 --2- 192.168.123.100:0/252656774 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d14108350 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f8d10009a90 tx=0x7f8d10009da0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.509+0000 7f8d1be02700 1 -- 192.168.123.100:0/252656774 shutdown_connections 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.509+0000 7f8d1be02700 1 --2- 192.168.123.100:0/252656774 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d14108350 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.509+0000 7f8d1be02700 1 -- 192.168.123.100:0/252656774 >> 192.168.123.100:0/252656774 conn(0x7f8d14103770 msgr2=0x7f8d14105b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.509+0000 7f8d1be02700 1 -- 192.168.123.100:0/252656774 shutdown_connections 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.509+0000 7f8d1be02700 1 -- 192.168.123.100:0/252656774 wait complete. 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.510+0000 7f8d1be02700 1 Processor -- start 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.510+0000 7f8d1be02700 1 -- start start 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.510+0000 7f8d1be02700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d1419bc20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.510+0000 7f8d1be02700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d1419c160 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.510+0000 7f8d19b9e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d1419bc20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.510+0000 7f8d19b9e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d1419bc20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32836/0 (socket says 192.168.123.100:32836) 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.510+0000 7f8d19b9e700 1 -- 192.168.123.100:0/1217568413 learned_addr learned my addr 192.168.123.100:0/1217568413 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.511+0000 7f8d19b9e700 1 -- 192.168.123.100:0/1217568413 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8d10009740 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.511+0000 7f8d19b9e700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d1419bc20 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f8d10000c00 tx=0x7f8d1000bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.511+0000 7f8d0affd700 1 -- 192.168.123.100:0/1217568413 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8d10004160 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.511+0000 7f8d0affd700 1 -- 192.168.123.100:0/1217568413 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8d100042c0 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.511+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8d1419c360 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.511+0000 7f8d0affd700 1 -- 192.168.123.100:0/1217568413 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8d10011620 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.511+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8d1419c800 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.512+0000 7f8d0affd700 1 -- 192.168.123.100:0/1217568413 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 11) v1 ==== 44986+0+0 (secure 0 0 0) 0x7f8d10011890 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.513+0000 7f8d0affd700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8d000383f0 0x7f8d0003a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.514+0000 7f8d1939d700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8d000383f0 0x7f8d0003a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.514+0000 7f8d0affd700 1 -- 192.168.123.100:0/1217568413 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8d1004d060 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.514+0000 7f8d1939d700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8d000383f0 0x7f8d0003a8a0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f8d04006fd0 tx=0x7f8d04006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.515+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8d1404f9e0 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.518+0000 7f8d0affd700 1 -- 192.168.123.100:0/1217568413 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8d10011b40 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.684+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7f8d14062380 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.695+0000 7f8d0affd700 1 -- 192.168.123.100:0/1217568413 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7f8d10018b40 con 0x7f8d14107f40 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.697+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8d000383f0 msgr2=0x7f8d0003a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.697+0000 7f8d1be02700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8d000383f0 0x7f8d0003a8a0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f8d04006fd0 tx=0x7f8d04006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 msgr2=0x7f8d1419bc20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d1419bc20 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f8d10000c00 tx=0x7f8d1000bfa0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 shutdown_connections 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8d000383f0 0x7f8d0003a8a0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 --2- 192.168.123.100:0/1217568413 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8d14107f40 0x7f8d1419bc20 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 >> 192.168.123.100:0/1217568413 conn(0x7f8d14103770 msgr2=0x7f8d14105350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 shutdown_connections 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr 2026-03-10T12:32:49.698+0000 7f8d1be02700 1 -- 192.168.123.100:0/1217568413 wait complete. 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:Or, if you are only running a single cluster on this host: 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.751 INFO:teuthology.orchestra.run.vm00.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout: ceph telemetry on 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout:For more information see: 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:49.752 INFO:teuthology.orchestra.run.vm00.stdout:Bootstrap complete. 2026-03-10T12:32:49.774 INFO:tasks.cephadm:Fetching config... 2026-03-10T12:32:49.774 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:32:49.774 DEBUG:teuthology.orchestra.run.vm00:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-10T12:32:49.797 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-10T12:32:49.797 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:32:49.797 DEBUG:teuthology.orchestra.run.vm00:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-10T12:32:49.870 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-10T12:32:49.870 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:32:49.870 DEBUG:teuthology.orchestra.run.vm00:> sudo dd if=/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/keyring of=/dev/stdout 2026-03-10T12:32:49.938 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:49 vm00 ceph-mon[50686]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:49.938 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:49 vm00 ceph-mon[50686]: from='client.14178 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:49.938 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:49 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3279192564' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-10T12:32:49.938 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:49 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1217568413' entity='client.admin' 2026-03-10T12:32:49.938 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:49 vm00 ceph-mon[50686]: mgrmap e12: vm00.nescmq(active, since 2s) 2026-03-10T12:32:49.943 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-10T12:32:49.943 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:32:49.943 DEBUG:teuthology.orchestra.run.vm00:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-10T12:32:50.001 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-10T12:32:50.001 DEBUG:teuthology.orchestra.run.vm00:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzZRQ37HtENKJEtU72puwFDv4yHsrS240AznofJBuhBEq2ma9FuKuvTUoszP+iPl2yAbg7pC+V8pZZgRBCE0y9VZ/f8xjXUFjHMeOlsftuDrI22fwtp7NeqcTjTvn+QbSCr2cs7fdQqakVHkQ60C5u60GW4vDCNYNhQv3Bb+h6XOGQGC3vt5Pu5OoVE6yalSWhMNKiAn/OWdrQdF92zM9EKRzTFt1MJwuspyqHZeBoDjmIqchQ7ZWchq562VuK9kAQo7s2ReydXWzWoI8sdoH41vHweJSx0MSHoRrmENAh9XCaMwzlJjl9SrjE3zq6HD4QxR3xSV0T+BobUuiMfQ+aQxLVK8YshnhGpa9hIeeiNizce6p/Q664pxGmt+m1qMyrtXp46uXe6X8SXM5NHIwWiV8T6FFxgUCQiVycYfRjU4iXQrj5bE+2jSy0aIrRNbZ2hQXMVHfKnoASUouJ9Jm7wGK6V4wbv9biuxHwFS/A3Gaua/oV5Y15aVM5u3cf5PE= ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T12:32:50.089 INFO:teuthology.orchestra.run.vm00.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzZRQ37HtENKJEtU72puwFDv4yHsrS240AznofJBuhBEq2ma9FuKuvTUoszP+iPl2yAbg7pC+V8pZZgRBCE0y9VZ/f8xjXUFjHMeOlsftuDrI22fwtp7NeqcTjTvn+QbSCr2cs7fdQqakVHkQ60C5u60GW4vDCNYNhQv3Bb+h6XOGQGC3vt5Pu5OoVE6yalSWhMNKiAn/OWdrQdF92zM9EKRzTFt1MJwuspyqHZeBoDjmIqchQ7ZWchq562VuK9kAQo7s2ReydXWzWoI8sdoH41vHweJSx0MSHoRrmENAh9XCaMwzlJjl9SrjE3zq6HD4QxR3xSV0T+BobUuiMfQ+aQxLVK8YshnhGpa9hIeeiNizce6p/Q664pxGmt+m1qMyrtXp46uXe6X8SXM5NHIwWiV8T6FFxgUCQiVycYfRjU4iXQrj5bE+2jSy0aIrRNbZ2hQXMVHfKnoASUouJ9Jm7wGK6V4wbv9biuxHwFS/A3Gaua/oV5Y15aVM5u3cf5PE= ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:50.104 DEBUG:teuthology.orchestra.run.vm07:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzZRQ37HtENKJEtU72puwFDv4yHsrS240AznofJBuhBEq2ma9FuKuvTUoszP+iPl2yAbg7pC+V8pZZgRBCE0y9VZ/f8xjXUFjHMeOlsftuDrI22fwtp7NeqcTjTvn+QbSCr2cs7fdQqakVHkQ60C5u60GW4vDCNYNhQv3Bb+h6XOGQGC3vt5Pu5OoVE6yalSWhMNKiAn/OWdrQdF92zM9EKRzTFt1MJwuspyqHZeBoDjmIqchQ7ZWchq562VuK9kAQo7s2ReydXWzWoI8sdoH41vHweJSx0MSHoRrmENAh9XCaMwzlJjl9SrjE3zq6HD4QxR3xSV0T+BobUuiMfQ+aQxLVK8YshnhGpa9hIeeiNizce6p/Q664pxGmt+m1qMyrtXp46uXe6X8SXM5NHIwWiV8T6FFxgUCQiVycYfRjU4iXQrj5bE+2jSy0aIrRNbZ2hQXMVHfKnoASUouJ9Jm7wGK6V4wbv9biuxHwFS/A3Gaua/oV5Y15aVM5u3cf5PE= ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T12:32:50.138 INFO:teuthology.orchestra.run.vm07.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzZRQ37HtENKJEtU72puwFDv4yHsrS240AznofJBuhBEq2ma9FuKuvTUoszP+iPl2yAbg7pC+V8pZZgRBCE0y9VZ/f8xjXUFjHMeOlsftuDrI22fwtp7NeqcTjTvn+QbSCr2cs7fdQqakVHkQ60C5u60GW4vDCNYNhQv3Bb+h6XOGQGC3vt5Pu5OoVE6yalSWhMNKiAn/OWdrQdF92zM9EKRzTFt1MJwuspyqHZeBoDjmIqchQ7ZWchq562VuK9kAQo7s2ReydXWzWoI8sdoH41vHweJSx0MSHoRrmENAh9XCaMwzlJjl9SrjE3zq6HD4QxR3xSV0T+BobUuiMfQ+aQxLVK8YshnhGpa9hIeeiNizce6p/Q664pxGmt+m1qMyrtXp46uXe6X8SXM5NHIwWiV8T6FFxgUCQiVycYfRjU4iXQrj5bE+2jSy0aIrRNbZ2hQXMVHfKnoASUouJ9Jm7wGK6V4wbv9biuxHwFS/A3Gaua/oV5Y15aVM5u3cf5PE= ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:32:50.148 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-10T12:32:50.320 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:32:50.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.671+0000 7facf178e700 1 -- 192.168.123.100:0/3950315207 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 msgr2=0x7facec0731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:50.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.671+0000 7facf178e700 1 --2- 192.168.123.100:0/3950315207 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec0731e0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7face0009b50 tx=0x7face0009e60 comp rx=0 tx=0).stop 2026-03-10T12:32:50.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.673+0000 7facf178e700 1 -- 192.168.123.100:0/3950315207 shutdown_connections 2026-03-10T12:32:50.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.673+0000 7facf178e700 1 --2- 192.168.123.100:0/3950315207 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec0731e0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:50.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.673+0000 7facf178e700 1 -- 192.168.123.100:0/3950315207 >> 192.168.123.100:0/3950315207 conn(0x7facec0fb430 msgr2=0x7facec0fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:50.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.673+0000 7facf178e700 1 -- 192.168.123.100:0/3950315207 shutdown_connections 2026-03-10T12:32:50.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.673+0000 7facf178e700 1 -- 192.168.123.100:0/3950315207 wait complete. 2026-03-10T12:32:50.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.674+0000 7facf178e700 1 Processor -- start 2026-03-10T12:32:50.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.674+0000 7facf178e700 1 -- start start 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.676+0000 7facf178e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec19b710 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.676+0000 7facf178e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7facec19bc50 con 0x7facec074d80 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.690+0000 7facebfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec19b710 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.690+0000 7facebfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec19b710 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32852/0 (socket says 192.168.123.100:32852) 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.690+0000 7facebfff700 1 -- 192.168.123.100:0/3243239348 learned_addr learned my addr 192.168.123.100:0/3243239348 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.691+0000 7facebfff700 1 -- 192.168.123.100:0/3243239348 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7face00097e0 con 0x7facec074d80 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.691+0000 7facebfff700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec19b710 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7face00047a0 tx=0x7face0003730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.691+0000 7face97fa700 1 -- 192.168.123.100:0/3243239348 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7face0004030 con 0x7facec074d80 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.691+0000 7facf178e700 1 -- 192.168.123.100:0/3243239348 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7facec19be50 con 0x7facec074d80 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.691+0000 7facf178e700 1 -- 192.168.123.100:0/3243239348 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7facec19c2f0 con 0x7facec074d80 2026-03-10T12:32:50.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.691+0000 7face97fa700 1 -- 192.168.123.100:0/3243239348 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7face0003c10 con 0x7facec074d80 2026-03-10T12:32:50.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.691+0000 7face97fa700 1 -- 192.168.123.100:0/3243239348 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7face00175a0 con 0x7facec074d80 2026-03-10T12:32:50.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.692+0000 7face97fa700 1 -- 192.168.123.100:0/3243239348 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7face0017700 con 0x7facec074d80 2026-03-10T12:32:50.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.693+0000 7face97fa700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7facd40384f0 0x7facd403a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:50.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.693+0000 7face97fa700 1 -- 192.168.123.100:0/3243239348 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7face004d380 con 0x7facec074d80 2026-03-10T12:32:50.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.693+0000 7faceb7fe700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7facd40384f0 0x7facd403a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:50.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.693+0000 7faceb7fe700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7facd40384f0 0x7facd403a9a0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7facdc006fd0 tx=0x7facdc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:50.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.693+0000 7facf178e700 1 -- 192.168.123.100:0/3243239348 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7facd8005320 con 0x7facec074d80 2026-03-10T12:32:50.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.696+0000 7face97fa700 1 -- 192.168.123.100:0/3243239348 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7face0027070 con 0x7facec074d80 2026-03-10T12:32:50.917 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.915+0000 7facf178e700 1 -- 192.168.123.100:0/3243239348 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7facd8005f70 con 0x7facec074d80 2026-03-10T12:32:50.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.925+0000 7face97fa700 1 -- 192.168.123.100:0/3243239348 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7face002c430 con 0x7facec074d80 2026-03-10T12:32:50.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.931+0000 7facd2ffd700 1 -- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7facd40384f0 msgr2=0x7facd403a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:50.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.931+0000 7facd2ffd700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7facd40384f0 0x7facd403a9a0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7facdc006fd0 tx=0x7facdc006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:50.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.931+0000 7facd2ffd700 1 -- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 msgr2=0x7facec19b710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:50.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.931+0000 7facd2ffd700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec19b710 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7face00047a0 tx=0x7face0003730 comp rx=0 tx=0).stop 2026-03-10T12:32:50.932 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.932+0000 7facd2ffd700 1 -- 192.168.123.100:0/3243239348 shutdown_connections 2026-03-10T12:32:50.932 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.932+0000 7facd2ffd700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7facd40384f0 0x7facd403a9a0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:50.932 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.932+0000 7facd2ffd700 1 --2- 192.168.123.100:0/3243239348 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7facec074d80 0x7facec19b710 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:50.932 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.932+0000 7facd2ffd700 1 -- 192.168.123.100:0/3243239348 >> 192.168.123.100:0/3243239348 conn(0x7facec0fb430 msgr2=0x7facec0fc090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:50.932 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.932+0000 7facd2ffd700 1 -- 192.168.123.100:0/3243239348 shutdown_connections 2026-03-10T12:32:50.932 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:50.932+0000 7facd2ffd700 1 -- 192.168.123.100:0/3243239348 wait complete. 2026-03-10T12:32:50.982 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-10T12:32:50.982 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-10T12:32:51.148 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:32:51.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.440+0000 7ffa1163b700 1 -- 192.168.123.100:0/3397702687 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 msgr2=0x7ffa0c0fe5f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:51.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.440+0000 7ffa1163b700 1 --2- 192.168.123.100:0/3397702687 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c0fe5f0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7ffa00009b00 tx=0x7ffa00009e10 comp rx=0 tx=0).stop 2026-03-10T12:32:51.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.442+0000 7ffa1163b700 1 -- 192.168.123.100:0/3397702687 shutdown_connections 2026-03-10T12:32:51.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.442+0000 7ffa1163b700 1 --2- 192.168.123.100:0/3397702687 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c0fe5f0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:51.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.442+0000 7ffa1163b700 1 -- 192.168.123.100:0/3397702687 >> 192.168.123.100:0/3397702687 conn(0x7ffa0c0f9c30 msgr2=0x7ffa0c0fc060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:51.443 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.443+0000 7ffa1163b700 1 -- 192.168.123.100:0/3397702687 shutdown_connections 2026-03-10T12:32:51.443 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.443+0000 7ffa1163b700 1 -- 192.168.123.100:0/3397702687 wait complete. 2026-03-10T12:32:51.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.444+0000 7ffa1163b700 1 Processor -- start 2026-03-10T12:32:51.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.444+0000 7ffa1163b700 1 -- start start 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.444+0000 7ffa1163b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c19b790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.444+0000 7ffa1163b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa0c19bcd0 con 0x7ffa0c067760 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.444+0000 7ffa0affd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c19b790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.444+0000 7ffa0affd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c19b790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32868/0 (socket says 192.168.123.100:32868) 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.444+0000 7ffa0affd700 1 -- 192.168.123.100:0/428391431 learned_addr learned my addr 192.168.123.100:0/428391431 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.445+0000 7ffa0affd700 1 -- 192.168.123.100:0/428391431 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffa000097e0 con 0x7ffa0c067760 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.445+0000 7ffa0affd700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c19b790 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7ffa00004750 tx=0x7ffa00005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:51.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.445+0000 7ff9f3fff700 1 -- 192.168.123.100:0/428391431 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ffa0001c070 con 0x7ffa0c067760 2026-03-10T12:32:51.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.446+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffa0c19bed0 con 0x7ffa0c067760 2026-03-10T12:32:51.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.446+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffa0c19c370 con 0x7ffa0c067760 2026-03-10T12:32:51.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.447+0000 7ff9f3fff700 1 -- 192.168.123.100:0/428391431 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ffa00021470 con 0x7ffa0c067760 2026-03-10T12:32:51.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.447+0000 7ff9f3fff700 1 -- 192.168.123.100:0/428391431 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ffa0000f460 con 0x7ffa0c067760 2026-03-10T12:32:51.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.447+0000 7ff9f3fff700 1 -- 192.168.123.100:0/428391431 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7ffa0000f680 con 0x7ffa0c067760 2026-03-10T12:32:51.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.447+0000 7ff9f3fff700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff9f4038510 0x7ff9f403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:51.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.447+0000 7ffa0a7fc700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff9f4038510 0x7ff9f403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:51.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.447+0000 7ff9f3fff700 1 -- 192.168.123.100:0/428391431 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ffa0004d560 con 0x7ffa0c067760 2026-03-10T12:32:51.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.448+0000 7ffa0a7fc700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff9f4038510 0x7ff9f403a9c0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7ffa0400ad30 tx=0x7ffa040093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:51.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.448+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff9f8005320 con 0x7ffa0c067760 2026-03-10T12:32:51.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.451+0000 7ff9f3fff700 1 -- 192.168.123.100:0/428391431 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ffa00026070 con 0x7ffa0c067760 2026-03-10T12:32:51.590 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.589+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7ff9f8000bf0 con 0x7ff9f4038510 2026-03-10T12:32:51.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.593+0000 7ff9f3fff700 1 -- 192.168.123.100:0/428391431 <== mgr.14164 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7ff9f8000bf0 con 0x7ff9f4038510 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.595+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff9f4038510 msgr2=0x7ff9f403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.595+0000 7ffa1163b700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff9f4038510 0x7ff9f403a9c0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7ffa0400ad30 tx=0x7ffa040093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 msgr2=0x7ffa0c19b790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c19b790 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7ffa00004750 tx=0x7ffa00005dc0 comp rx=0 tx=0).stop 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 shutdown_connections 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff9f4038510 0x7ff9f403a9c0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 --2- 192.168.123.100:0/428391431 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ffa0c067760 0x7ffa0c19b790 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 >> 192.168.123.100:0/428391431 conn(0x7ffa0c0f9c30 msgr2=0x7ffa0c105010 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:51.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 shutdown_connections 2026-03-10T12:32:51.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:51.596+0000 7ffa1163b700 1 -- 192.168.123.100:0/428391431 wait complete. 2026-03-10T12:32:51.656 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm07 2026-03-10T12:32:51.656 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:32:51.656 DEBUG:teuthology.orchestra.run.vm07:> dd of=/etc/ceph/ceph.conf 2026-03-10T12:32:51.672 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:32:51.672 DEBUG:teuthology.orchestra.run.vm07:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:32:51.727 INFO:tasks.cephadm:Adding host vm07 to orchestrator... 2026-03-10T12:32:51.727 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch host add vm07 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: Deploying daemon ceph-exporter.vm00 on vm00 2026-03-10T12:32:51.732 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:51 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3243239348' entity='client.admin' 2026-03-10T12:32:51.977 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:32:52.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.440+0000 7f838ee36700 1 -- 192.168.123.100:0/2908611328 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8388071530 msgr2=0x7f8388071940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:52.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.440+0000 7f838ee36700 1 --2- 192.168.123.100:0/2908611328 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8388071530 0x7f8388071940 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f8378008790 tx=0x7f8378008aa0 comp rx=0 tx=0).stop 2026-03-10T12:32:52.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.440+0000 7f838ee36700 1 -- 192.168.123.100:0/2908611328 shutdown_connections 2026-03-10T12:32:52.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.440+0000 7f838ee36700 1 --2- 192.168.123.100:0/2908611328 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8388071530 0x7f8388071940 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:52.442 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.440+0000 7f838ee36700 1 -- 192.168.123.100:0/2908611328 >> 192.168.123.100:0/2908611328 conn(0x7f838806cd50 msgr2=0x7f838806f1a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:52.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.440+0000 7f838ee36700 1 -- 192.168.123.100:0/2908611328 shutdown_connections 2026-03-10T12:32:52.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.440+0000 7f838ee36700 1 -- 192.168.123.100:0/2908611328 wait complete. 2026-03-10T12:32:52.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.443+0000 7f838ee36700 1 Processor -- start 2026-03-10T12:32:52.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.443+0000 7f838ee36700 1 -- start start 2026-03-10T12:32:52.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.443+0000 7f838ee36700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f838807e0e0 0x7f838807e4f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:52.446 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.443+0000 7f838ee36700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8378016070 con 0x7f838807e0e0 2026-03-10T12:32:52.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.447+0000 7f838cbd2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f838807e0e0 0x7f838807e4f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:52.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.447+0000 7f838cbd2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f838807e0e0 0x7f838807e4f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32900/0 (socket says 192.168.123.100:32900) 2026-03-10T12:32:52.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.447+0000 7f838cbd2700 1 -- 192.168.123.100:0/739434917 learned_addr learned my addr 192.168.123.100:0/739434917 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:52.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.448+0000 7f838cbd2700 1 -- 192.168.123.100:0/739434917 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8378008440 con 0x7f838807e0e0 2026-03-10T12:32:52.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.448+0000 7f838cbd2700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f838807e0e0 0x7f838807e4f0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f8378005ef0 tx=0x7f837800bf60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:52.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.448+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8378012660 con 0x7f838807e0e0 2026-03-10T12:32:52.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.448+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f838807ea30 con 0x7f838807e0e0 2026-03-10T12:32:52.449 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.449+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f838807c940 con 0x7f838807e0e0 2026-03-10T12:32:52.450 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.449+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8388062380 con 0x7f838807e0e0 2026-03-10T12:32:52.450 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.449+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8378012ca0 con 0x7f838807e0e0 2026-03-10T12:32:52.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.450+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f837801b440 con 0x7f838807e0e0 2026-03-10T12:32:52.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.450+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 12) v1 ==== 45092+0+0 (secure 0 0 0) 0x7f837801b660 con 0x7f838807e0e0 2026-03-10T12:32:52.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.450+0000 7f8385ffb700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8370038500 0x7f837003a9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:52.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.452+0000 7f8387fff700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8370038500 0x7f837003a9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:52.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.452+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8378003700 con 0x7f838807e0e0 2026-03-10T12:32:52.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.453+0000 7f8387fff700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8370038500 0x7f837003a9b0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f838000ad30 tx=0x7f83800093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:52.454 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.454+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8378024400 con 0x7f838807e0e0 2026-03-10T12:32:52.584 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:52.581+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm07", "target": ["mon-mgr", ""]}) v1 -- 0x7f838807d410 con 0x7f8370038500 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T12:32:52.837 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:52 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:32:53.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:53.601+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f837800a360 con 0x7f838807e0e0 2026-03-10T12:32:53.767 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:53 vm00 ceph-mon[50686]: Deploying daemon crash.vm00 on vm00 2026-03-10T12:32:53.767 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:53 vm00 ceph-mon[50686]: from='client.14191 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm07", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:53.768 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:53 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:53.768 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:53 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:53.768 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:53 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:53.768 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:53 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:54.445 INFO:teuthology.orchestra.run.vm00.stdout:Added host 'vm07' with addr '192.168.123.107' 2026-03-10T12:32:54.445 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.443+0000 7f8385ffb700 1 -- 192.168.123.100:0/739434917 <== mgr.14164 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f838807d410 con 0x7f8370038500 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8370038500 msgr2=0x7f837003a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8370038500 0x7f837003a9b0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f838000ad30 tx=0x7f83800093f0 comp rx=0 tx=0).stop 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f838807e0e0 msgr2=0x7f838807e4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f838807e0e0 0x7f838807e4f0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f8378005ef0 tx=0x7f837800bf60 comp rx=0 tx=0).stop 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 shutdown_connections 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8370038500 0x7f837003a9b0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 --2- 192.168.123.100:0/739434917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f838807e0e0 0x7f838807e4f0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 >> 192.168.123.100:0/739434917 conn(0x7f838806cd50 msgr2=0x7f838806e7c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 shutdown_connections 2026-03-10T12:32:54.447 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.447+0000 7f838ee36700 1 -- 192.168.123.100:0/739434917 wait complete. 2026-03-10T12:32:54.520 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch host ls --format=json 2026-03-10T12:32:54.714 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:32:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:54 vm00 ceph-mon[50686]: Deploying cephadm binary to vm07 2026-03-10T12:32:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:54 vm00 ceph-mon[50686]: Deploying daemon node-exporter.vm00 on vm00 2026-03-10T12:32:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:54 vm00 ceph-mon[50686]: mgrmap e13: vm00.nescmq(active, since 6s) 2026-03-10T12:32:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:54 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:54.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.974+0000 7f16915c3700 1 -- 192.168.123.100:0/221479266 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 msgr2=0x7f168c102f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:54.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.974+0000 7f16915c3700 1 --2- 192.168.123.100:0/221479266 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c102f40 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f1674009b00 tx=0x7f1674009e10 comp rx=0 tx=0).stop 2026-03-10T12:32:54.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.975+0000 7f16915c3700 1 -- 192.168.123.100:0/221479266 shutdown_connections 2026-03-10T12:32:54.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.975+0000 7f16915c3700 1 --2- 192.168.123.100:0/221479266 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c102f40 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:54.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.975+0000 7f16915c3700 1 -- 192.168.123.100:0/221479266 >> 192.168.123.100:0/221479266 conn(0x7f168c0fa4a0 msgr2=0x7f168c0fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:54.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.975+0000 7f16915c3700 1 -- 192.168.123.100:0/221479266 shutdown_connections 2026-03-10T12:32:54.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.975+0000 7f16915c3700 1 -- 192.168.123.100:0/221479266 wait complete. 2026-03-10T12:32:54.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.975+0000 7f16915c3700 1 Processor -- start 2026-03-10T12:32:54.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.976+0000 7f16915c3700 1 -- start start 2026-03-10T12:32:54.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.976+0000 7f16915c3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c195160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:54.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.976+0000 7f16915c3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f168c1956a0 con 0x7f168c100b60 2026-03-10T12:32:54.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.976+0000 7f168affd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c195160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:54.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.976+0000 7f168affd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c195160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32924/0 (socket says 192.168.123.100:32924) 2026-03-10T12:32:54.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.976+0000 7f168affd700 1 -- 192.168.123.100:0/692735881 learned_addr learned my addr 192.168.123.100:0/692735881 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:54.977 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.976+0000 7f168affd700 1 -- 192.168.123.100:0/692735881 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f16740097e0 con 0x7f168c100b60 2026-03-10T12:32:54.977 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.977+0000 7f168affd700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c195160 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f1674004f40 tx=0x7f1674005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.977+0000 7f1683fff700 1 -- 192.168.123.100:0/692735881 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f167401c070 con 0x7f168c100b60 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.977+0000 7f1683fff700 1 -- 192.168.123.100:0/692735881 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f16740053b0 con 0x7f168c100b60 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.977+0000 7f1683fff700 1 -- 192.168.123.100:0/692735881 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f167400f460 con 0x7f168c100b60 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.977+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f168c1958a0 con 0x7f168c100b60 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.977+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f168c195d40 con 0x7f168c100b60 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.978+0000 7f1683fff700 1 -- 192.168.123.100:0/692735881 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f167400f5e0 con 0x7f168c100b60 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.978+0000 7f1683fff700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1678038540 0x7f167803a9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.978+0000 7f1683fff700 1 -- 192.168.123.100:0/692735881 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f167404d4b0 con 0x7f168c100b60 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.978+0000 7f168a7fc700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1678038540 0x7f167803a9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:54.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.979+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f168c18ee70 con 0x7f168c100b60 2026-03-10T12:32:54.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.982+0000 7f168a7fc700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1678038540 0x7f167803a9f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f167c006fd0 tx=0x7f167c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:54.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:54.982+0000 7f1683fff700 1 -- 192.168.123.100:0/692735881 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1674026070 con 0x7f168c100b60 2026-03-10T12:32:55.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.090+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f168c061190 con 0x7f1678038540 2026-03-10T12:32:55.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.092+0000 7f1683fff700 1 -- 192.168.123.100:0/692735881 <== mgr.14164 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7f168c061190 con 0x7f1678038540 2026-03-10T12:32:55.093 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:32:55.093 INFO:teuthology.orchestra.run.vm00.stdout:[{"addr": "192.168.123.100", "hostname": "vm00", "labels": [], "status": ""}, {"addr": "192.168.123.107", "hostname": "vm07", "labels": [], "status": ""}] 2026-03-10T12:32:55.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.095+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1678038540 msgr2=0x7f167803a9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:55.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.095+0000 7f16915c3700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1678038540 0x7f167803a9f0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f167c006fd0 tx=0x7f167c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:55.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.095+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 msgr2=0x7f168c195160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:55.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.096+0000 7f16915c3700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c195160 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f1674004f40 tx=0x7f1674005e70 comp rx=0 tx=0).stop 2026-03-10T12:32:55.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.096+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 shutdown_connections 2026-03-10T12:32:55.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.096+0000 7f16915c3700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1678038540 0x7f167803a9f0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:55.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.096+0000 7f16915c3700 1 --2- 192.168.123.100:0/692735881 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f168c100b60 0x7f168c195160 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:55.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.097+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 >> 192.168.123.100:0/692735881 conn(0x7f168c0fa4a0 msgr2=0x7f168c0fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:55.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.097+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 shutdown_connections 2026-03-10T12:32:55.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.097+0000 7f16915c3700 1 -- 192.168.123.100:0/692735881 wait complete. 2026-03-10T12:32:55.163 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-10T12:32:55.164 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd crush tunables default 2026-03-10T12:32:55.315 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:32:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.570+0000 7f298891b700 1 -- 192.168.123.100:0/2339228490 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 msgr2=0x7f29840717d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.570+0000 7f298891b700 1 --2- 192.168.123.100:0/2339228490 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29840717d0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f2974009b00 tx=0x7f2974009e10 comp rx=0 tx=0).stop 2026-03-10T12:32:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.571+0000 7f298891b700 1 -- 192.168.123.100:0/2339228490 shutdown_connections 2026-03-10T12:32:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.571+0000 7f298891b700 1 --2- 192.168.123.100:0/2339228490 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29840717d0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.571+0000 7f298891b700 1 -- 192.168.123.100:0/2339228490 >> 192.168.123.100:0/2339228490 conn(0x7f298406cd30 msgr2=0x7f298406f180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.571+0000 7f298891b700 1 -- 192.168.123.100:0/2339228490 shutdown_connections 2026-03-10T12:32:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.571+0000 7f298891b700 1 -- 192.168.123.100:0/2339228490 wait complete. 2026-03-10T12:32:55.572 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f298891b700 1 Processor -- start 2026-03-10T12:32:55.572 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f298891b700 1 -- start start 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f298891b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29841acab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f298891b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29841acff0 con 0x7f29840713c0 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f29837fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29841acab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f29837fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29841acab0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32934/0 (socket says 192.168.123.100:32934) 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f29837fe700 1 -- 192.168.123.100:0/3403791245 learned_addr learned my addr 192.168.123.100:0/3403791245 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f29837fe700 1 -- 192.168.123.100:0/3403791245 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29740097e0 con 0x7f29840713c0 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.572+0000 7f29837fe700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29841acab0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f2974004f40 tx=0x7f2974005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.573+0000 7f29817fa700 1 -- 192.168.123.100:0/3403791245 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f297401c070 con 0x7f29840713c0 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.573+0000 7f29817fa700 1 -- 192.168.123.100:0/3403791245 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f29740053b0 con 0x7f29840713c0 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.573+0000 7f29817fa700 1 -- 192.168.123.100:0/3403791245 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f297400f550 con 0x7f29840713c0 2026-03-10T12:32:55.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.573+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f29841ad1f0 con 0x7f29840713c0 2026-03-10T12:32:55.575 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.574+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29841ad690 con 0x7f29840713c0 2026-03-10T12:32:55.575 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.574+0000 7f29817fa700 1 -- 192.168.123.100:0/3403791245 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f2974005520 con 0x7f29840713c0 2026-03-10T12:32:55.575 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.575+0000 7f29817fa700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2970038470 0x7f297003a920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:55.575 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.575+0000 7f29817fa700 1 -- 192.168.123.100:0/3403791245 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f2974020c00 con 0x7f29840713c0 2026-03-10T12:32:55.575 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.575+0000 7f297adff700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2970038470 0x7f297003a920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:55.575 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.575+0000 7f297adff700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2970038470 0x7f297003a920 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f296c006fd0 tx=0x7f296c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:55.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.576+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2984110500 con 0x7f29840713c0 2026-03-10T12:32:55.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.578+0000 7f29817fa700 1 -- 192.168.123.100:0/3403791245 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2974026070 con 0x7f29840713c0 2026-03-10T12:32:55.691 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:55.691+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7f2984062380 con 0x7f29840713c0 2026-03-10T12:32:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:55 vm00 ceph-mon[50686]: Added host vm07 2026-03-10T12:32:56.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.608+0000 7f29817fa700 1 -- 192.168.123.100:0/3403791245 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7f2974052050 con 0x7f29840713c0 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.611+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2970038470 msgr2=0x7f297003a920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.611+0000 7f298891b700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2970038470 0x7f297003a920 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f296c006fd0 tx=0x7f296c006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.611+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 msgr2=0x7f29841acab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.611+0000 7f298891b700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29841acab0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f2974004f40 tx=0x7f2974005e70 comp rx=0 tx=0).stop 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.611+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 shutdown_connections 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.611+0000 7f298891b700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2970038470 0x7f297003a920 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.612+0000 7f298891b700 1 --2- 192.168.123.100:0/3403791245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f29840713c0 0x7f29841acab0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.612+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 >> 192.168.123.100:0/3403791245 conn(0x7f298406cd30 msgr2=0x7f298406e8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:56.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.612+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 shutdown_connections 2026-03-10T12:32:56.613 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:32:56.612+0000 7f298891b700 1 -- 192.168.123.100:0/3403791245 wait complete. 2026-03-10T12:32:56.613 INFO:teuthology.orchestra.run.vm00.stderr:adjusted tunables profile to default 2026-03-10T12:32:56.648 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:56 vm00 ceph-mon[50686]: from='client.14193 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T12:32:56.648 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:56 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3403791245' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-10T12:32:56.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:56 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:56.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:56 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:56.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:56 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:56.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:56 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:56.700 INFO:tasks.cephadm:Adding mon.vm00 on vm00 2026-03-10T12:32:56.700 INFO:tasks.cephadm:Adding mon.vm07 on vm07 2026-03-10T12:32:56.700 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch apply mon '2;vm00:192.168.123.100=vm00;vm07:192.168.123.107=vm07' 2026-03-10T12:32:56.838 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:32:56.872 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:32:57.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:57 vm00 ceph-mon[50686]: Deploying daemon alertmanager.vm00 on vm00 2026-03-10T12:32:57.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:57 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3403791245' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-10T12:32:57.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:57 vm00 ceph-mon[50686]: osdmap e4: 0 total, 0 up, 0 in 2026-03-10T12:32:57.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:57 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:58.003 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.002+0000 7f41b7f0e700 1 -- 192.168.123.107:0/4242242875 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 msgr2=0x7f41b0073160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:58.003 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.002+0000 7f41b7f0e700 1 --2- 192.168.123.107:0/4242242875 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0073160 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f41a0009b00 tx=0x7f41a0009e10 comp rx=0 tx=0).stop 2026-03-10T12:32:58.003 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.003+0000 7f41b7f0e700 1 -- 192.168.123.107:0/4242242875 shutdown_connections 2026-03-10T12:32:58.003 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.003+0000 7f41b7f0e700 1 --2- 192.168.123.107:0/4242242875 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0073160 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:58.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.003+0000 7f41b7f0e700 1 -- 192.168.123.107:0/4242242875 >> 192.168.123.107:0/4242242875 conn(0x7f41b00fb5e0 msgr2=0x7f41b00fda10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:58.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.003+0000 7f41b7f0e700 1 -- 192.168.123.107:0/4242242875 shutdown_connections 2026-03-10T12:32:58.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.003+0000 7f41b7f0e700 1 -- 192.168.123.107:0/4242242875 wait complete. 2026-03-10T12:32:58.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.004+0000 7f41b7f0e700 1 Processor -- start 2026-03-10T12:32:58.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.004+0000 7f41b7f0e700 1 -- start start 2026-03-10T12:32:58.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.004+0000 7f41b7f0e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:58.004 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.004+0000 7f41b7f0e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f41b01978c0 con 0x7f41b0074d00 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.004+0000 7f41b5caa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.004+0000 7f41b5caa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:48300/0 (socket says 192.168.123.107:48300) 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.004+0000 7f41b5caa700 1 -- 192.168.123.107:0/463507830 learned_addr learned my addr 192.168.123.107:0/463507830 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.005+0000 7f41b5caa700 1 -- 192.168.123.107:0/463507830 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f41a00097e0 con 0x7f41b0074d00 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.005+0000 7f41b5caa700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0197380 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f41a0004f40 tx=0x7f41a0005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.005+0000 7f41a6ffd700 1 -- 192.168.123.107:0/463507830 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f41a001c070 con 0x7f41b0074d00 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.005+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f41b0197ac0 con 0x7f41b0074d00 2026-03-10T12:32:58.005 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.005+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f41b0197f60 con 0x7f41b0074d00 2026-03-10T12:32:58.006 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.005+0000 7f41a6ffd700 1 -- 192.168.123.107:0/463507830 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f41a00053b0 con 0x7f41b0074d00 2026-03-10T12:32:58.006 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.005+0000 7f41a6ffd700 1 -- 192.168.123.107:0/463507830 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f41a000f460 con 0x7f41b0074d00 2026-03-10T12:32:58.007 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.006+0000 7f41a6ffd700 1 -- 192.168.123.107:0/463507830 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f41a0021470 con 0x7f41b0074d00 2026-03-10T12:32:58.007 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.006+0000 7f41a6ffd700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f419c038510 0x7f419c03a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:58.007 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.006+0000 7f41a6ffd700 1 -- 192.168.123.107:0/463507830 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f41a004c320 con 0x7f41b0074d00 2026-03-10T12:32:58.007 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.006+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4194005320 con 0x7f41b0074d00 2026-03-10T12:32:58.007 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.007+0000 7f41b54a9700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f419c038510 0x7f419c03a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:58.007 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.007+0000 7f41b54a9700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f419c038510 0x7f419c03a9c0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f41ac006fd0 tx=0x7f41ac006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:58.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.009+0000 7f41a6ffd700 1 -- 192.168.123.107:0/463507830 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f41a0026070 con 0x7f41b0074d00 2026-03-10T12:32:58.126 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.126+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm00:192.168.123.100=vm00;vm07:192.168.123.107=vm07", "target": ["mon-mgr", ""]}) v1 -- 0x7f4194000c90 con 0x7f419c038510 2026-03-10T12:32:58.132 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.132+0000 7f41a6ffd700 1 -- 192.168.123.107:0/463507830 <== mgr.14164 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f4194000c90 con 0x7f419c038510 2026-03-10T12:32:58.132 INFO:teuthology.orchestra.run.vm07.stdout:Scheduled mon update... 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.134+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f419c038510 msgr2=0x7f419c03a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.134+0000 7f41b7f0e700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f419c038510 0x7f419c03a9c0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f41ac006fd0 tx=0x7f41ac006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.134+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 msgr2=0x7f41b0197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.134+0000 7f41b7f0e700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0197380 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f41a0004f40 tx=0x7f41a0005e70 comp rx=0 tx=0).stop 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.134+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 shutdown_connections 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.135+0000 7f41b7f0e700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f419c038510 0x7f419c03a9c0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.135+0000 7f41b7f0e700 1 --2- 192.168.123.107:0/463507830 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f41b0074d00 0x7f41b0197380 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.135+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 >> 192.168.123.107:0/463507830 conn(0x7f41b00fb5e0 msgr2=0x7f41b00fc290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.135+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 shutdown_connections 2026-03-10T12:32:58.135 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.135+0000 7f41b7f0e700 1 -- 192.168.123.107:0/463507830 wait complete. 2026-03-10T12:32:58.200 DEBUG:teuthology.orchestra.run.vm07:mon.vm07> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm07.service 2026-03-10T12:32:58.201 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:32:58.201 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:32:58.388 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:32:58.428 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:32:58.719 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.717+0000 7ff81897a700 1 -- 192.168.123.107:0/1095387992 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 msgr2=0x7ff810102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:58.719 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.717+0000 7ff81897a700 1 --2- 192.168.123.107:0/1095387992 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810102650 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7ff7fc009b00 tx=0x7ff7fc009e10 comp rx=0 tx=0).stop 2026-03-10T12:32:58.719 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.719+0000 7ff81897a700 1 -- 192.168.123.107:0/1095387992 shutdown_connections 2026-03-10T12:32:58.719 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.719+0000 7ff81897a700 1 --2- 192.168.123.107:0/1095387992 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810102650 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:58.719 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.719+0000 7ff81897a700 1 -- 192.168.123.107:0/1095387992 >> 192.168.123.107:0/1095387992 conn(0x7ff8100fd8d0 msgr2=0x7ff8100ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:58.719 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.719+0000 7ff81897a700 1 -- 192.168.123.107:0/1095387992 shutdown_connections 2026-03-10T12:32:58.720 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.719+0000 7ff81897a700 1 -- 192.168.123.107:0/1095387992 wait complete. 2026-03-10T12:32:58.720 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.720+0000 7ff81897a700 1 Processor -- start 2026-03-10T12:32:58.720 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.720+0000 7ff81897a700 1 -- start start 2026-03-10T12:32:58.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.720+0000 7ff81897a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:58.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.720+0000 7ff81897a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff810197990 con 0x7ff810102240 2026-03-10T12:32:58.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.721+0000 7ff816716700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:58.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.721+0000 7ff816716700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:48316/0 (socket says 192.168.123.107:48316) 2026-03-10T12:32:58.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.721+0000 7ff816716700 1 -- 192.168.123.107:0/97186462 learned_addr learned my addr 192.168.123.107:0/97186462 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:32:58.721 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.721+0000 7ff816716700 1 -- 192.168.123.107:0/97186462 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7fc0097e0 con 0x7ff810102240 2026-03-10T12:32:58.722 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.721+0000 7ff816716700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810197450 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7ff7fc004d40 tx=0x7ff7fc004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:58.723 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.722+0000 7ff80b7fe700 1 -- 192.168.123.107:0/97186462 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff7fc01c070 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.722+0000 7ff80b7fe700 1 -- 192.168.123.107:0/97186462 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff7fc0056f0 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.722+0000 7ff80b7fe700 1 -- 192.168.123.107:0/97186462 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff7fc017440 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.722+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff810197b90 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.722+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff810198030 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.723+0000 7ff80b7fe700 1 -- 192.168.123.107:0/97186462 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7ff7fc00f460 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.723+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff810191090 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.723+0000 7ff80b7fe700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff800038510 0x7ff80003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.723+0000 7ff80b7fe700 1 -- 192.168.123.107:0/97186462 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ff7fc04bfe0 con 0x7ff810102240 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.724+0000 7ff815f15700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff800038510 0x7ff80003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:32:58.724 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.724+0000 7ff815f15700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff800038510 0x7ff80003a9c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7ff804006fd0 tx=0x7ff804006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:32:58.727 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.726+0000 7ff80b7fe700 1 -- 192.168.123.107:0/97186462 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff7fc00f920 con 0x7ff810102240 2026-03-10T12:32:58.877 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.876+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff810062380 con 0x7ff810102240 2026-03-10T12:32:58.877 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.877+0000 7ff80b7fe700 1 -- 192.168.123.107:0/97186462 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7ff7fc025070 con 0x7ff810102240 2026-03-10T12:32:58.878 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:32:58.878 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.879+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff800038510 msgr2=0x7ff80003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.879+0000 7ff81897a700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff800038510 0x7ff80003a9c0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7ff804006fd0 tx=0x7ff804006e40 comp rx=0 tx=0).stop 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.879+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 msgr2=0x7ff810197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.879+0000 7ff81897a700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810197450 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7ff7fc004d40 tx=0x7ff7fc004e20 comp rx=0 tx=0).stop 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.880+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 shutdown_connections 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.880+0000 7ff81897a700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff800038510 0x7ff80003a9c0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.880+0000 7ff81897a700 1 --2- 192.168.123.107:0/97186462 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff810102240 0x7ff810197450 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.880+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 >> 192.168.123.107:0/97186462 conn(0x7ff8100fd8d0 msgr2=0x7ff8100fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.880+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 shutdown_connections 2026-03-10T12:32:58.880 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:32:58.880+0000 7ff81897a700 1 -- 192.168.123.107:0/97186462 wait complete. 2026-03-10T12:32:58.881 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:32:59.409 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:59 vm00 ceph-mon[50686]: from='client.14197 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm00:192.168.123.100=vm00;vm07:192.168.123.107=vm07", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:32:59.409 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:59 vm00 ceph-mon[50686]: Saving service mon spec with placement vm00:192.168.123.100=vm00;vm07:192.168.123.107=vm07;count:2 2026-03-10T12:32:59.409 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:59 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:32:59.409 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:32:59 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/97186462' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:32:59.949 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:32:59.949 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:00.082 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:00.116 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:00.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.426+0000 7fb9f98fc700 1 -- 192.168.123.107:0/2277891983 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 msgr2=0x7fb9f40ff310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:00.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.426+0000 7fb9f98fc700 1 --2- 192.168.123.107:0/2277891983 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f40ff310 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7fb9dc009b00 tx=0x7fb9dc009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:00.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.427+0000 7fb9f98fc700 1 -- 192.168.123.107:0/2277891983 shutdown_connections 2026-03-10T12:33:00.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.427+0000 7fb9f98fc700 1 --2- 192.168.123.107:0/2277891983 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f40ff310 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:00.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.427+0000 7fb9f98fc700 1 -- 192.168.123.107:0/2277891983 >> 192.168.123.107:0/2277891983 conn(0x7fb9f40fa490 msgr2=0x7fb9f40fc8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:00.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.428+0000 7fb9f98fc700 1 -- 192.168.123.107:0/2277891983 shutdown_connections 2026-03-10T12:33:00.428 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.428+0000 7fb9f98fc700 1 -- 192.168.123.107:0/2277891983 wait complete. 2026-03-10T12:33:00.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.428+0000 7fb9f98fc700 1 Processor -- start 2026-03-10T12:33:00.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.429+0000 7fb9f98fc700 1 -- start start 2026-03-10T12:33:00.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.429+0000 7fb9f98fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f4192f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:00.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.429+0000 7fb9f98fc700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9f4193480 con 0x7fb9f40fef00 2026-03-10T12:33:00.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.429+0000 7fb9f2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f4192f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:00.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.429+0000 7fb9f2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f4192f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:48338/0 (socket says 192.168.123.107:48338) 2026-03-10T12:33:00.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.429+0000 7fb9f2ffd700 1 -- 192.168.123.107:0/3671412154 learned_addr learned my addr 192.168.123.107:0/3671412154 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:00.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.430+0000 7fb9f2ffd700 1 -- 192.168.123.107:0/3671412154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9dc0097e0 con 0x7fb9f40fef00 2026-03-10T12:33:00.430 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.430+0000 7fb9f2ffd700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f4192f40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fb9dc004f40 tx=0x7fb9dc005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:00.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.430+0000 7fb9f88fa700 1 -- 192.168.123.107:0/3671412154 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9dc01c070 con 0x7fb9f40fef00 2026-03-10T12:33:00.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.430+0000 7fb9f88fa700 1 -- 192.168.123.107:0/3671412154 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb9dc0053b0 con 0x7fb9f40fef00 2026-03-10T12:33:00.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.430+0000 7fb9f88fa700 1 -- 192.168.123.107:0/3671412154 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9dc00f460 con 0x7fb9f40fef00 2026-03-10T12:33:00.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.430+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9f4193680 con 0x7fb9f40fef00 2026-03-10T12:33:00.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.430+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9f4193b20 con 0x7fb9f40fef00 2026-03-10T12:33:00.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.431+0000 7fb9f88fa700 1 -- 192.168.123.107:0/3671412154 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fb9dc00f5e0 con 0x7fb9f40fef00 2026-03-10T12:33:00.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.431+0000 7fb9f88fa700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9e0038510 0x7fb9e003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:00.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.431+0000 7fb9f88fa700 1 -- 192.168.123.107:0/3671412154 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb9dc04d320 con 0x7fb9f40fef00 2026-03-10T12:33:00.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.432+0000 7fb9f27fc700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9e0038510 0x7fb9e003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:00.432 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.432+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb9f418cbc0 con 0x7fb9f40fef00 2026-03-10T12:33:00.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.432+0000 7fb9f27fc700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9e0038510 0x7fb9e003a9c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fb9e4006fd0 tx=0x7fb9e4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:00.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.435+0000 7fb9f88fa700 1 -- 192.168.123.107:0/3671412154 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb9dc026070 con 0x7fb9f40fef00 2026-03-10T12:33:00.583 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.583+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb9f404f9e0 con 0x7fb9f40fef00 2026-03-10T12:33:00.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.583+0000 7fb9f88fa700 1 -- 192.168.123.107:0/3671412154 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb9dc029950 con 0x7fb9f40fef00 2026-03-10T12:33:00.584 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:00.584 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:00.586 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.586+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9e0038510 msgr2=0x7fb9e003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:00.586 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.586+0000 7fb9f98fc700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9e0038510 0x7fb9e003a9c0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fb9e4006fd0 tx=0x7fb9e4006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.586+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 msgr2=0x7fb9f4192f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.586+0000 7fb9f98fc700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f4192f40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fb9dc004f40 tx=0x7fb9dc005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.586+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 shutdown_connections 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.586+0000 7fb9f98fc700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9e0038510 0x7fb9e003a9c0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.586+0000 7fb9f98fc700 1 --2- 192.168.123.107:0/3671412154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f40fef00 0x7fb9f4192f40 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.587+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 >> 192.168.123.107:0/3671412154 conn(0x7fb9f40fa490 msgr2=0x7fb9f40fb160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.587+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 shutdown_connections 2026-03-10T12:33:00.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:00.587+0000 7fb9f98fc700 1 -- 192.168.123.107:0/3671412154 wait complete. 2026-03-10T12:33:00.588 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: Deploying daemon grafana.vm00 on vm00 2026-03-10T12:33:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:01 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/3671412154' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:01.628 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:01.628 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:01.773 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:01.810 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:02.068 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.067+0000 7f96bc9c5700 1 -- 192.168.123.107:0/13297695 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 msgr2=0x7f96b40fecb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:02.068 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.067+0000 7f96bc9c5700 1 --2- 192.168.123.107:0/13297695 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b40fecb0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f96a0009b00 tx=0x7f96a0009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.068+0000 7f96bc9c5700 1 -- 192.168.123.107:0/13297695 shutdown_connections 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.068+0000 7f96bc9c5700 1 --2- 192.168.123.107:0/13297695 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b40fecb0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.068+0000 7f96bc9c5700 1 -- 192.168.123.107:0/13297695 >> 192.168.123.107:0/13297695 conn(0x7f96b40fa4f0 msgr2=0x7f96b40fc900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.068+0000 7f96bc9c5700 1 -- 192.168.123.107:0/13297695 shutdown_connections 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.068+0000 7f96bc9c5700 1 -- 192.168.123.107:0/13297695 wait complete. 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.069+0000 7f96bc9c5700 1 Processor -- start 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.069+0000 7f96bc9c5700 1 -- start start 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.069+0000 7f96bc9c5700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b4197390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:02.069 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.069+0000 7f96bc9c5700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96b41978d0 con 0x7f96b40fe8a0 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.069+0000 7f96ba761700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b4197390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.069+0000 7f96ba761700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b4197390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:48370/0 (socket says 192.168.123.107:48370) 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.069+0000 7f96ba761700 1 -- 192.168.123.107:0/675337193 learned_addr learned my addr 192.168.123.107:0/675337193 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.070+0000 7f96ba761700 1 -- 192.168.123.107:0/675337193 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f96a00097e0 con 0x7f96b40fe8a0 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.070+0000 7f96ba761700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b4197390 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f96a0004f40 tx=0x7f96a0005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.070+0000 7f96af7fe700 1 -- 192.168.123.107:0/675337193 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f96a001c070 con 0x7f96b40fe8a0 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.070+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96b4197ad0 con 0x7f96b40fe8a0 2026-03-10T12:33:02.070 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.070+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96b4197f70 con 0x7f96b40fe8a0 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.070+0000 7f96af7fe700 1 -- 192.168.123.107:0/675337193 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f96a00053b0 con 0x7f96b40fe8a0 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.070+0000 7f96af7fe700 1 -- 192.168.123.107:0/675337193 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f96a000f460 con 0x7f96b40fe8a0 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.071+0000 7f96af7fe700 1 -- 192.168.123.107:0/675337193 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f96a0021470 con 0x7f96b40fe8a0 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.071+0000 7f96af7fe700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f96a4038510 0x7f96a403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.071+0000 7f96af7fe700 1 -- 192.168.123.107:0/675337193 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f96a004c340 con 0x7f96b40fe8a0 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.071+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9698005320 con 0x7f96b40fe8a0 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.072+0000 7f96b9f60700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f96a4038510 0x7f96a403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:02.072 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.072+0000 7f96b9f60700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f96a4038510 0x7f96a403a9c0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f96a8006fd0 tx=0x7f96a8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:02.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.075+0000 7f96af7fe700 1 -- 192.168.123.107:0/675337193 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f96a0026070 con 0x7f96b40fe8a0 2026-03-10T12:33:02.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.219+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9698005190 con 0x7f96b40fe8a0 2026-03-10T12:33:02.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.221+0000 7f96af7fe700 1 -- 192.168.123.107:0/675337193 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f96a0021770 con 0x7f96b40fe8a0 2026-03-10T12:33:02.221 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:02.221 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:02.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f96a4038510 msgr2=0x7f96a403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:02.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f96a4038510 0x7f96a403a9c0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f96a8006fd0 tx=0x7f96a8006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:02.223 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 msgr2=0x7f96b4197390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b4197390 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f96a0004f40 tx=0x7f96a0005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 shutdown_connections 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f96a4038510 0x7f96a403a9c0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 --2- 192.168.123.107:0/675337193 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f96b40fe8a0 0x7f96b4197390 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 >> 192.168.123.107:0/675337193 conn(0x7f96b40fa4f0 msgr2=0x7f96b40fb150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 shutdown_connections 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:02.223+0000 7f96bc9c5700 1 -- 192.168.123.107:0/675337193 wait complete. 2026-03-10T12:33:02.224 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:03.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:02 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:03.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:02 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/675337193' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:03.286 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:03.286 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:03.439 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:03.478 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.769+0000 7fa9528ec700 1 -- 192.168.123.107:0/3034523316 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 msgr2=0x7fa94c102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.769+0000 7fa9528ec700 1 --2- 192.168.123.107:0/3034523316 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c102650 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fa934009b00 tx=0x7fa934009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.769+0000 7fa9528ec700 1 -- 192.168.123.107:0/3034523316 shutdown_connections 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.769+0000 7fa9528ec700 1 --2- 192.168.123.107:0/3034523316 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c102650 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.769+0000 7fa9528ec700 1 -- 192.168.123.107:0/3034523316 >> 192.168.123.107:0/3034523316 conn(0x7fa94c0fd8d0 msgr2=0x7fa94c0ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.769+0000 7fa9528ec700 1 -- 192.168.123.107:0/3034523316 shutdown_connections 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.769+0000 7fa9528ec700 1 -- 192.168.123.107:0/3034523316 wait complete. 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.770+0000 7fa9528ec700 1 Processor -- start 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.770+0000 7fa9528ec700 1 -- start start 2026-03-10T12:33:03.770 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.770+0000 7fa9528ec700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:03.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.770+0000 7fa9528ec700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa94c197990 con 0x7fa94c102240 2026-03-10T12:33:03.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.770+0000 7fa94bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:03.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.770+0000 7fa94bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:56124/0 (socket says 192.168.123.107:56124) 2026-03-10T12:33:03.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.770+0000 7fa94bfff700 1 -- 192.168.123.107:0/3351021103 learned_addr learned my addr 192.168.123.107:0/3351021103 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:03.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.771+0000 7fa94bfff700 1 -- 192.168.123.107:0/3351021103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa9340097e0 con 0x7fa94c102240 2026-03-10T12:33:03.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.771+0000 7fa94bfff700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c197450 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fa934004d40 tx=0x7fa934004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:03.771 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.771+0000 7fa9497fa700 1 -- 192.168.123.107:0/3351021103 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa93401c070 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.771+0000 7fa9497fa700 1 -- 192.168.123.107:0/3351021103 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa9340056f0 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.771+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa94c197b90 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.771+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa94c198030 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.772+0000 7fa9497fa700 1 -- 192.168.123.107:0/3351021103 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa934017440 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.772+0000 7fa9497fa700 1 -- 192.168.123.107:0/3351021103 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fa9340176a0 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.772+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa94c191090 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.772+0000 7fa9497fa700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa938038250 0x7fa93803a700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.772+0000 7fa9497fa700 1 -- 192.168.123.107:0/3351021103 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fa93404d210 con 0x7fa94c102240 2026-03-10T12:33:03.773 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.773+0000 7fa94b7fe700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa938038250 0x7fa93803a700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:03.774 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.773+0000 7fa94b7fe700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa938038250 0x7fa93803a700 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fa93c006fd0 tx=0x7fa93c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:03.775 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.775+0000 7fa9497fa700 1 -- 192.168.123.107:0/3351021103 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa93402a430 con 0x7fa94c102240 2026-03-10T12:33:03.920 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.917+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fa94c062380 con 0x7fa94c102240 2026-03-10T12:33:03.920 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.919+0000 7fa9497fa700 1 -- 192.168.123.107:0/3351021103 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fa934025030 con 0x7fa94c102240 2026-03-10T12:33:03.920 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:03.920 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:03.922 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa938038250 msgr2=0x7fa93803a700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa938038250 0x7fa93803a700 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fa93c006fd0 tx=0x7fa93c006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 msgr2=0x7fa94c197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c197450 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fa934004d40 tx=0x7fa934004e20 comp rx=0 tx=0).stop 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 shutdown_connections 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa938038250 0x7fa93803a700 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 --2- 192.168.123.107:0/3351021103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa94c102240 0x7fa94c197450 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.922+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 >> 192.168.123.107:0/3351021103 conn(0x7fa94c0fd8d0 msgr2=0x7fa94c0fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.923+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 shutdown_connections 2026-03-10T12:33:03.923 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:03.923+0000 7fa9528ec700 1 -- 192.168.123.107:0/3351021103 wait complete. 2026-03-10T12:33:03.924 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:04.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:04 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/3351021103' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:04.972 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:04.972 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:05.123 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:05.157 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:05.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.433+0000 7fc731c1c700 1 -- 192.168.123.107:0/360370495 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 msgr2=0x7fc72c102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.433+0000 7fc731c1c700 1 --2- 192.168.123.107:0/360370495 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c102640 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fc714009b00 tx=0x7fc714009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.434+0000 7fc731c1c700 1 -- 192.168.123.107:0/360370495 shutdown_connections 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.434+0000 7fc731c1c700 1 --2- 192.168.123.107:0/360370495 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c102640 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.434+0000 7fc731c1c700 1 -- 192.168.123.107:0/360370495 >> 192.168.123.107:0/360370495 conn(0x7fc72c0fd8d0 msgr2=0x7fc72c0ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.434+0000 7fc731c1c700 1 -- 192.168.123.107:0/360370495 shutdown_connections 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.435+0000 7fc731c1c700 1 -- 192.168.123.107:0/360370495 wait complete. 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.435+0000 7fc731c1c700 1 Processor -- start 2026-03-10T12:33:05.435 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.435+0000 7fc731c1c700 1 -- start start 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.435+0000 7fc731c1c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c1973a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.435+0000 7fc731c1c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc72c1978e0 con 0x7fc72c102230 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.435+0000 7fc72b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c1973a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc72b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c1973a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:56140/0 (socket says 192.168.123.107:56140) 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc72b7fe700 1 -- 192.168.123.107:0/4014511345 learned_addr learned my addr 192.168.123.107:0/4014511345 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc72b7fe700 1 -- 192.168.123.107:0/4014511345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc7140097e0 con 0x7fc72c102230 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc72b7fe700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c1973a0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc714004f40 tx=0x7fc714005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:05.436 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc728ff9700 1 -- 192.168.123.107:0/4014511345 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc71401c070 con 0x7fc72c102230 2026-03-10T12:33:05.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc728ff9700 1 -- 192.168.123.107:0/4014511345 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc7140053b0 con 0x7fc72c102230 2026-03-10T12:33:05.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc72c197ae0 con 0x7fc72c102230 2026-03-10T12:33:05.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.436+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc72c197f80 con 0x7fc72c102230 2026-03-10T12:33:05.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.437+0000 7fc728ff9700 1 -- 192.168.123.107:0/4014511345 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc71400f460 con 0x7fc72c102230 2026-03-10T12:33:05.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.437+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc70c005320 con 0x7fc72c102230 2026-03-10T12:33:05.438 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.437+0000 7fc728ff9700 1 -- 192.168.123.107:0/4014511345 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fc71400f6c0 con 0x7fc72c102230 2026-03-10T12:33:05.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.438+0000 7fc728ff9700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc720038510 0x7fc72003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:05.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.438+0000 7fc728ff9700 1 -- 192.168.123.107:0/4014511345 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc71404d540 con 0x7fc72c102230 2026-03-10T12:33:05.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.438+0000 7fc72affd700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc720038510 0x7fc72003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:05.439 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.439+0000 7fc72affd700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc720038510 0x7fc72003a9c0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc71c006fd0 tx=0x7fc71c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:05.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.441+0000 7fc728ff9700 1 -- 192.168.123.107:0/4014511345 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc714029bb0 con 0x7fc72c102230 2026-03-10T12:33:05.595 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.594+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc70c005190 con 0x7fc72c102230 2026-03-10T12:33:05.595 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.595+0000 7fc728ff9700 1 -- 192.168.123.107:0/4014511345 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc714026030 con 0x7fc72c102230 2026-03-10T12:33:05.595 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:05.595 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:05.597 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc720038510 msgr2=0x7fc72003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc720038510 0x7fc72003a9c0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc71c006fd0 tx=0x7fc71c006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 msgr2=0x7fc72c1973a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c1973a0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc714004f40 tx=0x7fc714005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 shutdown_connections 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc720038510 0x7fc72003a9c0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 --2- 192.168.123.107:0/4014511345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc72c102230 0x7fc72c1973a0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.597+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 >> 192.168.123.107:0/4014511345 conn(0x7fc72c0fd8d0 msgr2=0x7fc72c0fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.598+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 shutdown_connections 2026-03-10T12:33:05.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:05.598+0000 7fc731c1c700 1 -- 192.168.123.107:0/4014511345 wait complete. 2026-03-10T12:33:05.599 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:05.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:05 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/4014511345' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:06.668 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:06.668 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:06.810 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:06.844 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:07.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.091+0000 7fba69bcb700 1 -- 192.168.123.107:0/1071313356 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 msgr2=0x7fba64102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:07.092 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.091+0000 7fba69bcb700 1 --2- 192.168.123.107:0/1071313356 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64102640 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fba54009b00 tx=0x7fba54009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.092+0000 7fba69bcb700 1 -- 192.168.123.107:0/1071313356 shutdown_connections 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.092+0000 7fba69bcb700 1 --2- 192.168.123.107:0/1071313356 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64102640 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.092+0000 7fba69bcb700 1 -- 192.168.123.107:0/1071313356 >> 192.168.123.107:0/1071313356 conn(0x7fba640fd8d0 msgr2=0x7fba640ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.092+0000 7fba69bcb700 1 -- 192.168.123.107:0/1071313356 shutdown_connections 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.092+0000 7fba69bcb700 1 -- 192.168.123.107:0/1071313356 wait complete. 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.093+0000 7fba69bcb700 1 Processor -- start 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.093+0000 7fba69bcb700 1 -- start start 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.093+0000 7fba69bcb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:07.093 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.093+0000 7fba69bcb700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fba641978c0 con 0x7fba64102230 2026-03-10T12:33:07.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.093+0000 7fba637fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:07.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.094+0000 7fba637fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:56146/0 (socket says 192.168.123.107:56146) 2026-03-10T12:33:07.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.094+0000 7fba637fe700 1 -- 192.168.123.107:0/4195641515 learned_addr learned my addr 192.168.123.107:0/4195641515 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:07.094 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.094+0000 7fba637fe700 1 -- 192.168.123.107:0/4195641515 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fba540097e0 con 0x7fba64102230 2026-03-10T12:33:07.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.094+0000 7fba637fe700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64197380 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fba54004750 tx=0x7fba54005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:07.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.095+0000 7fba617fa700 1 -- 192.168.123.107:0/4195641515 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fba5401c070 con 0x7fba64102230 2026-03-10T12:33:07.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.095+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fba64197ac0 con 0x7fba64102230 2026-03-10T12:33:07.095 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.095+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fba64197f60 con 0x7fba64102230 2026-03-10T12:33:07.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.095+0000 7fba617fa700 1 -- 192.168.123.107:0/4195641515 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fba54021470 con 0x7fba64102230 2026-03-10T12:33:07.096 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.095+0000 7fba617fa700 1 -- 192.168.123.107:0/4195641515 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fba5400f460 con 0x7fba64102230 2026-03-10T12:33:07.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.096+0000 7fba617fa700 1 -- 192.168.123.107:0/4195641515 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fba5400f6d0 con 0x7fba64102230 2026-03-10T12:33:07.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.096+0000 7fba617fa700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fba44038510 0x7fba4403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:07.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.096+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fba64191080 con 0x7fba64102230 2026-03-10T12:33:07.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.096+0000 7fba617fa700 1 -- 192.168.123.107:0/4195641515 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fba5404d4a0 con 0x7fba64102230 2026-03-10T12:33:07.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.097+0000 7fba5bfff700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fba44038510 0x7fba4403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:07.097 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.097+0000 7fba5bfff700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fba44038510 0x7fba4403a9c0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fba4c006fd0 tx=0x7fba4c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:07.100 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.099+0000 7fba617fa700 1 -- 192.168.123.107:0/4195641515 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fba54026070 con 0x7fba64102230 2026-03-10T12:33:07.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.245+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fba64062380 con 0x7fba64102230 2026-03-10T12:33:07.246 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.246+0000 7fba617fa700 1 -- 192.168.123.107:0/4195641515 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fba54030300 con 0x7fba64102230 2026-03-10T12:33:07.247 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:07.247 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:07.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.249+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fba44038510 msgr2=0x7fba4403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:07.249 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.249+0000 7fba69bcb700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fba44038510 0x7fba4403a9c0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fba4c006fd0 tx=0x7fba4c006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:07.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.249+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 msgr2=0x7fba64197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:07.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.249+0000 7fba69bcb700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64197380 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fba54004750 tx=0x7fba54005dc0 comp rx=0 tx=0).stop 2026-03-10T12:33:07.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.250+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 shutdown_connections 2026-03-10T12:33:07.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.250+0000 7fba69bcb700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fba44038510 0x7fba4403a9c0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:07.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.250+0000 7fba69bcb700 1 --2- 192.168.123.107:0/4195641515 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fba64102230 0x7fba64197380 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:07.250 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.250+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 >> 192.168.123.107:0/4195641515 conn(0x7fba640fd8d0 msgr2=0x7fba640fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:07.251 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.250+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 shutdown_connections 2026-03-10T12:33:07.251 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:07.251+0000 7fba69bcb700 1 -- 192.168.123.107:0/4195641515 wait complete. 2026-03-10T12:33:07.252 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:07 vm00 ceph-mon[50686]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:07 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/4195641515' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:08.318 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:08.318 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:08.480 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:08.523 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:08.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.800+0000 7fc84683b700 1 -- 192.168.123.107:0/90335427 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 msgr2=0x7fc8400fed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:08.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.800+0000 7fc84683b700 1 --2- 192.168.123.107:0/90335427 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8400fed20 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fc830009b00 tx=0x7fc830009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:08.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.801+0000 7fc84683b700 1 -- 192.168.123.107:0/90335427 shutdown_connections 2026-03-10T12:33:08.801 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.801+0000 7fc84683b700 1 --2- 192.168.123.107:0/90335427 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8400fed20 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:08.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.801+0000 7fc84683b700 1 -- 192.168.123.107:0/90335427 >> 192.168.123.107:0/90335427 conn(0x7fc8400fa4a0 msgr2=0x7fc8400fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:08.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.801+0000 7fc84683b700 1 -- 192.168.123.107:0/90335427 shutdown_connections 2026-03-10T12:33:08.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.801+0000 7fc84683b700 1 -- 192.168.123.107:0/90335427 wait complete. 2026-03-10T12:33:08.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.802+0000 7fc84683b700 1 Processor -- start 2026-03-10T12:33:08.802 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.802+0000 7fc84683b700 1 -- start start 2026-03-10T12:33:08.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.802+0000 7fc84683b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8401973b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:08.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.802+0000 7fc84683b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc8401978f0 con 0x7fc8400fe910 2026-03-10T12:33:08.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.803+0000 7fc83ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8401973b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:08.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.803+0000 7fc83ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8401973b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:56170/0 (socket says 192.168.123.107:56170) 2026-03-10T12:33:08.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.803+0000 7fc83ffff700 1 -- 192.168.123.107:0/1815668206 learned_addr learned my addr 192.168.123.107:0/1815668206 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:08.803 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.803+0000 7fc83ffff700 1 -- 192.168.123.107:0/1815668206 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc8300097e0 con 0x7fc8400fe910 2026-03-10T12:33:08.804 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.803+0000 7fc83ffff700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8401973b0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fc830004f40 tx=0x7fc830005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:08.804 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.803+0000 7fc83dffb700 1 -- 192.168.123.107:0/1815668206 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc83001d070 con 0x7fc8400fe910 2026-03-10T12:33:08.804 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.804+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc840197af0 con 0x7fc8400fe910 2026-03-10T12:33:08.804 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.804+0000 7fc83dffb700 1 -- 192.168.123.107:0/1815668206 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc830022470 con 0x7fc8400fe910 2026-03-10T12:33:08.805 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.804+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc840197f90 con 0x7fc8400fe910 2026-03-10T12:33:08.805 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.804+0000 7fc83dffb700 1 -- 192.168.123.107:0/1815668206 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc83000f460 con 0x7fc8400fe910 2026-03-10T12:33:08.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.805+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc840191050 con 0x7fc8400fe910 2026-03-10T12:33:08.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.806+0000 7fc83dffb700 1 -- 192.168.123.107:0/1815668206 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7fc830022ae0 con 0x7fc8400fe910 2026-03-10T12:33:08.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.806+0000 7fc83dffb700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc820038550 0x7fc82003aa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:08.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.806+0000 7fc83dffb700 1 -- 192.168.123.107:0/1815668206 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc83004c310 con 0x7fc8400fe910 2026-03-10T12:33:08.806 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.806+0000 7fc837fff700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc820038550 0x7fc82003aa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:08.807 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.806+0000 7fc837fff700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc820038550 0x7fc82003aa00 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fc828006fd0 tx=0x7fc828006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:08.809 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.809+0000 7fc83dffb700 1 -- 192.168.123.107:0/1815668206 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc83002a360 con 0x7fc8400fe910 2026-03-10T12:33:08.952 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.951+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc840062380 con 0x7fc8400fe910 2026-03-10T12:33:08.953 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.953+0000 7fc83dffb700 1 -- 192.168.123.107:0/1815668206 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc830027070 con 0x7fc8400fe910 2026-03-10T12:33:08.954 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:08.954 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:08.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.955+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc820038550 msgr2=0x7fc82003aa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:08.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.955+0000 7fc84683b700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc820038550 0x7fc82003aa00 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fc828006fd0 tx=0x7fc828006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:08.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 msgr2=0x7fc8401973b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:08.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8401973b0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fc830004f40 tx=0x7fc830005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:08.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 shutdown_connections 2026-03-10T12:33:08.956 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc820038550 0x7fc82003aa00 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:08.957 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 --2- 192.168.123.107:0/1815668206 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc8400fe910 0x7fc8401973b0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:08.957 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 >> 192.168.123.107:0/1815668206 conn(0x7fc8400fa4a0 msgr2=0x7fc8400fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:08.957 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 shutdown_connections 2026-03-10T12:33:08.957 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:08.956+0000 7fc84683b700 1 -- 192.168.123.107:0/1815668206 wait complete. 2026-03-10T12:33:08.958 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:10.031 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:10.031 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:10.184 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:10.225 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:10.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:09 vm00 ceph-mon[50686]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:10.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:09 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1815668206' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:10.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.480+0000 7f276b671700 1 -- 192.168.123.107:0/380144024 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 msgr2=0x7f27640fed00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:10.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.480+0000 7f276b671700 1 --2- 192.168.123.107:0/380144024 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f27640fed00 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f2754009b00 tx=0x7f2754009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:10.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.481+0000 7f276b671700 1 -- 192.168.123.107:0/380144024 shutdown_connections 2026-03-10T12:33:10.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.481+0000 7f276b671700 1 --2- 192.168.123.107:0/380144024 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f27640fed00 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:10.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.481+0000 7f276b671700 1 -- 192.168.123.107:0/380144024 >> 192.168.123.107:0/380144024 conn(0x7f27640fa4a0 msgr2=0x7f27640fc8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:10.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.481+0000 7f276b671700 1 -- 192.168.123.107:0/380144024 shutdown_connections 2026-03-10T12:33:10.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.481+0000 7f276b671700 1 -- 192.168.123.107:0/380144024 wait complete. 2026-03-10T12:33:10.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.481+0000 7f276b671700 1 Processor -- start 2026-03-10T12:33:10.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276b671700 1 -- start start 2026-03-10T12:33:10.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276b671700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f2764197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:10.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276b671700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27641978c0 con 0x7f27640fe8f0 2026-03-10T12:33:10.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276940d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f2764197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:10.482 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276940d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f2764197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:56200/0 (socket says 192.168.123.107:56200) 2026-03-10T12:33:10.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276940d700 1 -- 192.168.123.107:0/432512900 learned_addr learned my addr 192.168.123.107:0/432512900 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:10.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276940d700 1 -- 192.168.123.107:0/432512900 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f27540097e0 con 0x7f27640fe8f0 2026-03-10T12:33:10.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.482+0000 7f276940d700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f2764197380 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f2754004f40 tx=0x7f2754005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:10.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.483+0000 7f275a7fc700 1 -- 192.168.123.107:0/432512900 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f275401c070 con 0x7f27640fe8f0 2026-03-10T12:33:10.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.483+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2764197ac0 con 0x7f27640fe8f0 2026-03-10T12:33:10.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.483+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2764197f60 con 0x7f27640fe8f0 2026-03-10T12:33:10.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.483+0000 7f275a7fc700 1 -- 192.168.123.107:0/432512900 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f27540053b0 con 0x7f27640fe8f0 2026-03-10T12:33:10.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.483+0000 7f275a7fc700 1 -- 192.168.123.107:0/432512900 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f275400f550 con 0x7f27640fe8f0 2026-03-10T12:33:10.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.484+0000 7f275a7fc700 1 -- 192.168.123.107:0/432512900 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f275400f770 con 0x7f27640fe8f0 2026-03-10T12:33:10.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.484+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2748005320 con 0x7f27640fe8f0 2026-03-10T12:33:10.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.484+0000 7f275a7fc700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2750038510 0x7f275003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:10.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.484+0000 7f275a7fc700 1 -- 192.168.123.107:0/432512900 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f275404d4b0 con 0x7f27640fe8f0 2026-03-10T12:33:10.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.484+0000 7f2768c0c700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2750038510 0x7f275003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:10.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.485+0000 7f2768c0c700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2750038510 0x7f275003a9c0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2760006fd0 tx=0x7f2760006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:10.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.487+0000 7f275a7fc700 1 -- 192.168.123.107:0/432512900 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2754029930 con 0x7f27640fe8f0 2026-03-10T12:33:10.628 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.627+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2748005190 con 0x7f27640fe8f0 2026-03-10T12:33:10.629 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.629+0000 7f275a7fc700 1 -- 192.168.123.107:0/432512900 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2754026030 con 0x7f27640fe8f0 2026-03-10T12:33:10.629 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:10.629 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.631+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2750038510 msgr2=0x7f275003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.631+0000 7f276b671700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2750038510 0x7f275003a9c0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f2760006fd0 tx=0x7f2760006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.631+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 msgr2=0x7f2764197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.631+0000 7f276b671700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f2764197380 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f2754004f40 tx=0x7f2754005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.632+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 shutdown_connections 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.632+0000 7f276b671700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2750038510 0x7f275003a9c0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.632+0000 7f276b671700 1 --2- 192.168.123.107:0/432512900 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f27640fe8f0 0x7f2764197380 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:10.632 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.632+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 >> 192.168.123.107:0/432512900 conn(0x7f27640fa4a0 msgr2=0x7f27640fb170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:10.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.632+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 shutdown_connections 2026-03-10T12:33:10.633 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:10.632+0000 7f276b671700 1 -- 192.168.123.107:0/432512900 wait complete. 2026-03-10T12:33:10.633 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:11.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:11 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/432512900' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:11.705 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:11.705 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:11.853 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:11.898 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:12.183 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.182+0000 7f189ccef700 1 -- 192.168.123.107:0/108629871 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 msgr2=0x7f1898102650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:12.183 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.182+0000 7f189ccef700 1 --2- 192.168.123.107:0/108629871 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898102650 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f1880009b00 tx=0x7f1880009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:12.183 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.182+0000 7f189ccef700 1 -- 192.168.123.107:0/108629871 shutdown_connections 2026-03-10T12:33:12.183 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.182+0000 7f189ccef700 1 --2- 192.168.123.107:0/108629871 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898102650 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:12.183 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.182+0000 7f189ccef700 1 -- 192.168.123.107:0/108629871 >> 192.168.123.107:0/108629871 conn(0x7f18980fd8d0 msgr2=0x7f18980ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:12.183 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.183+0000 7f189ccef700 1 -- 192.168.123.107:0/108629871 shutdown_connections 2026-03-10T12:33:12.183 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.183+0000 7f189ccef700 1 -- 192.168.123.107:0/108629871 wait complete. 2026-03-10T12:33:12.184 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.183+0000 7f189ccef700 1 Processor -- start 2026-03-10T12:33:12.184 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.184+0000 7f189ccef700 1 -- start start 2026-03-10T12:33:12.184 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.184+0000 7f189ccef700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898197450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:12.184 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.184+0000 7f189ccef700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1898197990 con 0x7f1898102240 2026-03-10T12:33:12.184 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.184+0000 7f189659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898197450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:12.185 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.184+0000 7f189659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898197450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:56212/0 (socket says 192.168.123.107:56212) 2026-03-10T12:33:12.185 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.184+0000 7f189659c700 1 -- 192.168.123.107:0/211495757 learned_addr learned my addr 192.168.123.107:0/211495757 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:12.185 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.185+0000 7f189659c700 1 -- 192.168.123.107:0/211495757 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f18800097e0 con 0x7f1898102240 2026-03-10T12:33:12.185 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.185+0000 7f189659c700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898197450 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f1880004d40 tx=0x7f1880004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:12.186 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.185+0000 7f188f7fe700 1 -- 192.168.123.107:0/211495757 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f188001c070 con 0x7f1898102240 2026-03-10T12:33:12.186 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.185+0000 7f188f7fe700 1 -- 192.168.123.107:0/211495757 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f18800056f0 con 0x7f1898102240 2026-03-10T12:33:12.186 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.185+0000 7f188f7fe700 1 -- 192.168.123.107:0/211495757 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1880017440 con 0x7f1898102240 2026-03-10T12:33:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.185+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1898197b90 con 0x7f1898102240 2026-03-10T12:33:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.185+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1898198030 con 0x7f1898102240 2026-03-10T12:33:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.187+0000 7f188f7fe700 1 -- 192.168.123.107:0/211495757 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f18800175a0 con 0x7f1898102240 2026-03-10T12:33:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.187+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1898191090 con 0x7f1898102240 2026-03-10T12:33:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.187+0000 7f188f7fe700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1884038510 0x7f188403a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.187+0000 7f188f7fe700 1 -- 192.168.123.107:0/211495757 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f188004d0e0 con 0x7f1898102240 2026-03-10T12:33:12.187 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.187+0000 7f1895d9b700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1884038510 0x7f188403a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:12.188 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.188+0000 7f1895d9b700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1884038510 0x7f188403a9c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f1888006fd0 tx=0x7f1888006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:12.190 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.190+0000 7f188f7fe700 1 -- 192.168.123.107:0/211495757 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1880028950 con 0x7f1898102240 2026-03-10T12:33:12.327 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:12 vm00 ceph-mon[50686]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:12.327 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:12 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:12.327 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:12 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:12.327 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:12 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:12.327 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:12 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:12.327 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:12 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:12.327 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:12 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:12.334 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.334+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f1898062380 con 0x7f1898102240 2026-03-10T12:33:12.335 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.334+0000 7f188f7fe700 1 -- 192.168.123.107:0/211495757 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f1880025070 con 0x7f1898102240 2026-03-10T12:33:12.335 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:12.335 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:12.337 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1884038510 msgr2=0x7f188403a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1884038510 0x7f188403a9c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f1888006fd0 tx=0x7f1888006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 msgr2=0x7f1898197450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898197450 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f1880004d40 tx=0x7f1880004e20 comp rx=0 tx=0).stop 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 shutdown_connections 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1884038510 0x7f188403a9c0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 --2- 192.168.123.107:0/211495757 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1898102240 0x7f1898197450 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 >> 192.168.123.107:0/211495757 conn(0x7f18980fd8d0 msgr2=0x7f18980fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 shutdown_connections 2026-03-10T12:33:12.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:12.337+0000 7f189ccef700 1 -- 192.168.123.107:0/211495757 wait complete. 2026-03-10T12:33:12.339 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:13.406 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:13.406 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:13.543 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:13.572 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:13.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:13 vm00 ceph-mon[50686]: Deploying daemon prometheus.vm00 on vm00 2026-03-10T12:33:13.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:13 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/211495757' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:13.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:13 vm00 ceph-mon[50686]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.851+0000 7f2cd952a700 1 -- 192.168.123.107:0/942742952 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 msgr2=0x7f2cd4102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.851+0000 7f2cd952a700 1 --2- 192.168.123.107:0/942742952 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4102640 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f2cbc009b00 tx=0x7f2cbc009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.852+0000 7f2cd952a700 1 -- 192.168.123.107:0/942742952 shutdown_connections 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.852+0000 7f2cd952a700 1 --2- 192.168.123.107:0/942742952 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4102640 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.852+0000 7f2cd952a700 1 -- 192.168.123.107:0/942742952 >> 192.168.123.107:0/942742952 conn(0x7f2cd40fd8d0 msgr2=0x7f2cd40ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.853+0000 7f2cd952a700 1 -- 192.168.123.107:0/942742952 shutdown_connections 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.853+0000 7f2cd952a700 1 -- 192.168.123.107:0/942742952 wait complete. 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.853+0000 7f2cd952a700 1 Processor -- start 2026-03-10T12:33:13.853 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.853+0000 7f2cd952a700 1 -- start start 2026-03-10T12:33:13.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.853+0000 7f2cd952a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:13.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.853+0000 7f2cd952a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2cd41978c0 con 0x7f2cd4102230 2026-03-10T12:33:13.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.854+0000 7f2cd2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:13.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.854+0000 7f2cd2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53316/0 (socket says 192.168.123.107:53316) 2026-03-10T12:33:13.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.854+0000 7f2cd2ffd700 1 -- 192.168.123.107:0/2561108005 learned_addr learned my addr 192.168.123.107:0/2561108005 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:13.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.854+0000 7f2cd2ffd700 1 -- 192.168.123.107:0/2561108005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2cbc0097e0 con 0x7f2cd4102230 2026-03-10T12:33:13.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.854+0000 7f2cd2ffd700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4197380 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f2cbc004750 tx=0x7f2cbc005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.855+0000 7f2ccbfff700 1 -- 192.168.123.107:0/2561108005 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2cbc01c070 con 0x7f2cd4102230 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.855+0000 7f2ccbfff700 1 -- 192.168.123.107:0/2561108005 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2cbc021470 con 0x7f2cd4102230 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.855+0000 7f2ccbfff700 1 -- 192.168.123.107:0/2561108005 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2cbc00f460 con 0x7f2cd4102230 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.855+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2cd4197ac0 con 0x7f2cd4102230 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.855+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2cd4197f60 con 0x7f2cd4102230 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.856+0000 7f2ccbfff700 1 -- 192.168.123.107:0/2561108005 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f2cbc021ac0 con 0x7f2cd4102230 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.856+0000 7f2ccbfff700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2cc0038510 0x7f2cc003a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.856+0000 7f2ccbfff700 1 -- 192.168.123.107:0/2561108005 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f2cbc04c3e0 con 0x7f2cd4102230 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.856+0000 7f2cd27fc700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2cc0038510 0x7f2cc003a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:13.856 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.856+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2cd4191080 con 0x7f2cd4102230 2026-03-10T12:33:13.857 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.856+0000 7f2cd27fc700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2cc0038510 0x7f2cc003a9c0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f2cc4006fd0 tx=0x7f2cc4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:13.860 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:13.860+0000 7f2ccbfff700 1 -- 192.168.123.107:0/2561108005 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2cbc026070 con 0x7f2cd4102230 2026-03-10T12:33:14.006 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.005+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2cd4062380 con 0x7f2cd4102230 2026-03-10T12:33:14.006 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.006+0000 7f2ccbfff700 1 -- 192.168.123.107:0/2561108005 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2cbc029540 con 0x7f2cd4102230 2026-03-10T12:33:14.007 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:14.008 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:14.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.009+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2cc0038510 msgr2=0x7f2cc003a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.009+0000 7f2cd952a700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2cc0038510 0x7f2cc003a9c0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f2cc4006fd0 tx=0x7f2cc4006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.009+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 msgr2=0x7f2cd4197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.009+0000 7f2cd952a700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4197380 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f2cbc004750 tx=0x7f2cbc005dc0 comp rx=0 tx=0).stop 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.010+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 shutdown_connections 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.010+0000 7f2cd952a700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2cc0038510 0x7f2cc003a9c0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.010+0000 7f2cd952a700 1 --2- 192.168.123.107:0/2561108005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2cd4102230 0x7f2cd4197380 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.010+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 >> 192.168.123.107:0/2561108005 conn(0x7f2cd40fd8d0 msgr2=0x7f2cd40fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:14.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.010+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 shutdown_connections 2026-03-10T12:33:14.011 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:14.010+0000 7f2cd952a700 1 -- 192.168.123.107:0/2561108005 wait complete. 2026-03-10T12:33:14.011 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:14.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:14 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/2561108005' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:15.082 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:15.083 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:15.228 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:15.274 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:15.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.548+0000 7f10585da700 1 -- 192.168.123.107:0/1459273479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1050100230 msgr2=0x7f1050100640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:15.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.548+0000 7f10585da700 1 --2- 192.168.123.107:0/1459273479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1050100230 0x7f1050100640 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f104c009b00 tx=0x7f104c009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:15.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.549+0000 7f10585da700 1 -- 192.168.123.107:0/1459273479 shutdown_connections 2026-03-10T12:33:15.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.549+0000 7f10585da700 1 --2- 192.168.123.107:0/1459273479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1050100230 0x7f1050100640 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:15.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.549+0000 7f10585da700 1 -- 192.168.123.107:0/1459273479 >> 192.168.123.107:0/1459273479 conn(0x7f10500fb7e0 msgr2=0x7f10500fdc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:15.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.549+0000 7f10585da700 1 -- 192.168.123.107:0/1459273479 shutdown_connections 2026-03-10T12:33:15.549 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.549+0000 7f10585da700 1 -- 192.168.123.107:0/1459273479 wait complete. 2026-03-10T12:33:15.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.550+0000 7f10585da700 1 Processor -- start 2026-03-10T12:33:15.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.550+0000 7f10585da700 1 -- start start 2026-03-10T12:33:15.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.550+0000 7f10585da700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10501977c0 0x7f1050197bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:15.550 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.550+0000 7f10585da700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1050198110 con 0x7f10501977c0 2026-03-10T12:33:15.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.550+0000 7f1056376700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10501977c0 0x7f1050197bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:15.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.550+0000 7f1056376700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10501977c0 0x7f1050197bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53332/0 (socket says 192.168.123.107:53332) 2026-03-10T12:33:15.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.550+0000 7f1056376700 1 -- 192.168.123.107:0/2010865510 learned_addr learned my addr 192.168.123.107:0/2010865510 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:15.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.551+0000 7f1056376700 1 -- 192.168.123.107:0/2010865510 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f104c0097e0 con 0x7f10501977c0 2026-03-10T12:33:15.551 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.551+0000 7f1056376700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10501977c0 0x7f1050197bd0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f104c00b5c0 tx=0x7f104c005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:15.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.551+0000 7f10477fe700 1 -- 192.168.123.107:0/2010865510 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f104c01c070 con 0x7f10501977c0 2026-03-10T12:33:15.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.551+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1050198310 con 0x7f10501977c0 2026-03-10T12:33:15.552 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.552+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f105019af70 con 0x7f10501977c0 2026-03-10T12:33:15.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.552+0000 7f10477fe700 1 -- 192.168.123.107:0/2010865510 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f104c021470 con 0x7f10501977c0 2026-03-10T12:33:15.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.552+0000 7f10477fe700 1 -- 192.168.123.107:0/2010865510 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f104c021e70 con 0x7f10501977c0 2026-03-10T12:33:15.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.553+0000 7f10477fe700 1 -- 192.168.123.107:0/2010865510 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f104c0052c0 con 0x7f10501977c0 2026-03-10T12:33:15.553 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.553+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1034005320 con 0x7f10501977c0 2026-03-10T12:33:15.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.553+0000 7f10477fe700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f103c0384c0 0x7f103c03a970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:15.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.553+0000 7f10477fe700 1 -- 192.168.123.107:0/2010865510 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f104c04c1d0 con 0x7f10501977c0 2026-03-10T12:33:15.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.553+0000 7f1055b75700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f103c0384c0 0x7f103c03a970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:15.554 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.554+0000 7f1055b75700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f103c0384c0 0x7f103c03a970 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f1040006fd0 tx=0x7f1040006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:15.556 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.556+0000 7f10477fe700 1 -- 192.168.123.107:0/2010865510 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f104c0215e0 con 0x7f10501977c0 2026-03-10T12:33:15.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.702+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f1034005190 con 0x7f10501977c0 2026-03-10T12:33:15.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.703+0000 7f10477fe700 1 -- 192.168.123.107:0/2010865510 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f104c026070 con 0x7f10501977c0 2026-03-10T12:33:15.705 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:15.705 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.706+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f103c0384c0 msgr2=0x7f103c03a970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.706+0000 7f10585da700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f103c0384c0 0x7f103c03a970 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f1040006fd0 tx=0x7f1040006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.707+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10501977c0 msgr2=0x7f1050197bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.707+0000 7f10585da700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10501977c0 0x7f1050197bd0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f104c00b5c0 tx=0x7f104c005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.707+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 shutdown_connections 2026-03-10T12:33:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.707+0000 7f10585da700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f103c0384c0 0x7f103c03a970 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:15.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.707+0000 7f10585da700 1 --2- 192.168.123.107:0/2010865510 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10501977c0 0x7f1050197bd0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:15.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.707+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 >> 192.168.123.107:0/2010865510 conn(0x7f10500fb7e0 msgr2=0x7f10500fdc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:15.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.708+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 shutdown_connections 2026-03-10T12:33:15.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:15.708+0000 7f10585da700 1 -- 192.168.123.107:0/2010865510 wait complete. 2026-03-10T12:33:15.709 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:15.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:15 vm00 ceph-mon[50686]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:16.757 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:16 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/2010865510' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:16.772 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:16.772 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:16.919 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:16.960 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.228+0000 7f6ed1364700 1 -- 192.168.123.107:0/839428670 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 msgr2=0x7f6ecc100420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.228+0000 7f6ed1364700 1 --2- 192.168.123.107:0/839428670 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc100420 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f6eb4009b00 tx=0x7f6eb4009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.229+0000 7f6ed1364700 1 -- 192.168.123.107:0/839428670 shutdown_connections 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.229+0000 7f6ed1364700 1 --2- 192.168.123.107:0/839428670 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc100420 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.229+0000 7f6ed1364700 1 -- 192.168.123.107:0/839428670 >> 192.168.123.107:0/839428670 conn(0x7f6ecc0fb5a0 msgr2=0x7f6ecc0fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.229+0000 7f6ed1364700 1 -- 192.168.123.107:0/839428670 shutdown_connections 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.229+0000 7f6ed1364700 1 -- 192.168.123.107:0/839428670 wait complete. 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.230+0000 7f6ed1364700 1 Processor -- start 2026-03-10T12:33:17.230 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.230+0000 7f6ed1364700 1 -- start start 2026-03-10T12:33:17.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.230+0000 7f6ed1364700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc1973a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:17.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.230+0000 7f6ed1364700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ecc1978e0 con 0x7f6ecc100010 2026-03-10T12:33:17.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.231+0000 7f6ecaffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc1973a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:17.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.231+0000 7f6ecaffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc1973a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53358/0 (socket says 192.168.123.107:53358) 2026-03-10T12:33:17.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.231+0000 7f6ecaffd700 1 -- 192.168.123.107:0/2456479883 learned_addr learned my addr 192.168.123.107:0/2456479883 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:17.231 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.231+0000 7f6ecaffd700 1 -- 192.168.123.107:0/2456479883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6eb40097e0 con 0x7f6ecc100010 2026-03-10T12:33:17.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.231+0000 7f6ecaffd700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc1973a0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f6eb4004750 tx=0x7f6eb4005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:17.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.232+0000 7f6ec3fff700 1 -- 192.168.123.107:0/2456479883 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6eb401c070 con 0x7f6ecc100010 2026-03-10T12:33:17.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.232+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ecc197ae0 con 0x7f6ecc100010 2026-03-10T12:33:17.232 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.232+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ecc197f80 con 0x7f6ecc100010 2026-03-10T12:33:17.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.232+0000 7f6ec3fff700 1 -- 192.168.123.107:0/2456479883 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6eb4021470 con 0x7f6ecc100010 2026-03-10T12:33:17.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.232+0000 7f6ec3fff700 1 -- 192.168.123.107:0/2456479883 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6eb400f460 con 0x7f6ecc100010 2026-03-10T12:33:17.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.233+0000 7f6ec3fff700 1 -- 192.168.123.107:0/2456479883 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 13) v1 ==== 45138+0+0 (secure 0 0 0) 0x7f6eb400f600 con 0x7f6ecc100010 2026-03-10T12:33:17.233 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.233+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ecc191260 con 0x7f6ecc100010 2026-03-10T12:33:17.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.233+0000 7f6ec3fff700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6eb8038510 0x7f6eb803a9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:17.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.233+0000 7f6ec3fff700 1 -- 192.168.123.107:0/2456479883 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f6eb404d4a0 con 0x7f6ecc100010 2026-03-10T12:33:17.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.234+0000 7f6eca7fc700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6eb8038510 0x7f6eb803a9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:17.234 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.234+0000 7f6eca7fc700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6eb8038510 0x7f6eb803a9c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f6ebc006fd0 tx=0x7f6ebc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:17.237 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.237+0000 7f6ec3fff700 1 -- 192.168.123.107:0/2456479883 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6eb4026080 con 0x7f6ecc100010 2026-03-10T12:33:17.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.381+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6ecc062380 con 0x7f6ecc100010 2026-03-10T12:33:17.382 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.381+0000 7f6ec3fff700 1 -- 192.168.123.107:0/2456479883 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f6eb4029720 con 0x7f6ecc100010 2026-03-10T12:33:17.382 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:17.382 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6eb8038510 msgr2=0x7f6eb803a9c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6eb8038510 0x7f6eb803a9c0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f6ebc006fd0 tx=0x7f6ebc006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 msgr2=0x7f6ecc1973a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc1973a0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f6eb4004750 tx=0x7f6eb4005dc0 comp rx=0 tx=0).stop 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 shutdown_connections 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6eb8038510 0x7f6eb803a9c0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 --2- 192.168.123.107:0/2456479883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6ecc100010 0x7f6ecc1973a0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.384+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 >> 192.168.123.107:0/2456479883 conn(0x7f6ecc0fb5a0 msgr2=0x7f6ecc0fd9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.385+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 shutdown_connections 2026-03-10T12:33:17.385 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:17.385+0000 7f6ed1364700 1 -- 192.168.123.107:0/2456479883 wait complete. 2026-03-10T12:33:17.386 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:17 vm00 ceph-mon[50686]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:17 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:17 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:17 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:17 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/2456479883' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:17 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:17 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-10T12:33:18.436 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:18.436 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:18.602 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:18.640 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:18.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.899+0000 7fa14caea700 1 -- 192.168.123.107:0/2769964092 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 msgr2=0x7fa148102640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:18.900 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.899+0000 7fa14caea700 1 --2- 192.168.123.107:0/2769964092 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148102640 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fa130009b00 tx=0x7fa130009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:18.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.900+0000 7fa14caea700 1 -- 192.168.123.107:0/2769964092 shutdown_connections 2026-03-10T12:33:18.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.900+0000 7fa14caea700 1 --2- 192.168.123.107:0/2769964092 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148102640 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:18.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.900+0000 7fa14caea700 1 -- 192.168.123.107:0/2769964092 >> 192.168.123.107:0/2769964092 conn(0x7fa1480fd8d0 msgr2=0x7fa1480ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:18.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.900+0000 7fa14caea700 1 -- 192.168.123.107:0/2769964092 shutdown_connections 2026-03-10T12:33:18.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.900+0000 7fa14caea700 1 -- 192.168.123.107:0/2769964092 wait complete. 2026-03-10T12:33:18.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.901+0000 7fa14caea700 1 Processor -- start 2026-03-10T12:33:18.901 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.901+0000 7fa14caea700 1 -- start start 2026-03-10T12:33:18.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.901+0000 7fa14caea700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148197380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:18.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.901+0000 7fa14caea700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1481978c0 con 0x7fa148102230 2026-03-10T12:33:18.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.901+0000 7fa14659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148197380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:18.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.901+0000 7fa14659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148197380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53378/0 (socket says 192.168.123.107:53378) 2026-03-10T12:33:18.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.901+0000 7fa14659c700 1 -- 192.168.123.107:0/1358371231 learned_addr learned my addr 192.168.123.107:0/1358371231 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:18.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.902+0000 7fa14659c700 1 -- 192.168.123.107:0/1358371231 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa1300097e0 con 0x7fa148102230 2026-03-10T12:33:18.902 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.902+0000 7fa14659c700 1 --2- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148197380 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fa130004750 tx=0x7fa130005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:18.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.902+0000 7fa13f7fe700 1 -- 192.168.123.107:0/1358371231 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa13001c070 con 0x7fa148102230 2026-03-10T12:33:18.903 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.902+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa148197ac0 con 0x7fa148102230 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.902+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa148197f60 con 0x7fa148102230 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.902+0000 7fa13f7fe700 1 -- 192.168.123.107:0/1358371231 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa130021470 con 0x7fa148102230 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.902+0000 7fa13f7fe700 1 -- 192.168.123.107:0/1358371231 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa13000f460 con 0x7fa148102230 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.903+0000 7fa13f7fe700 1 -- 192.168.123.107:0/1358371231 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7fa13000f5c0 con 0x7fa148102230 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.903+0000 7fa13f7fe700 1 --2- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa1340385b0 0x7fa13403aa60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.903+0000 7fa13f7fe700 1 -- 192.168.123.107:0/1358371231 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fa13004d470 con 0x7fa148102230 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.904+0000 7fa145d9b700 1 -- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa1340385b0 msgr2=0x7fa13403aa60 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:33:18.904 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.904+0000 7fa145d9b700 1 --2- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa1340385b0 0x7fa13403aa60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:33:18.905 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.905+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa128005320 con 0x7fa148102230 2026-03-10T12:33:18.908 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:18.908+0000 7fa13f7fe700 1 -- 192.168.123.107:0/1358371231 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa130026070 con 0x7fa148102230 2026-03-10T12:33:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.053+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fa128005190 con 0x7fa148102230 2026-03-10T12:33:19.056 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.056+0000 7fa13f7fe700 1 -- 192.168.123.107:0/1358371231 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fa130029360 con 0x7fa148102230 2026-03-10T12:33:19.057 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:19.057 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa1340385b0 msgr2=0x7fa13403aa60 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 --2- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa1340385b0 0x7fa13403aa60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 msgr2=0x7fa148197380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 --2- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148197380 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fa130004750 tx=0x7fa130005dc0 comp rx=0 tx=0).stop 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 shutdown_connections 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 --2- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa1340385b0 0x7fa13403aa60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 --2- 192.168.123.107:0/1358371231 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa148102230 0x7fa148197380 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.058+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 >> 192.168.123.107:0/1358371231 conn(0x7fa1480fd8d0 msgr2=0x7fa1480fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.059+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 shutdown_connections 2026-03-10T12:33:19.059 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:19.059+0000 7fa14caea700 1 -- 192.168.123.107:0/1358371231 wait complete. 2026-03-10T12:33:19.060 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:19.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:19 vm00 ceph-mon[50686]: from='mgr.14164 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-10T12:33:19.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:19 vm00 ceph-mon[50686]: mgrmap e14: vm00.nescmq(active, since 31s) 2026-03-10T12:33:19.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:19 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1358371231' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:20.112 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:20.112 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:20.260 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:20.301 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:20.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.591+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/2804063849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 msgr2=0x7f3cb8075b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:20.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.591+0000 7f3cbdfcc700 1 --2- 192.168.123.107:0/2804063849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb8075b10 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f3ca8009b00 tx=0x7f3ca8009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:20.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.592+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/2804063849 shutdown_connections 2026-03-10T12:33:20.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.592+0000 7f3cbdfcc700 1 --2- 192.168.123.107:0/2804063849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb8075b10 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:20.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.592+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/2804063849 >> 192.168.123.107:0/2804063849 conn(0x7f3cb80fd8d0 msgr2=0x7f3cb80ffcb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:20.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.592+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/2804063849 shutdown_connections 2026-03-10T12:33:20.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.592+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/2804063849 wait complete. 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.592+0000 7f3cbdfcc700 1 Processor -- start 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.592+0000 7f3cbdfcc700 1 -- start start 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.593+0000 7f3cbdfcc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb819b780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.593+0000 7f3cbdfcc700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3cb819bcc0 con 0x7f3cb8075700 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.593+0000 7f3cb77fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb819b780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.593+0000 7f3cb77fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb819b780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53392/0 (socket says 192.168.123.107:53392) 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.593+0000 7f3cb77fe700 1 -- 192.168.123.107:0/1123212936 learned_addr learned my addr 192.168.123.107:0/1123212936 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:20.593 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.593+0000 7f3cb77fe700 1 -- 192.168.123.107:0/1123212936 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ca80097e0 con 0x7f3cb8075700 2026-03-10T12:33:20.594 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.593+0000 7f3cb77fe700 1 --2- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb819b780 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f3ca8004f40 tx=0x7f3ca8005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:20.594 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.594+0000 7f3cb4ff9700 1 -- 192.168.123.107:0/1123212936 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3ca801c070 con 0x7f3cb8075700 2026-03-10T12:33:20.594 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.594+0000 7f3cb4ff9700 1 -- 192.168.123.107:0/1123212936 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3ca80053b0 con 0x7f3cb8075700 2026-03-10T12:33:20.594 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.594+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3cb819bec0 con 0x7f3cb8075700 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.594+0000 7f3cb4ff9700 1 -- 192.168.123.107:0/1123212936 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3ca800f460 con 0x7f3cb8075700 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.594+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3cb819c360 con 0x7f3cb8075700 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.595+0000 7f3cb4ff9700 1 -- 192.168.123.107:0/1123212936 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f3ca800f6f0 con 0x7f3cb8075700 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.595+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3cb8195470 con 0x7f3cb8075700 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.595+0000 7f3cb4ff9700 1 --2- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ca4038560 0x7f3ca403aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.595+0000 7f3cb4ff9700 1 -- 192.168.123.107:0/1123212936 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f3ca804c3b0 con 0x7f3cb8075700 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.595+0000 7f3cb6ffd700 1 -- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ca4038560 msgr2=0x7f3ca403aa10 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:33:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.595+0000 7f3cb6ffd700 1 --2- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ca4038560 0x7f3ca403aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:33:20.598 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.598+0000 7f3cb4ff9700 1 -- 192.168.123.107:0/1123212936 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3ca8017490 con 0x7f3cb8075700 2026-03-10T12:33:20.745 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.744+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f3cb8062380 con 0x7f3cb8075700 2026-03-10T12:33:20.745 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.745+0000 7f3cb4ff9700 1 -- 192.168.123.107:0/1123212936 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f3ca8029360 con 0x7f3cb8075700 2026-03-10T12:33:20.746 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:20.746 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:20.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.748+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ca4038560 msgr2=0x7f3ca403aa10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:33:20.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.748+0000 7f3cbdfcc700 1 --2- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ca4038560 0x7f3ca403aa10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:20.748 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.748+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 msgr2=0x7f3cb819b780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:20.749 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.748+0000 7f3cbdfcc700 1 --2- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb819b780 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f3ca8004f40 tx=0x7f3ca8005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:20.749 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.748+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 shutdown_connections 2026-03-10T12:33:20.749 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.749+0000 7f3cbdfcc700 1 --2- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ca4038560 0x7f3ca403aa10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:20.749 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.749+0000 7f3cbdfcc700 1 --2- 192.168.123.107:0/1123212936 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3cb8075700 0x7f3cb819b780 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:20.749 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.749+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 >> 192.168.123.107:0/1123212936 conn(0x7f3cb80fd8d0 msgr2=0x7f3cb80fe530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:20.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.749+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 shutdown_connections 2026-03-10T12:33:20.750 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:20.749+0000 7f3cbdfcc700 1 -- 192.168.123.107:0/1123212936 wait complete. 2026-03-10T12:33:20.751 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:20.898 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:20 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1123212936' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:21.820 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:21.820 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:21.969 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:22.013 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:22.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.276+0000 7f8c21fc8700 1 -- 192.168.123.107:0/313022251 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 msgr2=0x7f8c1c0fe6d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:22.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.276+0000 7f8c21fc8700 1 --2- 192.168.123.107:0/313022251 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c0fe6d0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f8c0c009b00 tx=0x7f8c0c009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:22.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.277+0000 7f8c21fc8700 1 -- 192.168.123.107:0/313022251 shutdown_connections 2026-03-10T12:33:22.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.277+0000 7f8c21fc8700 1 --2- 192.168.123.107:0/313022251 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c0fe6d0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:22.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.277+0000 7f8c21fc8700 1 -- 192.168.123.107:0/313022251 >> 192.168.123.107:0/313022251 conn(0x7f8c1c0f9d10 msgr2=0x7f8c1c0fc120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:22.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.277+0000 7f8c21fc8700 1 -- 192.168.123.107:0/313022251 shutdown_connections 2026-03-10T12:33:22.277 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.277+0000 7f8c21fc8700 1 -- 192.168.123.107:0/313022251 wait complete. 2026-03-10T12:33:22.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.277+0000 7f8c21fc8700 1 Processor -- start 2026-03-10T12:33:22.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.278+0000 7f8c21fc8700 1 -- start start 2026-03-10T12:33:22.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.278+0000 7f8c21fc8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c192f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:22.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.278+0000 7f8c21fc8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c1c193490 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.278+0000 7f8c1b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c192f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:22.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.278+0000 7f8c1b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c192f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53414/0 (socket says 192.168.123.107:53414) 2026-03-10T12:33:22.278 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.278+0000 7f8c1b7fe700 1 -- 192.168.123.107:0/677545944 learned_addr learned my addr 192.168.123.107:0/677545944 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:22.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.279+0000 7f8c1b7fe700 1 -- 192.168.123.107:0/677545944 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c0c0097e0 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.279+0000 7f8c1b7fe700 1 --2- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c192f50 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f8c0c00b5c0 tx=0x7f8c0c005ee0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:22.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.279+0000 7f8c197fa700 1 -- 192.168.123.107:0/677545944 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c0c01e070 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.279+0000 7f8c197fa700 1 -- 192.168.123.107:0/677545944 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8c0c00f460 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.279 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.279+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8c1c193690 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.279+0000 7f8c197fa700 1 -- 192.168.123.107:0/677545944 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8c0c017440 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.279+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8c1c193b30 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.280+0000 7f8c197fa700 1 -- 192.168.123.107:0/677545944 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 14) v1 ==== 45152+0+0 (secure 0 0 0) 0x7f8c0c0175c0 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.280+0000 7f8c197fa700 1 --2- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c0803c9a0 0x7f8c0803ee50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.280+0000 7f8c197fa700 1 -- 192.168.123.107:0/677545944 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8c0c04d4a0 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.280+0000 7f8c12dff700 1 -- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c0803c9a0 msgr2=0x7f8c0803ee50 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.100:6800/2 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.280+0000 7f8c12dff700 1 --2- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c0803c9a0 0x7f8c0803ee50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:33:22.281 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.281+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8c1c18cb80 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.284 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.284+0000 7f8c197fa700 1 -- 192.168.123.107:0/677545944 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8c0c017910 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.429 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.429+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8c1c062380 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.431 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.430+0000 7f8c197fa700 1 -- 192.168.123.107:0/677545944 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8c0c021b60 con 0x7f8c1c0fe2c0 2026-03-10T12:33:22.431 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:22.431 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c0803c9a0 msgr2=0x7f8c0803ee50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 --2- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c0803c9a0 0x7f8c0803ee50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 msgr2=0x7f8c1c192f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 --2- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c192f50 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f8c0c00b5c0 tx=0x7f8c0c005ee0 comp rx=0 tx=0).stop 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 shutdown_connections 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 --2- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8c0803c9a0 0x7f8c0803ee50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 --2- 192.168.123.107:0/677545944 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8c1c0fe2c0 0x7f8c1c192f50 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:22.433 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 >> 192.168.123.107:0/677545944 conn(0x7f8c1c0f9d10 msgr2=0x7f8c1c0fa970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:22.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 shutdown_connections 2026-03-10T12:33:22.434 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:22.433+0000 7f8c21fc8700 1 -- 192.168.123.107:0/677545944 wait complete. 2026-03-10T12:33:22.434 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:22.733 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:22 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/677545944' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: Active manager daemon vm00.nescmq restarted 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: Activating manager daemon vm00.nescmq 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: mgrmap e15: vm00.nescmq(active, starting, since 0.00508305s) 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: Manager daemon vm00.nescmq is now available 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:33:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:33:23.504 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:23.505 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:23.664 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:23.709 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:24.041 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.040+0000 7f9d9a35a700 1 -- 192.168.123.107:0/1262718483 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 msgr2=0x7f9d940717d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:24.041 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.040+0000 7f9d9a35a700 1 --2- 192.168.123.107:0/1262718483 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d940717d0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f9d84009b00 tx=0x7f9d84009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:24.042 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.042+0000 7f9d9a35a700 1 -- 192.168.123.107:0/1262718483 shutdown_connections 2026-03-10T12:33:24.042 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.042+0000 7f9d9a35a700 1 --2- 192.168.123.107:0/1262718483 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d940717d0 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:24.042 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.042+0000 7f9d9a35a700 1 -- 192.168.123.107:0/1262718483 >> 192.168.123.107:0/1262718483 conn(0x7f9d9406cd30 msgr2=0x7f9d9406f180 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:24.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.042+0000 7f9d9a35a700 1 -- 192.168.123.107:0/1262718483 shutdown_connections 2026-03-10T12:33:24.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.042+0000 7f9d9a35a700 1 -- 192.168.123.107:0/1262718483 wait complete. 2026-03-10T12:33:24.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.043+0000 7f9d9a35a700 1 Processor -- start 2026-03-10T12:33:24.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.043+0000 7f9d9a35a700 1 -- start start 2026-03-10T12:33:24.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.043+0000 7f9d9a35a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d941098c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:24.043 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.043+0000 7f9d9a35a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d94109e00 con 0x7f9d940713c0 2026-03-10T12:33:24.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.044+0000 7f9d99358700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d941098c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:24.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.044+0000 7f9d99358700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d941098c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53918/0 (socket says 192.168.123.107:53918) 2026-03-10T12:33:24.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.044+0000 7f9d99358700 1 -- 192.168.123.107:0/4127512154 learned_addr learned my addr 192.168.123.107:0/4127512154 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:24.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.044+0000 7f9d99358700 1 -- 192.168.123.107:0/4127512154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d840097e0 con 0x7f9d940713c0 2026-03-10T12:33:24.045 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.044+0000 7f9d99358700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d941098c0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f9d84004f40 tx=0x7f9d84005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:24.046 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.045+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9d8401c070 con 0x7f9d940713c0 2026-03-10T12:33:24.046 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.045+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d840053b0 con 0x7f9d940713c0 2026-03-10T12:33:24.047 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.046+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9d8400f460 con 0x7f9d940713c0 2026-03-10T12:33:24.047 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.046+0000 7f9d9a35a700 1 -- 192.168.123.107:0/4127512154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d94107f10 con 0x7f9d940713c0 2026-03-10T12:33:24.047 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.046+0000 7f9d9a35a700 1 -- 192.168.123.107:0/4127512154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d941083b0 con 0x7f9d940713c0 2026-03-10T12:33:24.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.047+0000 7f9d9a35a700 1 -- 192.168.123.107:0/4127512154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d9404efc0 con 0x7f9d940713c0 2026-03-10T12:33:24.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.048+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 15) v1 ==== 44873+0+0 (secure 0 0 0) 0x7f9d84005520 con 0x7f9d940713c0 2026-03-10T12:33:24.048 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.048+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f9d8404c090 con 0x7f9d940713c0 2026-03-10T12:33:24.051 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.050+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9d8402aa30 con 0x7f9d940713c0 2026-03-10T12:33:24.088 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.087+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mgrmap(e 16) v1 ==== 45000+0+0 (secure 0 0 0) 0x7f9d8402bd10 con 0x7f9d940713c0 2026-03-10T12:33:24.088 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.087+0000 7f9d8a7fc700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d80038f90 0x7f9d8003b370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:24.089 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.088+0000 7f9d98b57700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d80038f90 0x7f9d8003b370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:24.091 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.089+0000 7f9d98b57700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d80038f90 0x7f9d8003b370 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f9d8c00ad30 tx=0x7f9d8c0093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:24.207 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.206+0000 7f9d9a35a700 1 -- 192.168.123.107:0/4127512154 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9d94062380 con 0x7f9d940713c0 2026-03-10T12:33:24.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.212+0000 7f9d8a7fc700 1 -- 192.168.123.107:0/4127512154 <== mon.0 v2:192.168.123.100:3300/0 8 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f9d84026030 con 0x7f9d940713c0 2026-03-10T12:33:24.212 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:24.212 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:24.216 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.214+0000 7f9d7ffff700 1 -- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d80038f90 msgr2=0x7f9d8003b370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:24.216 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.214+0000 7f9d7ffff700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d80038f90 0x7f9d8003b370 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f9d8c00ad30 tx=0x7f9d8c0093f0 comp rx=0 tx=0).stop 2026-03-10T12:33:24.216 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.214+0000 7f9d7ffff700 1 -- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 msgr2=0x7f9d941098c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:24.216 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.214+0000 7f9d7ffff700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d941098c0 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f9d84004f40 tx=0x7f9d84005e70 comp rx=0 tx=0).stop 2026-03-10T12:33:24.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.218+0000 7f9d7ffff700 1 -- 192.168.123.107:0/4127512154 shutdown_connections 2026-03-10T12:33:24.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.218+0000 7f9d7ffff700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d80038f90 0x7f9d8003b370 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:24.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.218+0000 7f9d7ffff700 1 --2- 192.168.123.107:0/4127512154 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d940713c0 0x7f9d941098c0 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:24.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.218+0000 7f9d7ffff700 1 -- 192.168.123.107:0/4127512154 >> 192.168.123.107:0/4127512154 conn(0x7f9d9406cd30 msgr2=0x7f9d9406e8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:24.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.218+0000 7f9d7ffff700 1 -- 192.168.123.107:0/4127512154 shutdown_connections 2026-03-10T12:33:24.218 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:24.218+0000 7f9d7ffff700 1 -- 192.168.123.107:0/4127512154 wait complete. 2026-03-10T12:33:24.226 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: [10/Mar/2026:12:33:23] ENGINE Bus STARTING 2026-03-10T12:33:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: [10/Mar/2026:12:33:23] ENGINE Serving on http://192.168.123.100:8765 2026-03-10T12:33:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: [10/Mar/2026:12:33:23] ENGINE Serving on https://192.168.123.100:7150 2026-03-10T12:33:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: [10/Mar/2026:12:33:23] ENGINE Bus STARTED 2026-03-10T12:33:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: mgrmap e16: vm00.nescmq(active, since 1.01275s) 2026-03-10T12:33:24.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:24 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/4127512154' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:25.296 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:25.296 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:25.474 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:25.529 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T12:33:25.841 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.839+0000 7f44f1702700 1 -- 192.168.123.107:0/4050367518 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 msgr2=0x7f44ec0719e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:25.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.839+0000 7f44f1702700 1 --2- 192.168.123.107:0/4050367518 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec0719e0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f44dc009b00 tx=0x7f44dc009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:25.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.840+0000 7f44f1702700 1 -- 192.168.123.107:0/4050367518 shutdown_connections 2026-03-10T12:33:25.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.840+0000 7f44f1702700 1 --2- 192.168.123.107:0/4050367518 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec0719e0 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:25.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.840+0000 7f44f1702700 1 -- 192.168.123.107:0/4050367518 >> 192.168.123.107:0/4050367518 conn(0x7f44ec06cf00 msgr2=0x7f44ec06f350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:25.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.841+0000 7f44f1702700 1 -- 192.168.123.107:0/4050367518 shutdown_connections 2026-03-10T12:33:25.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.841+0000 7f44f1702700 1 -- 192.168.123.107:0/4050367518 wait complete. 2026-03-10T12:33:25.842 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.842+0000 7f44f1702700 1 Processor -- start 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.842+0000 7f44f1702700 1 -- start start 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.842+0000 7f44f1702700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec1a3f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.842+0000 7f44f1702700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f44ec1a4490 con 0x7f44ec0715f0 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.842+0000 7f44eaffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec1a3f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.842+0000 7f44eaffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec1a3f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53944/0 (socket says 192.168.123.107:53944) 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.842+0000 7f44eaffd700 1 -- 192.168.123.107:0/2595029931 learned_addr learned my addr 192.168.123.107:0/2595029931 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.843+0000 7f44eaffd700 1 -- 192.168.123.107:0/2595029931 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f44dc0097e0 con 0x7f44ec0715f0 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.843+0000 7f44eaffd700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec1a3f50 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f44dc004f40 tx=0x7f44dc004740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:25.843 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.843+0000 7f44d3fff700 1 -- 192.168.123.107:0/2595029931 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f44dc02d070 con 0x7f44ec0715f0 2026-03-10T12:33:25.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.843+0000 7f44f1702700 1 -- 192.168.123.107:0/2595029931 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f44ec1a4690 con 0x7f44ec0715f0 2026-03-10T12:33:25.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.843+0000 7f44f1702700 1 -- 192.168.123.107:0/2595029931 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f44ec1a4b30 con 0x7f44ec0715f0 2026-03-10T12:33:25.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.844+0000 7f44d3fff700 1 -- 192.168.123.107:0/2595029931 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f44dc0053f0 con 0x7f44ec0715f0 2026-03-10T12:33:25.844 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.844+0000 7f44f1702700 1 -- 192.168.123.107:0/2595029931 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f44ec062380 con 0x7f44ec0715f0 2026-03-10T12:33:25.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.844+0000 7f44d3fff700 1 -- 192.168.123.107:0/2595029931 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f44dc00f550 con 0x7f44ec0715f0 2026-03-10T12:33:25.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.845+0000 7f44d3fff700 1 -- 192.168.123.107:0/2595029931 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f44dc00f6b0 con 0x7f44ec0715f0 2026-03-10T12:33:25.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.845+0000 7f44d3fff700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f44d4038650 0x7f44d403ab00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:25.845 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.845+0000 7f44d3fff700 1 -- 192.168.123.107:0/2595029931 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f44dc05eea0 con 0x7f44ec0715f0 2026-03-10T12:33:25.848 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.845+0000 7f44ea7fc700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f44d4038650 0x7f44d403ab00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:25.857 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.848+0000 7f44ea7fc700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f44d4038650 0x7f44d403ab00 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f44e0006fd0 tx=0x7f44e0006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:25.857 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:25.848+0000 7f44d3fff700 1 -- 192.168.123.107:0/2595029931 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f44dc030960 con 0x7f44ec0715f0 2026-03-10T12:33:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:25 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:25 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:25 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:25 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:25 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:25 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:33:26.009 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.008+0000 7f44f1702700 1 -- 192.168.123.107:0/2595029931 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f44ec10e370 con 0x7f44ec0715f0 2026-03-10T12:33:26.010 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.009+0000 7f44d3fff700 1 -- 192.168.123.107:0/2595029931 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f44dc037070 con 0x7f44ec0715f0 2026-03-10T12:33:26.010 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:26.010 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 -- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f44d4038650 msgr2=0x7f44d403ab00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f44d4038650 0x7f44d403ab00 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f44e0006fd0 tx=0x7f44e0006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 -- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 msgr2=0x7f44ec1a3f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec1a3f50 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f44dc004f40 tx=0x7f44dc004740 comp rx=0 tx=0).stop 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 -- 192.168.123.107:0/2595029931 shutdown_connections 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f44d4038650 0x7f44d403ab00 secure :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f44e0006fd0 tx=0x7f44e0006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 --2- 192.168.123.107:0/2595029931 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44ec0715f0 0x7f44ec1a3f50 secure :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f44dc004f40 tx=0x7f44dc004740 comp rx=0 tx=0).stop 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.012+0000 7f44d1ffb700 1 -- 192.168.123.107:0/2595029931 >> 192.168.123.107:0/2595029931 conn(0x7f44ec06cf00 msgr2=0x7f44ec06fbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.013+0000 7f44d1ffb700 1 -- 192.168.123.107:0/2595029931 shutdown_connections 2026-03-10T12:33:26.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:26.013+0000 7f44d1ffb700 1 -- 192.168.123.107:0/2595029931 wait complete. 2026-03-10T12:33:26.014 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: mgrmap e17: vm00.nescmq(active, since 2s) 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/2595029931' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:26.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:26 vm00 ceph-mon[50686]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:27.077 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:27.078 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:27.253 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:27.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.579+0000 7fa65d6e7700 1 -- 192.168.123.107:0/3782114641 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 msgr2=0x7fa650093d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:27.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.579+0000 7fa65d6e7700 1 --2- 192.168.123.107:0/3782114641 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa650093d30 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7fa64c009b00 tx=0x7fa64c009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:27.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.580+0000 7fa65d6e7700 1 -- 192.168.123.107:0/3782114641 shutdown_connections 2026-03-10T12:33:27.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.580+0000 7fa65d6e7700 1 --2- 192.168.123.107:0/3782114641 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa650093d30 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:27.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.580+0000 7fa65d6e7700 1 -- 192.168.123.107:0/3782114641 >> 192.168.123.107:0/3782114641 conn(0x7fa650009520 msgr2=0x7fa650009920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:27.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.580+0000 7fa65d6e7700 1 -- 192.168.123.107:0/3782114641 shutdown_connections 2026-03-10T12:33:27.581 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.580+0000 7fa65d6e7700 1 -- 192.168.123.107:0/3782114641 wait complete. 2026-03-10T12:33:27.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.581+0000 7fa65d6e7700 1 Processor -- start 2026-03-10T12:33:27.582 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.582+0000 7fa65d6e7700 1 -- start start 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.582+0000 7fa65d6e7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa65012a690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.582+0000 7fa65d6e7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa65012abd0 con 0x7fa650093960 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.582+0000 7fa657fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa65012a690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.583+0000 7fa657fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa65012a690 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53958/0 (socket says 192.168.123.107:53958) 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.583+0000 7fa657fff700 1 -- 192.168.123.107:0/1866026159 learned_addr learned my addr 192.168.123.107:0/1866026159 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.583+0000 7fa657fff700 1 -- 192.168.123.107:0/1866026159 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa64c0097e0 con 0x7fa650093960 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.583+0000 7fa657fff700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa65012a690 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fa64c005e00 tx=0x7fa64c0050b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:27.584 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.583+0000 7fa6557fa700 1 -- 192.168.123.107:0/1866026159 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa64c01c070 con 0x7fa650093960 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.583+0000 7fa6557fa700 1 -- 192.168.123.107:0/1866026159 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa64c0054a0 con 0x7fa650093960 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.584+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa65012add0 con 0x7fa650093960 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.584+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa65012b190 con 0x7fa650093960 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.585+0000 7fa6557fa700 1 -- 192.168.123.107:0/1866026159 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa64c00f680 con 0x7fa650093960 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.585+0000 7fa6557fa700 1 -- 192.168.123.107:0/1866026159 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7fa64c00f8f0 con 0x7fa650093960 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.585+0000 7fa6557fa700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa6480386a0 0x7fa64803ab50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.585+0000 7fa6557fa700 1 -- 192.168.123.107:0/1866026159 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fa64c04d880 con 0x7fa650093960 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.585+0000 7fa6577fe700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa6480386a0 0x7fa64803ab50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.586+0000 7fa6577fe700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa6480386a0 0x7fa64803ab50 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fa644006fd0 tx=0x7fa644006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:27.587 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.587+0000 7fa642ffd700 1 -- 192.168.123.107:0/1866026159 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa63c005320 con 0x7fa650093960 2026-03-10T12:33:27.592 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.591+0000 7fa6557fa700 1 -- 192.168.123.107:0/1866026159 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa64c030080 con 0x7fa650093960 2026-03-10T12:33:27.778 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.777+0000 7fa642ffd700 1 -- 192.168.123.107:0/1866026159 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fa63c005190 con 0x7fa650093960 2026-03-10T12:33:27.781 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.781+0000 7fa6557fa700 1 -- 192.168.123.107:0/1866026159 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fa64c026070 con 0x7fa650093960 2026-03-10T12:33:27.781 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:27.782 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:27.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa6480386a0 msgr2=0x7fa64803ab50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:27.786 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa6480386a0 0x7fa64803ab50 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fa644006fd0 tx=0x7fa644006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 msgr2=0x7fa65012a690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa65012a690 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fa64c005e00 tx=0x7fa64c0050b0 comp rx=0 tx=0).stop 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 shutdown_connections 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa6480386a0 0x7fa64803ab50 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 --2- 192.168.123.107:0/1866026159 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa650093960 0x7fa65012a690 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.786+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 >> 192.168.123.107:0/1866026159 conn(0x7fa650009520 msgr2=0x7fa6500995e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.787+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 shutdown_connections 2026-03-10T12:33:27.787 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:27.787+0000 7fa65d6e7700 1 -- 192.168.123.107:0/1866026159 wait complete. 2026-03-10T12:33:27.789 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: Updating vm00:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: Deploying daemon ceph-exporter.vm07 on vm07 2026-03-10T12:33:28.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:28 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1866026159' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:28.938 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:28.939 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:29.173 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:29.832 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.831+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1985465168 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e0072730 msgr2=0x7fc5e010edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:29.832 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.831+0000 7fc5e6cc2700 1 --2- 192.168.123.107:0/1985465168 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e0072730 0x7fc5e010edb0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7fc5dc009b00 tx=0x7fc5dc009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:29.832 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.831+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1985465168 shutdown_connections 2026-03-10T12:33:29.832 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.831+0000 7fc5e6cc2700 1 --2- 192.168.123.107:0/1985465168 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e0072730 0x7fc5e010edb0 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:29.832 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.831+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1985465168 >> 192.168.123.107:0/1985465168 conn(0x7fc5e006c410 msgr2=0x7fc5e006c810 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:29.832 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.831+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1985465168 shutdown_connections 2026-03-10T12:33:29.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.831+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1985465168 wait complete. 2026-03-10T12:33:29.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.833+0000 7fc5e6cc2700 1 Processor -- start 2026-03-10T12:33:29.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.833+0000 7fc5e6cc2700 1 -- start start 2026-03-10T12:33:29.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.833+0000 7fc5e6cc2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e01a4140 0x7fc5e01a4510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:29.833 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.833+0000 7fc5e6cc2700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5dc012070 con 0x7fc5e01a4140 2026-03-10T12:33:29.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.833+0000 7fc5e5cc0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e01a4140 0x7fc5e01a4510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:29.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.833+0000 7fc5e5cc0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e01a4140 0x7fc5e01a4510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:53992/0 (socket says 192.168.123.107:53992) 2026-03-10T12:33:29.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.833+0000 7fc5e5cc0700 1 -- 192.168.123.107:0/1373478890 learned_addr learned my addr 192.168.123.107:0/1373478890 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:29.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.834+0000 7fc5e5cc0700 1 -- 192.168.123.107:0/1373478890 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5dc0097e0 con 0x7fc5e01a4140 2026-03-10T12:33:29.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.834+0000 7fc5e5cc0700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e01a4140 0x7fc5e01a4510 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fc5dc006010 tx=0x7fc5dc00bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:29.834 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.834+0000 7fc5d6ffd700 1 -- 192.168.123.107:0/1373478890 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc5dc01c070 con 0x7fc5e01a4140 2026-03-10T12:33:29.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.834+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5e01a4ab0 con 0x7fc5e01a4140 2026-03-10T12:33:29.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.834+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5e01a8af0 con 0x7fc5e01a4140 2026-03-10T12:33:29.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.835+0000 7fc5d6ffd700 1 -- 192.168.123.107:0/1373478890 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc5dc003d70 con 0x7fc5e01a4140 2026-03-10T12:33:29.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.835+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc5e004efc0 con 0x7fc5e01a4140 2026-03-10T12:33:29.835 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.835+0000 7fc5d6ffd700 1 -- 192.168.123.107:0/1373478890 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc5dc017440 con 0x7fc5e01a4140 2026-03-10T12:33:29.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.836+0000 7fc5d6ffd700 1 -- 192.168.123.107:0/1373478890 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7fc5dc003890 con 0x7fc5e01a4140 2026-03-10T12:33:29.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.836+0000 7fc5d6ffd700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc5cc038640 0x7fc5cc03aaf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:29.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.836+0000 7fc5d6ffd700 1 -- 192.168.123.107:0/1373478890 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fc5dc02f080 con 0x7fc5e01a4140 2026-03-10T12:33:29.836 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.836+0000 7fc5e54bf700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc5cc038640 0x7fc5cc03aaf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:29.837 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.836+0000 7fc5e54bf700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc5cc038640 0x7fc5cc03aaf0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fc5d800ad30 tx=0x7fc5d80093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:29.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:29.838+0000 7fc5d6ffd700 1 -- 192.168.123.107:0/1373478890 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc5dc029b80 con 0x7fc5e01a4140 2026-03-10T12:33:30.001 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.001+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc5e0062380 con 0x7fc5e01a4140 2026-03-10T12:33:30.002 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.001+0000 7fc5d6ffd700 1 -- 192.168.123.107:0/1373478890 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc5dc029b80 con 0x7fc5e01a4140 2026-03-10T12:33:30.002 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:30.012 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc5cc038640 msgr2=0x7fc5cc03aaf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc5cc038640 0x7fc5cc03aaf0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fc5d800ad30 tx=0x7fc5d80093f0 comp rx=0 tx=0).stop 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e01a4140 msgr2=0x7fc5e01a4510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e01a4140 0x7fc5e01a4510 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fc5dc006010 tx=0x7fc5dc00bba0 comp rx=0 tx=0).stop 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 shutdown_connections 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc5cc038640 0x7fc5cc03aaf0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 --2- 192.168.123.107:0/1373478890 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5e01a4140 0x7fc5e01a4510 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 >> 192.168.123.107:0/1373478890 conn(0x7fc5e006c410 msgr2=0x7fc5e010b740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 shutdown_connections 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:30.004+0000 7fc5e6cc2700 1 -- 192.168.123.107:0/1373478890 wait complete. 2026-03-10T12:33:30.013 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:30.066 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:29 vm00 ceph-mon[50686]: Deploying daemon crash.vm07 on vm07 2026-03-10T12:33:31.091 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:31.091 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:30 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:30 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:30 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1373478890' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:30 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:30 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:30 vm00 ceph-mon[50686]: Deploying daemon node-exporter.vm07 on vm07 2026-03-10T12:33:31.247 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.529+0000 7f602ba5e700 1 -- 192.168.123.107:0/2993913493 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 msgr2=0x7f602410ca30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.529+0000 7f602ba5e700 1 --2- 192.168.123.107:0/2993913493 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f602410ca30 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f6018009b00 tx=0x7f6018009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.530+0000 7f602ba5e700 1 -- 192.168.123.107:0/2993913493 shutdown_connections 2026-03-10T12:33:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.530+0000 7f602ba5e700 1 --2- 192.168.123.107:0/2993913493 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f602410ca30 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.530+0000 7f602ba5e700 1 -- 192.168.123.107:0/2993913493 >> 192.168.123.107:0/2993913493 conn(0x7f6024075e80 msgr2=0x7f6024078290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.531+0000 7f602ba5e700 1 -- 192.168.123.107:0/2993913493 shutdown_connections 2026-03-10T12:33:31.531 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.531+0000 7f602ba5e700 1 -- 192.168.123.107:0/2993913493 wait complete. 2026-03-10T12:33:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.531+0000 7f602ba5e700 1 Processor -- start 2026-03-10T12:33:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.531+0000 7f602ba5e700 1 -- start start 2026-03-10T12:33:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.532+0000 7f602ba5e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f60241a1960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.532+0000 7f602ba5e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60241a1ea0 con 0x7f602410c660 2026-03-10T12:33:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.532+0000 7f60297fa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f60241a1960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.532+0000 7f60297fa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f60241a1960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:54014/0 (socket says 192.168.123.107:54014) 2026-03-10T12:33:31.532 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.532+0000 7f60297fa700 1 -- 192.168.123.107:0/3178063358 learned_addr learned my addr 192.168.123.107:0/3178063358 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:31.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.532+0000 7f60297fa700 1 -- 192.168.123.107:0/3178063358 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f60180097e0 con 0x7f602410c660 2026-03-10T12:33:31.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.532+0000 7f60297fa700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f60241a1960 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f6018004d10 tx=0x7f6018004df0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:31.533 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.533+0000 7f60167fc700 1 -- 192.168.123.107:0/3178063358 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f601801c070 con 0x7f602410c660 2026-03-10T12:33:31.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.533+0000 7f60167fc700 1 -- 192.168.123.107:0/3178063358 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6018021470 con 0x7f602410c660 2026-03-10T12:33:31.534 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.533+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60241a20a0 con 0x7f602410c660 2026-03-10T12:33:31.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.533+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60241a24c0 con 0x7f602410c660 2026-03-10T12:33:31.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.533+0000 7f60167fc700 1 -- 192.168.123.107:0/3178063358 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f601800f460 con 0x7f602410c660 2026-03-10T12:33:31.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.534+0000 7f60167fc700 1 -- 192.168.123.107:0/3178063358 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f601800f680 con 0x7f602410c660 2026-03-10T12:33:31.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.534+0000 7f60167fc700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6010038650 0x7f601003ab00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:31.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.534+0000 7f60167fc700 1 -- 192.168.123.107:0/3178063358 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f601804d5a0 con 0x7f602410c660 2026-03-10T12:33:31.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.534+0000 7f6028ff9700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6010038650 0x7f601003ab00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:31.535 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.535+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f602404f9e0 con 0x7f602410c660 2026-03-10T12:33:31.536 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.536+0000 7f6028ff9700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6010038650 0x7f601003ab00 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f6020006fd0 tx=0x7f6020006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:31.538 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.538+0000 7f60167fc700 1 -- 192.168.123.107:0/3178063358 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6018026070 con 0x7f602410c660 2026-03-10T12:33:31.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.691+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f6024062380 con 0x7f602410c660 2026-03-10T12:33:31.692 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.692+0000 7f60167fc700 1 -- 192.168.123.107:0/3178063358 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f6018029540 con 0x7f602410c660 2026-03-10T12:33:31.693 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:31.693 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:31.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.694+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6010038650 msgr2=0x7f601003ab00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:31.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.695+0000 7f602ba5e700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6010038650 0x7f601003ab00 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f6020006fd0 tx=0x7f6020006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:31.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.695+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 msgr2=0x7f60241a1960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:31.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.695+0000 7f602ba5e700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f60241a1960 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f6018004d10 tx=0x7f6018004df0 comp rx=0 tx=0).stop 2026-03-10T12:33:31.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.695+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 shutdown_connections 2026-03-10T12:33:31.695 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.695+0000 7f602ba5e700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6010038650 0x7f601003ab00 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:31.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.695+0000 7f602ba5e700 1 --2- 192.168.123.107:0/3178063358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f602410c660 0x7f60241a1960 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:31.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.695+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 >> 192.168.123.107:0/3178063358 conn(0x7f6024075e80 msgr2=0x7f6024077b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:31.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.696+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 shutdown_connections 2026-03-10T12:33:31.696 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:31.696+0000 7f602ba5e700 1 -- 192.168.123.107:0/3178063358 wait complete. 2026-03-10T12:33:31.697 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:31 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/3178063358' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:32.755 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:32.755 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:32.899 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:33.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.205+0000 7f2e37fff700 1 -- 192.168.123.107:0/2851760614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 msgr2=0x7f2e381026f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:33.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.205+0000 7f2e37fff700 1 --2- 192.168.123.107:0/2851760614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e381026f0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f2e28009b00 tx=0x7f2e28009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:33.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.206+0000 7f2e37fff700 1 -- 192.168.123.107:0/2851760614 shutdown_connections 2026-03-10T12:33:33.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.206+0000 7f2e37fff700 1 --2- 192.168.123.107:0/2851760614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e381026f0 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:33.206 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.206+0000 7f2e37fff700 1 -- 192.168.123.107:0/2851760614 >> 192.168.123.107:0/2851760614 conn(0x7f2e380fdc00 msgr2=0x7f2e38100010 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:33.208 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.208+0000 7f2e37fff700 1 -- 192.168.123.107:0/2851760614 shutdown_connections 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.208+0000 7f2e37fff700 1 -- 192.168.123.107:0/2851760614 wait complete. 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.208+0000 7f2e37fff700 1 Processor -- start 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.209+0000 7f2e37fff700 1 -- start start 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.209+0000 7f2e37fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e3806fe40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.209+0000 7f2e37fff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2e38072e20 con 0x7f2e38102320 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.209+0000 7f2e36ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e3806fe40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.209+0000 7f2e36ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e3806fe40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:57856/0 (socket says 192.168.123.107:57856) 2026-03-10T12:33:33.209 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.209+0000 7f2e36ffd700 1 -- 192.168.123.107:0/3766555537 learned_addr learned my addr 192.168.123.107:0/3766555537 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:33.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.209+0000 7f2e36ffd700 1 -- 192.168.123.107:0/3766555537 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2e280097e0 con 0x7f2e38102320 2026-03-10T12:33:33.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.210+0000 7f2e36ffd700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e3806fe40 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f2e28006010 tx=0x7f2e28004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:33.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.210+0000 7f2e3c9d5700 1 -- 192.168.123.107:0/3766555537 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2e2801c070 con 0x7f2e38102320 2026-03-10T12:33:33.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.210+0000 7f2e3c9d5700 1 -- 192.168.123.107:0/3766555537 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2e28021470 con 0x7f2e38102320 2026-03-10T12:33:33.210 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.210+0000 7f2e3c9d5700 1 -- 192.168.123.107:0/3766555537 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2e2800f460 con 0x7f2e38102320 2026-03-10T12:33:33.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.210+0000 7f2e37fff700 1 -- 192.168.123.107:0/3766555537 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2e380703e0 con 0x7f2e38102320 2026-03-10T12:33:33.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.210+0000 7f2e37fff700 1 -- 192.168.123.107:0/3766555537 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2e38070860 con 0x7f2e38102320 2026-03-10T12:33:33.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.211+0000 7f2e3c9d5700 1 -- 192.168.123.107:0/3766555537 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f2e280215e0 con 0x7f2e38102320 2026-03-10T12:33:33.211 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.211+0000 7f2e3c9d5700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2e24038600 0x7f2e2403aab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:33.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.211+0000 7f2e3c9d5700 1 -- 192.168.123.107:0/3766555537 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f2e2804c3d0 con 0x7f2e38102320 2026-03-10T12:33:33.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.212+0000 7f2e2ffff700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2e24038600 0x7f2e2403aab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:33.212 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.211+0000 7f2e37fff700 1 -- 192.168.123.107:0/3766555537 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2e18005320 con 0x7f2e38102320 2026-03-10T12:33:33.215 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.215+0000 7f2e2ffff700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2e24038600 0x7f2e2403aab0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f2e20006fd0 tx=0x7f2e20006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:33.215 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.215+0000 7f2e3c9d5700 1 -- 192.168.123.107:0/3766555537 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2e28026070 con 0x7f2e38102320 2026-03-10T12:33:33.408 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.407+0000 7f2e37fff700 1 -- 192.168.123.107:0/3766555537 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2e18005190 con 0x7f2e38102320 2026-03-10T12:33:33.409 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.408+0000 7f2e3c9d5700 1 -- 192.168.123.107:0/3766555537 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2e28029540 con 0x7f2e38102320 2026-03-10T12:33:33.409 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:33.409 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:33.412 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.412+0000 7f2e2e7fc700 1 -- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2e24038600 msgr2=0x7f2e2403aab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:33.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.412+0000 7f2e2e7fc700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2e24038600 0x7f2e2403aab0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f2e20006fd0 tx=0x7f2e20006e40 comp rx=0 tx=0).stop 2026-03-10T12:33:33.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.412+0000 7f2e2e7fc700 1 -- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 msgr2=0x7f2e3806fe40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:33.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.412+0000 7f2e2e7fc700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e3806fe40 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f2e28006010 tx=0x7f2e28004dc0 comp rx=0 tx=0).stop 2026-03-10T12:33:33.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.413+0000 7f2e2e7fc700 1 -- 192.168.123.107:0/3766555537 shutdown_connections 2026-03-10T12:33:33.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.413+0000 7f2e2e7fc700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2e24038600 0x7f2e2403aab0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:33.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.413+0000 7f2e2e7fc700 1 --2- 192.168.123.107:0/3766555537 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2e38102320 0x7f2e3806fe40 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:33.413 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.413+0000 7f2e2e7fc700 1 -- 192.168.123.107:0/3766555537 >> 192.168.123.107:0/3766555537 conn(0x7f2e380fdc00 msgr2=0x7f2e381064b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:33.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.413+0000 7f2e2e7fc700 1 -- 192.168.123.107:0/3766555537 shutdown_connections 2026-03-10T12:33:33.414 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:33.414+0000 7f2e2e7fc700 1 -- 192.168.123.107:0/3766555537 wait complete. 2026-03-10T12:33:33.417 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: Deploying daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:33 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/3766555537' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:34.502 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:34.503 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:34.913 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:33:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:33:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:34 vm00 ceph-mon[50686]: Deploying daemon mon.vm07 on vm07 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.478+0000 7f1f71ded700 1 -- 192.168.123.107:0/266853803 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c10c540 msgr2=0x7f1f6c10c910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.478+0000 7f1f71ded700 1 --2- 192.168.123.107:0/266853803 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c10c540 0x7f1f6c10c910 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f1f5c007780 tx=0x7f1f5c00c050 comp rx=0 tx=0).stop 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.479+0000 7f1f71ded700 1 -- 192.168.123.107:0/266853803 shutdown_connections 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.479+0000 7f1f71ded700 1 --2- 192.168.123.107:0/266853803 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c10c540 0x7f1f6c10c910 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.479+0000 7f1f71ded700 1 -- 192.168.123.107:0/266853803 >> 192.168.123.107:0/266853803 conn(0x7f1f6c06b290 msgr2=0x7f1f6c06b690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.479+0000 7f1f71ded700 1 -- 192.168.123.107:0/266853803 shutdown_connections 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.479+0000 7f1f71ded700 1 -- 192.168.123.107:0/266853803 wait complete. 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f71ded700 1 Processor -- start 2026-03-10T12:33:35.480 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f71ded700 1 -- start start 2026-03-10T12:33:35.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f71ded700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c0837a0 0x7f1f6c07c7c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:35.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f71ded700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1f5c003680 con 0x7f1f6c0837a0 2026-03-10T12:33:35.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f70deb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c0837a0 0x7f1f6c07c7c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:35.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f70deb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c0837a0 0x7f1f6c07c7c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:57898/0 (socket says 192.168.123.107:57898) 2026-03-10T12:33:35.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f70deb700 1 -- 192.168.123.107:0/512672429 learned_addr learned my addr 192.168.123.107:0/512672429 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:35.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.480+0000 7f1f70deb700 1 -- 192.168.123.107:0/512672429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1f5c007430 con 0x7f1f6c0837a0 2026-03-10T12:33:35.481 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.481+0000 7f1f70deb700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c0837a0 0x7f1f6c07c7c0 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f1f5c00a010 tx=0x7f1f5c00c7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:35.483 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.483+0000 7f1f69ffb700 1 -- 192.168.123.107:0/512672429 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1f5c00f050 con 0x7f1f6c0837a0 2026-03-10T12:33:35.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.483+0000 7f1f69ffb700 1 -- 192.168.123.107:0/512672429 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1f5c00cb60 con 0x7f1f6c0837a0 2026-03-10T12:33:35.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.483+0000 7f1f69ffb700 1 -- 192.168.123.107:0/512672429 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1f5c008370 con 0x7f1f6c0837a0 2026-03-10T12:33:35.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.483+0000 7f1f71ded700 1 -- 192.168.123.107:0/512672429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1f6c083c30 con 0x7f1f6c0837a0 2026-03-10T12:33:35.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.483+0000 7f1f71ded700 1 -- 192.168.123.107:0/512672429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1f6c07ced0 con 0x7f1f6c0837a0 2026-03-10T12:33:35.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.483+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1f4c0052f0 con 0x7f1f6c0837a0 2026-03-10T12:33:35.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.484+0000 7f1f69ffb700 1 -- 192.168.123.107:0/512672429 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 17) v1 ==== 45198+0+0 (secure 0 0 0) 0x7f1f5c01a040 con 0x7f1f6c0837a0 2026-03-10T12:33:35.484 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.484+0000 7f1f69ffb700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1f54038670 0x7f1f5403ab20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:35.485 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.484+0000 7f1f69ffb700 1 -- 192.168.123.107:0/512672429 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f1f5c02a030 con 0x7f1f6c0837a0 2026-03-10T12:33:35.487 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.487+0000 7f1f6bfff700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1f54038670 0x7f1f5403ab20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:35.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.487+0000 7f1f69ffb700 1 -- 192.168.123.107:0/512672429 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1f5c02e3c0 con 0x7f1f6c0837a0 2026-03-10T12:33:35.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.492+0000 7f1f6bfff700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1f54038670 0x7f1f5403ab20 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f1f6400ad30 tx=0x7f1f640093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:35.640 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.639+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f1f4c005160 con 0x7f1f6c0837a0 2026-03-10T12:33:35.644 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:35.644 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":1,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:32:16.576749Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T12:33:35.644 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.642+0000 7f1f69ffb700 1 -- 192.168.123.107:0/512672429 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f1f5c020070 con 0x7f1f6c0837a0 2026-03-10T12:33:35.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.645+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1f54038670 msgr2=0x7f1f5403ab20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:35.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.645+0000 7f1f537fe700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1f54038670 0x7f1f5403ab20 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f1f6400ad30 tx=0x7f1f640093f0 comp rx=0 tx=0).stop 2026-03-10T12:33:35.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.645+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c0837a0 msgr2=0x7f1f6c07c7c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:35.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.645+0000 7f1f537fe700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c0837a0 0x7f1f6c07c7c0 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f1f5c00a010 tx=0x7f1f5c00c7b0 comp rx=0 tx=0).stop 2026-03-10T12:33:35.646 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.646+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 shutdown_connections 2026-03-10T12:33:35.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.646+0000 7f1f537fe700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1f54038670 0x7f1f5403ab20 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:35.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.646+0000 7f1f537fe700 1 --2- 192.168.123.107:0/512672429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1f6c0837a0 0x7f1f6c07c7c0 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:35.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.646+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 >> 192.168.123.107:0/512672429 conn(0x7f1f6c06b290 msgr2=0x7f1f6c073780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:35.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.646+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 shutdown_connections 2026-03-10T12:33:35.647 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:35.646+0000 7f1f537fe700 1 -- 192.168.123.107:0/512672429 wait complete. 2026-03-10T12:33:35.648 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 1 2026-03-10T12:33:36.763 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T12:33:36.763 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mon dump -f json 2026-03-10T12:33:36.952 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.669+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2488186833 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 msgr2=0x7f2a64005610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.669+0000 7f2a7bfff700 1 --2- 192.168.123.107:0/2488186833 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 0x7f2a64005610 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f2a6c005fa0 tx=0x7f2a6c00ff70 comp rx=0 tx=0).stop 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.670+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2488186833 shutdown_connections 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.670+0000 7f2a7bfff700 1 --2- 192.168.123.107:0/2488186833 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 0x7f2a64005610 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.670+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2488186833 >> 192.168.123.107:0/2488186833 conn(0x7f2a7c06b290 msgr2=0x7f2a7c06b690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.671+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2488186833 shutdown_connections 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.671+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2488186833 wait complete. 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.671+0000 7f2a7bfff700 1 Processor -- start 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.671+0000 7f2a7bfff700 1 -- start start 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 0x7f2a7c1a45d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7bfff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 0x7f2a7c1a8d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7bfff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a7c1a92d0 con 0x7f2a7c071980 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7bfff700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a7c1a9440 con 0x7f2a7c1a4b10 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7a7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 0x7f2a7c1a8d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7a7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 0x7f2a7c1a8d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:37690/0 (socket says 192.168.123.107:37690) 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7a7fc700 1 -- 192.168.123.107:0/2831985032 learned_addr learned my addr 192.168.123.107:0/2831985032 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7a7fc700 1 -- 192.168.123.107:0/2831985032 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 msgr2=0x7f2a7c1a8d90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7a7fc700 1 -- 192.168.123.107:0/2831985032 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 msgr2=0x7f2a7c1a8d90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7a7fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 0x7f2a7c1a8d90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7a7fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 0x7f2a7c1a8d90 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.672+0000 7f2a7affd700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 0x7f2a7c1a45d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.673+0000 7f2a7affd700 1 -- 192.168.123.107:0/2831985032 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 msgr2=0x7f2a7c1a8d90 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.673+0000 7f2a7affd700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 0x7f2a7c1a8d90 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.673+0000 7f2a7affd700 1 -- 192.168.123.107:0/2831985032 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2a6c00f970 con 0x7f2a7c071980 2026-03-10T12:33:41.673 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.673+0000 7f2a7affd700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 0x7f2a7c1a45d0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f2a6c0046c0 tx=0x7f2a6c0057a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.674+0000 7f2a8088a700 1 -- 192.168.123.107:0/2831985032 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a6c012df0 con 0x7f2a7c071980 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.674+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2831985032 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a7c1a9580 con 0x7f2a7c071980 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.674+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2831985032 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a7c1a9ad0 con 0x7f2a7c071980 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.674+0000 7f2a8088a700 1 -- 192.168.123.107:0/2831985032 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2a6c005050 con 0x7f2a7c071980 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.674+0000 7f2a8088a700 1 -- 192.168.123.107:0/2831985032 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a6c004ba0 con 0x7f2a7c071980 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.676+0000 7f2a8088a700 1 -- 192.168.123.107:0/2831985032 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f2a6c024070 con 0x7f2a7c071980 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.676+0000 7f2a8088a700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2a6806c820 0x7f2a6806ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.676+0000 7f2a7a7fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2a6806c820 0x7f2a6806ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:41.677 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.676+0000 7f2a8088a700 1 -- 192.168.123.107:0/2831985032 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f2a6c08e6d0 con 0x7f2a7c071980 2026-03-10T12:33:41.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.677+0000 7f2a7a7fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2a6806c820 0x7f2a6806ecd0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f2a74009400 tx=0x7f2a74007040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:41.678 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.677+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2831985032 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2a64002170 con 0x7f2a7c071980 2026-03-10T12:33:41.680 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.680+0000 7f2a8088a700 1 -- 192.168.123.107:0/2831985032 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f2a6c059fa0 con 0x7f2a7c071980 2026-03-10T12:33:41.851 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:33:41.851 INFO:teuthology.orchestra.run.vm07.stdout:{"epoch":2,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","modified":"2026-03-10T12:33:35.856080Z","created":"2026-03-10T12:32:16.576749Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm00","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:3300","nonce":0},{"type":"v1","addr":"192.168.123.100:6789","nonce":0}]},"addr":"192.168.123.100:6789/0","public_addr":"192.168.123.100:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm07","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:3300","nonce":0},{"type":"v1","addr":"192.168.123.107:6789","nonce":0}]},"addr":"192.168.123.107:6789/0","public_addr":"192.168.123.107:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-10T12:33:41.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.849+0000 7f2a7bfff700 1 -- 192.168.123.107:0/2831985032 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f2a64001fe0 con 0x7f2a7c071980 2026-03-10T12:33:41.852 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.849+0000 7f2a8088a700 1 -- 192.168.123.107:0/2831985032 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1032 (secure 0 0 0) 0x7f2a6c02b020 con 0x7f2a7c071980 2026-03-10T12:33:41.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.853+0000 7f2a627fc700 1 -- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2a6806c820 msgr2=0x7f2a6806ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:41.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.853+0000 7f2a627fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2a6806c820 0x7f2a6806ecd0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f2a74009400 tx=0x7f2a74007040 comp rx=0 tx=0).stop 2026-03-10T12:33:41.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.853+0000 7f2a627fc700 1 -- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 msgr2=0x7f2a7c1a45d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:41.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.853+0000 7f2a627fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 0x7f2a7c1a45d0 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f2a6c0046c0 tx=0x7f2a6c0057a0 comp rx=0 tx=0).stop 2026-03-10T12:33:41.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.854+0000 7f2a627fc700 1 -- 192.168.123.107:0/2831985032 shutdown_connections 2026-03-10T12:33:41.854 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.854+0000 7f2a627fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f2a6806c820 0x7f2a6806ecd0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:41.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.854+0000 7f2a627fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c071980 0x7f2a7c1a45d0 secure :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f2a6c0046c0 tx=0x7f2a6c0057a0 comp rx=0 tx=0).stop 2026-03-10T12:33:41.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.854+0000 7f2a627fc700 1 --2- 192.168.123.107:0/2831985032 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1a4b10 0x7f2a7c1a8d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:41.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.854+0000 7f2a627fc700 1 -- 192.168.123.107:0/2831985032 >> 192.168.123.107:0/2831985032 conn(0x7f2a7c06b290 msgr2=0x7f2a7c06f9c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:41.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.855+0000 7f2a627fc700 1 -- 192.168.123.107:0/2831985032 shutdown_connections 2026-03-10T12:33:41.855 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:33:41.855+0000 7f2a627fc700 1 -- 192.168.123.107:0/2831985032 wait complete. 2026-03-10T12:33:41.856 INFO:teuthology.orchestra.run.vm07.stderr:dumped monmap epoch 2 2026-03-10T12:33:41.912 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-10T12:33:41.912 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph config generate-minimal-conf 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: mon.vm00 calling monitor election 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: mon.vm07 calling monitor election 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.? 192.168.123.107:0/3648174574' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/crt"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: mon.vm00 is new leader, mons vm00,vm07 in quorum (ranks 0,1) 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: monmap e2: 2 mons at {vm00=[v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0],vm07=[v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0]} removed_ranks: {} 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: fsmap 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: mgrmap e17: vm00.nescmq(active, since 18s) 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: Standby manager daemon vm07.kfawlb started 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.? 192.168.123.107:0/3648174574' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: overall HEALTH_OK 2026-03-10T12:33:41.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.? 192.168.123.107:0/3648174574' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/key"}]: dispatch 2026-03-10T12:33:41.938 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.? 192.168.123.107:0/3648174574' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:33:41.938 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:42.078 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:33:42.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.647+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/2506832358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 msgr2=0x7fa5c8107d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:42.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.647+0000 7fa5cd1f1700 1 --2- 192.168.123.100:0/2506832358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c8107d50 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7fa5b8009940 tx=0x7fa5b8009c50 comp rx=0 tx=0).stop 2026-03-10T12:33:42.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.648+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/2506832358 shutdown_connections 2026-03-10T12:33:42.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.648+0000 7fa5cd1f1700 1 --2- 192.168.123.100:0/2506832358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c8107d50 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:42.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.648+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/2506832358 >> 192.168.123.100:0/2506832358 conn(0x7fa5c806c5e0 msgr2=0x7fa5c806c9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:42.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.650+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/2506832358 shutdown_connections 2026-03-10T12:33:42.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.650+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/2506832358 wait complete. 2026-03-10T12:33:42.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.650+0000 7fa5cd1f1700 1 Processor -- start 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5cd1f1700 1 -- start start 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5cd1f1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c807c830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5cd1f1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 0x7fa5c8077720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5cd1f1700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5c8077cf0 con 0x7fa5c806da10 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5cd1f1700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5c8077e60 con 0x7fa5c807cd70 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c6d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c807c830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c6d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c807c830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:53338/0 (socket says 192.168.123.100:53338) 2026-03-10T12:33:42.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c6d9d700 1 -- 192.168.123.100:0/475549634 learned_addr learned my addr 192.168.123.100:0/475549634 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c659c700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 0x7fa5c8077720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c659c700 1 -- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 msgr2=0x7fa5c8077720 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c659c700 1 -- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 msgr2=0x7fa5c8077720 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c659c700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 0x7fa5c8077720 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c659c700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 0x7fa5c8077720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c6d9d700 1 -- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 msgr2=0x7fa5c8077720 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c6d9d700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 0x7fa5c8077720 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.651+0000 7fa5c6d9d700 1 -- 192.168.123.100:0/475549634 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5b80096b0 con 0x7fa5c806da10 2026-03-10T12:33:42.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.652+0000 7fa5c6d9d700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c807c830 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fa5b8005fd0 tx=0x7fa5b8004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:42.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.654+0000 7fa5affff700 1 -- 192.168.123.100:0/475549634 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5b801c070 con 0x7fa5c806da10 2026-03-10T12:33:42.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.654+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa5c80780e0 con 0x7fa5c806da10 2026-03-10T12:33:42.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.654+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5c80823b0 con 0x7fa5c806da10 2026-03-10T12:33:42.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.656+0000 7fa5affff700 1 -- 192.168.123.100:0/475549634 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa5b800cc50 con 0x7fa5c806da10 2026-03-10T12:33:42.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.656+0000 7fa5affff700 1 -- 192.168.123.100:0/475549634 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5b800eb50 con 0x7fa5c806da10 2026-03-10T12:33:42.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.656+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa5c80785d0 con 0x7fa5c806da10 2026-03-10T12:33:42.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.662+0000 7fa5affff700 1 -- 192.168.123.100:0/475549634 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7fa5b8016620 con 0x7fa5c806da10 2026-03-10T12:33:42.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.663+0000 7fa5affff700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa5b006c830 0x7fa5b006ece0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:42.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.664+0000 7fa5affff700 1 -- 192.168.123.100:0/475549634 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fa5b8012070 con 0x7fa5c806da10 2026-03-10T12:33:42.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.664+0000 7fa5affff700 1 -- 192.168.123.100:0/475549634 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa5b808c2b0 con 0x7fa5c806da10 2026-03-10T12:33:42.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.664+0000 7fa5c659c700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa5b006c830 0x7fa5b006ece0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:42.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.664+0000 7fa5c659c700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa5b006c830 0x7fa5b006ece0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fa5c00070b0 tx=0x7fa5c0007040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: mgrmap e18: vm00.nescmq(active, since 18s), standbys: vm07.kfawlb 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm07.kfawlb", "id": "vm07.kfawlb"}]: dispatch 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/2831985032' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:42 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:33:42.796 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.795+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7fa5c80623e0 con 0x7fa5c806da10 2026-03-10T12:33:42.798 INFO:teuthology.orchestra.run.vm00.stdout:# minimal ceph.conf for 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:33:42.798 INFO:teuthology.orchestra.run.vm00.stdout:[global] 2026-03-10T12:33:42.798 INFO:teuthology.orchestra.run.vm00.stdout: fsid = 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:33:42.798 INFO:teuthology.orchestra.run.vm00.stdout: mon_host = [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] 2026-03-10T12:33:42.798 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.797+0000 7fa5affff700 1 -- 192.168.123.100:0/475549634 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7fa5b8025100 con 0x7fa5c806da10 2026-03-10T12:33:42.799 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa5b006c830 msgr2=0x7fa5b006ece0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:42.799 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa5b006c830 0x7fa5b006ece0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fa5c00070b0 tx=0x7fa5c0007040 comp rx=0 tx=0).stop 2026-03-10T12:33:42.799 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 msgr2=0x7fa5c807c830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:42.799 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c807c830 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fa5b8005fd0 tx=0x7fa5b8004970 comp rx=0 tx=0).stop 2026-03-10T12:33:42.799 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 shutdown_connections 2026-03-10T12:33:42.799 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa5b006c830 0x7fa5b006ece0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:42.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa5c806da10 0x7fa5c807c830 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:42.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 --2- 192.168.123.100:0/475549634 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa5c807cd70 0x7fa5c8077720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:42.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 >> 192.168.123.100:0/475549634 conn(0x7fa5c806c5e0 msgr2=0x7fa5c8070c50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:42.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 shutdown_connections 2026-03-10T12:33:42.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:42.799+0000 7fa5cd1f1700 1 -- 192.168.123.100:0/475549634 wait complete. 2026-03-10T12:33:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:42 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:33:42.854 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-10T12:33:42.854 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:33:42.854 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T12:33:42.931 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:33:42.931 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:33:43.003 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:33:43.004 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T12:33:43.033 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:33:43.034 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:33:43.106 INFO:tasks.cephadm:Deploying OSDs... 2026-03-10T12:33:43.106 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:33:43.106 DEBUG:teuthology.orchestra.run.vm00:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T12:33:43.126 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:33:43.126 DEBUG:teuthology.orchestra.run.vm00:> ls /dev/[sv]d? 2026-03-10T12:33:43.183 INFO:teuthology.orchestra.run.vm00.stdout:/dev/vda 2026-03-10T12:33:43.183 INFO:teuthology.orchestra.run.vm00.stdout:/dev/vdb 2026-03-10T12:33:43.183 INFO:teuthology.orchestra.run.vm00.stdout:/dev/vdc 2026-03-10T12:33:43.183 INFO:teuthology.orchestra.run.vm00.stdout:/dev/vdd 2026-03-10T12:33:43.183 INFO:teuthology.orchestra.run.vm00.stdout:/dev/vde 2026-03-10T12:33:43.183 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T12:33:43.183 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T12:33:43.183 DEBUG:teuthology.orchestra.run.vm00:> stat /dev/vdb 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout: File: /dev/vdb 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout:Device: 6h/6d Inode: 249 Links: 1 Device type: fc,10 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout:Access: 2026-03-10 12:32:51.156876640 +0000 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout:Modify: 2026-03-10 12:32:51.046876466 +0000 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout:Change: 2026-03-10 12:32:51.046876466 +0000 2026-03-10T12:33:43.242 INFO:teuthology.orchestra.run.vm00.stdout: Birth: 2026-03-10 12:28:05.214000000 +0000 2026-03-10T12:33:43.242 DEBUG:teuthology.orchestra.run.vm00:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T12:33:43.309 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records in 2026-03-10T12:33:43.309 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records out 2026-03-10T12:33:43.309 INFO:teuthology.orchestra.run.vm00.stderr:512 bytes copied, 0.000171011 s, 3.0 MB/s 2026-03-10T12:33:43.311 DEBUG:teuthology.orchestra.run.vm00:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T12:33:43.373 DEBUG:teuthology.orchestra.run.vm00:> stat /dev/vdc 2026-03-10T12:33:43.434 INFO:teuthology.orchestra.run.vm00.stdout: File: /dev/vdc 2026-03-10T12:33:43.434 INFO:teuthology.orchestra.run.vm00.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:43.434 INFO:teuthology.orchestra.run.vm00.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-10T12:33:43.434 INFO:teuthology.orchestra.run.vm00.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:43.434 INFO:teuthology.orchestra.run.vm00.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:43.434 INFO:teuthology.orchestra.run.vm00.stdout:Access: 2026-03-10 12:32:51.214876733 +0000 2026-03-10T12:33:43.434 INFO:teuthology.orchestra.run.vm00.stdout:Modify: 2026-03-10 12:32:51.041876458 +0000 2026-03-10T12:33:43.435 INFO:teuthology.orchestra.run.vm00.stdout:Change: 2026-03-10 12:32:51.041876458 +0000 2026-03-10T12:33:43.435 INFO:teuthology.orchestra.run.vm00.stdout: Birth: 2026-03-10 12:28:05.218000000 +0000 2026-03-10T12:33:43.435 DEBUG:teuthology.orchestra.run.vm00:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T12:33:43.501 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records in 2026-03-10T12:33:43.501 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records out 2026-03-10T12:33:43.501 INFO:teuthology.orchestra.run.vm00.stderr:512 bytes copied, 0.000169317 s, 3.0 MB/s 2026-03-10T12:33:43.503 DEBUG:teuthology.orchestra.run.vm00:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T12:33:43.574 DEBUG:teuthology.orchestra.run.vm00:> stat /dev/vdd 2026-03-10T12:33:43.582 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:33:43.582 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:33:43.582 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/475549634' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:43.582 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:43.582 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: Reconfiguring mon.vm00 (unknown last config time)... 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:43.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:43 vm00 ceph-mon[50686]: Reconfiguring daemon mon.vm00 on vm00 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout: File: /dev/vdd 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout:Access: 2026-03-10 12:32:51.293876858 +0000 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout:Modify: 2026-03-10 12:32:51.040876457 +0000 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout:Change: 2026-03-10 12:32:51.040876457 +0000 2026-03-10T12:33:43.623 INFO:teuthology.orchestra.run.vm00.stdout: Birth: 2026-03-10 12:28:05.227000000 +0000 2026-03-10T12:33:43.623 DEBUG:teuthology.orchestra.run.vm00:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T12:33:43.703 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records in 2026-03-10T12:33:43.703 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records out 2026-03-10T12:33:43.703 INFO:teuthology.orchestra.run.vm00.stderr:512 bytes copied, 0.000185669 s, 2.8 MB/s 2026-03-10T12:33:43.704 DEBUG:teuthology.orchestra.run.vm00:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T12:33:43.768 DEBUG:teuthology.orchestra.run.vm00:> stat /dev/vde 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout: File: /dev/vde 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout:Access: 2026-03-10 12:32:51.352876951 +0000 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout:Modify: 2026-03-10 12:32:51.040876457 +0000 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout:Change: 2026-03-10 12:32:51.040876457 +0000 2026-03-10T12:33:43.828 INFO:teuthology.orchestra.run.vm00.stdout: Birth: 2026-03-10 12:28:05.271000000 +0000 2026-03-10T12:33:43.828 DEBUG:teuthology.orchestra.run.vm00:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T12:33:43.894 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records in 2026-03-10T12:33:43.894 INFO:teuthology.orchestra.run.vm00.stderr:1+0 records out 2026-03-10T12:33:43.894 INFO:teuthology.orchestra.run.vm00.stderr:512 bytes copied, 0.000115386 s, 4.4 MB/s 2026-03-10T12:33:43.895 DEBUG:teuthology.orchestra.run.vm00:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T12:33:43.978 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:33:43.978 DEBUG:teuthology.orchestra.run.vm07:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T12:33:43.995 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:33:43.995 DEBUG:teuthology.orchestra.run.vm07:> ls /dev/[sv]d? 2026-03-10T12:33:44.060 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vda 2026-03-10T12:33:44.060 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vdb 2026-03-10T12:33:44.060 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vdc 2026-03-10T12:33:44.060 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vdd 2026-03-10T12:33:44.060 INFO:teuthology.orchestra.run.vm07.stdout:/dev/vde 2026-03-10T12:33:44.060 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T12:33:44.060 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T12:33:44.060 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vdb 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/475549634' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: Reconfiguring mon.vm00 (unknown last config time)... 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:43 vm07 ceph-mon[58582]: Reconfiguring daemon mon.vm00 on vm00 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vdb 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-10 12:33:25.432564413 +0000 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-10 12:33:25.346564266 +0000 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-10 12:33:25.346564266 +0000 2026-03-10T12:33:44.084 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-10 12:27:40.212000000 +0000 2026-03-10T12:33:44.085 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T12:33:44.150 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-10T12:33:44.151 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-10T12:33:44.151 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000220243 s, 2.3 MB/s 2026-03-10T12:33:44.151 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T12:33:44.211 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vdc 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vdc 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-10 12:33:25.485564504 +0000 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-10 12:33:25.344564262 +0000 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-10 12:33:25.344564262 +0000 2026-03-10T12:33:44.270 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-10 12:27:40.215000000 +0000 2026-03-10T12:33:44.270 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T12:33:44.336 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-10T12:33:44.336 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-10T12:33:44.336 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000181149 s, 2.8 MB/s 2026-03-10T12:33:44.337 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T12:33:44.395 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vdd 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vdd 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-10 12:33:25.547564611 +0000 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-10 12:33:25.362564293 +0000 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-10 12:33:25.362564293 +0000 2026-03-10T12:33:44.453 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-10 12:27:40.220000000 +0000 2026-03-10T12:33:44.453 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T12:33:44.518 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-10T12:33:44.519 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-10T12:33:44.519 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000186179 s, 2.8 MB/s 2026-03-10T12:33:44.520 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T12:33:44.581 DEBUG:teuthology.orchestra.run.vm07:> stat /dev/vde 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout: File: /dev/vde 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout:Access: 2026-03-10 12:33:25.606564712 +0000 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-10 12:33:25.367564302 +0000 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-10 12:33:25.367564302 +0000 2026-03-10T12:33:44.639 INFO:teuthology.orchestra.run.vm07.stdout: Birth: 2026-03-10 12:27:40.226000000 +0000 2026-03-10T12:33:44.640 DEBUG:teuthology.orchestra.run.vm07:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T12:33:44.707 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records in 2026-03-10T12:33:44.707 INFO:teuthology.orchestra.run.vm07.stderr:1+0 records out 2026-03-10T12:33:44.707 INFO:teuthology.orchestra.run.vm07.stderr:512 bytes copied, 0.000173845 s, 2.9 MB/s 2026-03-10T12:33:44.708 DEBUG:teuthology.orchestra.run.vm07:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T12:33:44.767 INFO:tasks.cephadm:Deploying osd.0 on vm00 with /dev/vde... 2026-03-10T12:33:44.767 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- lvm zap /dev/vde 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: Reconfiguring mgr.vm00.nescmq (unknown last config time)... 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: Reconfiguring daemon mgr.vm00.nescmq on vm00 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: Reconfiguring ceph-exporter.vm00 (monmap changed)... 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: Reconfiguring daemon ceph-exporter.vm00 on vm00 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: Reconfiguring crash.vm00 (monmap changed)... 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:44 vm07 ceph-mon[58582]: Reconfiguring daemon crash.vm00 on vm00 2026-03-10T12:33:44.949 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: Reconfiguring mgr.vm00.nescmq (unknown last config time)... 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: Reconfiguring daemon mgr.vm00.nescmq on vm00 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: Reconfiguring ceph-exporter.vm00 (monmap changed)... 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: Reconfiguring daemon ceph-exporter.vm00 on vm00 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: Reconfiguring crash.vm00 (monmap changed)... 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:44.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:44 vm00 ceph-mon[50686]: Reconfiguring daemon crash.vm00 on vm00 2026-03-10T12:33:45.483 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:33:45.495 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch daemon add osd vm00:/dev/vde 2026-03-10T12:33:45.731 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:33:45.936 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:45 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:45.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:45 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:45.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:45 vm00 ceph-mon[50686]: Reconfiguring alertmanager.vm00 (dependencies changed)... 2026-03-10T12:33:45.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:45 vm00 ceph-mon[50686]: Reconfiguring daemon alertmanager.vm00 on vm00 2026-03-10T12:33:45.937 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:45 vm00 ceph-mon[50686]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:46.052 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 -- 192.168.123.100:0/1677496003 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc071a60 msgr2=0x7f02dc071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/1677496003 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc071a60 0x7f02dc071e70 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7f02cc009b00 tx=0x7f02cc009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 -- 192.168.123.100:0/1677496003 shutdown_connections 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/1677496003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f02dc072440 0x7f02dc10be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/1677496003 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc071a60 0x7f02dc071e70 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 -- 192.168.123.100:0/1677496003 >> 192.168.123.100:0/1677496003 conn(0x7f02dc06d1a0 msgr2=0x7f02dc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 -- 192.168.123.100:0/1677496003 shutdown_connections 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.050+0000 7f02e0dfd700 1 -- 192.168.123.100:0/1677496003 wait complete. 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02e0dfd700 1 Processor -- start 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02e0dfd700 1 -- start start 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02e0dfd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f02dc072440 0x7f02dc116a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02e0dfd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc116f90 0x7f02dc1b2780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02e0dfd700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f02dc117490 con 0x7f02dc116f90 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02e0dfd700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f02dc117600 con 0x7f02dc072440 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02daffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc116f90 0x7f02dc1b2780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02daffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc116f90 0x7f02dc1b2780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:53354/0 (socket says 192.168.123.100:53354) 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02daffd700 1 -- 192.168.123.100:0/2730830839 learned_addr learned my addr 192.168.123.100:0/2730830839 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02db7fe700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f02dc072440 0x7f02dc116a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02daffd700 1 -- 192.168.123.100:0/2730830839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f02dc072440 msgr2=0x7f02dc116a50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02daffd700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f02dc072440 0x7f02dc116a50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.051+0000 7f02daffd700 1 -- 192.168.123.100:0/2730830839 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f02cc0097e0 con 0x7f02dc116f90 2026-03-10T12:33:46.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.052+0000 7f02daffd700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc116f90 0x7f02dc1b2780 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f02d4007f00 tx=0x7f02d400d3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:46.055 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.054+0000 7f02d8ff9700 1 -- 192.168.123.100:0/2730830839 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f02d400dcf0 con 0x7f02dc116f90 2026-03-10T12:33:46.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.058+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f02dc1b2d20 con 0x7f02dc116f90 2026-03-10T12:33:46.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.058+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f02dc1b3240 con 0x7f02dc116f90 2026-03-10T12:33:46.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.058+0000 7f02d8ff9700 1 -- 192.168.123.100:0/2730830839 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f02d400f040 con 0x7f02dc116f90 2026-03-10T12:33:46.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.058+0000 7f02d8ff9700 1 -- 192.168.123.100:0/2730830839 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f02d40127c0 con 0x7f02dc116f90 2026-03-10T12:33:46.059 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.058+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f02dc04ea50 con 0x7f02dc116f90 2026-03-10T12:33:46.061 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.060+0000 7f02d8ff9700 1 -- 192.168.123.100:0/2730830839 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f02d4004ad0 con 0x7f02dc116f90 2026-03-10T12:33:46.061 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.060+0000 7f02d8ff9700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f02c406c650 0x7f02c406eb00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:46.061 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.060+0000 7f02d8ff9700 1 -- 192.168.123.100:0/2730830839 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f02d4089e20 con 0x7f02dc116f90 2026-03-10T12:33:46.061 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.061+0000 7f02db7fe700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f02c406c650 0x7f02c406eb00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:46.061 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.061+0000 7f02db7fe700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f02c406c650 0x7f02c406eb00 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f02cc009ad0 tx=0x7f02cc011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:46.063 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.063+0000 7f02d8ff9700 1 -- 192.168.123.100:0/2730830839 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f02d4059180 con 0x7f02dc116f90 2026-03-10T12:33:46.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:45 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:46.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:45 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:46.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:45 vm07 ceph-mon[58582]: Reconfiguring alertmanager.vm00 (dependencies changed)... 2026-03-10T12:33:46.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:45 vm07 ceph-mon[58582]: Reconfiguring daemon alertmanager.vm00 on vm00 2026-03-10T12:33:46.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:45 vm07 ceph-mon[58582]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:46.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:46.193+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f02dc1b3540 con 0x7f02c406c650 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: from='client.14264 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: Reconfiguring grafana.vm00 (dependencies changed)... 2026-03-10T12:33:47.106 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:46 vm00 ceph-mon[50686]: Reconfiguring daemon grafana.vm00 on vm00 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: from='client.14264 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: Reconfiguring grafana.vm00 (dependencies changed)... 2026-03-10T12:33:47.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:46 vm07 ceph-mon[58582]: Reconfiguring daemon grafana.vm00 on vm00 2026-03-10T12:33:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:48 vm00 ceph-mon[50686]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:48 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:48 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:48 vm00 ceph-mon[50686]: Reconfiguring prometheus.vm00 (dependencies changed)... 2026-03-10T12:33:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:48 vm00 ceph-mon[50686]: Reconfiguring daemon prometheus.vm00 on vm00 2026-03-10T12:33:48.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:48 vm07 ceph-mon[58582]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:48.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:48 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:48.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:48 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:48.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:48 vm07 ceph-mon[58582]: Reconfiguring prometheus.vm00 (dependencies changed)... 2026-03-10T12:33:48.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:48 vm07 ceph-mon[58582]: Reconfiguring daemon prometheus.vm00 on vm00 2026-03-10T12:33:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:49 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1262128534' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0f6cb3f2-3337-4851-ba13-f08c9574062c"}]: dispatch 2026-03-10T12:33:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:49 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1262128534' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0f6cb3f2-3337-4851-ba13-f08c9574062c"}]': finished 2026-03-10T12:33:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:49 vm00 ceph-mon[50686]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T12:33:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:49 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:33:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:49 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3460040055' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:33:49.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:49 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1262128534' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0f6cb3f2-3337-4851-ba13-f08c9574062c"}]: dispatch 2026-03-10T12:33:49.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:49 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1262128534' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0f6cb3f2-3337-4851-ba13-f08c9574062c"}]': finished 2026-03-10T12:33:49.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:49 vm07 ceph-mon[58582]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T12:33:49.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:49 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:33:49.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:49 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3460040055' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:33:50.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:50 vm00 ceph-mon[50686]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:50.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:50 vm07 ceph-mon[58582]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:52.440 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:52 vm00 ceph-mon[50686]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:52.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:52 vm07 ceph-mon[58582]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:53.637 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.637 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.637 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-10T12:33:53.637 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:33:53.637 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:53.637 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-10T12:33:53.637 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:33:53.638 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:33:53.712 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:54.427 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: Deploying daemon osd.0 on vm00 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: Deploying daemon osd.0 on vm00 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: Reconfiguring daemon crash.vm07 on vm07 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: Reconfiguring mgr.vm07.kfawlb (monmap changed)... 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: Reconfiguring daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm00.local:9093"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm00.local:3000"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm00.local:9095"}]: dispatch 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.631 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:54 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: Reconfiguring daemon crash.vm07 on vm07 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: Reconfiguring mgr.vm07.kfawlb (monmap changed)... 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: Reconfiguring daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm00.local:9093"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm00.local:3000"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm00.local:9095"}]: dispatch 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:54 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: Reconfiguring mon.vm07 (monmap changed)... 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: Reconfiguring daemon mon.vm07 on vm07 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm00.local:9093"}]: dispatch 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm00.local:3000"}]: dispatch 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm00.local:9095"}]: dispatch 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.570 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:55 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: Reconfiguring mon.vm07 (monmap changed)... 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: Reconfiguring daemon mon.vm07 on vm07 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm00.local:9093"}]: dispatch 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm00.local:3000"}]: dispatch 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm00.local:9095"}]: dispatch 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:55 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:55.855 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.854+0000 7f02d8ff9700 1 -- 192.168.123.100:0/2730830839 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f02dc1b3540 con 0x7f02c406c650 2026-03-10T12:33:55.855 INFO:teuthology.orchestra.run.vm00.stdout:Created osd(s) 0 on host 'vm00' 2026-03-10T12:33:55.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f02c406c650 msgr2=0x7f02c406eb00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:55.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f02c406c650 0x7f02c406eb00 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f02cc009ad0 tx=0x7f02cc011040 comp rx=0 tx=0).stop 2026-03-10T12:33:55.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc116f90 msgr2=0x7f02dc1b2780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:55.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc116f90 0x7f02dc1b2780 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7f02d4007f00 tx=0x7f02d400d3b0 comp rx=0 tx=0).stop 2026-03-10T12:33:55.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 shutdown_connections 2026-03-10T12:33:55.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f02c406c650 0x7f02c406eb00 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:55.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f02dc072440 0x7f02dc116a50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:55.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 --2- 192.168.123.100:0/2730830839 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f02dc116f90 0x7f02dc1b2780 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:55.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 >> 192.168.123.100:0/2730830839 conn(0x7f02dc06d1a0 msgr2=0x7f02dc0705d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:55.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 shutdown_connections 2026-03-10T12:33:55.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:55.857+0000 7f02e0dfd700 1 -- 192.168.123.100:0/2730830839 wait complete. 2026-03-10T12:33:55.900 DEBUG:teuthology.orchestra.run.vm00:osd.0> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.0.service 2026-03-10T12:33:55.907 INFO:tasks.cephadm:Deploying osd.1 on vm00 with /dev/vdd... 2026-03-10T12:33:55.907 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- lvm zap /dev/vdd 2026-03-10T12:33:56.149 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:56 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:56 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:33:57.023 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:33:57.056 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch daemon add osd vm00:/dev/vdd 2026-03-10T12:33:57.307 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:33:57.412 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:33:57 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[67549]: 2026-03-10T12:33:57.236+0000 7f69ca7fc640 -1 osd.0 0 log_to_monitors true 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 -- 192.168.123.100:0/1361123074 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 msgr2=0x7f70c4095830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/1361123074 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c4095830 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f70c8009b00 tx=0x7f70c8009e10 comp rx=0 tx=0).stop 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 -- 192.168.123.100:0/1361123074 shutdown_connections 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/1361123074 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f70c4096620 0x7f70c4096a70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/1361123074 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c4095830 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 -- 192.168.123.100:0/1361123074 >> 192.168.123.100:0/1361123074 conn(0x7f70c40909d0 msgr2=0x7f70c4092e00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 -- 192.168.123.100:0/1361123074 shutdown_connections 2026-03-10T12:33:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.618+0000 7f70d3ca0700 1 -- 192.168.123.100:0/1361123074 wait complete. 2026-03-10T12:33:57.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.620+0000 7f70d3ca0700 1 Processor -- start 2026-03-10T12:33:57.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.620+0000 7f70d3ca0700 1 -- start start 2026-03-10T12:33:57.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.621+0000 7f70d3ca0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c412ab40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.621+0000 7f70d1a3c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c412ab40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.621+0000 7f70d1a3c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c412ab40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42890/0 (socket says 192.168.123.100:42890) 2026-03-10T12:33:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.622+0000 7f70d1a3c700 1 -- 192.168.123.100:0/115953103 learned_addr learned my addr 192.168.123.100:0/115953103 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:33:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.622+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f70c4096620 0x7f70c412b080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.622+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f70c412b6a0 con 0x7f70c4095420 2026-03-10T12:33:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.622+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f70c412b7e0 con 0x7f70c4096620 2026-03-10T12:33:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.622+0000 7f70d123b700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f70c4096620 0x7f70c412b080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:57.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.622+0000 7f70d1a3c700 1 -- 192.168.123.100:0/115953103 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f70c4096620 msgr2=0x7f70c412b080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:33:57.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.623+0000 7f70d1a3c700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f70c4096620 0x7f70c412b080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:33:57.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.623+0000 7f70d1a3c700 1 -- 192.168.123.100:0/115953103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f70c80097e0 con 0x7f70c4095420 2026-03-10T12:33:57.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.623+0000 7f70d1a3c700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c412ab40 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7f70c8006010 tx=0x7f70c8004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:57.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.623+0000 7f70c2ffd700 1 -- 192.168.123.100:0/115953103 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f70c801d070 con 0x7f70c4095420 2026-03-10T12:33:57.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.623+0000 7f70c2ffd700 1 -- 192.168.123.100:0/115953103 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f70c800bd00 con 0x7f70c4095420 2026-03-10T12:33:57.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.623+0000 7f70c2ffd700 1 -- 192.168.123.100:0/115953103 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f70c800f9c0 con 0x7f70c4095420 2026-03-10T12:33:57.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.623+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f70c4130230 con 0x7f70c4095420 2026-03-10T12:33:57.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.624+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f70c4130720 con 0x7f70c4095420 2026-03-10T12:33:57.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.625+0000 7f70c2ffd700 1 -- 192.168.123.100:0/115953103 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f70c800fb20 con 0x7f70c4095420 2026-03-10T12:33:57.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.626+0000 7f70c2ffd700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f70b806c600 0x7f70b806eab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:33:57.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.626+0000 7f70c2ffd700 1 -- 192.168.123.100:0/115953103 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(6..6 src has 1..6) v4 ==== 1313+0+0 (secure 0 0 0) 0x7f70c808c160 con 0x7f70c4095420 2026-03-10T12:33:57.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.626+0000 7f70d123b700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f70b806c600 0x7f70b806eab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:33:57.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.626+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f70c4006120 con 0x7f70c4095420 2026-03-10T12:33:57.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.629+0000 7f70c2ffd700 1 -- 192.168.123.100:0/115953103 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f70c8027080 con 0x7f70c4095420 2026-03-10T12:33:57.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.629+0000 7f70d123b700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f70b806c600 0x7f70b806eab0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f70bc009c00 tx=0x7f70bc009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:33:57.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:33:57.772+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f70c409ac10 con 0x7f70b806c600 2026-03-10T12:33:57.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:57.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:57.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:57.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T12:33:57.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:33:57.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:33:57.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:57.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:57.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:57 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:57 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: osdmap e7: 1 total, 0 up, 1 in 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1256854114' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b"}]: dispatch 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm00", "root=default"]}]': finished 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1256854114' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b"}]': finished 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:33:59.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:33:58 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:33:59.123 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:33:58 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[67549]: 2026-03-10T12:33:58.756+0000 7f69c0e75700 -1 osd.0 0 waiting for initial osdmap 2026-03-10T12:33:59.123 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:33:58 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[67549]: 2026-03-10T12:33:58.765+0000 7f69ba465700 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: osdmap e7: 1 total, 0 up, 1 in 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1256854114' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b"}]: dispatch 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm00", "root=default"]}]': finished 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1256854114' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b"}]': finished 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:33:59.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:33:58 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:34:00.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:00 vm00 ceph-mon[50686]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:34:00.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:00 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/148741221' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:00.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:00 vm00 ceph-mon[50686]: osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] boot 2026-03-10T12:34:00.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:00 vm00 ceph-mon[50686]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T12:34:00.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:00 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:34:00.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:00 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:00.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:00 vm07 ceph-mon[58582]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T12:34:00.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:00 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/148741221' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:00.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:00 vm07 ceph-mon[58582]: osd.0 [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] boot 2026-03-10T12:34:00.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:00 vm07 ceph-mon[58582]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T12:34:00.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:00 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:34:00.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:00 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:01 vm00 ceph-mon[50686]: purged_snaps scrub starts 2026-03-10T12:34:01.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:01 vm00 ceph-mon[50686]: purged_snaps scrub ok 2026-03-10T12:34:01.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:01 vm07 ceph-mon[58582]: purged_snaps scrub starts 2026-03-10T12:34:01.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:01 vm07 ceph-mon[58582]: purged_snaps scrub ok 2026-03-10T12:34:02.406 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:02 vm00 ceph-mon[50686]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:02.406 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:02 vm00 ceph-mon[50686]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T12:34:02.406 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:02 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:02.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:02 vm07 ceph-mon[58582]: pgmap v16: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:02.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:02 vm07 ceph-mon[58582]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T12:34:02.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:02 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:03.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: Detected new or changed devices on vm00 2026-03-10T12:34:03.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:03.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:03.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:03.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:03.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:03.924 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:03.925 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:03.925 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T12:34:03.925 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:03.925 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:03 vm07 ceph-mon[58582]: Deploying daemon osd.1 on vm00 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: Detected new or changed devices on vm00 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:04.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:03 vm00 ceph-mon[50686]: Deploying daemon osd.1 on vm00 2026-03-10T12:34:06.006 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:06 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.006 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:06 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.006 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:06 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:06.006 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:06 vm07 ceph-mon[58582]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:06.006 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:06 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.006 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:06 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.117 INFO:teuthology.orchestra.run.vm00.stdout:Created osd(s) 1 on host 'vm00' 2026-03-10T12:34:06.117 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.116+0000 7f70c2ffd700 1 -- 192.168.123.100:0/115953103 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f70c409ac10 con 0x7f70b806c600 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f70b806c600 msgr2=0x7f70b806eab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f70b806c600 0x7f70b806eab0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f70bc009c00 tx=0x7f70bc009380 comp rx=0 tx=0).stop 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 msgr2=0x7f70c412ab40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c412ab40 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7f70c8006010 tx=0x7f70c8004c30 comp rx=0 tx=0).stop 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 shutdown_connections 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f70b806c600 0x7f70b806eab0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f70c4095420 0x7f70c412ab40 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 --2- 192.168.123.100:0/115953103 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f70c4096620 0x7f70c412b080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.118+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 >> 192.168.123.100:0/115953103 conn(0x7f70c40909d0 msgr2=0x7f70c4092c60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:06.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.119+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 shutdown_connections 2026-03-10T12:34:06.120 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:06.119+0000 7f70d3ca0700 1 -- 192.168.123.100:0/115953103 wait complete. 2026-03-10T12:34:06.174 DEBUG:teuthology.orchestra.run.vm00:osd.1> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.1.service 2026-03-10T12:34:06.175 INFO:tasks.cephadm:Deploying osd.2 on vm00 with /dev/vdc... 2026-03-10T12:34:06.175 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- lvm zap /dev/vdc 2026-03-10T12:34:06.256 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:06 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.257 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:06 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.257 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:06 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:06.257 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:06 vm00 ceph-mon[50686]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:06.257 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:06 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.257 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:06 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:06.406 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:06.975 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:34:06 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[73253]: 2026-03-10T12:34:06.674+0000 7f9d29c67640 -1 osd.1 0 log_to_monitors true 2026-03-10T12:34:07.014 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:07.030 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch daemon add osd vm00:/dev/vdc 2026-03-10T12:34:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:07 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:07 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:07 vm00 ceph-mon[50686]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T12:34:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:07 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:07 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.250 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:07 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:07 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:07 vm07 ceph-mon[58582]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T12:34:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:07 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:07 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- 192.168.123.100:0/2534878232 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237406ac60 msgr2=0x7f237406b070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 --2- 192.168.123.100:0/2534878232 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237406ac60 0x7f237406b070 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f2370009b00 tx=0x7f2370009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- 192.168.123.100:0/2534878232 shutdown_connections 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 --2- 192.168.123.100:0/2534878232 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f237406b5b0 0x7f237406bfb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 --2- 192.168.123.100:0/2534878232 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237406ac60 0x7f237406b070 unknown :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- 192.168.123.100:0/2534878232 >> 192.168.123.100:0/2534878232 conn(0x7f23740faa70 msgr2=0x7f23740fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.576+0000 7f2378f0e700 1 -- 192.168.123.100:0/2534878232 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2370005600 con 0x7f237406ac60 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- 192.168.123.100:0/2534878232 shutdown_connections 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- 192.168.123.100:0/2534878232 wait complete. 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 Processor -- start 2026-03-10T12:34:07.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- start start 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f237406b5b0 0x7f237418b120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237418b660 0x7f23741906d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f237418bb60 con 0x7f237418b660 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.578+0000 7f237c174700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f237418bcd0 con 0x7f237406b5b0 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.579+0000 7f237970f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237418b660 0x7f23741906d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.579+0000 7f237970f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237418b660 0x7f23741906d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:38512/0 (socket says 192.168.123.100:38512) 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.579+0000 7f237970f700 1 -- 192.168.123.100:0/1660754968 learned_addr learned my addr 192.168.123.100:0/1660754968 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.580+0000 7f237970f700 1 -- 192.168.123.100:0/1660754968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f237406b5b0 msgr2=0x7f237418b120 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.580+0000 7f237970f700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f237406b5b0 0x7f237418b120 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.580+0000 7f237970f700 1 -- 192.168.123.100:0/1660754968 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f23700097e0 con 0x7f237418b660 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.580+0000 7f237970f700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237418b660 0x7f23741906d0 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f236c00c390 tx=0x7f236c00c750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.581+0000 7f236affd700 1 -- 192.168.123.100:0/1660754968 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f236c0090d0 con 0x7f237418b660 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.581+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2374190c10 con 0x7f237418b660 2026-03-10T12:34:07.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.581+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2374191130 con 0x7f237418b660 2026-03-10T12:34:07.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.581+0000 7f236affd700 1 -- 192.168.123.100:0/1660754968 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f236c00f040 con 0x7f237418b660 2026-03-10T12:34:07.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.581+0000 7f236affd700 1 -- 192.168.123.100:0/1660754968 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f236c014670 con 0x7f237418b660 2026-03-10T12:34:07.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.582+0000 7f236affd700 1 -- 192.168.123.100:0/1660754968 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f236c0147d0 con 0x7f237418b660 2026-03-10T12:34:07.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.582+0000 7f236affd700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f236008e720 0x7f2360090bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:07.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.583+0000 7f2379f10700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f236008e720 0x7f2360090bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:07.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.583+0000 7f236affd700 1 -- 192.168.123.100:0/1660754968 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(11..11 src has 1..11) v4 ==== 1936+0+0 (secure 0 0 0) 0x7f236c08bcf0 con 0x7f237418b660 2026-03-10T12:34:07.584 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.584+0000 7f2379f10700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f236008e720 0x7f2360090bd0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f2370009ad0 tx=0x7f2370000bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:07.587 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.587+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2358005320 con 0x7f237418b660 2026-03-10T12:34:07.591 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.591+0000 7f236affd700 1 -- 192.168.123.100:0/1660754968 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f236c05ad90 con 0x7f237418b660 2026-03-10T12:34:07.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:07.701+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f2358000bf0 con 0x7f236008e720 2026-03-10T12:34:08.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:08.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T12:34:08.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:08.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:08.123 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:34:08 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[73253]: 2026-03-10T12:34:08.100+0000 7f9d1eadd700 -1 osd.1 0 waiting for initial osdmap 2026-03-10T12:34:08.123 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:34:08 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[73253]: 2026-03-10T12:34:08.119+0000 7f9d188ce700 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: pgmap v20: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:08.517 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm00", "root=default"]}]': finished 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: osdmap e12: 2 total, 1 up, 2 in 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2733582595' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3de6b811-dbac-419f-abf8-afd0bec7a47f"}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] boot 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2733582595' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "3de6b811-dbac-419f-abf8-afd0bec7a47f"}]': finished 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:09.325 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm00:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm00", "root=default"]}]': finished 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: osdmap e12: 2 total, 1 up, 2 in 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2733582595' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "3de6b811-dbac-419f-abf8-afd0bec7a47f"}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: osd.1 [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] boot 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2733582595' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "3de6b811-dbac-419f-abf8-afd0bec7a47f"}]': finished 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: purged_snaps scrub starts 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: purged_snaps scrub ok 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: pgmap v24: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1100357554' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:10 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: purged_snaps scrub starts 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: purged_snaps scrub ok 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: pgmap v24: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1100357554' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:10 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:11.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:11 vm00 ceph-mon[50686]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T12:34:11.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:11.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:11 vm07 ceph-mon[58582]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T12:34:11.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:12.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:12 vm00 ceph-mon[50686]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:12.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:12 vm07 ceph-mon[58582]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:13.378 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T12:34:13.378 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:13.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T12:34:13.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:14.273 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:14 vm00 ceph-mon[50686]: Deploying daemon osd.2 on vm00 2026-03-10T12:34:14.273 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:14 vm00 ceph-mon[50686]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:14 vm07 ceph-mon[58582]: Deploying daemon osd.2 on vm00 2026-03-10T12:34:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:14 vm07 ceph-mon[58582]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:15.939 INFO:teuthology.orchestra.run.vm00.stdout:Created osd(s) 2 on host 'vm00' 2026-03-10T12:34:15.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.936+0000 7f236affd700 1 -- 192.168.123.100:0/1660754968 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f2358000bf0 con 0x7f236008e720 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.941+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f236008e720 msgr2=0x7f2360090bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.941+0000 7f237c174700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f236008e720 0x7f2360090bd0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f2370009ad0 tx=0x7f2370000bc0 comp rx=0 tx=0).stop 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.941+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237418b660 msgr2=0x7f23741906d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.941+0000 7f237c174700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237418b660 0x7f23741906d0 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f236c00c390 tx=0x7f236c00c750 comp rx=0 tx=0).stop 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.942+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 shutdown_connections 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.942+0000 7f237c174700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f236008e720 0x7f2360090bd0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.942+0000 7f237c174700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f237406b5b0 0x7f237418b120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.942+0000 7f237c174700 1 --2- 192.168.123.100:0/1660754968 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f237418b660 0x7f23741906d0 unknown :-1 s=CLOSED pgs=185 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.942+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 >> 192.168.123.100:0/1660754968 conn(0x7f23740faa70 msgr2=0x7f23740fb130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.944+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 shutdown_connections 2026-03-10T12:34:15.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:15.944+0000 7f237c174700 1 -- 192.168.123.100:0/1660754968 wait complete. 2026-03-10T12:34:16.008 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:16.008 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:16.008 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:16.008 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:15 vm00 ceph-mon[50686]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:16.009 DEBUG:teuthology.orchestra.run.vm00:osd.2> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.2.service 2026-03-10T12:34:16.019 INFO:tasks.cephadm:Deploying osd.3 on vm07 with /dev/vde... 2026-03-10T12:34:16.019 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- lvm zap /dev/vde 2026-03-10T12:34:16.153 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:34:16.178 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:16.178 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:16.178 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:16.178 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:15 vm07 ceph-mon[58582]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:16.656 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:34:16.671 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch daemon add osd vm07:/dev/vde 2026-03-10T12:34:16.823 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.073+0000 7f66395b6700 1 -- 192.168.123.107:0/2691786990 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 msgr2=0x7f6634105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.073+0000 7f66395b6700 1 --2- 192.168.123.107:0/2691786990 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f6634105ac0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f6624009b00 tx=0x7f6624009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.074+0000 7f66395b6700 1 -- 192.168.123.107:0/2691786990 shutdown_connections 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.074+0000 7f66395b6700 1 --2- 192.168.123.107:0/2691786990 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f6634105ac0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.074+0000 7f66395b6700 1 --2- 192.168.123.107:0/2691786990 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6634069180 0x7f6634103140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.075+0000 7f66395b6700 1 -- 192.168.123.107:0/2691786990 >> 192.168.123.107:0/2691786990 conn(0x7f66340faa70 msgr2=0x7f66340fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.075+0000 7f66395b6700 1 -- 192.168.123.107:0/2691786990 shutdown_connections 2026-03-10T12:34:17.075 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.075+0000 7f66395b6700 1 -- 192.168.123.107:0/2691786990 wait complete. 2026-03-10T12:34:17.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.075+0000 7f66395b6700 1 Processor -- start 2026-03-10T12:34:17.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66395b6700 1 -- start start 2026-03-10T12:34:17.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66395b6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6634069180 0x7f6634198070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:17.076 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66395b6700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f66341985b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:17.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66327fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f66341985b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:17.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66327fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f66341985b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:33932/0 (socket says 192.168.123.107:33932) 2026-03-10T12:34:17.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66327fc700 1 -- 192.168.123.107:0/264103864 learned_addr learned my addr 192.168.123.107:0/264103864 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:34:17.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6634198bd0 con 0x7f6634069180 2026-03-10T12:34:17.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.076+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6634198d10 con 0x7f6634103680 2026-03-10T12:34:17.077 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f6632ffd700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6634069180 0x7f6634198070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:17.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f66327fc700 1 -- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6634069180 msgr2=0x7f6634198070 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:17.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f66327fc700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6634069180 0x7f6634198070 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:17.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f66327fc700 1 -- 192.168.123.107:0/264103864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f66240097e0 con 0x7f6634103680 2026-03-10T12:34:17.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f6632ffd700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6634069180 0x7f6634198070 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:17.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f66327fc700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f66341985b0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f6624005f50 tx=0x7f6624004a60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:17.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f662bfff700 1 -- 192.168.123.107:0/264103864 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f662401d070 con 0x7f6634103680 2026-03-10T12:34:17.078 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f662bfff700 1 -- 192.168.123.107:0/264103864 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f662400bc50 con 0x7f6634103680 2026-03-10T12:34:17.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f663419d760 con 0x7f6634103680 2026-03-10T12:34:17.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.077+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f663419dc50 con 0x7f6634103680 2026-03-10T12:34:17.079 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.078+0000 7f662bfff700 1 -- 192.168.123.107:0/264103864 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6624022620 con 0x7f6634103680 2026-03-10T12:34:17.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.079+0000 7f662bfff700 1 -- 192.168.123.107:0/264103864 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 18) v1 ==== 89854+0+0 (secure 0 0 0) 0x7f6624022890 con 0x7f6634103680 2026-03-10T12:34:17.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.079+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f66340fc670 con 0x7f6634103680 2026-03-10T12:34:17.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.079+0000 7f662bfff700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f662006c600 0x7f662006eab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:17.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.079+0000 7f662bfff700 1 -- 192.168.123.107:0/264103864 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(14..14 src has 1..14) v4 ==== 2347+0+0 (secure 0 0 0) 0x7f662408c630 con 0x7f6634103680 2026-03-10T12:34:17.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.080+0000 7f6632ffd700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f662006c600 0x7f662006eab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:17.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.080+0000 7f6632ffd700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f662006c600 0x7f662006eab0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f6634068eb0 tx=0x7f661c008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:17.083 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.083+0000 7f662bfff700 1 -- 192.168.123.107:0/264103864 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f662405b4f0 con 0x7f6634103680 2026-03-10T12:34:17.184 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.184 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.184 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.184 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.199 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:17.199+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f6634061190 con 0x7f662006c600 2026-03-10T12:34:17.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:17.236 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:34:17 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[79470]: 2026-03-10T12:34:17.022+0000 7fc19af15640 -1 osd.2 0 log_to_monitors true 2026-03-10T12:34:17.898 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T12:34:17.898 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:17.898 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='client.24139 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:17.898 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:17.898 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:18.206 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:18.206 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:18.206 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:18.206 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:18.207 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:18.207 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:18.207 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:18.207 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:17 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:18.207 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:34:18 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[79470]: 2026-03-10T12:34:18.061+0000 7fc19158e700 -1 osd.2 0 waiting for initial osdmap 2026-03-10T12:34:18.207 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:34:18 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[79470]: 2026-03-10T12:34:18.067+0000 7fc189b7c700 -1 osd.2 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='client.24139 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:17 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: Detected new or changed devices on vm00 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/744502405' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cd850d3a-e99e-4292-9600-f18ed81a7d18"}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cd850d3a-e99e-4292-9600-f18ed81a7d18"}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm00", "root=default"]}]': finished 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "cd850d3a-e99e-4292-9600-f18ed81a7d18"}]': finished 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: osdmap e16: 4 total, 2 up, 4 in 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:19.169 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:19 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/3032327171' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: Detected new or changed devices on vm00 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/744502405' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cd850d3a-e99e-4292-9600-f18ed81a7d18"}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "cd850d3a-e99e-4292-9600-f18ed81a7d18"}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm00", "root=default"]}]': finished 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "cd850d3a-e99e-4292-9600-f18ed81a7d18"}]': finished 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: osdmap e16: 4 total, 2 up, 4 in 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:19.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:19 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/3032327171' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: purged_snaps scrub starts 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: purged_snaps scrub ok 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: pgmap v32: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] boot 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:20 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: purged_snaps scrub starts 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: purged_snaps scrub ok 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: pgmap v32: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: osd.2 [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] boot 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:20.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:20 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:21.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:21 vm07 ceph-mon[58582]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T12:34:21.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:21 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:21.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:21 vm00 ceph-mon[50686]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T12:34:21.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:21 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:21.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:21 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T12:34:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:21 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T12:34:22.634 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:22 vm07 ceph-mon[58582]: pgmap v35: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:22.899 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:22 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T12:34:22.899 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:22 vm07 ceph-mon[58582]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T12:34:22.899 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:22 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:22.899 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:22 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T12:34:22.899 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:22 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T12:34:22.899 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:22 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:22 vm00 ceph-mon[50686]: pgmap v35: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:22 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T12:34:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:22 vm00 ceph-mon[50686]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T12:34:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:22 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:22 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T12:34:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:22 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T12:34:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:22 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83014]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vde 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83014]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83014]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83014]: pam_unix(sudo:session): session closed for user root 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83017]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdd 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83017]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83017]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T12:34:23.734 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83017]: pam_unix(sudo:session): session closed for user root 2026-03-10T12:34:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:23 vm07 ceph-mon[58582]: Deploying daemon osd.3 on vm07 2026-03-10T12:34:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:23 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T12:34:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:23 vm07 ceph-mon[58582]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T12:34:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:23 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:23 vm07 ceph-mon[58582]: pgmap v38: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:23 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:24.234 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83020]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vdc 2026-03-10T12:34:24.234 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83020]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T12:34:24.234 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83020]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T12:34:24.234 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83020]: pam_unix(sudo:session): session closed for user root 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 ceph-mon[50686]: Deploying daemon osd.3 on vm07 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 ceph-mon[50686]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 ceph-mon[50686]: pgmap v38: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83023]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83023]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83023]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T12:34:24.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:23 vm00 sudo[83023]: pam_unix(sudo:session): session closed for user root 2026-03-10T12:34:24.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 sudo[64317]: ceph : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/sbin/smartctl -x --json=o /dev/vda 2026-03-10T12:34:24.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 sudo[64317]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T12:34:24.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 sudo[64317]: pam_unix(sudo:session): session opened for user root by (uid=0) 2026-03-10T12:34:24.380 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 sudo[64317]: pam_unix(sudo:session): session closed for user root 2026-03-10T12:34:24.949 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:24.950 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:24 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:25.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:24 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:25.441 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.440+0000 7f662bfff700 1 -- 192.168.123.107:0/264103864 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f6634061190 con 0x7f662006c600 2026-03-10T12:34:25.443 INFO:teuthology.orchestra.run.vm07.stdout:Created osd(s) 3 on host 'vm07' 2026-03-10T12:34:25.443 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f662006c600 msgr2=0x7f662006eab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f662006c600 0x7f662006eab0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f6634068eb0 tx=0x7f661c008040 comp rx=0 tx=0).stop 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 msgr2=0x7f66341985b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f66341985b0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f6624005f50 tx=0x7f6624004a60 comp rx=0 tx=0).stop 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 shutdown_connections 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f662006c600 0x7f662006eab0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6634069180 0x7f6634198070 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 --2- 192.168.123.107:0/264103864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6634103680 0x7f66341985b0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 >> 192.168.123.107:0/264103864 conn(0x7f66340faa70 msgr2=0x7f66341042e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 shutdown_connections 2026-03-10T12:34:25.444 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:25.443+0000 7f66395b6700 1 -- 192.168.123.107:0/264103864 wait complete. 2026-03-10T12:34:25.499 DEBUG:teuthology.orchestra.run.vm07:osd.3> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.3.service 2026-03-10T12:34:25.501 INFO:tasks.cephadm:Deploying osd.4 on vm07 with /dev/vdd... 2026-03-10T12:34:25.501 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- lvm zap /dev/vdd 2026-03-10T12:34:25.718 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:34:26.298 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:34:26.329 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch daemon add osd vm07:/dev/vdd 2026-03-10T12:34:26.545 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:34:26 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[64470]: 2026-03-10T12:34:26.318+0000 7f54b7edb640 -1 osd.3 0 log_to_monitors true 2026-03-10T12:34:27.092 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: mgrmap e19: vm00.nescmq(active, since 62s), standbys: vm07.kfawlb 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: from='osd.3 [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:34:27.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:26 vm07 ceph-mon[58582]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: mgrmap e19: vm00.nescmq(active, since 62s), standbys: vm07.kfawlb 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: from='osd.3 [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:34:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:26 vm00 ceph-mon[50686]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:34:27.700 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.698+0000 7fb6bd70e700 1 -- 192.168.123.107:0/3255668195 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80ff7d0 msgr2=0x7fb6b80ffc40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:27.700 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.698+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/3255668195 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b80ffc40 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fb6a8009b00 tx=0x7fb6a8009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:27.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.702+0000 7fb6bd70e700 1 -- 192.168.123.107:0/3255668195 shutdown_connections 2026-03-10T12:34:27.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.702+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/3255668195 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b80ffc40 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:27.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.702+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/3255668195 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80fee80 0x7fb6b80ff290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:27.702 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.702+0000 7fb6bd70e700 1 -- 192.168.123.107:0/3255668195 >> 192.168.123.107:0/3255668195 conn(0x7fb6b80faa70 msgr2=0x7fb6b80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:27.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.702+0000 7fb6bd70e700 1 -- 192.168.123.107:0/3255668195 shutdown_connections 2026-03-10T12:34:27.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.702+0000 7fb6bd70e700 1 -- 192.168.123.107:0/3255668195 wait complete. 2026-03-10T12:34:27.703 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.703+0000 7fb6bd70e700 1 Processor -- start 2026-03-10T12:34:27.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.703+0000 7fb6bd70e700 1 -- start start 2026-03-10T12:34:27.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.703+0000 7fb6bd70e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80fee80 0x7fb6b8193bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:27.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.703+0000 7fb6bd70e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b8194110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:27.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.703+0000 7fb6bd70e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb6b81946e0 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.704 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.703+0000 7fb6bd70e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb6b8194820 con 0x7fb6b80fee80 2026-03-10T12:34:27.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.704+0000 7fb6b6ffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80fee80 0x7fb6b8193bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:27.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.704+0000 7fb6b67fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b8194110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:27.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.705+0000 7fb6b67fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b8194110 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:34666/0 (socket says 192.168.123.107:34666) 2026-03-10T12:34:27.705 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.705+0000 7fb6b67fc700 1 -- 192.168.123.107:0/855245429 learned_addr learned my addr 192.168.123.107:0/855245429 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:34:27.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.705+0000 7fb6b67fc700 1 -- 192.168.123.107:0/855245429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80fee80 msgr2=0x7fb6b8193bd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:27.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.705+0000 7fb6b67fc700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80fee80 0x7fb6b8193bd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:27.706 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.705+0000 7fb6b67fc700 1 -- 192.168.123.107:0/855245429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb6a80097e0 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.706+0000 7fb6b67fc700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b8194110 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7fb6a8000c00 tx=0x7fb6a80048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:27.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.707+0000 7fb6affff700 1 -- 192.168.123.107:0/855245429 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb6a801d070 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.707+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb6b806a830 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.707+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb6b806ad20 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.707 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.707+0000 7fb6affff700 1 -- 192.168.123.107:0/855245429 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb6a800bb40 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.708 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.707+0000 7fb6affff700 1 -- 192.168.123.107:0/855245429 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb6a800f670 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.710 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.709+0000 7fb6affff700 1 -- 192.168.123.107:0/855245429 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb6a800f7d0 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.710 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.709+0000 7fb6affff700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb6a406c7a0 0x7fb6a406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:27.710 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.710+0000 7fb6affff700 1 -- 192.168.123.107:0/855245429 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(22..22 src has 1..22) v4 ==== 3186+0+0 (secure 0 0 0) 0x7fb6a808d7b0 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.710+0000 7fb6b6ffd700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb6a406c7a0 0x7fb6a406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:27.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.710+0000 7fb6b6ffd700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb6a406c7a0 0x7fb6a406ec50 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fb6b8068f50 tx=0x7fb6a0009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:27.711 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.711+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb698005320 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.715 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.714+0000 7fb6affff700 1 -- 192.168.123.107:0/855245429 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb6a8027080 con 0x7fb6b80ff7d0 2026-03-10T12:34:27.838 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:27.837+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7fb698000bf0 con 0x7fb6a406c7a0 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: osdmap e22: 4 total, 3 up, 4 in 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='osd.3 [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: pgmap v42: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:28.070 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:28 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:28.322 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:34:28 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[64470]: 2026-03-10T12:34:28.070+0000 7f54acd51700 -1 osd.3 0 waiting for initial osdmap 2026-03-10T12:34:28.322 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:34:28 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[64470]: 2026-03-10T12:34:28.077+0000 7f54a9347700 -1 osd.3 23 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: osdmap e22: 4 total, 3 up, 4 in 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='osd.3 [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: pgmap v42: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:28 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: Detected new or changed devices on vm07 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='client.14328 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: osdmap e23: 4 total, 3 up, 4 in 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/1101640381' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ff03ddab-6945-46b4-b19b-30775ca85618"}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ff03ddab-6945-46b4-b19b-30775ca85618"}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: osd.3 [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] boot 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "ff03ddab-6945-46b4-b19b-30775ca85618"}]': finished 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:29.402 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:29 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: Detected new or changed devices on vm07 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='client.14328 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: osdmap e23: 4 total, 3 up, 4 in 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1101640381' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ff03ddab-6945-46b4-b19b-30775ca85618"}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ff03ddab-6945-46b4-b19b-30775ca85618"}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: osd.3 [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] boot 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "ff03ddab-6945-46b4-b19b-30775ca85618"}]': finished 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:34:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:29 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:30 vm00 ceph-mon[50686]: purged_snaps scrub starts 2026-03-10T12:34:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:30 vm00 ceph-mon[50686]: purged_snaps scrub ok 2026-03-10T12:34:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:30 vm00 ceph-mon[50686]: pgmap v45: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T12:34:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:30 vm00 ceph-mon[50686]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T12:34:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:30 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:30 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/3456835836' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:30 vm07 ceph-mon[58582]: purged_snaps scrub starts 2026-03-10T12:34:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:30 vm07 ceph-mon[58582]: purged_snaps scrub ok 2026-03-10T12:34:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:30 vm07 ceph-mon[58582]: pgmap v45: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T12:34:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:30 vm07 ceph-mon[58582]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T12:34:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:30 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:30 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/3456835836' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:31.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:31 vm00 ceph-mon[50686]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T12:34:31.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:31 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:31.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:31 vm07 ceph-mon[58582]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T12:34:31.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:31 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:32.513 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:32 vm07 ceph-mon[58582]: pgmap v48: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 2.2 KiB/s rd, 65 KiB/s wr, 5 op/s 2026-03-10T12:34:32.641 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:32 vm00 ceph-mon[50686]: pgmap v48: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 2.2 KiB/s rd, 65 KiB/s wr, 5 op/s 2026-03-10T12:34:34.427 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:34 vm07 ceph-mon[58582]: pgmap v49: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 1.8 KiB/s rd, 52 KiB/s wr, 4 op/s 2026-03-10T12:34:34.428 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:34 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T12:34:34.428 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:34 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:34.428 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:34 vm07 ceph-mon[58582]: Deploying daemon osd.4 on vm07 2026-03-10T12:34:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:34 vm00 ceph-mon[50686]: pgmap v49: 1 pgs: 1 remapped+peering; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 1.8 KiB/s rd, 52 KiB/s wr, 4 op/s 2026-03-10T12:34:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T12:34:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:34 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:34 vm00 ceph-mon[50686]: Deploying daemon osd.4 on vm07 2026-03-10T12:34:35.503 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:35 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:35.503 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:35 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:35.503 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:35 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:35.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:35 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:35.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:35 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:35.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:35 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:35.978 INFO:teuthology.orchestra.run.vm07.stdout:Created osd(s) 4 on host 'vm07' 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.976+0000 7fb6affff700 1 -- 192.168.123.107:0/855245429 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fb698000bf0 con 0x7fb6a406c7a0 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.978+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb6a406c7a0 msgr2=0x7fb6a406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.978+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb6a406c7a0 0x7fb6a406ec50 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fb6b8068f50 tx=0x7fb6a0009450 comp rx=0 tx=0).stop 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.978+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80ff7d0 msgr2=0x7fb6b8194110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.978+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b8194110 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7fb6a8000c00 tx=0x7fb6a80048c0 comp rx=0 tx=0).stop 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.978+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 shutdown_connections 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.978+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb6a406c7a0 0x7fb6a406ec50 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.978+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6b80fee80 0x7fb6b8193bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.979+0000 7fb6bd70e700 1 --2- 192.168.123.107:0/855245429 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb6b80ff7d0 0x7fb6b8194110 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.979+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 >> 192.168.123.107:0/855245429 conn(0x7fb6b80faa70 msgr2=0x7fb6b81030a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.979+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 shutdown_connections 2026-03-10T12:34:35.979 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:35.979+0000 7fb6bd70e700 1 -- 192.168.123.107:0/855245429 wait complete. 2026-03-10T12:34:36.026 DEBUG:teuthology.orchestra.run.vm07:osd.4> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.4.service 2026-03-10T12:34:36.028 INFO:tasks.cephadm:Deploying osd.5 on vm07 with /dev/vdc... 2026-03-10T12:34:36.028 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- lvm zap /dev/vdc 2026-03-10T12:34:36.242 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:34:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:36 vm07 ceph-mon[58582]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 1.5 KiB/s rd, 42 KiB/s wr, 3 op/s; 72 KiB/s, 0 objects/s recovering 2026-03-10T12:34:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:36 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:36 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:36 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:36 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.733 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:36 vm00 ceph-mon[50686]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 1.5 KiB/s rd, 42 KiB/s wr, 3 op/s; 72 KiB/s, 0 objects/s recovering 2026-03-10T12:34:36.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:36 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:36 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:36 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:36 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:36.773 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:34:36.787 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph orch daemon add osd vm07:/dev/vdc 2026-03-10T12:34:37.035 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:34:37.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.337+0000 7f6eb466d700 1 -- 192.168.123.107:0/471352594 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac071a90 msgr2=0x7f6eac071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:37.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.337+0000 7f6eb466d700 1 --2- 192.168.123.107:0/471352594 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac071a90 0x7f6eac071ea0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f6ea8009b00 tx=0x7f6ea8009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:37.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.337+0000 7f6eb466d700 1 -- 192.168.123.107:0/471352594 shutdown_connections 2026-03-10T12:34:37.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.337+0000 7f6eb466d700 1 --2- 192.168.123.107:0/471352594 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac072470 0x7f6eac109ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:37.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.337+0000 7f6eb466d700 1 --2- 192.168.123.107:0/471352594 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac071a90 0x7f6eac071ea0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:37.338 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.337+0000 7f6eb466d700 1 -- 192.168.123.107:0/471352594 >> 192.168.123.107:0/471352594 conn(0x7f6eac06d1a0 msgr2=0x7f6eac06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.338+0000 7f6eb466d700 1 -- 192.168.123.107:0/471352594 shutdown_connections 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.338+0000 7f6eb466d700 1 -- 192.168.123.107:0/471352594 wait complete. 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb466d700 1 Processor -- start 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb466d700 1 -- start start 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb466d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac071a90 0x7f6eac1a0550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb466d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac072470 0x7f6eac1a0a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb466d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6eac1a10b0 con 0x7f6eac071a90 2026-03-10T12:34:37.339 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb466d700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6eac1a11f0 con 0x7f6eac072470 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb1c08700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac072470 0x7f6eac1a0a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb2409700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac071a90 0x7f6eac1a0550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb2409700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac071a90 0x7f6eac1a0550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.107:45816/0 (socket says 192.168.123.107:45816) 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.339+0000 7f6eb2409700 1 -- 192.168.123.107:0/1887225971 learned_addr learned my addr 192.168.123.107:0/1887225971 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.340+0000 7f6eb2409700 1 -- 192.168.123.107:0/1887225971 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac072470 msgr2=0x7f6eac1a0a90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.340+0000 7f6eb2409700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac072470 0x7f6eac1a0a90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.340+0000 7f6eb2409700 1 -- 192.168.123.107:0/1887225971 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ea80097e0 con 0x7f6eac071a90 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.340+0000 7f6eb2409700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac071a90 0x7f6eac1a0550 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f6ea800bac0 tx=0x7f6ea800baf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.340+0000 7f6e9f7fe700 1 -- 192.168.123.107:0/1887225971 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ea801d070 con 0x7f6eac071a90 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.340+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6eac1a5c40 con 0x7f6eac071a90 2026-03-10T12:34:37.340 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.340+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6eac1a6130 con 0x7f6eac071a90 2026-03-10T12:34:37.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.342+0000 7f6e9f7fe700 1 -- 192.168.123.107:0/1887225971 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6ea800fb30 con 0x7f6eac071a90 2026-03-10T12:34:37.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.342+0000 7f6e9f7fe700 1 -- 192.168.123.107:0/1887225971 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ea8022c90 con 0x7f6eac071a90 2026-03-10T12:34:37.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.342+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6eac19a850 con 0x7f6eac071a90 2026-03-10T12:34:37.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.343+0000 7f6e9f7fe700 1 -- 192.168.123.107:0/1887225971 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6ea800f650 con 0x7f6eac071a90 2026-03-10T12:34:37.343 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.343+0000 7f6e9f7fe700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6e9806c7a0 0x7f6e9806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:37.344 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.343+0000 7f6e9f7fe700 1 -- 192.168.123.107:0/1887225971 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(26..26 src has 1..26) v4 ==== 3697+0+0 (secure 0 0 0) 0x7f6ea808cea0 con 0x7f6eac071a90 2026-03-10T12:34:37.346 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.346+0000 7f6eb1c08700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6e9806c7a0 0x7f6e9806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:37.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.346+0000 7f6e9f7fe700 1 -- 192.168.123.107:0/1887225971 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6ea805b750 con 0x7f6eac071a90 2026-03-10T12:34:37.347 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.346+0000 7f6eb1c08700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6e9806c7a0 0x7f6e9806ec50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6ea0006fd0 tx=0x7f6ea0009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:37.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:37.491+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f6eac061190 con 0x7f6e9806c7a0 2026-03-10T12:34:37.811 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:34:37 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[69961]: 2026-03-10T12:34:37.609+0000 7f18de5c2640 -1 osd.4 0 log_to_monitors true 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: pgmap v51: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 1.1 KiB/s rd, 33 KiB/s wr, 2 op/s; 56 KiB/s, 0 objects/s recovering 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='osd.4 [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:38 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: pgmap v51: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 1.1 KiB/s rd, 33 KiB/s wr, 2 op/s; 56 KiB/s, 0 objects/s recovering 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='osd.4 [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:38 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='client.14338 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: Detected new or changed devices on vm07 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/1172887639' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d91585d7-d879-48f1-8fdf-f6c88a82428a"}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/1172887639' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d91585d7-d879-48f1-8fdf-f6c88a82428a"}]': finished 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='osd.4 [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:39 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/1852610371' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:39.566 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:34:39 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[69961]: 2026-03-10T12:34:39.398+0000 7f18d3438700 -1 osd.4 0 waiting for initial osdmap 2026-03-10T12:34:39.566 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:34:39 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[69961]: 2026-03-10T12:34:39.416+0000 7f18cfa2e700 -1 osd.4 28 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='client.14338 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm07:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: Detected new or changed devices on vm07 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1172887639' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "d91585d7-d879-48f1-8fdf-f6c88a82428a"}]: dispatch 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1172887639' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "d91585d7-d879-48f1-8fdf-f6c88a82428a"}]': finished 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='osd.4 [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:39.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:39 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/1852610371' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T12:34:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:40 vm00 ceph-mon[50686]: pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 51 KiB/s, 0 objects/s recovering 2026-03-10T12:34:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:40 vm00 ceph-mon[50686]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-10T12:34:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:40 vm00 ceph-mon[50686]: osdmap e28: 6 total, 4 up, 6 in 2026-03-10T12:34:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:40 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:40 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:40 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:40.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:40 vm07 ceph-mon[58582]: pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 51 KiB/s, 0 objects/s recovering 2026-03-10T12:34:40.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:40 vm07 ceph-mon[58582]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-10T12:34:40.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:40 vm07 ceph-mon[58582]: osdmap e28: 6 total, 4 up, 6 in 2026-03-10T12:34:40.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:40 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:40.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:40 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:40.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:40 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:41 vm00 ceph-mon[50686]: purged_snaps scrub ok 2026-03-10T12:34:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:41 vm00 ceph-mon[50686]: osd.4 [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] boot 2026-03-10T12:34:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:41 vm00 ceph-mon[50686]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T12:34:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:41 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:41 vm07 ceph-mon[58582]: purged_snaps scrub ok 2026-03-10T12:34:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:41 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:41 vm07 ceph-mon[58582]: osd.4 [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] boot 2026-03-10T12:34:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:41 vm07 ceph-mon[58582]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T12:34:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:41 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:34:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:41 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:42.407 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:42 vm07 ceph-mon[58582]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T12:34:42.407 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:42 vm07 ceph-mon[58582]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T12:34:42.407 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:42 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:42.691 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:42 vm00 ceph-mon[50686]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T12:34:42.691 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:42 vm00 ceph-mon[50686]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T12:34:42.691 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:42 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:43.626 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T12:34:43.626 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:43 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:43.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T12:34:43.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:43 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:44.687 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:44 vm07 ceph-mon[58582]: Deploying daemon osd.5 on vm07 2026-03-10T12:34:44.687 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:44 vm07 ceph-mon[58582]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:44.687 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:44.687 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:44.687 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:44 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:44 vm00 ceph-mon[50686]: Deploying daemon osd.5 on vm07 2026-03-10T12:34:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:44 vm00 ceph-mon[50686]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:44 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:45.221 INFO:teuthology.orchestra.run.vm07.stdout:Created osd(s) 5 on host 'vm07' 2026-03-10T12:34:45.221 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.221+0000 7f6e9f7fe700 1 -- 192.168.123.107:0/1887225971 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f6eac061190 con 0x7f6e9806c7a0 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6e9806c7a0 msgr2=0x7f6e9806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6e9806c7a0 0x7f6e9806ec50 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f6ea0006fd0 tx=0x7f6ea0009380 comp rx=0 tx=0).stop 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac071a90 msgr2=0x7f6eac1a0550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac071a90 0x7f6eac1a0550 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f6ea800bac0 tx=0x7f6ea800baf0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 shutdown_connections 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6e9806c7a0 0x7f6e9806ec50 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6eac071a90 0x7f6eac1a0550 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 --2- 192.168.123.107:0/1887225971 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6eac072470 0x7f6eac1a0a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 >> 192.168.123.107:0/1887225971 conn(0x7f6eac06d1a0 msgr2=0x7f6eac110440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 shutdown_connections 2026-03-10T12:34:45.224 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:45.223+0000 7f6eb466d700 1 -- 192.168.123.107:0/1887225971 wait complete. 2026-03-10T12:34:45.281 DEBUG:teuthology.orchestra.run.vm07:osd.5> sudo journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.5.service 2026-03-10T12:34:45.283 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-10T12:34:45.283 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd stat -f json 2026-03-10T12:34:45.454 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.722+0000 7fe40c41b700 1 -- 192.168.123.100:0/456574862 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 msgr2=0x7fe404101b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.722+0000 7fe40c41b700 1 --2- 192.168.123.100:0/456574862 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404101b80 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7fe3f4009b00 tx=0x7fe3f4009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.723+0000 7fe40c41b700 1 -- 192.168.123.100:0/456574862 shutdown_connections 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.723+0000 7fe40c41b700 1 --2- 192.168.123.100:0/456574862 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404101b80 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.723+0000 7fe40c41b700 1 --2- 192.168.123.100:0/456574862 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe404100530 0x7fe404100940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.723+0000 7fe40c41b700 1 -- 192.168.123.100:0/456574862 >> 192.168.123.100:0/456574862 conn(0x7fe4040fbaa0 msgr2=0x7fe4040fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.723+0000 7fe40c41b700 1 -- 192.168.123.100:0/456574862 shutdown_connections 2026-03-10T12:34:45.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.723+0000 7fe40c41b700 1 -- 192.168.123.100:0/456574862 wait complete. 2026-03-10T12:34:45.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe40c41b700 1 Processor -- start 2026-03-10T12:34:45.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe40c41b700 1 -- start start 2026-03-10T12:34:45.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe40c41b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe404100530 0x7fe404197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe40c41b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe40c41b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe404198b50 con 0x7fe404101730 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe4099b6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe4099b6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33514/0 (socket says 192.168.123.100:33514) 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe4099b6700 1 -- 192.168.123.100:0/2949747041 learned_addr learned my addr 192.168.123.100:0/2949747041 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.724+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe404198c90 con 0x7fe404100530 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.725+0000 7fe4099b6700 1 -- 192.168.123.100:0/2949747041 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe404100530 msgr2=0x7fe404197ff0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.725+0000 7fe4099b6700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe404100530 0x7fe404197ff0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.725+0000 7fe4099b6700 1 -- 192.168.123.100:0/2949747041 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe3f40097e0 con 0x7fe404101730 2026-03-10T12:34:45.726 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.725+0000 7fe4099b6700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404198530 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fe3f4009ad0 tx=0x7fe3f4005070 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:45.727 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.725+0000 7fe3fb7fe700 1 -- 192.168.123.100:0/2949747041 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3f401d070 con 0x7fe404101730 2026-03-10T12:34:45.727 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.725+0000 7fe3fb7fe700 1 -- 192.168.123.100:0/2949747041 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe3f400bc50 con 0x7fe404101730 2026-03-10T12:34:45.727 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.725+0000 7fe3fb7fe700 1 -- 192.168.123.100:0/2949747041 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3f400f8b0 con 0x7fe404101730 2026-03-10T12:34:45.727 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.726+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe40419d6e0 con 0x7fe404101730 2026-03-10T12:34:45.727 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.726+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe40418ebc0 con 0x7fe404101730 2026-03-10T12:34:45.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.727+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe404105810 con 0x7fe404101730 2026-03-10T12:34:45.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.727+0000 7fe3fb7fe700 1 -- 192.168.123.100:0/2949747041 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe3f400fa10 con 0x7fe404101730 2026-03-10T12:34:45.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.728+0000 7fe3fb7fe700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3f006c7a0 0x7fe3f006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:45.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.728+0000 7fe3fb7fe700 1 -- 192.168.123.100:0/2949747041 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(30..30 src has 1..30) v4 ==== 4129+0+0 (secure 0 0 0) 0x7fe3f408cb90 con 0x7fe404101730 2026-03-10T12:34:45.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.730+0000 7fe40a1b7700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3f006c7a0 0x7fe3f006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:45.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.731+0000 7fe3fb7fe700 1 -- 192.168.123.100:0/2949747041 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe3f405b350 con 0x7fe404101730 2026-03-10T12:34:45.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.731+0000 7fe40a1b7700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3f006c7a0 0x7fe3f006ec50 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fe404101590 tx=0x7fe400005c30 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:45.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.835+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fe404066e40 con 0x7fe404101730 2026-03-10T12:34:45.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.836+0000 7fe3fb7fe700 1 -- 192.168.123.100:0/2949747041 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v30) v1 ==== 74+0+130 (secure 0 0 0) 0x7fe3f405aee0 con 0x7fe404101730 2026-03-10T12:34:45.836 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:45.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3f006c7a0 msgr2=0x7fe3f006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:45.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3f006c7a0 0x7fe3f006ec50 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fe404101590 tx=0x7fe400005c30 comp rx=0 tx=0).stop 2026-03-10T12:34:45.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 msgr2=0x7fe404198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:45.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404198530 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fe3f4009ad0 tx=0x7fe3f4005070 comp rx=0 tx=0).stop 2026-03-10T12:34:45.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 shutdown_connections 2026-03-10T12:34:45.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3f006c7a0 0x7fe3f006ec50 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe404100530 0x7fe404197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.838+0000 7fe40c41b700 1 --2- 192.168.123.100:0/2949747041 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe404101730 0x7fe404198530 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:45.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.839+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 >> 192.168.123.100:0/2949747041 conn(0x7fe4040fbaa0 msgr2=0x7fe404102950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:45.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.839+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 shutdown_connections 2026-03-10T12:34:45.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:45.839+0000 7fe40c41b700 1 -- 192.168.123.100:0/2949747041 wait complete. 2026-03-10T12:34:45.909 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":30,"num_osds":6,"num_up_osds":5,"osd_up_since":1773146080,"num_in_osds":6,"osd_in_since":1773146078,"num_remapped_pgs":0} 2026-03-10T12:34:46.170 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:46 vm00 ceph-mon[50686]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:46.171 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.171 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.171 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.171 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:46 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.171 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:46 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2949747041' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T12:34:46.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:46 vm07 ceph-mon[58582]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:46.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:46 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:46.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:46 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2949747041' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T12:34:46.910 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd stat -f json 2026-03-10T12:34:47.069 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:47.243 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:34:46 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[75277]: 2026-03-10T12:34:46.836+0000 7f5267adf640 -1 osd.5 0 log_to_monitors true 2026-03-10T12:34:47.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.349+0000 7f9d418d9700 1 -- 192.168.123.100:0/4117087610 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 msgr2=0x7f9d3c105cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:47.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.349+0000 7f9d418d9700 1 --2- 192.168.123.100:0/4117087610 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 0x7f9d3c105cd0 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f9d2c009b00 tx=0x7f9d2c009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:47.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.350+0000 7f9d418d9700 1 -- 192.168.123.100:0/4117087610 shutdown_connections 2026-03-10T12:34:47.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.350+0000 7f9d418d9700 1 --2- 192.168.123.100:0/4117087610 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 0x7f9d3c105cd0 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:47.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.350+0000 7f9d418d9700 1 --2- 192.168.123.100:0/4117087610 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 0x7f9d3c1033b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:47.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.350+0000 7f9d418d9700 1 -- 192.168.123.100:0/4117087610 >> 192.168.123.100:0/4117087610 conn(0x7f9d3c0fa9b0 msgr2=0x7f9d3c0fce20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:47.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.350+0000 7f9d418d9700 1 -- 192.168.123.100:0/4117087610 shutdown_connections 2026-03-10T12:34:47.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.350+0000 7f9d418d9700 1 -- 192.168.123.100:0/4117087610 wait complete. 2026-03-10T12:34:47.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.351+0000 7f9d418d9700 1 Processor -- start 2026-03-10T12:34:47.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.351+0000 7f9d418d9700 1 -- start start 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.351+0000 7f9d418d9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 0x7f9d3c071cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.351+0000 7f9d418d9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 0x7f9d3c072210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.351+0000 7f9d418d9700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d3c072830 con 0x7f9d3c1038f0 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.351+0000 7f9d418d9700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d3c1a3890 con 0x7f9d3c100fd0 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.351+0000 7f9d3affd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 0x7f9d3c071cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3affd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 0x7f9d3c071cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:49556/0 (socket says 192.168.123.100:49556) 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3affd700 1 -- 192.168.123.100:0/3849178340 learned_addr learned my addr 192.168.123.100:0/3849178340 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3a7fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 0x7f9d3c072210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3affd700 1 -- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 msgr2=0x7f9d3c072210 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3affd700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 0x7f9d3c072210 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3affd700 1 -- 192.168.123.100:0/3849178340 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d2c0097e0 con 0x7f9d3c100fd0 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3a7fc700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 0x7f9d3c072210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:47.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d3affd700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 0x7f9d3c071cd0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f9d2400eb10 tx=0x7f9d2400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:47.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.352+0000 7f9d408d7700 1 -- 192.168.123.100:0/3849178340 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d2400cca0 con 0x7f9d3c100fd0 2026-03-10T12:34:47.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.353+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d3c1a3b70 con 0x7f9d3c100fd0 2026-03-10T12:34:47.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.353+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d3c1a40c0 con 0x7f9d3c100fd0 2026-03-10T12:34:47.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.353+0000 7f9d408d7700 1 -- 192.168.123.100:0/3849178340 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d2400ce00 con 0x7f9d3c100fd0 2026-03-10T12:34:47.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.353+0000 7f9d408d7700 1 -- 192.168.123.100:0/3849178340 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d24018910 con 0x7f9d3c100fd0 2026-03-10T12:34:47.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.354+0000 7f9d408d7700 1 -- 192.168.123.100:0/3849178340 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9d24018b50 con 0x7f9d3c100fd0 2026-03-10T12:34:47.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.354+0000 7f9d408d7700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d2806c680 0x7f9d2806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:47.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.355+0000 7f9d408d7700 1 -- 192.168.123.100:0/3849178340 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(30..30 src has 1..30) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f9d24014070 con 0x7f9d3c100fd0 2026-03-10T12:34:47.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.355+0000 7f9d3a7fc700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d2806c680 0x7f9d2806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:47.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.355+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d1c005320 con 0x7f9d3c100fd0 2026-03-10T12:34:47.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.355+0000 7f9d3a7fc700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d2806c680 0x7f9d2806eb30 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f9d2c006010 tx=0x7f9d2c00b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:47.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.358+0000 7f9d408d7700 1 -- 192.168.123.100:0/3849178340 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9d24057b60 con 0x7f9d3c100fd0 2026-03-10T12:34:47.469 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.468+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f9d1c005190 con 0x7f9d3c100fd0 2026-03-10T12:34:47.469 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.469+0000 7f9d408d7700 1 -- 192.168.123.100:0/3849178340 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v31) v1 ==== 74+0+130 (secure 0 0 0) 0x7f9d2405b180 con 0x7f9d3c100fd0 2026-03-10T12:34:47.469 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:47.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d2806c680 msgr2=0x7f9d2806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:47.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d2806c680 0x7f9d2806eb30 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f9d2c006010 tx=0x7f9d2c00b540 comp rx=0 tx=0).stop 2026-03-10T12:34:47.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 msgr2=0x7f9d3c071cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:47.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 0x7f9d3c071cd0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f9d2400eb10 tx=0x7f9d2400eed0 comp rx=0 tx=0).stop 2026-03-10T12:34:47.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 shutdown_connections 2026-03-10T12:34:47.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9d2806c680 0x7f9d2806eb30 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:47.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d3c100fd0 0x7f9d3c071cd0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:47.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 --2- 192.168.123.100:0/3849178340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d3c1038f0 0x7f9d3c072210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:47.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.471+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 >> 192.168.123.100:0/3849178340 conn(0x7f9d3c0fa9b0 msgr2=0x7f9d3c0fce20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:47.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.472+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 shutdown_connections 2026-03-10T12:34:47.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:47.472+0000 7f9d418d9700 1 -- 192.168.123.100:0/3849178340 wait complete. 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: Detected new or changed devices on vm07 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='osd.5 [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:34:47.481 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:47 vm00 ceph-mon[50686]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:34:47.530 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":31,"num_osds":6,"num_up_osds":5,"osd_up_since":1773146080,"num_in_osds":6,"osd_in_since":1773146078,"num_remapped_pgs":0} 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: Detected new or changed devices on vm07 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='osd.5 [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:34:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:47 vm07 ceph-mon[58582]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:34:48.531 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd stat -f json 2026-03-10T12:34:48.693 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:48.732 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:34:48 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[75277]: 2026-03-10T12:34:48.374+0000 7f525c955700 -1 osd.5 0 waiting for initial osdmap 2026-03-10T12:34:48.732 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:34:48 vm07 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[75277]: 2026-03-10T12:34:48.387+0000 7f5258f4b700 -1 osd.5 32 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:34:48.732 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:48 vm07 ceph-mon[58582]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:48.732 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:48 vm07 ceph-mon[58582]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T12:34:48.732 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:48 vm07 ceph-mon[58582]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T12:34:48.732 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:48 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:48.732 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:48 vm07 ceph-mon[58582]: from='osd.5 [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:48.732 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:48 vm07 ceph-mon[58582]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:48.732 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:48 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3849178340' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T12:34:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:48 vm00 ceph-mon[50686]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:48 vm00 ceph-mon[50686]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T12:34:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:48 vm00 ceph-mon[50686]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T12:34:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:48 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:48 vm00 ceph-mon[50686]: from='osd.5 [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:48 vm00 ceph-mon[50686]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:34:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:48 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3849178340' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T12:34:49.464 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:49 vm00 ceph-mon[50686]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-10T12:34:49.465 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:49 vm00 ceph-mon[50686]: osdmap e32: 6 total, 5 up, 6 in 2026-03-10T12:34:49.465 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:49 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:49.465 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:49 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:49.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.553+0000 7f91fffce700 1 -- 192.168.123.100:0/3846066892 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 msgr2=0x7f91f8105d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:49.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.553+0000 7f91fffce700 1 --2- 192.168.123.100:0/3846066892 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8105d90 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f91f4009b00 tx=0x7f91f4009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:49.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.554+0000 7f91fffce700 1 -- 192.168.123.100:0/3846066892 shutdown_connections 2026-03-10T12:34:49.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.554+0000 7f91fffce700 1 --2- 192.168.123.100:0/3846066892 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8105d90 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:49.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.554+0000 7f91fffce700 1 --2- 192.168.123.100:0/3846066892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91f8101090 0x7f91f8103470 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:49.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.554+0000 7f91fffce700 1 -- 192.168.123.100:0/3846066892 >> 192.168.123.100:0/3846066892 conn(0x7f91f80faa70 msgr2=0x7f91f80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:49.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.554+0000 7f91fffce700 1 -- 192.168.123.100:0/3846066892 shutdown_connections 2026-03-10T12:34:49.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.554+0000 7f91fffce700 1 -- 192.168.123.100:0/3846066892 wait complete. 2026-03-10T12:34:49.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.555+0000 7f91fffce700 1 Processor -- start 2026-03-10T12:34:49.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.555+0000 7f91fffce700 1 -- start start 2026-03-10T12:34:49.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.555+0000 7f91fffce700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91f8101090 0x7f91f8197fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.555+0000 7f91fffce700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8198520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.555+0000 7f91fffce700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91f8198b40 con 0x7f91f81039b0 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.555+0000 7f91fffce700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91f8198c80 con 0x7f91f8101090 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fd569700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8198520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fd569700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8198520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:38854/0 (socket says 192.168.123.100:38854) 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fd569700 1 -- 192.168.123.100:0/1721533531 learned_addr learned my addr 192.168.123.100:0/1721533531 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fd569700 1 -- 192.168.123.100:0/1721533531 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91f8101090 msgr2=0x7f91f8197fe0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fdd6a700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91f8101090 0x7f91f8197fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fd569700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91f8101090 0x7f91f8197fe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:49.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fd569700 1 -- 192.168.123.100:0/1721533531 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91f40097e0 con 0x7f91f81039b0 2026-03-10T12:34:49.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91fd569700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8198520 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f91f400bb70 tx=0x7f91f4004690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:49.558 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91eeffd700 1 -- 192.168.123.100:0/1721533531 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91f401d070 con 0x7f91f81039b0 2026-03-10T12:34:49.558 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91eeffd700 1 -- 192.168.123.100:0/1721533531 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f91f4004d60 con 0x7f91f81039b0 2026-03-10T12:34:49.558 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.556+0000 7f91eeffd700 1 -- 192.168.123.100:0/1721533531 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f91f400f740 con 0x7f91f81039b0 2026-03-10T12:34:49.558 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.557+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f91f819d6d0 con 0x7f91f81039b0 2026-03-10T12:34:49.558 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.557+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f91f819dbc0 con 0x7f91f81039b0 2026-03-10T12:34:49.558 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.557+0000 7f91fdd6a700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91f8101090 0x7f91f8197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:49.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.558+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f91f80fc670 con 0x7f91f81039b0 2026-03-10T12:34:49.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.559+0000 7f91eeffd700 1 -- 192.168.123.100:0/1721533531 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f91f4004ed0 con 0x7f91f81039b0 2026-03-10T12:34:49.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.559+0000 7f91eeffd700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f91e406c750 0x7f91e406ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:49.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.559+0000 7f91eeffd700 1 -- 192.168.123.100:0/1721533531 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f91f408cc90 con 0x7f91f81039b0 2026-03-10T12:34:49.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.560+0000 7f91fdd6a700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f91e406c750 0x7f91e406ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:49.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.560+0000 7f91fdd6a700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f91e406c750 0x7f91e406ec00 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f91f8068f50 tx=0x7f91e8009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:49.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.562+0000 7f91eeffd700 1 -- 192.168.123.100:0/1721533531 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f91f405b260 con 0x7f91f81039b0 2026-03-10T12:34:49.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.673+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f91f8066e40 con 0x7f91f81039b0 2026-03-10T12:34:49.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.673+0000 7f91eeffd700 1 -- 192.168.123.100:0/1721533531 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v33) v1 ==== 74+0+130 (secure 0 0 0) 0x7f91f40270a0 con 0x7f91f81039b0 2026-03-10T12:34:49.673 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:49.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.675+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f91e406c750 msgr2=0x7f91e406ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:49.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.675+0000 7f91fffce700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f91e406c750 0x7f91e406ec00 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f91f8068f50 tx=0x7f91e8009450 comp rx=0 tx=0).stop 2026-03-10T12:34:49.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.675+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 msgr2=0x7f91f8198520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.675+0000 7f91fffce700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8198520 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f91f400bb70 tx=0x7f91f4004690 comp rx=0 tx=0).stop 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.676+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 shutdown_connections 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.676+0000 7f91fffce700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f91e406c750 0x7f91e406ec00 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.676+0000 7f91fffce700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f91f8101090 0x7f91f8197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.676+0000 7f91fffce700 1 --2- 192.168.123.100:0/1721533531 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f91f81039b0 0x7f91f8198520 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.676+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 >> 192.168.123.100:0/1721533531 conn(0x7f91f80faa70 msgr2=0x7f91f80ff710 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.676+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 shutdown_connections 2026-03-10T12:34:49.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:49.676+0000 7f91fffce700 1 -- 192.168.123.100:0/1721533531 wait complete. 2026-03-10T12:34:49.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:49 vm07 ceph-mon[58582]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]': finished 2026-03-10T12:34:49.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:49 vm07 ceph-mon[58582]: osdmap e32: 6 total, 5 up, 6 in 2026-03-10T12:34:49.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:49 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:49.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:49 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:49.840 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":33,"num_osds":6,"num_up_osds":6,"osd_up_since":1773146089,"num_in_osds":6,"osd_in_since":1773146078,"num_remapped_pgs":0} 2026-03-10T12:34:49.840 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd dump --format=json 2026-03-10T12:34:50.021 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:50.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.310+0000 7f1e95341700 1 -- 192.168.123.100:0/389267612 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 msgr2=0x7f1e9006d260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:50.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.310+0000 7f1e95341700 1 --2- 192.168.123.100:0/389267612 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e9006d260 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f1e80009b00 tx=0x7f1e80009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:50.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.311+0000 7f1e95341700 1 -- 192.168.123.100:0/389267612 shutdown_connections 2026-03-10T12:34:50.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.311+0000 7f1e95341700 1 --2- 192.168.123.100:0/389267612 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e9006d7a0 0x7f1e9006dc10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.311+0000 7f1e95341700 1 --2- 192.168.123.100:0/389267612 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e9006d260 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.311+0000 7f1e95341700 1 -- 192.168.123.100:0/389267612 >> 192.168.123.100:0/389267612 conn(0x7f1e9006c830 msgr2=0x7f1e90071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:50.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.311+0000 7f1e95341700 1 -- 192.168.123.100:0/389267612 shutdown_connections 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.311+0000 7f1e95341700 1 -- 192.168.123.100:0/389267612 wait complete. 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.312+0000 7f1e95341700 1 Processor -- start 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.312+0000 7f1e95341700 1 -- start start 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.312+0000 7f1e95341700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e9006d7a0 0x7f1e901a4e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.312+0000 7f1e95341700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e901a53c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.312+0000 7f1e95341700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e901a5aa0 con 0x7f1e9010ed80 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.312+0000 7f1e95341700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1e901a9830 con 0x7f1e9006d7a0 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.313+0000 7f1e865ff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e901a53c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.313+0000 7f1e8effd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e9006d7a0 0x7f1e901a4e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.313+0000 7f1e865ff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e901a53c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:38868/0 (socket says 192.168.123.100:38868) 2026-03-10T12:34:50.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.313+0000 7f1e865ff700 1 -- 192.168.123.100:0/1256070250 learned_addr learned my addr 192.168.123.100:0/1256070250 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:50.314 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.313+0000 7f1e865ff700 1 -- 192.168.123.100:0/1256070250 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e9006d7a0 msgr2=0x7f1e901a4e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:50.314 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.313+0000 7f1e865ff700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e9006d7a0 0x7f1e901a4e80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.314 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.313+0000 7f1e865ff700 1 -- 192.168.123.100:0/1256070250 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1e800097e0 con 0x7f1e9010ed80 2026-03-10T12:34:50.314 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.314+0000 7f1e865ff700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e901a53c0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f1e7800d8d0 tx=0x7f1e7800dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:50.314 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.314+0000 7f1e8cff9700 1 -- 192.168.123.100:0/1256070250 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e78009880 con 0x7f1e9010ed80 2026-03-10T12:34:50.314 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.314+0000 7f1e8cff9700 1 -- 192.168.123.100:0/1256070250 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1e78010460 con 0x7f1e9010ed80 2026-03-10T12:34:50.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.314+0000 7f1e8cff9700 1 -- 192.168.123.100:0/1256070250 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1e7800f5d0 con 0x7f1e9010ed80 2026-03-10T12:34:50.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.314+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1e901a9a30 con 0x7f1e9010ed80 2026-03-10T12:34:50.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.314+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1e901a9f80 con 0x7f1e9010ed80 2026-03-10T12:34:50.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.316+0000 7f1e8cff9700 1 -- 192.168.123.100:0/1256070250 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f1e7800f730 con 0x7f1e9010ed80 2026-03-10T12:34:50.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.316+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1e90109e10 con 0x7f1e9010ed80 2026-03-10T12:34:50.320 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.316+0000 7f1e8cff9700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1e7c06c6d0 0x7f1e7c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:50.320 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.316+0000 7f1e8cff9700 1 -- 192.168.123.100:0/1256070250 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1e7808b130 con 0x7f1e9010ed80 2026-03-10T12:34:50.320 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.317+0000 7f1e8effd700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1e7c06c6d0 0x7f1e7c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:50.320 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.317+0000 7f1e8effd700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1e7c06c6d0 0x7f1e7c06eb80 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1e8000b5c0 tx=0x7f1e80005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:50.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.320+0000 7f1e8cff9700 1 -- 192.168.123.100:0/1256070250 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f1e78056db0 con 0x7f1e9010ed80 2026-03-10T12:34:50.449 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.448+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f1e9004ea50 con 0x7f1e9010ed80 2026-03-10T12:34:50.449 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.449+0000 7f1e8cff9700 1 -- 192.168.123.100:0/1256070250 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v34) v1 ==== 74+0+11271 (secure 0 0 0) 0x7f1e7805a3d0 con 0x7f1e9010ed80 2026-03-10T12:34:50.449 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:50.450 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":34,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","created":"2026-03-10T12:32:17.778471+0000","modified":"2026-03-10T12:34:50.381363+0000","last_up_change":"2026-03-10T12:34:49.373861+0000","last_in_change":"2026-03-10T12:34:38.384036+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T12:34:21.281104+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"0f6cb3f2-3337-4851-ba13-f08c9574062c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6802","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6803","nonce":3160210101}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6804","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6805","nonce":3160210101}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6808","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6809","nonce":3160210101}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6806","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6807","nonce":3160210101}]},"public_addr":"192.168.123.100:6803/3160210101","cluster_addr":"192.168.123.100:6805/3160210101","heartbeat_back_addr":"192.168.123.100:6809/3160210101","heartbeat_front_addr":"192.168.123.100:6807/3160210101","state":["exists","up"]},{"osd":1,"uuid":"bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":25,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6810","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6811","nonce":4135684750}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6812","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6813","nonce":4135684750}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6816","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6817","nonce":4135684750}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6814","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6815","nonce":4135684750}]},"public_addr":"192.168.123.100:6811/4135684750","cluster_addr":"192.168.123.100:6813/4135684750","heartbeat_back_addr":"192.168.123.100:6817/4135684750","heartbeat_front_addr":"192.168.123.100:6815/4135684750","state":["exists","up"]},{"osd":2,"uuid":"3de6b811-dbac-419f-abf8-afd0bec7a47f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6818","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6819","nonce":3405786659}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6820","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6821","nonce":3405786659}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6824","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6825","nonce":3405786659}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6822","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6823","nonce":3405786659}]},"public_addr":"192.168.123.100:6819/3405786659","cluster_addr":"192.168.123.100:6821/3405786659","heartbeat_back_addr":"192.168.123.100:6825/3405786659","heartbeat_front_addr":"192.168.123.100:6823/3405786659","state":["exists","up"]},{"osd":3,"uuid":"cd850d3a-e99e-4292-9600-f18ed81a7d18","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":24,"up_thru":28,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6800","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6801","nonce":698977351}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6802","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6803","nonce":698977351}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6806","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6807","nonce":698977351}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6804","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6805","nonce":698977351}]},"public_addr":"192.168.123.107:6801/698977351","cluster_addr":"192.168.123.107:6803/698977351","heartbeat_back_addr":"192.168.123.107:6807/698977351","heartbeat_front_addr":"192.168.123.107:6805/698977351","state":["exists","up"]},{"osd":4,"uuid":"ff03ddab-6945-46b4-b19b-30775ca85618","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":29,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6808","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6809","nonce":4106671248}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6810","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6811","nonce":4106671248}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6814","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6815","nonce":4106671248}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6812","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6813","nonce":4106671248}]},"public_addr":"192.168.123.107:6809/4106671248","cluster_addr":"192.168.123.107:6811/4106671248","heartbeat_back_addr":"192.168.123.107:6815/4106671248","heartbeat_front_addr":"192.168.123.107:6813/4106671248","state":["exists","up"]},{"osd":5,"uuid":"d91585d7-d879-48f1-8fdf-f6c88a82428a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":33,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6816","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6817","nonce":4252230828}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6818","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6819","nonce":4252230828}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6822","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6823","nonce":4252230828}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6820","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6821","nonce":4252230828}]},"public_addr":"192.168.123.107:6817/4252230828","cluster_addr":"192.168.123.107:6819/4252230828","heartbeat_back_addr":"192.168.123.107:6823/4252230828","heartbeat_front_addr":"192.168.123.107:6821/4252230828","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:33:58.292397+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:07.673082+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:18.026191+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:27.370020+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:38.625936+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:47.790663+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.100:0/2792084710":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/69960775":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/1015166415":"2026-03-11T12:32:46.753519+0000","192.168.123.100:0/2753083811":"2026-03-11T12:32:46.753519+0000","192.168.123.100:6800/2":"2026-03-11T12:32:32.101116+0000","192.168.123.100:6801/2":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/4113305903":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/385023950":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/1487901880":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/3472231466":"2026-03-11T12:32:46.753519+0000","192.168.123.100:0/1442998252":"2026-03-11T12:32:32.101116+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T12:34:50.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1e7c06c6d0 msgr2=0x7f1e7c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:50.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1e7c06c6d0 0x7f1e7c06eb80 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1e8000b5c0 tx=0x7f1e80005fb0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 msgr2=0x7f1e901a53c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:50.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e901a53c0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f1e7800d8d0 tx=0x7f1e7800dbe0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 shutdown_connections 2026-03-10T12:34:50.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f1e7c06c6d0 0x7f1e7c06eb80 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1e9006d7a0 0x7f1e901a4e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 --2- 192.168.123.100:0/1256070250 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1e9010ed80 0x7f1e901a53c0 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:50.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.451+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 >> 192.168.123.100:0/1256070250 conn(0x7f1e9006c830 msgr2=0x7f1e901189b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:50.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.452+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 shutdown_connections 2026-03-10T12:34:50.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:50.452+0000 7f1e95341700 1 -- 192.168.123.100:0/1256070250 wait complete. 2026-03-10T12:34:50.534 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-10T12:34:21.281104+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '21', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-10T12:34:50.534 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd pool get .mgr pg_num 2026-03-10T12:34:50.690 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: purged_snaps scrub starts 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: purged_snaps scrub ok 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: osd.5 [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] boot 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:50.720 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:50 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1721533531' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: purged_snaps scrub starts 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: purged_snaps scrub ok 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: osd.5 [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] boot 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:34:50.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:50 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1721533531' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.254+0000 7f25abe5e700 1 -- 192.168.123.100:0/593169749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a410cbb0 msgr2=0x7f25a410cf80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.254+0000 7f25abe5e700 1 --2- 192.168.123.100:0/593169749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a410cbb0 0x7f25a410cf80 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f2594009b00 tx=0x7f2594009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.255+0000 7f25abe5e700 1 -- 192.168.123.100:0/593169749 shutdown_connections 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.255+0000 7f25abe5e700 1 --2- 192.168.123.100:0/593169749 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a4106ba0 0x7f25a4107010 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.255+0000 7f25abe5e700 1 --2- 192.168.123.100:0/593169749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a410cbb0 0x7f25a410cf80 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.255+0000 7f25abe5e700 1 -- 192.168.123.100:0/593169749 >> 192.168.123.100:0/593169749 conn(0x7f25a4074b10 msgr2=0x7f25a4076f20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.256+0000 7f25abe5e700 1 -- 192.168.123.100:0/593169749 shutdown_connections 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.256+0000 7f25abe5e700 1 -- 192.168.123.100:0/593169749 wait complete. 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.256+0000 7f25abe5e700 1 Processor -- start 2026-03-10T12:34:51.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.256+0000 7f25abe5e700 1 -- start start 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25abe5e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a4106ba0 0x7f25a41a26a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25abe5e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a410cbb0 0x7f25a41a2be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25abe5e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25a419c690 con 0x7f25a4106ba0 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25abe5e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25a419c800 con 0x7f25a410cbb0 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a93f9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a410cbb0 0x7f25a41a2be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a93f9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a410cbb0 0x7f25a41a2be0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:38346/0 (socket says 192.168.123.100:38346) 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a93f9700 1 -- 192.168.123.100:0/2217187760 learned_addr learned my addr 192.168.123.100:0/2217187760 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a9bfa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a4106ba0 0x7f25a41a26a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a93f9700 1 -- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a4106ba0 msgr2=0x7f25a41a26a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:51.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a93f9700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a4106ba0 0x7f25a41a26a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:51.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a93f9700 1 -- 192.168.123.100:0/2217187760 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f25940097e0 con 0x7f25a410cbb0 2026-03-10T12:34:51.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a9bfa700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a4106ba0 0x7f25a41a26a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:34:51.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.257+0000 7f25a93f9700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a410cbb0 0x7f25a41a2be0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f25a000d900 tx=0x7f25a000dc10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:51.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.258+0000 7f259affd700 1 -- 192.168.123.100:0/2217187760 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25a00041d0 con 0x7f25a410cbb0 2026-03-10T12:34:51.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.258+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f25a419cae0 con 0x7f25a410cbb0 2026-03-10T12:34:51.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.258+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f25a419d030 con 0x7f25a410cbb0 2026-03-10T12:34:51.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.258+0000 7f259affd700 1 -- 192.168.123.100:0/2217187760 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f25a0004d10 con 0x7f25a410cbb0 2026-03-10T12:34:51.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.258+0000 7f259affd700 1 -- 192.168.123.100:0/2217187760 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25a000b750 con 0x7f25a410cbb0 2026-03-10T12:34:51.260 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.260+0000 7f259affd700 1 -- 192.168.123.100:0/2217187760 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f25a0005020 con 0x7f25a410cbb0 2026-03-10T12:34:51.260 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.260+0000 7f259affd700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f259006c630 0x7f259006eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:51.260 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.260+0000 7f25a9bfa700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f259006c630 0x7f259006eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:51.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.261+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2588005320 con 0x7f25a410cbb0 2026-03-10T12:34:51.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.261+0000 7f259affd700 1 -- 192.168.123.100:0/2217187760 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f25a008b120 con 0x7f25a410cbb0 2026-03-10T12:34:51.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.261+0000 7f25a9bfa700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f259006c630 0x7f259006eae0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f2594006010 tx=0x7f259400b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:51.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.264+0000 7f259affd700 1 -- 192.168.123.100:0/2217187760 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f25a0016080 con 0x7f25a410cbb0 2026-03-10T12:34:51.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.374+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7f2588005f70 con 0x7f25a410cbb0 2026-03-10T12:34:51.375 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.374+0000 7f259affd700 1 -- 192.168.123.100:0/2217187760 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v34) v1 ==== 93+0+10 (secure 0 0 0) 0x7f25a0055be0 con 0x7f25a410cbb0 2026-03-10T12:34:51.375 INFO:teuthology.orchestra.run.vm00.stdout:pg_num: 1 2026-03-10T12:34:51.377 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f259006c630 msgr2=0x7f259006eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:51.377 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f259006c630 0x7f259006eae0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f2594006010 tx=0x7f259400b540 comp rx=0 tx=0).stop 2026-03-10T12:34:51.377 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a410cbb0 msgr2=0x7f25a41a2be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:51.377 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a410cbb0 0x7f25a41a2be0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f25a000d900 tx=0x7f25a000dc10 comp rx=0 tx=0).stop 2026-03-10T12:34:51.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 shutdown_connections 2026-03-10T12:34:51.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f259006c630 0x7f259006eae0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:51.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f25a4106ba0 0x7f25a41a26a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:51.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 --2- 192.168.123.100:0/2217187760 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f25a410cbb0 0x7f25a41a2be0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:51.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 >> 192.168.123.100:0/2217187760 conn(0x7f25a4074b10 msgr2=0x7f25a4076490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:51.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.377+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 shutdown_connections 2026-03-10T12:34:51.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:51.378+0000 7f25abe5e700 1 -- 192.168.123.100:0/2217187760 wait complete. 2026-03-10T12:34:52.391 INFO:tasks.cephadm:Setting up client nodes... 2026-03-10T12:34:52.392 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T12:34:52.556 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:52.582 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:52 vm00 ceph-mon[50686]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T12:34:52.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:52 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1256070250' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T12:34:52.583 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:52 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2217187760' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T12:34:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:52 vm07 ceph-mon[58582]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T12:34:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:52 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1256070250' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T12:34:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:52 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2217187760' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T12:34:52.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.829+0000 7fb97dc91700 1 -- 192.168.123.100:0/1554182140 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 msgr2=0x7fb9780731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:52.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.829+0000 7fb97dc91700 1 --2- 192.168.123.100:0/1554182140 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb9780731e0 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7fb968009b30 tx=0x7fb968009e40 comp rx=0 tx=0).stop 2026-03-10T12:34:52.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.831+0000 7fb97dc91700 1 -- 192.168.123.100:0/1554182140 shutdown_connections 2026-03-10T12:34:52.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.831+0000 7fb97dc91700 1 --2- 192.168.123.100:0/1554182140 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9780737b0 0x7fb978073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.831+0000 7fb97dc91700 1 --2- 192.168.123.100:0/1554182140 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb9780731e0 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.831+0000 7fb97dc91700 1 -- 192.168.123.100:0/1554182140 >> 192.168.123.100:0/1554182140 conn(0x7fb9780fbaa0 msgr2=0x7fb9780fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:52.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.831+0000 7fb97dc91700 1 -- 192.168.123.100:0/1554182140 shutdown_connections 2026-03-10T12:34:52.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.831+0000 7fb97dc91700 1 -- 192.168.123.100:0/1554182140 wait complete. 2026-03-10T12:34:52.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.831+0000 7fb97dc91700 1 Processor -- start 2026-03-10T12:34:52.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.832+0000 7fb97dc91700 1 -- start start 2026-03-10T12:34:52.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb97dc91700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9780737b0 0x7fb978071d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb97dc91700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb978072250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb97dc91700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb978072870 con 0x7fb978074d80 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb97dc91700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9781a5e00 con 0x7fb9780737b0 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb96edff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb978072250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb96edff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb978072250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:38900/0 (socket says 192.168.123.100:38900) 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb96edff700 1 -- 192.168.123.100:0/368561156 learned_addr learned my addr 192.168.123.100:0/368561156 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb9777fe700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9780737b0 0x7fb978071d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:52.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb96edff700 1 -- 192.168.123.100:0/368561156 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9780737b0 msgr2=0x7fb978071d10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:52.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb96edff700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9780737b0 0x7fb978071d10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.833+0000 7fb96edff700 1 -- 192.168.123.100:0/368561156 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9680097e0 con 0x7fb978074d80 2026-03-10T12:34:52.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.834+0000 7fb96edff700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb978072250 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fb96000cc60 tx=0x7fb9600074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:52.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.835+0000 7fb9757fa700 1 -- 192.168.123.100:0/368561156 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb960007af0 con 0x7fb978074d80 2026-03-10T12:34:52.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.835+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9781a5fa0 con 0x7fb978074d80 2026-03-10T12:34:52.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.835+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9781a6470 con 0x7fb978074d80 2026-03-10T12:34:52.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.837+0000 7fb9757fa700 1 -- 192.168.123.100:0/368561156 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb960004d10 con 0x7fb978074d80 2026-03-10T12:34:52.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.837+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb978066e40 con 0x7fb978074d80 2026-03-10T12:34:52.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.838+0000 7fb9757fa700 1 -- 192.168.123.100:0/368561156 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb9600056e0 con 0x7fb978074d80 2026-03-10T12:34:52.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.838+0000 7fb9757fa700 1 -- 192.168.123.100:0/368561156 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb96000f4b0 con 0x7fb978074d80 2026-03-10T12:34:52.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.838+0000 7fb9757fa700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb96406c820 0x7fb96406ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:52.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.838+0000 7fb9757fa700 1 -- 192.168.123.100:0/368561156 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb96008c170 con 0x7fb978074d80 2026-03-10T12:34:52.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.840+0000 7fb9777fe700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb96406c820 0x7fb96406ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:52.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.840+0000 7fb9757fa700 1 -- 192.168.123.100:0/368561156 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb96005a740 con 0x7fb978074d80 2026-03-10T12:34:52.841 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.841+0000 7fb9777fe700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb96406c820 0x7fb96406ecd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fb9680052a0 tx=0x7fb968005ab0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:52.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.985+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fb9781a68b0 con 0x7fb978074d80 2026-03-10T12:34:52.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.990+0000 7fb9757fa700 1 -- 192.168.123.100:0/368561156 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7fb96005a2d0 con 0x7fb978074d80 2026-03-10T12:34:52.990 INFO:teuthology.orchestra.run.vm00.stdout:[client.0] 2026-03-10T12:34:52.990 INFO:teuthology.orchestra.run.vm00.stdout: key = AQDsD7BpI2TVOhAAvTv64Axiq3ZkxbVVdAiozg== 2026-03-10T12:34:52.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.992+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb96406c820 msgr2=0x7fb96406ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:52.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.992+0000 7fb97dc91700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb96406c820 0x7fb96406ecd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fb9680052a0 tx=0x7fb968005ab0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.992+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 msgr2=0x7fb978072250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.992+0000 7fb97dc91700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb978072250 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fb96000cc60 tx=0x7fb9600074a0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.993+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 shutdown_connections 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.993+0000 7fb97dc91700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb96406c820 0x7fb96406ecd0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.993+0000 7fb97dc91700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9780737b0 0x7fb978071d10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.993+0000 7fb97dc91700 1 --2- 192.168.123.100:0/368561156 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb978074d80 0x7fb978072250 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.993+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 >> 192.168.123.100:0/368561156 conn(0x7fb9780fbaa0 msgr2=0x7fb978106310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.993+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 shutdown_connections 2026-03-10T12:34:52.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:52.993+0000 7fb97dc91700 1 -- 192.168.123.100:0/368561156 wait complete. 2026-03-10T12:34:53.039 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:34:53.039 DEBUG:teuthology.orchestra.run.vm00:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-10T12:34:53.039 DEBUG:teuthology.orchestra.run.vm00:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-10T12:34:53.079 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T12:34:53.223 INFO:teuthology.orchestra.run.vm07.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm07/config 2026-03-10T12:34:53.367 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:53 vm07 ceph-mon[58582]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:53.367 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:53 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/368561156' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T12:34:53.368 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:53 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/368561156' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T12:34:53.368 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:53 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:53.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.487+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2734802652 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70102760 msgr2=0x7f8f70102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:53.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.487+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2734802652 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70102760 0x7f8f70102b70 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f8f58009b00 tx=0x7f8f58009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:53.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.488+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2734802652 shutdown_connections 2026-03-10T12:34:53.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.488+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2734802652 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8f70103960 0x7f8f70103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:53.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.488+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2734802652 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70102760 0x7f8f70102b70 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:53.488 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.488+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2734802652 >> 192.168.123.107:0/2734802652 conn(0x7f8f700fdcf0 msgr2=0x7f8f70100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:53.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.488+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2734802652 shutdown_connections 2026-03-10T12:34:53.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.488+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2734802652 wait complete. 2026-03-10T12:34:53.489 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.489+0000 7f8f74bd3700 1 Processor -- start 2026-03-10T12:34:53.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.489+0000 7f8f74bd3700 1 -- start start 2026-03-10T12:34:53.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.490+0000 7f8f74bd3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8f70102760 0x7f8f70198040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:53.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.490+0000 7f8f74bd3700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70103960 0x7f8f70198580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:53.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.490+0000 7f8f6dd9b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70103960 0x7f8f70198580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:53.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.490+0000 7f8f6dd9b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70103960 0x7f8f70198580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.107:57032/0 (socket says 192.168.123.107:57032) 2026-03-10T12:34:53.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.490+0000 7f8f6dd9b700 1 -- 192.168.123.107:0/2242280749 learned_addr learned my addr 192.168.123.107:0/2242280749 (peer_addr_for_me v2:192.168.123.107:0/0) 2026-03-10T12:34:53.490 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.490+0000 7f8f74bd3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f70198ba0 con 0x7f8f70102760 2026-03-10T12:34:53.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.490+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8f70198ce0 con 0x7f8f70103960 2026-03-10T12:34:53.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f6e59c700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8f70102760 0x7f8f70198040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:53.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f6dd9b700 1 -- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8f70102760 msgr2=0x7f8f70198040 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:53.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f6dd9b700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8f70102760 0x7f8f70198040 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:53.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f6dd9b700 1 -- 192.168.123.107:0/2242280749 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8f580097e0 con 0x7f8f70103960 2026-03-10T12:34:53.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f6e59c700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8f70102760 0x7f8f70198040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:53.491 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f6dd9b700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70103960 0x7f8f70198580 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f8f6000d8d0 tx=0x7f8f6000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:53.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f677fe700 1 -- 192.168.123.107:0/2242280749 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f60009940 con 0x7f8f70103960 2026-03-10T12:34:53.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f677fe700 1 -- 192.168.123.107:0/2242280749 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8f60010460 con 0x7f8f70103960 2026-03-10T12:34:53.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f677fe700 1 -- 192.168.123.107:0/2242280749 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8f6000f5d0 con 0x7f8f70103960 2026-03-10T12:34:53.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8f7019d790 con 0x7f8f70103960 2026-03-10T12:34:53.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.491+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8f7019dce0 con 0x7f8f70103960 2026-03-10T12:34:53.494 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.492+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8f7010b440 con 0x7f8f70103960 2026-03-10T12:34:53.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.495+0000 7f8f677fe700 1 -- 192.168.123.107:0/2242280749 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8f600105d0 con 0x7f8f70103960 2026-03-10T12:34:53.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.495+0000 7f8f677fe700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8f5c06c750 0x7f8f5c06ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:53.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.495+0000 7f8f677fe700 1 -- 192.168.123.107:0/2242280749 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f8f6008b940 con 0x7f8f70103960 2026-03-10T12:34:53.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.496+0000 7f8f677fe700 1 -- 192.168.123.107:0/2242280749 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8f600b77c0 con 0x7f8f70103960 2026-03-10T12:34:53.496 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.496+0000 7f8f6e59c700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8f5c06c750 0x7f8f5c06ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:53.499 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.499+0000 7f8f6e59c700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8f5c06c750 0x7f8f5c06ec00 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f8f58009fd0 tx=0x7f8f58005fd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:53.648 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.647+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f8f7010b630 con 0x7f8f70103960 2026-03-10T12:34:53.654 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.654+0000 7f8f677fe700 1 -- 192.168.123.107:0/2242280749 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7f8f60059c80 con 0x7f8f70103960 2026-03-10T12:34:53.655 INFO:teuthology.orchestra.run.vm07.stdout:[client.1] 2026-03-10T12:34:53.655 INFO:teuthology.orchestra.run.vm07.stdout: key = AQDtD7Bp5TvQJhAAtpH4iWN69oIQOzxULtdUsQ== 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.656+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8f5c06c750 msgr2=0x7f8f5c06ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.656+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8f5c06c750 0x7f8f5c06ec00 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f8f58009fd0 tx=0x7f8f58005fd0 comp rx=0 tx=0).stop 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.656+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70103960 msgr2=0x7f8f70198580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.656+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70103960 0x7f8f70198580 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f8f6000d8d0 tx=0x7f8f6000dc90 comp rx=0 tx=0).stop 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.657+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 shutdown_connections 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.657+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8f5c06c750 0x7f8f5c06ec00 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.657+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8f70102760 0x7f8f70198040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.657+0000 7f8f74bd3700 1 --2- 192.168.123.107:0/2242280749 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8f70103960 0x7f8f70198580 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.657+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 >> 192.168.123.107:0/2242280749 conn(0x7f8f700fdcf0 msgr2=0x7f8f70106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.657+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 shutdown_connections 2026-03-10T12:34:53.657 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:34:53.657+0000 7f8f74bd3700 1 -- 192.168.123.107:0/2242280749 wait complete. 2026-03-10T12:34:53.702 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:34:53.702 DEBUG:teuthology.orchestra.run.vm07:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-10T12:34:53.702 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-10T12:34:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:53 vm00 ceph-mon[50686]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:53 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/368561156' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T12:34:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:53 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/368561156' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T12:34:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:53 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:34:53.742 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-10T12:34:53.742 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-10T12:34:53.742 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mgr dump --format=json 2026-03-10T12:34:53.896 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:54.295 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.294+0000 7fd3e2065700 1 -- 192.168.123.100:0/3787213001 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc073a00 msgr2=0x7fd3dc111040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:54.295 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.294+0000 7fd3e2065700 1 --2- 192.168.123.100:0/3787213001 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc073a00 0x7fd3dc111040 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fd3cc009b00 tx=0x7fd3cc009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.295+0000 7fd3e2065700 1 -- 192.168.123.100:0/3787213001 shutdown_connections 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.295+0000 7fd3e2065700 1 --2- 192.168.123.100:0/3787213001 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc073a00 0x7fd3dc111040 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.295+0000 7fd3e2065700 1 --2- 192.168.123.100:0/3787213001 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc0730f0 0x7fd3dc0734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.295+0000 7fd3e2065700 1 -- 192.168.123.100:0/3787213001 >> 192.168.123.100:0/3787213001 conn(0x7fd3dc0fc090 msgr2=0x7fd3dc0fe4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.295+0000 7fd3e2065700 1 -- 192.168.123.100:0/3787213001 shutdown_connections 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.296+0000 7fd3e2065700 1 -- 192.168.123.100:0/3787213001 wait complete. 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.296+0000 7fd3e2065700 1 Processor -- start 2026-03-10T12:34:54.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.296+0000 7fd3e2065700 1 -- start start 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.296+0000 7fd3e2065700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc0730f0 0x7fd3dc1a24f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3e2065700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc073a00 0x7fd3dc1a2a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3e2065700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd3dc1a30c0 con 0x7fd3dc0730f0 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3e2065700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd3dc19c570 con 0x7fd3dc073a00 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3daffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc073a00 0x7fd3dc1a2a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3daffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc073a00 0x7fd3dc1a2a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:38382/0 (socket says 192.168.123.100:38382) 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3daffd700 1 -- 192.168.123.100:0/661836783 learned_addr learned my addr 192.168.123.100:0/661836783 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3daffd700 1 -- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc0730f0 msgr2=0x7fd3dc1a24f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:34:54.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3db7fe700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc0730f0 0x7fd3dc1a24f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3daffd700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc0730f0 0x7fd3dc1a24f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3daffd700 1 -- 192.168.123.100:0/661836783 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd3cc0097e0 con 0x7fd3dc073a00 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3db7fe700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc0730f0 0x7fd3dc1a24f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.297+0000 7fd3daffd700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc073a00 0x7fd3dc1a2a30 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fd3cc009ad0 tx=0x7fd3cc0052e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.298+0000 7fd3d8ff9700 1 -- 192.168.123.100:0/661836783 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd3cc01d070 con 0x7fd3dc073a00 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.298+0000 7fd3d8ff9700 1 -- 192.168.123.100:0/661836783 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd3cc00bc50 con 0x7fd3dc073a00 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.298+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd3dc19c850 con 0x7fd3dc073a00 2026-03-10T12:34:54.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.298+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd3dc19cda0 con 0x7fd3dc073a00 2026-03-10T12:34:54.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.298+0000 7fd3d8ff9700 1 -- 192.168.123.100:0/661836783 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd3cc022620 con 0x7fd3dc073a00 2026-03-10T12:34:54.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.299+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd3dc10e7c0 con 0x7fd3dc073a00 2026-03-10T12:34:54.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.299+0000 7fd3d8ff9700 1 -- 192.168.123.100:0/661836783 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd3cc022890 con 0x7fd3dc073a00 2026-03-10T12:34:54.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.299+0000 7fd3d8ff9700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd3c806c6d0 0x7fd3c806eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:54.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.300+0000 7fd3d8ff9700 1 -- 192.168.123.100:0/661836783 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd3cc048020 con 0x7fd3dc073a00 2026-03-10T12:34:54.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.300+0000 7fd3db7fe700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd3c806c6d0 0x7fd3c806eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:54.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.300+0000 7fd3db7fe700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd3c806c6d0 0x7fd3c806eb80 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fd3c4005fd0 tx=0x7fd3c4005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:54.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.301+0000 7fd3d8ff9700 1 -- 192.168.123.100:0/661836783 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd3cc05b7f0 con 0x7fd3dc073a00 2026-03-10T12:34:54.436 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.435+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7fd3dc04ea50 con 0x7fd3dc073a00 2026-03-10T12:34:54.438 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.437+0000 7fd3d8ff9700 1 -- 192.168.123.100:0/661836783 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v19) v1 ==== 74+0+172845 (secure 0 0 0) 0x7fd3cc05b380 con 0x7fd3dc073a00 2026-03-10T12:34:54.438 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd3c806c6d0 msgr2=0x7fd3c806eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd3c806c6d0 0x7fd3c806eb80 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fd3c4005fd0 tx=0x7fd3c4005dc0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc073a00 msgr2=0x7fd3dc1a2a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc073a00 0x7fd3dc1a2a30 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fd3cc009ad0 tx=0x7fd3cc0052e0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 shutdown_connections 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd3c806c6d0 0x7fd3c806eb80 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3dc0730f0 0x7fd3dc1a24f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 --2- 192.168.123.100:0/661836783 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd3dc073a00 0x7fd3dc1a2a30 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 >> 192.168.123.100:0/661836783 conn(0x7fd3dc0fc090 msgr2=0x7fd3dc102b50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 shutdown_connections 2026-03-10T12:34:54.440 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:54.440+0000 7fd3e2065700 1 -- 192.168.123.100:0/661836783 wait complete. 2026-03-10T12:34:54.502 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":19,"active_gid":14223,"active_name":"vm00.nescmq","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6800","nonce":2},{"type":"v1","addr":"192.168.123.100:6801","nonce":2}]},"active_addr":"192.168.123.100:6801/2","active_change":"2026-03-10T12:33:23.076050+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14250,"name":"vm07.kfawlb","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_password":{"name":"alertmanager_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web password","long_desc":"","tags":[],"see_also":[]},"alertmanager_web_user":{"name":"alertmanager_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Alertmanager web user","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph:v18","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"prometheus_web_password":{"name":"prometheus_web_password","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web password","long_desc":"","tags":[],"see_also":[]},"prometheus_web_user":{"name":"prometheus_web_user","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"Prometheus web user","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"noautoscale":{"name":"noautoscale","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"global autoscale flag","long_desc":"Option to turn on/off the autoscaler for all pools","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.100:8443/","prometheus":"http://192.168.123.100:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.100:0","nonce":2255478548}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.100:0","nonce":3513500591}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.100:0","nonce":4057224011}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.100:0","nonce":3579075241}]}]} 2026-03-10T12:34:54.503 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-10T12:34:54.503 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-10T12:34:54.504 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd dump --format=json 2026-03-10T12:34:54.650 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:54.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:54 vm00 ceph-mon[50686]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:54.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:54 vm00 ceph-mon[50686]: from='client.? 192.168.123.107:0/2242280749' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T12:34:54.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:54 vm00 ceph-mon[50686]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T12:34:54.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:54 vm00 ceph-mon[50686]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T12:34:54.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:54 vm07 ceph-mon[58582]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:54.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:54 vm07 ceph-mon[58582]: from='client.? 192.168.123.107:0/2242280749' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T12:34:54.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:54 vm07 ceph-mon[58582]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T12:34:54.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:54 vm07 ceph-mon[58582]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.176+0000 7f19d9673700 1 -- 192.168.123.100:0/944992640 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 msgr2=0x7f19d4102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.176+0000 7f19d9673700 1 --2- 192.168.123.100:0/944992640 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 0x7f19d4102bf0 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f19c4009b00 tx=0x7f19c4009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.177+0000 7f19d9673700 1 -- 192.168.123.100:0/944992640 shutdown_connections 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.177+0000 7f19d9673700 1 --2- 192.168.123.100:0/944992640 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 0x7f19d4102bf0 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.177+0000 7f19d9673700 1 --2- 192.168.123.100:0/944992640 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 0x7f19d4108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.177+0000 7f19d9673700 1 -- 192.168.123.100:0/944992640 >> 192.168.123.100:0/944992640 conn(0x7f19d40fe280 msgr2=0x7f19d4100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.177+0000 7f19d9673700 1 -- 192.168.123.100:0/944992640 shutdown_connections 2026-03-10T12:34:55.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.177+0000 7f19d9673700 1 -- 192.168.123.100:0/944992640 wait complete. 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d9673700 1 Processor -- start 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d9673700 1 -- start start 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d9673700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 0x7f19d419ce80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d9673700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 0x7f19d4078280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d9673700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f19d419d480 con 0x7f19d4102780 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d9673700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f19d419d5f0 con 0x7f19d4108780 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 0x7f19d419ce80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:55.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d27fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 0x7f19d4078280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d27fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 0x7f19d4078280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:38406/0 (socket says 192.168.123.100:38406) 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.178+0000 7f19d27fc700 1 -- 192.168.123.100:0/1375808022 learned_addr learned my addr 192.168.123.100:0/1375808022 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19d27fc700 1 -- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 msgr2=0x7f19d419ce80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19d27fc700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 0x7f19d419ce80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19d27fc700 1 -- 192.168.123.100:0/1375808022 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f19c40097e0 con 0x7f19d4108780 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19d2ffd700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 0x7f19d419ce80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19d27fc700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 0x7f19d4078280 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f19c4009ad0 tx=0x7f19c4004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:55.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19cbfff700 1 -- 192.168.123.100:0/1375808022 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f19c401d070 con 0x7f19d4108780 2026-03-10T12:34:55.180 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f19d40787c0 con 0x7f19d4108780 2026-03-10T12:34:55.180 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.179+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f19d4078cb0 con 0x7f19d4108780 2026-03-10T12:34:55.180 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.180+0000 7f19cbfff700 1 -- 192.168.123.100:0/1375808022 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f19c400bc50 con 0x7f19d4108780 2026-03-10T12:34:55.180 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.180+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f19d404ea50 con 0x7f19d4108780 2026-03-10T12:34:55.180 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.180+0000 7f19cbfff700 1 -- 192.168.123.100:0/1375808022 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f19c400f630 con 0x7f19d4108780 2026-03-10T12:34:55.181 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.181+0000 7f19cbfff700 1 -- 192.168.123.100:0/1375808022 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f19c400f790 con 0x7f19d4108780 2026-03-10T12:34:55.182 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.181+0000 7f19cbfff700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f19c006c6d0 0x7f19c006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:55.182 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.182+0000 7f19cbfff700 1 -- 192.168.123.100:0/1375808022 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f19c408d300 con 0x7f19d4108780 2026-03-10T12:34:55.182 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.182+0000 7f19d2ffd700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f19c006c6d0 0x7f19c006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:55.182 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.182+0000 7f19d2ffd700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f19c006c6d0 0x7f19c006eb80 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f19d41038c0 tx=0x7f19bc00a3b0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:55.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.183+0000 7f19cbfff700 1 -- 192.168.123.100:0/1375808022 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f19c405bad0 con 0x7f19d4108780 2026-03-10T12:34:55.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.284+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f19d4195330 con 0x7f19d4108780 2026-03-10T12:34:55.286 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.286+0000 7f19cbfff700 1 -- 192.168.123.100:0/1375808022 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v34) v1 ==== 74+0+11271 (secure 0 0 0) 0x7f19c4027090 con 0x7f19d4108780 2026-03-10T12:34:55.287 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:55.287 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":34,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","created":"2026-03-10T12:32:17.778471+0000","modified":"2026-03-10T12:34:50.381363+0000","last_up_change":"2026-03-10T12:34:49.373861+0000","last_in_change":"2026-03-10T12:34:38.384036+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T12:34:21.281104+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"0f6cb3f2-3337-4851-ba13-f08c9574062c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6802","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6803","nonce":3160210101}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6804","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6805","nonce":3160210101}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6808","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6809","nonce":3160210101}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6806","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6807","nonce":3160210101}]},"public_addr":"192.168.123.100:6803/3160210101","cluster_addr":"192.168.123.100:6805/3160210101","heartbeat_back_addr":"192.168.123.100:6809/3160210101","heartbeat_front_addr":"192.168.123.100:6807/3160210101","state":["exists","up"]},{"osd":1,"uuid":"bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":25,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6810","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6811","nonce":4135684750}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6812","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6813","nonce":4135684750}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6816","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6817","nonce":4135684750}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6814","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6815","nonce":4135684750}]},"public_addr":"192.168.123.100:6811/4135684750","cluster_addr":"192.168.123.100:6813/4135684750","heartbeat_back_addr":"192.168.123.100:6817/4135684750","heartbeat_front_addr":"192.168.123.100:6815/4135684750","state":["exists","up"]},{"osd":2,"uuid":"3de6b811-dbac-419f-abf8-afd0bec7a47f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6818","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6819","nonce":3405786659}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6820","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6821","nonce":3405786659}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6824","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6825","nonce":3405786659}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6822","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6823","nonce":3405786659}]},"public_addr":"192.168.123.100:6819/3405786659","cluster_addr":"192.168.123.100:6821/3405786659","heartbeat_back_addr":"192.168.123.100:6825/3405786659","heartbeat_front_addr":"192.168.123.100:6823/3405786659","state":["exists","up"]},{"osd":3,"uuid":"cd850d3a-e99e-4292-9600-f18ed81a7d18","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":24,"up_thru":28,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6800","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6801","nonce":698977351}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6802","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6803","nonce":698977351}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6806","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6807","nonce":698977351}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6804","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6805","nonce":698977351}]},"public_addr":"192.168.123.107:6801/698977351","cluster_addr":"192.168.123.107:6803/698977351","heartbeat_back_addr":"192.168.123.107:6807/698977351","heartbeat_front_addr":"192.168.123.107:6805/698977351","state":["exists","up"]},{"osd":4,"uuid":"ff03ddab-6945-46b4-b19b-30775ca85618","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":29,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6808","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6809","nonce":4106671248}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6810","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6811","nonce":4106671248}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6814","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6815","nonce":4106671248}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6812","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6813","nonce":4106671248}]},"public_addr":"192.168.123.107:6809/4106671248","cluster_addr":"192.168.123.107:6811/4106671248","heartbeat_back_addr":"192.168.123.107:6815/4106671248","heartbeat_front_addr":"192.168.123.107:6813/4106671248","state":["exists","up"]},{"osd":5,"uuid":"d91585d7-d879-48f1-8fdf-f6c88a82428a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":33,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6816","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6817","nonce":4252230828}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6818","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6819","nonce":4252230828}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6822","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6823","nonce":4252230828}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6820","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6821","nonce":4252230828}]},"public_addr":"192.168.123.107:6817/4252230828","cluster_addr":"192.168.123.107:6819/4252230828","heartbeat_back_addr":"192.168.123.107:6823/4252230828","heartbeat_front_addr":"192.168.123.107:6821/4252230828","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:33:58.292397+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:07.673082+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:18.026191+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:27.370020+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:38.625936+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:47.790663+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.100:0/2792084710":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/69960775":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/1015166415":"2026-03-11T12:32:46.753519+0000","192.168.123.100:0/2753083811":"2026-03-11T12:32:46.753519+0000","192.168.123.100:6800/2":"2026-03-11T12:32:32.101116+0000","192.168.123.100:6801/2":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/4113305903":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/385023950":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/1487901880":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/3472231466":"2026-03-11T12:32:46.753519+0000","192.168.123.100:0/1442998252":"2026-03-11T12:32:32.101116+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f19c006c6d0 msgr2=0x7f19c006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f19c006c6d0 0x7f19c006eb80 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f19d41038c0 tx=0x7f19bc00a3b0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 msgr2=0x7f19d4078280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 0x7f19d4078280 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f19c4009ad0 tx=0x7f19c4004c30 comp rx=0 tx=0).stop 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 shutdown_connections 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f19c006c6d0 0x7f19c006eb80 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f19d4102780 0x7f19d419ce80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 --2- 192.168.123.100:0/1375808022 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f19d4108780 0x7f19d4078280 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 >> 192.168.123.100:0/1375808022 conn(0x7f19d40fe280 msgr2=0x7f19d40ffbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:55.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 shutdown_connections 2026-03-10T12:34:55.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.289+0000 7f19d9673700 1 -- 192.168.123.100:0/1375808022 wait complete. 2026-03-10T12:34:55.347 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-10T12:34:55.347 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd dump --format=json 2026-03-10T12:34:55.486 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:55.536 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:55 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/661836783' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T12:34:55.536 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:55 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1375808022' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.710+0000 7f079136b700 1 -- 192.168.123.100:0/1745624689 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 msgr2=0x7f078c102c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.710+0000 7f079136b700 1 --2- 192.168.123.100:0/1745624689 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 0x7f078c102c00 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f077c009b00 tx=0x7f077c009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 -- 192.168.123.100:0/1745624689 shutdown_connections 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 --2- 192.168.123.100:0/1745624689 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 0x7f078c102c00 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 --2- 192.168.123.100:0/1745624689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 0x7f078c108b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 -- 192.168.123.100:0/1745624689 >> 192.168.123.100:0/1745624689 conn(0x7f078c0fe2b0 msgr2=0x7f078c1006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 -- 192.168.123.100:0/1745624689 shutdown_connections 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 -- 192.168.123.100:0/1745624689 wait complete. 2026-03-10T12:34:55.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 Processor -- start 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.711+0000 7f079136b700 1 -- start start 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f079136b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 0x7f078c198400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f079136b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 0x7f078c198940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f079136b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f078c199020 con 0x7f078c102790 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f079136b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f078c19cdb0 con 0x7f078c108790 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078a7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 0x7f078c198940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078a7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 0x7f078c198940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:38424/0 (socket says 192.168.123.100:38424) 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078a7fc700 1 -- 192.168.123.100:0/1413614912 learned_addr learned my addr 192.168.123.100:0/1413614912 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078affd700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 0x7f078c198400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078a7fc700 1 -- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 msgr2=0x7f078c198400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078a7fc700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 0x7f078c198400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078a7fc700 1 -- 192.168.123.100:0/1413614912 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f077c0097e0 con 0x7f078c108790 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078affd700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 0x7f078c198400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.712+0000 7f078a7fc700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 0x7f078c198940 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f077c009ad0 tx=0x7f077c0052e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.713+0000 7f0783fff700 1 -- 192.168.123.100:0/1413614912 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f077c01d070 con 0x7f078c108790 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.713+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f078c19d030 con 0x7f078c108790 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.713+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f078c19d520 con 0x7f078c108790 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.713+0000 7f0783fff700 1 -- 192.168.123.100:0/1413614912 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f077c00bc50 con 0x7f078c108790 2026-03-10T12:34:55.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.713+0000 7f0783fff700 1 -- 192.168.123.100:0/1413614912 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f077c00e5f0 con 0x7f078c108790 2026-03-10T12:34:55.714 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.713+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f078c04ea50 con 0x7f078c108790 2026-03-10T12:34:55.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.714+0000 7f0783fff700 1 -- 192.168.123.100:0/1413614912 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f077c00f460 con 0x7f078c108790 2026-03-10T12:34:55.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.714+0000 7f0783fff700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f077806c680 0x7f077806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:55.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.715+0000 7f0783fff700 1 -- 192.168.123.100:0/1413614912 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f077c08cb10 con 0x7f078c108790 2026-03-10T12:34:55.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.715+0000 7f078affd700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f077806c680 0x7f077806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:55.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.715+0000 7f078affd700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f077806c680 0x7f077806eb30 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f078c1038d0 tx=0x7f0774008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:55.717 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.717+0000 7f0783fff700 1 -- 192.168.123.100:0/1413614912 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f077c057700 con 0x7f078c108790 2026-03-10T12:34:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:55 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/661836783' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T12:34:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:55 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1375808022' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T12:34:55.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.819+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f078c066e40 con 0x7f078c108790 2026-03-10T12:34:55.822 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.822+0000 7f0783fff700 1 -- 192.168.123.100:0/1413614912 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v34) v1 ==== 74+0+11271 (secure 0 0 0) 0x7f077c05ad20 con 0x7f078c108790 2026-03-10T12:34:55.822 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:34:55.823 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":34,"fsid":"1a52002a-1c7d-11f1-af82-51cdd81caea8","created":"2026-03-10T12:32:17.778471+0000","modified":"2026-03-10T12:34:50.381363+0000","last_up_change":"2026-03-10T12:34:49.373861+0000","last_in_change":"2026-03-10T12:34:38.384036+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T12:34:21.281104+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"21","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"0f6cb3f2-3337-4851-ba13-f08c9574062c","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6802","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6803","nonce":3160210101}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6804","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6805","nonce":3160210101}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6808","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6809","nonce":3160210101}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6806","nonce":3160210101},{"type":"v1","addr":"192.168.123.100:6807","nonce":3160210101}]},"public_addr":"192.168.123.100:6803/3160210101","cluster_addr":"192.168.123.100:6805/3160210101","heartbeat_back_addr":"192.168.123.100:6809/3160210101","heartbeat_front_addr":"192.168.123.100:6807/3160210101","state":["exists","up"]},{"osd":1,"uuid":"bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":25,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6810","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6811","nonce":4135684750}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6812","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6813","nonce":4135684750}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6816","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6817","nonce":4135684750}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6814","nonce":4135684750},{"type":"v1","addr":"192.168.123.100:6815","nonce":4135684750}]},"public_addr":"192.168.123.100:6811/4135684750","cluster_addr":"192.168.123.100:6813/4135684750","heartbeat_back_addr":"192.168.123.100:6817/4135684750","heartbeat_front_addr":"192.168.123.100:6815/4135684750","state":["exists","up"]},{"osd":2,"uuid":"3de6b811-dbac-419f-abf8-afd0bec7a47f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6818","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6819","nonce":3405786659}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6820","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6821","nonce":3405786659}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6824","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6825","nonce":3405786659}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6822","nonce":3405786659},{"type":"v1","addr":"192.168.123.100:6823","nonce":3405786659}]},"public_addr":"192.168.123.100:6819/3405786659","cluster_addr":"192.168.123.100:6821/3405786659","heartbeat_back_addr":"192.168.123.100:6825/3405786659","heartbeat_front_addr":"192.168.123.100:6823/3405786659","state":["exists","up"]},{"osd":3,"uuid":"cd850d3a-e99e-4292-9600-f18ed81a7d18","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":24,"up_thru":28,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6800","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6801","nonce":698977351}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6802","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6803","nonce":698977351}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6806","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6807","nonce":698977351}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6804","nonce":698977351},{"type":"v1","addr":"192.168.123.107:6805","nonce":698977351}]},"public_addr":"192.168.123.107:6801/698977351","cluster_addr":"192.168.123.107:6803/698977351","heartbeat_back_addr":"192.168.123.107:6807/698977351","heartbeat_front_addr":"192.168.123.107:6805/698977351","state":["exists","up"]},{"osd":4,"uuid":"ff03ddab-6945-46b4-b19b-30775ca85618","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":29,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6808","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6809","nonce":4106671248}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6810","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6811","nonce":4106671248}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6814","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6815","nonce":4106671248}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6812","nonce":4106671248},{"type":"v1","addr":"192.168.123.107:6813","nonce":4106671248}]},"public_addr":"192.168.123.107:6809/4106671248","cluster_addr":"192.168.123.107:6811/4106671248","heartbeat_back_addr":"192.168.123.107:6815/4106671248","heartbeat_front_addr":"192.168.123.107:6813/4106671248","state":["exists","up"]},{"osd":5,"uuid":"d91585d7-d879-48f1-8fdf-f6c88a82428a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":33,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6816","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6817","nonce":4252230828}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6818","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6819","nonce":4252230828}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6822","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6823","nonce":4252230828}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6820","nonce":4252230828},{"type":"v1","addr":"192.168.123.107:6821","nonce":4252230828}]},"public_addr":"192.168.123.107:6817/4252230828","cluster_addr":"192.168.123.107:6819/4252230828","heartbeat_back_addr":"192.168.123.107:6823/4252230828","heartbeat_front_addr":"192.168.123.107:6821/4252230828","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:33:58.292397+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:07.673082+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:18.026191+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:27.370020+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:38.625936+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T12:34:47.790663+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.100:0/2792084710":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/69960775":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/1015166415":"2026-03-11T12:32:46.753519+0000","192.168.123.100:0/2753083811":"2026-03-11T12:32:46.753519+0000","192.168.123.100:6800/2":"2026-03-11T12:32:32.101116+0000","192.168.123.100:6801/2":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/4113305903":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/385023950":"2026-03-11T12:33:23.075938+0000","192.168.123.100:0/1487901880":"2026-03-11T12:32:32.101116+0000","192.168.123.100:0/3472231466":"2026-03-11T12:32:46.753519+0000","192.168.123.100:0/1442998252":"2026-03-11T12:32:32.101116+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T12:34:55.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.824+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f077806c680 msgr2=0x7f077806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.824+0000 7f079136b700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f077806c680 0x7f077806eb30 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f078c1038d0 tx=0x7f0774008040 comp rx=0 tx=0).stop 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 msgr2=0x7f078c198940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 0x7f078c198940 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f077c009ad0 tx=0x7f077c0052e0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 shutdown_connections 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f077806c680 0x7f077806eb30 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f078c102790 0x7f078c198400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 --2- 192.168.123.100:0/1413614912 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f078c108790 0x7f078c198940 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 >> 192.168.123.100:0/1413614912 conn(0x7f078c0fe2b0 msgr2=0x7f078c0ff9c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:55.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 shutdown_connections 2026-03-10T12:34:55.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:55.825+0000 7f079136b700 1 -- 192.168.123.100:0/1413614912 wait complete. 2026-03-10T12:34:55.880 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph tell osd.0 flush_pg_stats 2026-03-10T12:34:55.880 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph tell osd.1 flush_pg_stats 2026-03-10T12:34:55.880 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph tell osd.2 flush_pg_stats 2026-03-10T12:34:55.880 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph tell osd.3 flush_pg_stats 2026-03-10T12:34:55.881 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph tell osd.4 flush_pg_stats 2026-03-10T12:34:55.881 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph tell osd.5 flush_pg_stats 2026-03-10T12:34:56.305 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:56.307 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:56.308 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:56.315 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:56.324 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:56.489 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:56.609 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:56 vm00 ceph-mon[50686]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:56.609 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:56 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1413614912' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T12:34:56.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:56 vm07 ceph-mon[58582]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:56.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:56 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1413614912' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T12:34:56.869 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.867+0000 7f163ff27700 1 -- 192.168.123.100:0/844719628 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f163810ac90 msgr2=0x7f163806d260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:56.869 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.867+0000 7f163ff27700 1 --2- 192.168.123.100:0/844719628 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f163810ac90 0x7f163806d260 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f1634009b00 tx=0x7f1634009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.869+0000 7f163ff27700 1 -- 192.168.123.100:0/844719628 shutdown_connections 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.869+0000 7f163ff27700 1 --2- 192.168.123.100:0/844719628 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f163806d830 0x7f163806dca0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.869+0000 7f163ff27700 1 --2- 192.168.123.100:0/844719628 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f163810ac90 0x7f163806d260 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.869+0000 7f163ff27700 1 -- 192.168.123.100:0/844719628 >> 192.168.123.100:0/844719628 conn(0x7f163806c830 msgr2=0x7f1638071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.869+0000 7f163ff27700 1 -- 192.168.123.100:0/844719628 shutdown_connections 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.869+0000 7f163ff27700 1 -- 192.168.123.100:0/844719628 wait complete. 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163ff27700 1 Processor -- start 2026-03-10T12:34:56.870 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163ff27700 1 -- start start 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163ff27700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f163806d830 0x7f16381a6800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163ff27700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16381a6d40 0x7f16381a0880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163ff27700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16381a7300 con 0x7f16381a6d40 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163ff27700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16381a0dc0 con 0x7f163806d830 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163d4c2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16381a6d40 0x7f16381a0880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163d4c2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16381a6d40 0x7f16381a0880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:38958/0 (socket says 192.168.123.100:38958) 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163d4c2700 1 -- 192.168.123.100:0/2412387330 learned_addr learned my addr 192.168.123.100:0/2412387330 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:56.871 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.870+0000 7f163dcc3700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f163806d830 0x7f16381a6800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.871+0000 7f163d4c2700 1 -- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f163806d830 msgr2=0x7f16381a6800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.871+0000 7f163d4c2700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f163806d830 0x7f16381a6800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.871+0000 7f163d4c2700 1 -- 192.168.123.100:0/2412387330 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f16340097e0 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.871+0000 7f163d4c2700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16381a6d40 0x7f16381a0880 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7f162800eb10 tx=0x7f162800ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.872+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f162800cc40 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.872+0000 7f163ff27700 1 -- 192.168.123.100:0/2412387330 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f16381a10a0 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.872+0000 7f163ff27700 1 -- 192.168.123.100:0/2412387330 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f16381a15f0 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.872+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f162800cda0 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.872+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1628018810 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.873+0000 7f163ff27700 1 -- 192.168.123.100:0/2412387330 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f161c000ff0 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.873+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f1628018a50 con 0x7f16381a6d40 2026-03-10T12:34:56.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.874+0000 7f162effd700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f162406c7a0 0x7f162406ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:56.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.874+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1628014070 con 0x7f16381a6d40 2026-03-10T12:34:56.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.874+0000 7f163dcc3700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f162406c7a0 0x7f162406ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:56.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.874+0000 7f163dcc3700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f162406c7a0 0x7f162406ec50 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f163400b5c0 tx=0x7f16340051f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:56.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.874+0000 7f162effd700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] conn(0x7f1624072330 0x7f1624074740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:56.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.875+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 --> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f1624074df0 con 0x7f1624072330 2026-03-10T12:34:56.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.875+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7f1628010480 con 0x7f16381a6d40 2026-03-10T12:34:56.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.875+0000 7f163e4c4700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] conn(0x7f1624072330 0x7f1624074740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:56.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.876+0000 7f163e4c4700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] conn(0x7f1624072330 0x7f1624074740 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:56.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.880+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== osd.3 v2:192.168.123.107:6800/698977351 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f1624074df0 con 0x7f1624072330 2026-03-10T12:34:56.965 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.930+0000 7f163ff27700 1 -- 192.168.123.100:0/2412387330 --> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f161c002d60 con 0x7f1624072330 2026-03-10T12:34:56.965 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.953+0000 7f162effd700 1 -- 192.168.123.100:0/2412387330 <== osd.3 v2:192.168.123.107:6800/698977351 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f161c002d60 con 0x7f1624072330 2026-03-10T12:34:56.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.982+0000 7f162cff9700 1 -- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] conn(0x7f1624072330 msgr2=0x7f1624074740 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:56.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.982+0000 7f162cff9700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] conn(0x7f1624072330 0x7f1624074740 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.983+0000 7f162cff9700 1 -- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f162406c7a0 msgr2=0x7f162406ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:56.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.983+0000 7f162cff9700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f162406c7a0 0x7f162406ec50 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f163400b5c0 tx=0x7f16340051f0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.983+0000 7f162cff9700 1 -- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16381a6d40 msgr2=0x7f16381a0880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:56.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.983+0000 7f162cff9700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16381a6d40 0x7f16381a0880 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7f162800eb10 tx=0x7f162800ee20 comp rx=0 tx=0).stop 2026-03-10T12:34:56.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 -- 192.168.123.100:0/2412387330 shutdown_connections 2026-03-10T12:34:56.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:6800/698977351,v1:192.168.123.107:6801/698977351] conn(0x7f1624072330 0x7f1624074740 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f162406c7a0 0x7f162406ec50 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f163806d830 0x7f16381a6800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 --2- 192.168.123.100:0/2412387330 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16381a6d40 0x7f16381a0880 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:56.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 -- 192.168.123.100:0/2412387330 >> 192.168.123.100:0/2412387330 conn(0x7f163806c830 msgr2=0x7f163806fb50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:56.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 -- 192.168.123.100:0/2412387330 shutdown_connections 2026-03-10T12:34:56.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:56.987+0000 7f162cff9700 1 -- 192.168.123.100:0/2412387330 wait complete. 2026-03-10T12:34:57.080 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.079+0000 7f80d8d68700 1 -- 192.168.123.100:0/3148621878 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 msgr2=0x7f80d410edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.080 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.079+0000 7f80d8d68700 1 --2- 192.168.123.100:0/3148621878 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d410edb0 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f80c4009b50 tx=0x7f80c4009e60 comp rx=0 tx=0).stop 2026-03-10T12:34:57.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.081+0000 7f80d8d68700 1 -- 192.168.123.100:0/3148621878 shutdown_connections 2026-03-10T12:34:57.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.081+0000 7f80d8d68700 1 --2- 192.168.123.100:0/3148621878 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80d4071b60 0x7f80d4071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.081+0000 7f80d8d68700 1 --2- 192.168.123.100:0/3148621878 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d410edb0 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.081+0000 7f80d8d68700 1 -- 192.168.123.100:0/3148621878 >> 192.168.123.100:0/3148621878 conn(0x7f80d406c6c0 msgr2=0x7f80d406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.082+0000 7f80d8d68700 1 -- 192.168.123.100:0/3148621878 shutdown_connections 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.082+0000 7f80d8d68700 1 -- 192.168.123.100:0/3148621878 wait complete. 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.082+0000 7f80d8d68700 1 Processor -- start 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d8d68700 1 -- start start 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d8d68700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80d4071b60 0x7f80d4119620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d8d68700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d4114620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d8d68700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80d4114bf0 con 0x7f80d410e9e0 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d8d68700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80d4114d30 con 0x7f80d4071b60 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d4114620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d4114620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:38982/0 (socket says 192.168.123.100:38982) 2026-03-10T12:34:57.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d2ffd700 1 -- 192.168.123.100:0/708035315 learned_addr learned my addr 192.168.123.100:0/708035315 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.083+0000 7f80d37fe700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80d4071b60 0x7f80d4119620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d2ffd700 1 -- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80d4071b60 msgr2=0x7f80d4119620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d2ffd700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80d4071b60 0x7f80d4119620 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d2ffd700 1 -- 192.168.123.100:0/708035315 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80c40097e0 con 0x7f80d410e9e0 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d2ffd700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d4114620 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f80c800ba70 tx=0x7f80c800bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80c800c700 con 0x7f80d410e9e0 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80d4115010 con 0x7f80d410e9e0 2026-03-10T12:34:57.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80d41b7b00 con 0x7f80d410e9e0 2026-03-10T12:34:57.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f80c800cd40 con 0x7f80d410e9e0 2026-03-10T12:34:57.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.084+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80c8012340 con 0x7f80d410e9e0 2026-03-10T12:34:57.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.085+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f80c0000ff0 con 0x7f80d410e9e0 2026-03-10T12:34:57.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.087+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f80c800c860 con 0x7f80d410e9e0 2026-03-10T12:34:57.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.087+0000 7f80d0ff9700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f80bc06c690 0x7f80bc06eb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.088+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f80c808b310 con 0x7f80d410e9e0 2026-03-10T12:34:57.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.088+0000 7f80d37fe700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f80bc06c690 0x7f80bc06eb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.088+0000 7f80d0ff9700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] conn(0x7f80bc072220 0x7f80bc074630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.088+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 --> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f80bc074ce0 con 0x7f80bc072220 2026-03-10T12:34:57.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.088+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7f80c808b6d0 con 0x7f80d410e9e0 2026-03-10T12:34:57.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.088+0000 7f80d37fe700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f80bc06c690 0x7f80bc06eb40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f80c400b5c0 tx=0x7f80c4005fd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.088+0000 7f80d3fff700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] conn(0x7f80bc072220 0x7f80bc074630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.091+0000 7f80d3fff700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] conn(0x7f80bc072220 0x7f80bc074630 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.098+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== osd.5 v2:192.168.123.107:6816/4252230828 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f80bc074ce0 con 0x7f80bc072220 2026-03-10T12:34:57.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.109+0000 7fca4ebde700 1 -- 192.168.123.100:0/3228483723 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca4810e9e0 msgr2=0x7fca4810edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.109+0000 7fca4ebde700 1 --2- 192.168.123.100:0/3228483723 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca4810e9e0 0x7fca4810edb0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7fca44009b50 tx=0x7fca44009e60 comp rx=0 tx=0).stop 2026-03-10T12:34:57.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.110+0000 7fca4ebde700 1 -- 192.168.123.100:0/3228483723 shutdown_connections 2026-03-10T12:34:57.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.110+0000 7fca4ebde700 1 --2- 192.168.123.100:0/3228483723 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca48071b60 0x7fca48071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.110+0000 7fca4ebde700 1 --2- 192.168.123.100:0/3228483723 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca4810e9e0 0x7fca4810edb0 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.110+0000 7fca4ebde700 1 -- 192.168.123.100:0/3228483723 >> 192.168.123.100:0/3228483723 conn(0x7fca4806c6c0 msgr2=0x7fca4806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 -- 192.168.123.100:0/3228483723 shutdown_connections 2026-03-10T12:34:57.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 -- 192.168.123.100:0/3228483723 wait complete. 2026-03-10T12:34:57.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 Processor -- start 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 -- start start 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca48071b60 0x7fca481195b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca481145b0 0x7fca48114a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca48114f60 con 0x7fca481145b0 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.112+0000 7fca4ebde700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca481150d0 con 0x7fca48071b60 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.115+0000 7fca4d3db700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca481145b0 0x7fca48114a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.115+0000 7fca4d3db700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca481145b0 0x7fca48114a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:39006/0 (socket says 192.168.123.100:39006) 2026-03-10T12:34:57.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.115+0000 7fca4d3db700 1 -- 192.168.123.100:0/445870784 learned_addr learned my addr 192.168.123.100:0/445870784 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:57.116 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.116+0000 7fca4d3db700 1 -- 192.168.123.100:0/445870784 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca48071b60 msgr2=0x7fca481195b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:34:57.116 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.116+0000 7fca4d3db700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca48071b60 0x7fca481195b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.116 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.116+0000 7fca4d3db700 1 -- 192.168.123.100:0/445870784 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca440097e0 con 0x7fca481145b0 2026-03-10T12:34:57.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.119+0000 7fca4d3db700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca481145b0 0x7fca48114a20 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7fca3800ed70 tx=0x7fca3800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.126+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca38009980 con 0x7fca481145b0 2026-03-10T12:34:57.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.129+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca48115350 con 0x7fca481145b0 2026-03-10T12:34:57.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.129+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca481a5c80 con 0x7fca481145b0 2026-03-10T12:34:57.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.132+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca3800cd70 con 0x7fca481145b0 2026-03-10T12:34:57.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.136+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca380189c0 con 0x7fca481145b0 2026-03-10T12:34:57.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.144+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fca38018be0 con 0x7fca481145b0 2026-03-10T12:34:57.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.148+0000 7fca3effd700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fca3406c870 0x7fca3406ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.148+0000 7fca4dbdc700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fca3406c870 0x7fca3406ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.153 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.151+0000 7fca3cff9700 1 -- 192.168.123.100:0/445870784 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fca2c000ff0 con 0x7fca481145b0 2026-03-10T12:34:57.154 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.154+0000 7fca4dbdc700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fca3406c870 0x7fca3406ed20 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fca44009b20 tx=0x7fca44005bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.154 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.154+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fca38014070 con 0x7fca481145b0 2026-03-10T12:34:57.154 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.154+0000 7fca3effd700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] conn(0x7fca34072400 0x7fca34074810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.155 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.154+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 --> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fca34074ec0 con 0x7fca34072400 2026-03-10T12:34:57.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.155+0000 7fca4e3dd700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] conn(0x7fca34072400 0x7fca34074810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.155+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7fca38010480 con 0x7fca481145b0 2026-03-10T12:34:57.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.156+0000 7fca4e3dd700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] conn(0x7fca34072400 0x7fca34074810 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.156+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 --> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f80c0002ce0 con 0x7f80bc072220 2026-03-10T12:34:57.157 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.156+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== osd.1 v2:192.168.123.100:6810/4135684750 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fca34074ec0 con 0x7fca34072400 2026-03-10T12:34:57.157 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.156+0000 7f80d0ff9700 1 -- 192.168.123.100:0/708035315 <== osd.5 v2:192.168.123.107:6816/4252230828 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f80c0002ce0 con 0x7f80bc072220 2026-03-10T12:34:57.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.158+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] conn(0x7f80bc072220 msgr2=0x7f80bc074630 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.159 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.158+0000 7f80d8d68700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] conn(0x7f80bc072220 0x7f80bc074630 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.159 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.158+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f80bc06c690 msgr2=0x7f80bc06eb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.159 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.158+0000 7f80d8d68700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f80bc06c690 0x7f80bc06eb40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f80c400b5c0 tx=0x7f80c4005fd0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.159+0000 7fb54b59e700 1 -- 192.168.123.100:0/1387290229 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c10e9e0 msgr2=0x7fb54c10edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.159+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 msgr2=0x7f80d4114620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.159+0000 7f80d8d68700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d4114620 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f80c800ba70 tx=0x7f80c800bd80 comp rx=0 tx=0).stop 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.160+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 shutdown_connections 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.160+0000 7f80d8d68700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:6816/4252230828,v1:192.168.123.107:6817/4252230828] conn(0x7f80bc072220 0x7f80bc074630 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.160+0000 7f80d8d68700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f80bc06c690 0x7f80bc06eb40 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.160+0000 7f80d8d68700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80d4071b60 0x7f80d4119620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.160+0000 7f80d8d68700 1 --2- 192.168.123.100:0/708035315 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80d410e9e0 0x7f80d4114620 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.160+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 >> 192.168.123.100:0/708035315 conn(0x7f80d406c6c0 msgr2=0x7f80d406cf90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.159+0000 7fb54b59e700 1 --2- 192.168.123.100:0/1387290229 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c10e9e0 0x7fb54c10edb0 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7fb53c009b50 tx=0x7fb53c009e60 comp rx=0 tx=0).stop 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 -- 192.168.123.100:0/1387290229 shutdown_connections 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 --2- 192.168.123.100:0/1387290229 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54c071b60 0x7fb54c071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 --2- 192.168.123.100:0/1387290229 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c10e9e0 0x7fb54c10edb0 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 -- 192.168.123.100:0/1387290229 >> 192.168.123.100:0/1387290229 conn(0x7fb54c06c6c0 msgr2=0x7fb54c06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 -- 192.168.123.100:0/1387290229 shutdown_connections 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 -- 192.168.123.100:0/1387290229 wait complete. 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 Processor -- start 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.165+0000 7fb54b59e700 1 -- start start 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb54b59e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54c071b60 0x7fb54c1195b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb54b59e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c1145b0 0x7fb54c114a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb54b59e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb54c114f60 con 0x7fb54c1145b0 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb54b59e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb54c1150d0 con 0x7fb54c071b60 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb549d9b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c1145b0 0x7fb54c114a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb549d9b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c1145b0 0x7fb54c114a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:39024/0 (socket says 192.168.123.100:39024) 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb549d9b700 1 -- 192.168.123.100:0/273775388 learned_addr learned my addr 192.168.123.100:0/273775388 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb549d9b700 1 -- 192.168.123.100:0/273775388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54c071b60 msgr2=0x7fb54c1195b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb549d9b700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54c071b60 0x7fb54c1195b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb549d9b700 1 -- 192.168.123.100:0/273775388 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb53c0097e0 con 0x7fb54c1145b0 2026-03-10T12:34:57.169 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.166+0000 7fb549d9b700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c1145b0 0x7fb54c114a20 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7fb54000ed70 tx=0x7fb54000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.169 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.169+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb540009980 con 0x7fb54c1145b0 2026-03-10T12:34:57.169 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.169+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb54c115350 con 0x7fb54c1145b0 2026-03-10T12:34:57.169 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.169+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb54c1b7ca0 con 0x7fb54c1145b0 2026-03-10T12:34:57.171 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.164+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 shutdown_connections 2026-03-10T12:34:57.171 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.164+0000 7f80d8d68700 1 -- 192.168.123.100:0/708035315 wait complete. 2026-03-10T12:34:57.182 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.179+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fb54c11c670 con 0x7fb54c1145b0 2026-03-10T12:34:57.182 INFO:teuthology.orchestra.run.vm00.stdout:103079215111 2026-03-10T12:34:57.183 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.3 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb54000cd70 con 0x7fb54c1145b0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb5400189c0 con 0x7fb54c1145b0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb540018be0 con 0x7fb54c1145b0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb53406c870 0x7fb53406ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb540014070 con 0x7fb54c1145b0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] conn(0x7fb534072400 0x7fb534074810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 --> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fb534074ec0 con 0x7fb534072400 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.191+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_get_version_reply(handle=1 version=34) v2 ==== 24+0+0 (secure 0 0 0) 0x7fb54008dfc0 con 0x7fb54c1145b0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.194+0000 7fb54ad9d700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] conn(0x7fb534072400 0x7fb534074810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.195+0000 7fb54a59c700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb53406c870 0x7fb53406ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.195+0000 7fb54ad9d700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] conn(0x7fb534072400 0x7fb534074810 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.195+0000 7fb54a59c700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb53406c870 0x7fb53406ed20 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb53c009b20 tx=0x7fb53c005bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.200+0000 7fca3cff9700 1 -- 192.168.123.100:0/445870784 --> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fca2c002ce0 con 0x7fca34072400 2026-03-10T12:34:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.202+0000 7fca3effd700 1 -- 192.168.123.100:0/445870784 <== osd.1 v2:192.168.123.100:6810/4135684750 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7fca2c002ce0 con 0x7fca34072400 2026-03-10T12:34:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.200+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== osd.2 v2:192.168.123.100:6818/3405786659 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fb534074ec0 con 0x7fb534072400 2026-03-10T12:34:57.207 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.205+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] conn(0x7fca34072400 msgr2=0x7fca34074810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.207 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.205+0000 7fca4ebde700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] conn(0x7fca34072400 0x7fca34074810 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.207 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.205+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fca3406c870 msgr2=0x7fca3406ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.207 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.205+0000 7fca4ebde700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fca3406c870 0x7fca3406ed20 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fca44009b20 tx=0x7fca44005bc0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.207 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.205+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca481145b0 msgr2=0x7fca48114a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.207 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.205+0000 7fca4ebde700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca481145b0 0x7fca48114a20 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7fca3800ed70 tx=0x7fca3800c5b0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.210 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 shutdown_connections 2026-03-10T12:34:57.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6810/4135684750,v1:192.168.123.100:6811/4135684750] conn(0x7fca34072400 0x7fca34074810 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fca3406c870 0x7fca3406ed20 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fca48071b60 0x7fca481195b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 --2- 192.168.123.100:0/445870784 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fca481145b0 0x7fca48114a20 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 >> 192.168.123.100:0/445870784 conn(0x7fca4806c6c0 msgr2=0x7fca4806f5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 shutdown_connections 2026-03-10T12:34:57.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.210+0000 7fca4ebde700 1 -- 192.168.123.100:0/445870784 wait complete. 2026-03-10T12:34:57.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.272+0000 7f790c117700 1 -- 192.168.123.100:0/2682796049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7904071e40 msgr2=0x7f79040722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.272+0000 7f790c117700 1 --2- 192.168.123.100:0/2682796049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7904071e40 0x7f79040722b0 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f78fc009230 tx=0x7f78fc009260 comp rx=0 tx=0).stop 2026-03-10T12:34:57.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.272+0000 7f790c117700 1 -- 192.168.123.100:0/2682796049 shutdown_connections 2026-03-10T12:34:57.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.273+0000 7f790c117700 1 --2- 192.168.123.100:0/2682796049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7904071e40 0x7f79040722b0 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.273+0000 7f790c117700 1 --2- 192.168.123.100:0/2682796049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f790410c8b0 0x7f790410cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.273+0000 7f790c117700 1 -- 192.168.123.100:0/2682796049 >> 192.168.123.100:0/2682796049 conn(0x7f790406c6c0 msgr2=0x7f790406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.273+0000 7f790c117700 1 -- 192.168.123.100:0/2682796049 shutdown_connections 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.273+0000 7f790c117700 1 -- 192.168.123.100:0/2682796049 wait complete. 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f790c117700 1 Processor -- start 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f790c117700 1 -- start start 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f790c117700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f790410c8b0 0x7f790407cef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f790c117700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f790407d430 0x7f790407d8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f790c117700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7904081a70 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f790c117700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7904081be0 con 0x7f790410c8b0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f79096b2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f790407d430 0x7f790407d8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f79096b2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f790407d430 0x7f790407d8a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:39052/0 (socket says 192.168.123.100:39052) 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.274+0000 7f79096b2700 1 -- 192.168.123.100:0/4125194302 learned_addr learned my addr 192.168.123.100:0/4125194302 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f7909eb3700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f790410c8b0 0x7f790407cef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f79096b2700 1 -- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f790410c8b0 msgr2=0x7f790407cef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f79096b2700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f790410c8b0 0x7f790407cef0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f79096b2700 1 -- 192.168.123.100:0/4125194302 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78fc008ee0 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f79096b2700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f790407d430 0x7f790407d8a0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f78fc00b9e0 tx=0x7f78fc008e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f78faffd700 1 -- 192.168.123.100:0/4125194302 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f78fc022ae0 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7904081e60 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.275+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7904082350 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.276+0000 7f78faffd700 1 -- 192.168.123.100:0/4125194302 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f78fc007dd0 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.276+0000 7f78faffd700 1 -- 192.168.123.100:0/4125194302 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f78fc0050d0 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.277+0000 7f78faffd700 1 -- 192.168.123.100:0/4125194302 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f78fc028030 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.277+0000 7f78faffd700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f78f006c7a0 0x7f78f006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.277+0000 7f7909eb3700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f78f006c7a0 0x7f78f006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.277+0000 7f78faffd700 1 -- 192.168.123.100:0/4125194302 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f78fc094ff0 con 0x7f790407d430 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.278+0000 7f7909eb3700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f78f006c7a0 0x7f78f006ec50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f79000089d0 tx=0x7f790000f040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.278+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] conn(0x7f78e8001610 0x7f78e8003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.278+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 --> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f78e8006bf0 con 0x7f78e8001610 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.278+0000 7f790a6b4700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] conn(0x7f78e8001610 0x7f78e8003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.278+0000 7f790a6b4700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] conn(0x7f78e8001610 0x7f78e8003ac0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.279+0000 7f78faffd700 1 -- 192.168.123.100:0/4125194302 <== osd.4 v2:192.168.123.107:6808/4106671248 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f78e8006bf0 con 0x7f78e8001610 2026-03-10T12:34:57.287 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.285+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 --> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fb54c04f2a0 con 0x7fb534072400 2026-03-10T12:34:57.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.296+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 --> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f78e8005cd0 con 0x7f78e8001610 2026-03-10T12:34:57.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.292+0000 7fb53b7fe700 1 -- 192.168.123.100:0/273775388 <== osd.2 v2:192.168.123.100:6818/3405786659 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7fb54c04f2a0 con 0x7fb534072400 2026-03-10T12:34:57.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.297+0000 7f78faffd700 1 -- 192.168.123.100:0/4125194302 <== osd.4 v2:192.168.123.107:6808/4106671248 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f78e8005cd0 con 0x7f78e8001610 2026-03-10T12:34:57.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.299+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] conn(0x7fb534072400 msgr2=0x7fb534074810 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.299+0000 7fb54b59e700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] conn(0x7fb534072400 0x7fb534074810 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb53406c870 msgr2=0x7fb53406ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb53406c870 0x7fb53406ed20 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb53c009b20 tx=0x7fb53c005bc0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c1145b0 msgr2=0x7fb54c114a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c1145b0 0x7fb54c114a20 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7fb54000ed70 tx=0x7fb54000c5b0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 shutdown_connections 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6818/3405786659,v1:192.168.123.100:6819/3405786659] conn(0x7fb534072400 0x7fb534074810 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb53406c870 0x7fb53406ed20 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb54c071b60 0x7fb54c1195b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 --2- 192.168.123.100:0/273775388 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb54c1145b0 0x7fb54c114a20 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.301+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 >> 192.168.123.100:0/273775388 conn(0x7fb54c06c6c0 msgr2=0x7fb54c06f5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.303+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 shutdown_connections 2026-03-10T12:34:57.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.303+0000 7fb54b59e700 1 -- 192.168.123.100:0/273775388 wait complete. 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] conn(0x7f78e8001610 msgr2=0x7f78e8003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] conn(0x7f78e8001610 0x7f78e8003ac0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f78f006c7a0 msgr2=0x7f78f006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f78f006c7a0 0x7f78f006ec50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f79000089d0 tx=0x7f790000f040 comp rx=0 tx=0).stop 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f790407d430 msgr2=0x7f790407d8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f790407d430 0x7f790407d8a0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f78fc00b9e0 tx=0x7f78fc008e70 comp rx=0 tx=0).stop 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 shutdown_connections 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f78f006c7a0 0x7f78f006ec50 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:6808/4106671248,v1:192.168.123.107:6809/4106671248] conn(0x7f78e8001610 0x7f78e8003ac0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f790410c8b0 0x7f790407cef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 --2- 192.168.123.100:0/4125194302 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f790407d430 0x7f790407d8a0 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 >> 192.168.123.100:0/4125194302 conn(0x7f790406c6c0 msgr2=0x7f790406fff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 shutdown_connections 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.312+0000 7f790c117700 1 -- 192.168.123.100:0/4125194302 wait complete. 2026-03-10T12:34:57.315 INFO:teuthology.orchestra.run.vm00.stdout:141733920771 2026-03-10T12:34:57.316 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.5 2026-03-10T12:34:57.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.341+0000 7f57e7e06700 1 -- 192.168.123.100:0/2272763138 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0071b60 msgr2=0x7f57e0071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.341+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2272763138 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0071b60 0x7f57e0071fd0 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f57d0009b00 tx=0x7f57d0009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:57.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.341+0000 7f57e7e06700 1 -- 192.168.123.100:0/2272763138 shutdown_connections 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.341+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2272763138 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0071b60 0x7f57e0071fd0 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.341+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2272763138 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 0x7f57e010ee80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.341+0000 7f57e7e06700 1 -- 192.168.123.100:0/2272763138 >> 192.168.123.100:0/2272763138 conn(0x7f57e006c6c0 msgr2=0x7f57e006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 -- 192.168.123.100:0/2272763138 shutdown_connections 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 -- 192.168.123.100:0/2272763138 wait complete. 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 Processor -- start 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 -- start start 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 0x7f57e0118670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0113670 0x7f57e0113ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57e0114020 con 0x7f57e0113670 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.343+0000 7f57e7e06700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57e0114190 con 0x7f57e010eab0 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.345+0000 7f57e5ba2700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 0x7f57e0118670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.345+0000 7f57e5ba2700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 0x7f57e0118670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:38516/0 (socket says 192.168.123.100:38516) 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.345+0000 7f57e5ba2700 1 -- 192.168.123.100:0/2862518742 learned_addr learned my addr 192.168.123.100:0/2862518742 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.345+0000 7f57e53a1700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0113670 0x7f57e0113ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.345+0000 7f57e5ba2700 1 -- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0113670 msgr2=0x7f57e0113ae0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.345+0000 7f57e5ba2700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0113670 0x7f57e0113ae0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.345+0000 7f57e5ba2700 1 -- 192.168.123.100:0/2862518742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57d00097e0 con 0x7f57e010eab0 2026-03-10T12:34:57.349 INFO:teuthology.orchestra.run.vm00.stdout:55834574859 2026-03-10T12:34:57.349 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.1 2026-03-10T12:34:57.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.349+0000 7f57e5ba2700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 0x7f57e0118670 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f57dc00eb10 tx=0x7f57dc00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.349+0000 7f57d6ffd700 1 -- 192.168.123.100:0/2862518742 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f57dc00cca0 con 0x7f57e010eab0 2026-03-10T12:34:57.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.349+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57e0114470 con 0x7f57e010eab0 2026-03-10T12:34:57.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.349+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57e01b7cc0 con 0x7f57e010eab0 2026-03-10T12:34:57.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.350+0000 7f57d6ffd700 1 -- 192.168.123.100:0/2862518742 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f57dc00ce00 con 0x7f57e010eab0 2026-03-10T12:34:57.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.350+0000 7f57d6ffd700 1 -- 192.168.123.100:0/2862518742 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f57dc0189c0 con 0x7f57e010eab0 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.353+0000 7f57d6ffd700 1 -- 192.168.123.100:0/2862518742 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f57dc018b20 con 0x7f57e010eab0 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.353+0000 7f57d6ffd700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f57cc06c7a0 0x7f57cc06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.354+0000 7f57e53a1700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f57cc06c7a0 0x7f57cc06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.354+0000 7f57d6ffd700 1 -- 192.168.123.100:0/2862518742 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f57dc014070 con 0x7f57e010eab0 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.355+0000 7f57e53a1700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f57cc06c7a0 0x7f57cc06ec50 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f57d000b5c0 tx=0x7f57d0011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.355+0000 7f57d4ff9700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] conn(0x7f57c4001610 0x7f57c4003ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.356+0000 7f57e63a3700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] conn(0x7f57c4001610 0x7f57c4003ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:57.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.356+0000 7f57d4ff9700 1 -- 192.168.123.100:0/2862518742 --> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f57c4006bf0 con 0x7f57c4001610 2026-03-10T12:34:57.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.359+0000 7f57e63a3700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] conn(0x7f57c4001610 0x7f57c4003ac0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:57.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.360+0000 7f57d6ffd700 1 -- 192.168.123.100:0/2862518742 <== osd.0 v2:192.168.123.100:6802/3160210101 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f57c4006bf0 con 0x7f57c4001610 2026-03-10T12:34:57.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.400+0000 7f57d4ff9700 1 -- 192.168.123.100:0/2862518742 --> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f57c4005cd0 con 0x7f57c4001610 2026-03-10T12:34:57.404 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.403+0000 7f57d6ffd700 1 -- 192.168.123.100:0/2862518742 <== osd.0 v2:192.168.123.100:6802/3160210101 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f57c4005cd0 con 0x7f57c4001610 2026-03-10T12:34:57.404 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.403+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] conn(0x7f57c4001610 msgr2=0x7f57c4003ac0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.404 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.404+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] conn(0x7f57c4001610 0x7f57c4003ac0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.404 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.404+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f57cc06c7a0 msgr2=0x7f57cc06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.404 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.404+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f57cc06c7a0 0x7f57cc06ec50 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f57d000b5c0 tx=0x7f57d0011040 comp rx=0 tx=0).stop 2026-03-10T12:34:57.404 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.404+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 msgr2=0x7f57e0118670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:57.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.404+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 0x7f57e0118670 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f57dc00eb10 tx=0x7f57dc00eed0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.404+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 shutdown_connections 2026-03-10T12:34:57.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.404+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6802/3160210101,v1:192.168.123.100:6803/3160210101] conn(0x7f57c4001610 0x7f57c4003ac0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.405+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f57cc06c7a0 0x7f57cc06ec50 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.405+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f57e010eab0 0x7f57e0118670 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.405+0000 7f57e7e06700 1 --2- 192.168.123.100:0/2862518742 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f57e0113670 0x7f57e0113ae0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:57.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.405+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 >> 192.168.123.100:0/2862518742 conn(0x7f57e006c6c0 msgr2=0x7f57e0070050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:57.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.405+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 shutdown_connections 2026-03-10T12:34:57.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:57.405+0000 7f57e7e06700 1 -- 192.168.123.100:0/2862518742 wait complete. 2026-03-10T12:34:57.469 INFO:teuthology.orchestra.run.vm00.stdout:73014444041 2026-03-10T12:34:57.469 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.2 2026-03-10T12:34:57.496 INFO:teuthology.orchestra.run.vm00.stdout:124554051590 2026-03-10T12:34:57.496 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.4 2026-03-10T12:34:57.499 INFO:teuthology.orchestra.run.vm00.stdout:38654705677 2026-03-10T12:34:57.499 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.0 2026-03-10T12:34:57.652 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:57.835 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:57.985 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:58.009 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:58.147 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:58.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.242+0000 7fb44a8de700 1 -- 192.168.123.100:0/1649391929 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb444071e40 msgr2=0x7fb4440722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.242+0000 7fb44a8de700 1 --2- 192.168.123.100:0/1649391929 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb444071e40 0x7fb4440722b0 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7fb43c00d3f0 tx=0x7fb43c00d700 comp rx=0 tx=0).stop 2026-03-10T12:34:58.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.243+0000 7fb44a8de700 1 -- 192.168.123.100:0/1649391929 shutdown_connections 2026-03-10T12:34:58.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.243+0000 7fb44a8de700 1 --2- 192.168.123.100:0/1649391929 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb444071e40 0x7fb4440722b0 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.243+0000 7fb44a8de700 1 --2- 192.168.123.100:0/1649391929 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44410c8f0 0x7fb44410ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.243+0000 7fb44a8de700 1 -- 192.168.123.100:0/1649391929 >> 192.168.123.100:0/1649391929 conn(0x7fb44406c6c0 msgr2=0x7fb44406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:58.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.244+0000 7fb44a8de700 1 -- 192.168.123.100:0/1649391929 shutdown_connections 2026-03-10T12:34:58.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.244+0000 7fb44a8de700 1 -- 192.168.123.100:0/1649391929 wait complete. 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.244+0000 7fb44a8de700 1 Processor -- start 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.244+0000 7fb44a8de700 1 -- start start 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb44a8de700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb44410c8f0 0x7fb44407ceb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb44a8de700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44407d3f0 0x7fb44407d860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb44a8de700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb444081a30 con 0x7fb44410c8f0 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb44a8de700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb444081ba0 con 0x7fb44407d3f0 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb4490db700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44407d3f0 0x7fb44407d860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb4490db700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44407d3f0 0x7fb44407d860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48624/0 (socket says 192.168.123.100:48624) 2026-03-10T12:34:58.247 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb4490db700 1 -- 192.168.123.100:0/2919413526 learned_addr learned my addr 192.168.123.100:0/2919413526 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:58.247 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb4498dc700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb44410c8f0 0x7fb44407ceb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.247 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb4490db700 1 -- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb44410c8f0 msgr2=0x7fb44407ceb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.247 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb4490db700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb44410c8f0 0x7fb44407ceb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.247 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.245+0000 7fb4490db700 1 -- 192.168.123.100:0/2919413526 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb43c007ed0 con 0x7fb44407d3f0 2026-03-10T12:34:58.247 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.246+0000 7fb4490db700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44407d3f0 0x7fb44407d860 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb43c003c60 tx=0x7fb43c003c90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.247+0000 7fb43affd700 1 -- 192.168.123.100:0/2919413526 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb43c01c070 con 0x7fb44407d3f0 2026-03-10T12:34:58.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.247+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb444081e20 con 0x7fb44407d3f0 2026-03-10T12:34:58.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.247+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb444082310 con 0x7fb44407d3f0 2026-03-10T12:34:58.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.248+0000 7fb43affd700 1 -- 192.168.123.100:0/2919413526 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb43c00fb40 con 0x7fb44407d3f0 2026-03-10T12:34:58.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.248+0000 7fb43affd700 1 -- 192.168.123.100:0/2919413526 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb43c017c40 con 0x7fb44407d3f0 2026-03-10T12:34:58.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.249+0000 7fb43affd700 1 -- 192.168.123.100:0/2919413526 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb43c02a430 con 0x7fb44407d3f0 2026-03-10T12:34:58.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.250+0000 7fb43affd700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb43006c7a0 0x7fb43006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.250+0000 7fb43affd700 1 -- 192.168.123.100:0/2919413526 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb43c013070 con 0x7fb44407d3f0 2026-03-10T12:34:58.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.250+0000 7fb4498dc700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb43006c7a0 0x7fb43006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.251+0000 7fb4498dc700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb43006c7a0 0x7fb43006ec50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fb44000d440 tx=0x7fb440008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.251+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb428005320 con 0x7fb44407d3f0 2026-03-10T12:34:58.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.259+0000 7fb43affd700 1 -- 192.168.123.100:0/2919413526 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb43c058be0 con 0x7fb44407d3f0 2026-03-10T12:34:58.389 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:34:58.463 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:58 vm00 ceph-mon[50686]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:58.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.467+0000 7f140b4aa700 1 -- 192.168.123.100:0/2293700049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1404071e40 msgr2=0x7f14040722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.467+0000 7f140b4aa700 1 --2- 192.168.123.100:0/2293700049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1404071e40 0x7f14040722b0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f1400007780 tx=0x7f1400007a90 comp rx=0 tx=0).stop 2026-03-10T12:34:58.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 -- 192.168.123.100:0/2293700049 shutdown_connections 2026-03-10T12:34:58.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 --2- 192.168.123.100:0/2293700049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1404071e40 0x7f14040722b0 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 --2- 192.168.123.100:0/2293700049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f140410c8b0 0x7f140410cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 -- 192.168.123.100:0/2293700049 >> 192.168.123.100:0/2293700049 conn(0x7f140406c6c0 msgr2=0x7f140406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:58.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 -- 192.168.123.100:0/2293700049 shutdown_connections 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 -- 192.168.123.100:0/2293700049 wait complete. 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 Processor -- start 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 -- start start 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f140410c8b0 0x7f140407ce50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f140407d390 0x7f140407d800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14040819d0 con 0x7f140410c8b0 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.468+0000 7f140b4aa700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1404081b40 con 0x7f140407d390 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.470+0000 7f1409246700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f140410c8b0 0x7f140407ce50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.470+0000 7f1409246700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f140410c8b0 0x7f140407ce50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33340/0 (socket says 192.168.123.100:33340) 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.470+0000 7f1409246700 1 -- 192.168.123.100:0/2693312242 learned_addr learned my addr 192.168.123.100:0/2693312242 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:58.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.470+0000 7f1408a45700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f140407d390 0x7f140407d800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.474+0000 7f1409246700 1 -- 192.168.123.100:0/2693312242 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f140407d390 msgr2=0x7f140407d800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.474+0000 7f1409246700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f140407d390 0x7f140407d800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.474+0000 7f1409246700 1 -- 192.168.123.100:0/2693312242 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1400007430 con 0x7f140410c8b0 2026-03-10T12:34:58.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.474+0000 7f1409246700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f140410c8b0 0x7f140407ce50 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f13fc00c390 tx=0x7f13fc00c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.475+0000 7f13fa7fc700 1 -- 192.168.123.100:0/2693312242 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13fc00e030 con 0x7f140410c8b0 2026-03-10T12:34:58.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.475+0000 7f140b4aa700 1 -- 192.168.123.100:0/2693312242 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1404081e20 con 0x7f140410c8b0 2026-03-10T12:34:58.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.475+0000 7f140b4aa700 1 -- 192.168.123.100:0/2693312242 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1404082370 con 0x7f140410c8b0 2026-03-10T12:34:58.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.477+0000 7f13fa7fc700 1 -- 192.168.123.100:0/2693312242 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f13fc00f040 con 0x7f140410c8b0 2026-03-10T12:34:58.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.477+0000 7f13fa7fc700 1 -- 192.168.123.100:0/2693312242 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f13fc014690 con 0x7f140410c8b0 2026-03-10T12:34:58.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.478+0000 7f13fa7fc700 1 -- 192.168.123.100:0/2693312242 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f13fc0147f0 con 0x7f140410c8b0 2026-03-10T12:34:58.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.484+0000 7f13fa7fc700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f13f006c7a0 0x7f13f006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.485+0000 7f13fa7fc700 1 -- 192.168.123.100:0/2693312242 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f13fc08c570 con 0x7f140410c8b0 2026-03-10T12:34:58.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.485+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f140404f2a0 con 0x7f140410c8b0 2026-03-10T12:34:58.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.485+0000 7f1408a45700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f13f006c7a0 0x7f13f006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.485+0000 7f1408a45700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f13f006c7a0 0x7f13f006ec50 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f1400007e60 tx=0x7f14000058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.488+0000 7f13fa7fc700 1 -- 192.168.123.100:0/2693312242 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f13fc0570b0 con 0x7f140410c8b0 2026-03-10T12:34:58.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.555+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7fb428005190 con 0x7fb44407d3f0 2026-03-10T12:34:58.558 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.556+0000 7fb43affd700 1 -- 192.168.123.100:0/2919413526 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7fb43c025090 con 0x7fb44407d3f0 2026-03-10T12:34:58.560 INFO:teuthology.orchestra.run.vm00.stdout:103079215111 2026-03-10T12:34:58.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.560+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb43006c7a0 msgr2=0x7fb43006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.560+0000 7fb44a8de700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb43006c7a0 0x7fb43006ec50 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fb44000d440 tx=0x7fb440008040 comp rx=0 tx=0).stop 2026-03-10T12:34:58.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.560+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44407d3f0 msgr2=0x7fb44407d860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.560+0000 7fb44a8de700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44407d3f0 0x7fb44407d860 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb43c003c60 tx=0x7fb43c003c90 comp rx=0 tx=0).stop 2026-03-10T12:34:58.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.560+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 shutdown_connections 2026-03-10T12:34:58.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.576+0000 7fb44a8de700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb43006c7a0 0x7fb43006ec50 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.576+0000 7fb44a8de700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb44410c8f0 0x7fb44407ceb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.576+0000 7fb44a8de700 1 --2- 192.168.123.100:0/2919413526 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb44407d3f0 0x7fb44407d860 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.576+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 >> 192.168.123.100:0/2919413526 conn(0x7fb44406c6c0 msgr2=0x7fb44406ff90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:58.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.579+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 shutdown_connections 2026-03-10T12:34:58.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.579+0000 7fb44a8de700 1 -- 192.168.123.100:0/2919413526 wait complete. 2026-03-10T12:34:58.737 INFO:tasks.cephadm.ceph_manager.ceph:need seq 103079215111 got 103079215111 for osd.3 2026-03-10T12:34:58.737 DEBUG:teuthology.parallel:result is None 2026-03-10T12:34:58.778 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:58 vm07 ceph-mon[58582]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:34:58.814 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.812+0000 7f9ceffff700 1 -- 192.168.123.100:0/832638100 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 msgr2=0x7f9cf0102950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.814 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.812+0000 7f9ceffff700 1 --2- 192.168.123.100:0/832638100 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 0x7f9cf0102950 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7f9ce0009b00 tx=0x7f9ce0009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:58.816 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.815+0000 7f9ceffff700 1 -- 192.168.123.100:0/832638100 shutdown_connections 2026-03-10T12:34:58.816 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.815+0000 7f9ceffff700 1 --2- 192.168.123.100:0/832638100 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 0x7f9cf0102950 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.816 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.815+0000 7f9ceffff700 1 --2- 192.168.123.100:0/832638100 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 0x7f9cf01088b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.816 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.815+0000 7f9ceffff700 1 -- 192.168.123.100:0/832638100 >> 192.168.123.100:0/832638100 conn(0x7f9cf00fe000 msgr2=0x7f9cf0100410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:58.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.816+0000 7f9ceffff700 1 -- 192.168.123.100:0/832638100 shutdown_connections 2026-03-10T12:34:58.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.821+0000 7f9ceffff700 1 -- 192.168.123.100:0/832638100 wait complete. 2026-03-10T12:34:58.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.821+0000 7f9ceffff700 1 Processor -- start 2026-03-10T12:34:58.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.826+0000 7f9ceffff700 1 -- start start 2026-03-10T12:34:58.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.829+0000 7f9ceffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 0x7f9cf019ccc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.831+0000 7f9ceffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 0x7f9cf0078270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.831+0000 7f9ceffff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cf019d350 con 0x7f9cf01024e0 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.831+0000 7f9ceffff700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cf019d4c0 con 0x7f9cf01084e0 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.832+0000 7f9cee7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 0x7f9cf0078270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.832+0000 7f9cee7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 0x7f9cf0078270 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48656/0 (socket says 192.168.123.100:48656) 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.832+0000 7f9cee7fc700 1 -- 192.168.123.100:0/620039444 learned_addr learned my addr 192.168.123.100:0/620039444 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.833+0000 7f9cee7fc700 1 -- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 msgr2=0x7f9cf019ccc0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.834+0000 7f9ceeffd700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 0x7f9cf019ccc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.834+0000 7f9cee7fc700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 0x7f9cf019ccc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.834+0000 7f9cee7fc700 1 -- 192.168.123.100:0/620039444 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9ce00097e0 con 0x7f9cf01084e0 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.834+0000 7f9ceeffd700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 0x7f9cf019ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.834+0000 7f9cee7fc700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 0x7f9cf0078270 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9ce0004900 tx=0x7f9ce00049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.834+0000 7f9cf4966700 1 -- 192.168.123.100:0/620039444 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ce001d070 con 0x7f9cf01084e0 2026-03-10T12:34:58.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.835+0000 7f9cf4966700 1 -- 192.168.123.100:0/620039444 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9ce000bc50 con 0x7f9cf01084e0 2026-03-10T12:34:58.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.835+0000 7f9ceffff700 1 -- 192.168.123.100:0/620039444 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9cf00787b0 con 0x7f9cf01084e0 2026-03-10T12:34:58.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.835+0000 7f9cf4966700 1 -- 192.168.123.100:0/620039444 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ce000f870 con 0x7f9cf01084e0 2026-03-10T12:34:58.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.835+0000 7f9ceffff700 1 -- 192.168.123.100:0/620039444 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9cf0078d00 con 0x7f9cf01084e0 2026-03-10T12:34:58.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.836+0000 7f9cf4966700 1 -- 192.168.123.100:0/620039444 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9ce000f9d0 con 0x7f9cf01084e0 2026-03-10T12:34:58.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.836+0000 7f9cf4966700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9cdc06c680 0x7f9cdc06eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.836+0000 7f9cf4966700 1 -- 192.168.123.100:0/620039444 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f9ce008dd00 con 0x7f9cf01084e0 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stdout:141733920770 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.844+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f140404ea50 con 0x7f140410c8b0 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.844+0000 7f13fa7fc700 1 -- 192.168.123.100:0/2693312242 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f13fc05a6d0 con 0x7f140410c8b0 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f13f006c7a0 msgr2=0x7f13f006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f13f006c7a0 0x7f13f006ec50 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f1400007e60 tx=0x7f14000058e0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f140410c8b0 msgr2=0x7f140407ce50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f140410c8b0 0x7f140407ce50 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f13fc00c390 tx=0x7f13fc00c6a0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 shutdown_connections 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f13f006c7a0 0x7f13f006ec50 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f140410c8b0 0x7f140407ce50 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 --2- 192.168.123.100:0/2693312242 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f140407d390 0x7f140407d800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 >> 192.168.123.100:0/2693312242 conn(0x7f140406c6c0 msgr2=0x7f1404070850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 shutdown_connections 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.847+0000 7f13effff700 1 -- 192.168.123.100:0/2693312242 wait complete. 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.845+0000 7f9ceeffd700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9cdc06c680 0x7f9cdc06eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.845+0000 7f9ceffff700 1 -- 192.168.123.100:0/620039444 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9cd0005320 con 0x7f9cf01084e0 2026-03-10T12:34:58.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.848+0000 7f9ceeffd700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9cdc06c680 0x7f9cdc06eb30 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f9cd8006fd0 tx=0x7f9cd8008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.852 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.850+0000 7f9cf4966700 1 -- 192.168.123.100:0/620039444 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9ce0027080 con 0x7f9cf01084e0 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.855+0000 7fc0a53b2700 1 -- 192.168.123.100:0/4190352733 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a0071b60 msgr2=0x7fc0a0071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.855+0000 7fc0a53b2700 1 --2- 192.168.123.100:0/4190352733 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a0071b60 0x7fc0a0071fd0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7fc094009ab0 tx=0x7fc094009dc0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.856+0000 7fc0a53b2700 1 -- 192.168.123.100:0/4190352733 shutdown_connections 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.856+0000 7fc0a53b2700 1 --2- 192.168.123.100:0/4190352733 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a0071b60 0x7fc0a0071fd0 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.856+0000 7fc0a53b2700 1 --2- 192.168.123.100:0/4190352733 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0a010e9e0 0x7fc0a010edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.856+0000 7fc0a53b2700 1 -- 192.168.123.100:0/4190352733 >> 192.168.123.100:0/4190352733 conn(0x7fc0a006c6c0 msgr2=0x7fc0a006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.856+0000 7fc0a53b2700 1 -- 192.168.123.100:0/4190352733 shutdown_connections 2026-03-10T12:34:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.856+0000 7fc0a53b2700 1 -- 192.168.123.100:0/4190352733 wait complete. 2026-03-10T12:34:58.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.857+0000 7fc0a53b2700 1 Processor -- start 2026-03-10T12:34:58.859 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.858+0000 7fc0a53b2700 1 -- start start 2026-03-10T12:34:58.861 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.858+0000 7fc0a53b2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a010e9e0 0x7fc0a0119540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.861 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.858+0000 7fc0a53b2700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0a0114540 0x7fc0a01149b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.861 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.858+0000 7fc0a53b2700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0a0114ef0 con 0x7fc0a010e9e0 2026-03-10T12:34:58.861 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.858+0000 7fc0a53b2700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0a0115060 con 0x7fc0a0114540 2026-03-10T12:34:58.862 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.861+0000 7fc09ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a010e9e0 0x7fc0a0119540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.866 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.861+0000 7fc09ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a010e9e0 0x7fc0a0119540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33362/0 (socket says 192.168.123.100:33362) 2026-03-10T12:34:58.866 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.861+0000 7fc09ffff700 1 -- 192.168.123.100:0/3425715771 learned_addr learned my addr 192.168.123.100:0/3425715771 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:58.866 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.864+0000 7fc09ffff700 1 -- 192.168.123.100:0/3425715771 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0a0114540 msgr2=0x7fc0a01149b0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:34:58.866 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.864+0000 7fc09ffff700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0a0114540 0x7fc0a01149b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.866 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.864+0000 7fc09ffff700 1 -- 192.168.123.100:0/3425715771 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc094009710 con 0x7fc0a010e9e0 2026-03-10T12:34:58.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.864+0000 7fc09ffff700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a010e9e0 0x7fc0a0119540 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7fc09000ec90 tx=0x7fc09000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.866+0000 7fc09d7fa700 1 -- 192.168.123.100:0/3425715771 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc09000cbc0 con 0x7fc0a010e9e0 2026-03-10T12:34:58.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.867+0000 7fc0a53b2700 1 -- 192.168.123.100:0/3425715771 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0a0115340 con 0x7fc0a010e9e0 2026-03-10T12:34:58.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.867+0000 7fc0a53b2700 1 -- 192.168.123.100:0/3425715771 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0a01b7c50 con 0x7fc0a010e9e0 2026-03-10T12:34:58.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.868+0000 7fc0a53b2700 1 -- 192.168.123.100:0/3425715771 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc0a004f2a0 con 0x7fc0a010e9e0 2026-03-10T12:34:58.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.872+0000 7fc09d7fa700 1 -- 192.168.123.100:0/3425715771 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc09000cd20 con 0x7fc0a010e9e0 2026-03-10T12:34:58.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.873+0000 7fc09d7fa700 1 -- 192.168.123.100:0/3425715771 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc090010640 con 0x7fc0a010e9e0 2026-03-10T12:34:58.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.877+0000 7fc09d7fa700 1 -- 192.168.123.100:0/3425715771 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc090010820 con 0x7fc0a010e9e0 2026-03-10T12:34:58.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.880+0000 7fc09d7fa700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc08806c870 0x7fc08806ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.889 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.886+0000 7fc09f7fe700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc08806c870 0x7fc08806ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.889 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.887+0000 7fc09f7fe700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc08806c870 0x7fc08806ed20 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fc094000c00 tx=0x7fc094003800 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.889 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.887+0000 7fc09d7fa700 1 -- 192.168.123.100:0/3425715771 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc090014070 con 0x7fc0a010e9e0 2026-03-10T12:34:58.889 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.887+0000 7fc09d7fa700 1 -- 192.168.123.100:0/3425715771 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc09008edc0 con 0x7fc0a010e9e0 2026-03-10T12:34:58.930 INFO:tasks.cephadm.ceph_manager.ceph:need seq 141733920771 got 141733920770 for osd.5 2026-03-10T12:34:58.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.977+0000 7fc846eba700 1 -- 192.168.123.100:0/1279895073 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc83809d650 msgr2=0x7fc83809da20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.984 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.977+0000 7fc846eba700 1 --2- 192.168.123.100:0/1279895073 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc83809d650 0x7fc83809da20 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7fc83c009b00 tx=0x7fc83c009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:58.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.990+0000 7fc846eba700 1 -- 192.168.123.100:0/1279895073 shutdown_connections 2026-03-10T12:34:58.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.990+0000 7fc846eba700 1 --2- 192.168.123.100:0/1279895073 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc838097650 0x7fc838097ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.990+0000 7fc846eba700 1 --2- 192.168.123.100:0/1279895073 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc83809d650 0x7fc83809da20 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.990+0000 7fc846eba700 1 -- 192.168.123.100:0/1279895073 >> 192.168.123.100:0/1279895073 conn(0x7fc838093150 msgr2=0x7fc838095560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:58.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.990+0000 7fc846eba700 1 -- 192.168.123.100:0/1279895073 shutdown_connections 2026-03-10T12:34:58.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.990+0000 7fc846eba700 1 -- 192.168.123.100:0/1279895073 wait complete. 2026-03-10T12:34:58.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.991+0000 7fc846eba700 1 Processor -- start 2026-03-10T12:34:58.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.991+0000 7fc846eba700 1 -- start start 2026-03-10T12:34:58.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.991+0000 7fc846eba700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc838097650 0x7fc83812d220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.991+0000 7fc844c56700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc838097650 0x7fc83812d220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc844c56700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc838097650 0x7fc83812d220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33378/0 (socket says 192.168.123.100:33378) 2026-03-10T12:34:58.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc844c56700 1 -- 192.168.123.100:0/3122328170 learned_addr learned my addr 192.168.123.100:0/3122328170 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:58.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc846eba700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc83809d650 0x7fc83812d760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:58.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc846eba700 1 -- 192.168.123.100:0/3122328170 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc83812de40 con 0x7fc838097650 2026-03-10T12:34:58.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc846eba700 1 -- 192.168.123.100:0/3122328170 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc838131bd0 con 0x7fc83809d650 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc837fff700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc83809d650 0x7fc83812d760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc837fff700 1 -- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc838097650 msgr2=0x7fc83812d220 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc837fff700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc838097650 0x7fc83812d220 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc837fff700 1 -- 192.168.123.100:0/3122328170 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc83c0097e0 con 0x7fc83809d650 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.992+0000 7fc837fff700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc83809d650 0x7fc83812d760 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fc83000b700 tx=0x7fc83000bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.993+0000 7fc844c56700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc838097650 0x7fc83812d220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.993+0000 7fc835ffb700 1 -- 192.168.123.100:0/3122328170 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc830011840 con 0x7fc83809d650 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.993+0000 7fc846eba700 1 -- 192.168.123.100:0/3122328170 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc838131e50 con 0x7fc83809d650 2026-03-10T12:34:58.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.993+0000 7fc846eba700 1 -- 192.168.123.100:0/3122328170 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc8381323a0 con 0x7fc83809d650 2026-03-10T12:34:58.994 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.993+0000 7fc835ffb700 1 -- 192.168.123.100:0/3122328170 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc830011e80 con 0x7fc83809d650 2026-03-10T12:34:58.994 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.993+0000 7fc835ffb700 1 -- 192.168.123.100:0/3122328170 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc83000f550 con 0x7fc83809d650 2026-03-10T12:34:58.995 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:58.995+0000 7fc846eba700 1 -- 192.168.123.100:0/3122328170 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc82c005320 con 0x7fc83809d650 2026-03-10T12:34:59.004 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.004+0000 7fc835ffb700 1 -- 192.168.123.100:0/3122328170 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc8300119a0 con 0x7fc83809d650 2026-03-10T12:34:59.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.004+0000 7fc835ffb700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc82806c750 0x7fc82806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:59.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.005+0000 7fc844c56700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc82806c750 0x7fc82806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:59.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.005+0000 7fc844c56700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc82806c750 0x7fc82806ec00 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc83c00b5c0 tx=0x7fc83c005960 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:59.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.005+0000 7fc835ffb700 1 -- 192.168.123.100:0/3122328170 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc83008b0f0 con 0x7fc83809d650 2026-03-10T12:34:59.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.005+0000 7fc835ffb700 1 -- 192.168.123.100:0/3122328170 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc83008d680 con 0x7fc83809d650 2026-03-10T12:34:59.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.067+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/3853889782 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0108780 msgr2=0x7f9fe0108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.067+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/3853889782 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0108780 0x7f9fe0108b50 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f9fd0009b00 tx=0x7f9fd0009e10 comp rx=0 tx=0).stop 2026-03-10T12:34:59.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.068+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/3853889782 shutdown_connections 2026-03-10T12:34:59.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.068+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/3853889782 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fe0102780 0x7f9fe0102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.068+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/3853889782 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0108780 0x7f9fe0108b50 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.068+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/3853889782 >> 192.168.123.100:0/3853889782 conn(0x7f9fe00fe280 msgr2=0x7f9fe0100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:59.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.068+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/3853889782 shutdown_connections 2026-03-10T12:34:59.072 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/3853889782 wait complete. 2026-03-10T12:34:59.072 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fe5c0c700 1 Processor -- start 2026-03-10T12:34:59.072 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fe5c0c700 1 -- start start 2026-03-10T12:34:59.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fe5c0c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0102780 0x7f9fe0075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:59.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fe5c0c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fe0108780 0x7f9fe00757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fe5c0c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fe00793f0 con 0x7f9fe0108780 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fe5c0c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fe0075ce0 con 0x7f9fe0102780 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fdf7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0102780 0x7f9fe0075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fdf7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0102780 0x7f9fe0075260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48716/0 (socket says 192.168.123.100:48716) 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fdf7fe700 1 -- 192.168.123.100:0/2454614849 learned_addr learned my addr 192.168.123.100:0/2454614849 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.072+0000 7f9fd6dff700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fe0108780 0x7f9fe00757a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.073+0000 7f9fdf7fe700 1 -- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fe0108780 msgr2=0x7f9fe00757a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.073+0000 7f9fdf7fe700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fe0108780 0x7f9fe00757a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.073+0000 7f9fdf7fe700 1 -- 192.168.123.100:0/2454614849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9fc8009710 con 0x7f9fe0102780 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.073+0000 7f9fd6dff700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fe0108780 0x7f9fe00757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.073+0000 7f9fdf7fe700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0102780 0x7f9fe0075260 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f9fd000ba30 tx=0x7f9fd000ba60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:59.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.074+0000 7f9fdd7fa700 1 -- 192.168.123.100:0/2454614849 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9fd001d070 con 0x7f9fe0102780 2026-03-10T12:34:59.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.074+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9fd00097e0 con 0x7f9fe0102780 2026-03-10T12:34:59.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.074+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9fe01a6af0 con 0x7f9fe0102780 2026-03-10T12:34:59.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.074+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9fe004ea50 con 0x7f9fe0102780 2026-03-10T12:34:59.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.075+0000 7f9fdd7fa700 1 -- 192.168.123.100:0/2454614849 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9fd000f460 con 0x7f9fe0102780 2026-03-10T12:34:59.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.075+0000 7f9fdd7fa700 1 -- 192.168.123.100:0/2454614849 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9fd0005170 con 0x7f9fe0102780 2026-03-10T12:34:59.080 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.080+0000 7f9fdd7fa700 1 -- 192.168.123.100:0/2454614849 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9fd000f5d0 con 0x7f9fe0102780 2026-03-10T12:34:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.084+0000 7f9fdd7fa700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fcc06c630 0x7f9fcc06eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:34:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.084+0000 7f9fdd7fa700 1 -- 192.168.123.100:0/2454614849 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f9fd008cd30 con 0x7f9fe0102780 2026-03-10T12:34:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.084+0000 7f9fd6dff700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fcc06c630 0x7f9fcc06eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:34:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.084+0000 7f9fdd7fa700 1 -- 192.168.123.100:0/2454614849 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9fd008d1c0 con 0x7f9fe0102780 2026-03-10T12:34:59.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.122+0000 7fc846eba700 1 -- 192.168.123.100:0/3122328170 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7fc82c005190 con 0x7fc83809d650 2026-03-10T12:34:59.129 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.128+0000 7fc835ffb700 1 -- 192.168.123.100:0/3122328170 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fc830055bb0 con 0x7fc83809d650 2026-03-10T12:34:59.129 INFO:teuthology.orchestra.run.vm00.stdout:73014444040 2026-03-10T12:34:59.129 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.129+0000 7f9fd6dff700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fcc06c630 0x7f9fcc06eae0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f9fe0076b00 tx=0x7f9fc8009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 -- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc82806c750 msgr2=0x7fc82806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc82806c750 0x7fc82806ec00 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc83c00b5c0 tx=0x7fc83c005960 comp rx=0 tx=0).stop 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 -- 192.168.123.100:0/3122328170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc83809d650 msgr2=0x7fc83812d760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc83809d650 0x7fc83812d760 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fc83000b700 tx=0x7fc83000bac0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 -- 192.168.123.100:0/3122328170 shutdown_connections 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc82806c750 0x7fc82806ec00 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc838097650 0x7fc83812d220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 --2- 192.168.123.100:0/3122328170 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc83809d650 0x7fc83812d760 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.131+0000 7fc8277fe700 1 -- 192.168.123.100:0/3122328170 >> 192.168.123.100:0/3122328170 conn(0x7fc838093150 msgr2=0x7fc8380948d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:59.134 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.134+0000 7fc8277fe700 1 -- 192.168.123.100:0/3122328170 shutdown_connections 2026-03-10T12:34:59.134 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.134+0000 7fc8277fe700 1 -- 192.168.123.100:0/3122328170 wait complete. 2026-03-10T12:34:59.172 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.171+0000 7f9ceffff700 1 -- 192.168.123.100:0/620039444 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7f9cd0005190 con 0x7f9cf01084e0 2026-03-10T12:34:59.173 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.173+0000 7f9cf4966700 1 -- 192.168.123.100:0/620039444 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f9ce005c300 con 0x7f9cf01084e0 2026-03-10T12:34:59.173 INFO:teuthology.orchestra.run.vm00.stdout:55834574858 2026-03-10T12:34:59.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.177+0000 7f9ce67fc700 1 -- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9cdc06c680 msgr2=0x7f9cdc06eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.177+0000 7f9ce67fc700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9cdc06c680 0x7f9cdc06eb30 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f9cd8006fd0 tx=0x7f9cd8008040 comp rx=0 tx=0).stop 2026-03-10T12:34:59.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.177+0000 7f9ce67fc700 1 -- 192.168.123.100:0/620039444 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 msgr2=0x7f9cf0078270 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.177+0000 7f9ce67fc700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 0x7f9cf0078270 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9ce0004900 tx=0x7f9ce00049e0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.182+0000 7f9ce67fc700 1 -- 192.168.123.100:0/620039444 shutdown_connections 2026-03-10T12:34:59.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.182+0000 7f9ce67fc700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9cdc06c680 0x7f9cdc06eb30 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.182+0000 7f9ce67fc700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9cf01024e0 0x7f9cf019ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.182+0000 7f9ce67fc700 1 --2- 192.168.123.100:0/620039444 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9cf01084e0 0x7f9cf0078270 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.182+0000 7f9ce67fc700 1 -- 192.168.123.100:0/620039444 >> 192.168.123.100:0/620039444 conn(0x7f9cf00fe000 msgr2=0x7f9cf00fe800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:59.184 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.183+0000 7f9ce67fc700 1 -- 192.168.123.100:0/620039444 shutdown_connections 2026-03-10T12:34:59.184 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.183+0000 7f9ce67fc700 1 -- 192.168.123.100:0/620039444 wait complete. 2026-03-10T12:34:59.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.205+0000 7fc0a53b2700 1 -- 192.168.123.100:0/3425715771 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7fc0a004ea50 con 0x7fc0a010e9e0 2026-03-10T12:34:59.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.206+0000 7fc09d7fa700 1 -- 192.168.123.100:0/3425715771 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7fc090057580 con 0x7fc0a010e9e0 2026-03-10T12:34:59.206 INFO:teuthology.orchestra.run.vm00.stdout:124554051589 2026-03-10T12:34:59.210 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.209+0000 7fc086ffd700 1 -- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc08806c870 msgr2=0x7fc08806ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.210 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.210+0000 7fc086ffd700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc08806c870 0x7fc08806ed20 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7fc094000c00 tx=0x7fc094003800 comp rx=0 tx=0).stop 2026-03-10T12:34:59.210 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.210+0000 7fc086ffd700 1 -- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a010e9e0 msgr2=0x7fc0a0119540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.210 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.210+0000 7fc086ffd700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a010e9e0 0x7fc0a0119540 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7fc09000ec90 tx=0x7fc09000c5b0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.211+0000 7fc086ffd700 1 -- 192.168.123.100:0/3425715771 shutdown_connections 2026-03-10T12:34:59.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.211+0000 7fc086ffd700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc08806c870 0x7fc08806ed20 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.211+0000 7fc086ffd700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0a010e9e0 0x7fc0a0119540 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.211+0000 7fc086ffd700 1 --2- 192.168.123.100:0/3425715771 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0a0114540 0x7fc0a01149b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.211+0000 7fc086ffd700 1 -- 192.168.123.100:0/3425715771 >> 192.168.123.100:0/3425715771 conn(0x7fc0a006c6c0 msgr2=0x7fc0a006ccc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:59.212 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.211+0000 7fc086ffd700 1 -- 192.168.123.100:0/3425715771 shutdown_connections 2026-03-10T12:34:59.212 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.212+0000 7fc086ffd700 1 -- 192.168.123.100:0/3425715771 wait complete. 2026-03-10T12:34:59.274 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444041 got 73014444040 for osd.2 2026-03-10T12:34:59.291 INFO:tasks.cephadm.ceph_manager.ceph:need seq 124554051590 got 124554051589 for osd.4 2026-03-10T12:34:59.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.307+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f9fe0066e40 con 0x7f9fe0102780 2026-03-10T12:34:59.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.309+0000 7f9fdd7fa700 1 -- 192.168.123.100:0/2454614849 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f9fe0066e40 con 0x7f9fe0102780 2026-03-10T12:34:59.310 INFO:teuthology.orchestra.run.vm00.stdout:38654705676 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fcc06c630 msgr2=0x7f9fcc06eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fcc06c630 0x7f9fcc06eae0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f9fe0076b00 tx=0x7f9fc8009450 comp rx=0 tx=0).stop 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0102780 msgr2=0x7f9fe0075260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0102780 0x7f9fe0075260 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f9fd000ba30 tx=0x7f9fd000ba60 comp rx=0 tx=0).stop 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 shutdown_connections 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fcc06c630 0x7f9fcc06eae0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fe0102780 0x7f9fe0075260 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 --2- 192.168.123.100:0/2454614849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fe0108780 0x7f9fe00757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 >> 192.168.123.100:0/2454614849 conn(0x7f9fe00fe280 msgr2=0x7f9fe00ffaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.320+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 shutdown_connections 2026-03-10T12:34:59.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:34:59.321+0000 7f9fe5c0c700 1 -- 192.168.123.100:0/2454614849 wait complete. 2026-03-10T12:34:59.328 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574859 got 55834574858 for osd.1 2026-03-10T12:34:59.388 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705677 got 38654705676 for osd.0 2026-03-10T12:34:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:59 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2919413526' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T12:34:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:59 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2693312242' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T12:34:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:59 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3122328170' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T12:34:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:59 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/620039444' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T12:34:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:59 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3425715771' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T12:34:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:34:59 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2454614849' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T12:34:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:59 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2919413526' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T12:34:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:59 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2693312242' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T12:34:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:59 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3122328170' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T12:34:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:59 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/620039444' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T12:34:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:59 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3425715771' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T12:34:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:34:59 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2454614849' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T12:34:59.931 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.5 2026-03-10T12:35:00.081 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:00.274 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.2 2026-03-10T12:35:00.293 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.4 2026-03-10T12:35:00.328 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.1 2026-03-10T12:35:00.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.342+0000 7f73eed5c700 1 -- 192.168.123.100:0/388753761 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 msgr2=0x7f73e8071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:00.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.342+0000 7f73eed5c700 1 --2- 192.168.123.100:0/388753761 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8071fd0 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f73e4009b50 tx=0x7f73e4009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.343+0000 7f73eed5c700 1 -- 192.168.123.100:0/388753761 shutdown_connections 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.343+0000 7f73eed5c700 1 --2- 192.168.123.100:0/388753761 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8071fd0 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.343+0000 7f73eed5c700 1 --2- 192.168.123.100:0/388753761 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f73e810e9e0 0x7f73e810edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.343+0000 7f73eed5c700 1 -- 192.168.123.100:0/388753761 >> 192.168.123.100:0/388753761 conn(0x7f73e806c6c0 msgr2=0x7f73e806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.343+0000 7f73eed5c700 1 -- 192.168.123.100:0/388753761 shutdown_connections 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.343+0000 7f73eed5c700 1 -- 192.168.123.100:0/388753761 wait complete. 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.344+0000 7f73eed5c700 1 Processor -- start 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.344+0000 7f73eed5c700 1 -- start start 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.344+0000 7f73eed5c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8119590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.344+0000 7f73eed5c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f73e810e9e0 0x7f73e8114590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.344+0000 7f73eed5c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73e8114ad0 con 0x7f73e8071b60 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.344+0000 7f73eed5c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73e8114c10 con 0x7f73e810e9e0 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73edd5a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8119590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73ed559700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f73e810e9e0 0x7f73e8114590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73edd5a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8119590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33390/0 (socket says 192.168.123.100:33390) 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73edd5a700 1 -- 192.168.123.100:0/2173396806 learned_addr learned my addr 192.168.123.100:0/2173396806 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73edd5a700 1 -- 192.168.123.100:0/2173396806 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f73e810e9e0 msgr2=0x7f73e8114590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73edd5a700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f73e810e9e0 0x7f73e8114590 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73edd5a700 1 -- 192.168.123.100:0/2173396806 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73e40097e0 con 0x7f73e8071b60 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.345+0000 7f73edd5a700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8119590 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f73d800ba70 tx=0x7f73d800bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.346+0000 7f73deffd700 1 -- 192.168.123.100:0/2173396806 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73d800c700 con 0x7f73e8071b60 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.346+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f73e8114ef0 con 0x7f73e8071b60 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.346+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f73e81b7b20 con 0x7f73e8071b60 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.346+0000 7f73deffd700 1 -- 192.168.123.100:0/2173396806 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f73d800cd40 con 0x7f73e8071b60 2026-03-10T12:35:00.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.346+0000 7f73deffd700 1 -- 192.168.123.100:0/2173396806 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f73d8012340 con 0x7f73e8071b60 2026-03-10T12:35:00.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.348+0000 7f73deffd700 1 -- 192.168.123.100:0/2173396806 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f73d8014440 con 0x7f73e8071b60 2026-03-10T12:35:00.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.349+0000 7f73dcff9700 1 -- 192.168.123.100:0/2173396806 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f73cc005320 con 0x7f73e8071b60 2026-03-10T12:35:00.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.349+0000 7f73deffd700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f73d406c6d0 0x7f73d406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:00.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.349+0000 7f73deffd700 1 -- 192.168.123.100:0/2173396806 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f73d808a400 con 0x7f73e8071b60 2026-03-10T12:35:00.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.352+0000 7f73ed559700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f73d406c6d0 0x7f73d406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:00.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.352+0000 7f73deffd700 1 -- 192.168.123.100:0/2173396806 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f73d804e720 con 0x7f73e8071b60 2026-03-10T12:35:00.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.352+0000 7f73ed559700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f73d406c6d0 0x7f73d406eb80 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f73e8115d80 tx=0x7f73e400b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:00.389 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd last-stat-seq osd.0 2026-03-10T12:35:00.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:00 vm00 ceph-mon[50686]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:00.496 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.495+0000 7f73dcff9700 1 -- 192.168.123.100:0/2173396806 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f73cc005190 con 0x7f73e8071b60 2026-03-10T12:35:00.496 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.496+0000 7f73deffd700 1 -- 192.168.123.100:0/2173396806 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f73d8054f70 con 0x7f73e8071b60 2026-03-10T12:35:00.497 INFO:teuthology.orchestra.run.vm00.stdout:141733920771 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f73d406c6d0 msgr2=0x7f73d406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f73d406c6d0 0x7f73d406eb80 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f73e8115d80 tx=0x7f73e400b540 comp rx=0 tx=0).stop 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 msgr2=0x7f73e8119590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8119590 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f73d800ba70 tx=0x7f73d800bd80 comp rx=0 tx=0).stop 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 shutdown_connections 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f73d406c6d0 0x7f73d406eb80 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f73e8071b60 0x7f73e8119590 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 --2- 192.168.123.100:0/2173396806 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f73e810e9e0 0x7f73e8114590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 >> 192.168.123.100:0/2173396806 conn(0x7f73e806c6c0 msgr2=0x7f73e806cf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 shutdown_connections 2026-03-10T12:35:00.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:00.500+0000 7f73eed5c700 1 -- 192.168.123.100:0/2173396806 wait complete. 2026-03-10T12:35:00.576 INFO:tasks.cephadm.ceph_manager.ceph:need seq 141733920771 got 141733920771 for osd.5 2026-03-10T12:35:00.576 DEBUG:teuthology.parallel:result is None 2026-03-10T12:35:00.644 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:00.682 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:00 vm07 ceph-mon[58582]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:01.125 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:01.180 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:01.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.388+0000 7f9dc9a7a700 1 -- 192.168.123.100:0/475375162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc410caf0 msgr2=0x7f9dc410cec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.388+0000 7f9dc9a7a700 1 --2- 192.168.123.100:0/475375162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc410caf0 0x7f9dc410cec0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f9db40099c0 tx=0x7f9db4009cd0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.393 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.393+0000 7f9dc9a7a700 1 -- 192.168.123.100:0/475375162 shutdown_connections 2026-03-10T12:35:01.393 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.393+0000 7f9dc9a7a700 1 --2- 192.168.123.100:0/475375162 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dc406d520 0x7f9dc406d990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.393 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.393+0000 7f9dc9a7a700 1 --2- 192.168.123.100:0/475375162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc410caf0 0x7f9dc410cec0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.393 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.393+0000 7f9dc9a7a700 1 -- 192.168.123.100:0/475375162 >> 192.168.123.100:0/475375162 conn(0x7f9dc406c830 msgr2=0x7f9dc4071830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.393 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.393+0000 7f9dc9a7a700 1 -- 192.168.123.100:0/475375162 shutdown_connections 2026-03-10T12:35:01.393 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.393+0000 7f9dc9a7a700 1 -- 192.168.123.100:0/475375162 wait complete. 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.394+0000 7f9dc9a7a700 1 Processor -- start 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.394+0000 7f9dc9a7a700 1 -- start start 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.394+0000 7f9dc9a7a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc406d520 0x7f9dc4083200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.394+0000 7f9dc9a7a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dc410caf0 0x7f9dc4083740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.394+0000 7f9dc9a7a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc40812e0 con 0x7f9dc406d520 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.394+0000 7f9dc9a7a700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc4081450 con 0x7f9dc410caf0 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.395+0000 7f9dc37fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc406d520 0x7f9dc4083200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.395+0000 7f9dc37fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc406d520 0x7f9dc4083200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33408/0 (socket says 192.168.123.100:33408) 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.395+0000 7f9dc37fe700 1 -- 192.168.123.100:0/1960425344 learned_addr learned my addr 192.168.123.100:0/1960425344 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:01.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.395+0000 7f9dc2ffd700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dc410caf0 0x7f9dc4083740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.396+0000 7f9dc37fe700 1 -- 192.168.123.100:0/1960425344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dc410caf0 msgr2=0x7f9dc4083740 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.396+0000 7f9dc37fe700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dc410caf0 0x7f9dc4083740 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.396+0000 7f9dc37fe700 1 -- 192.168.123.100:0/1960425344 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9db40096b0 con 0x7f9dc406d520 2026-03-10T12:35:01.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.399+0000 7f9dc37fe700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc406d520 0x7f9dc4083200 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f9db400c6d0 tx=0x7f9db4005800 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.399+0000 7f9dc0ff9700 1 -- 192.168.123.100:0/1960425344 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db400c9a0 con 0x7f9dc406d520 2026-03-10T12:35:01.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.399+0000 7f9dc0ff9700 1 -- 192.168.123.100:0/1960425344 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9db400e450 con 0x7f9dc406d520 2026-03-10T12:35:01.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.399+0000 7f9dc0ff9700 1 -- 192.168.123.100:0/1960425344 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db4016600 con 0x7f9dc406d520 2026-03-10T12:35:01.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.399+0000 7f9dc9a7a700 1 -- 192.168.123.100:0/1960425344 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9dc40816d0 con 0x7f9dc406d520 2026-03-10T12:35:01.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.400+0000 7f9dc9a7a700 1 -- 192.168.123.100:0/1960425344 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9dc4081ab0 con 0x7f9dc406d520 2026-03-10T12:35:01.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.401+0000 7f9dc0ff9700 1 -- 192.168.123.100:0/1960425344 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9db4016760 con 0x7f9dc406d520 2026-03-10T12:35:01.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.401+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9dc404ea50 con 0x7f9dc406d520 2026-03-10T12:35:01.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.405+0000 7f9dc0ff9700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9dac06c7a0 0x7f9dac06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.405+0000 7f9dc0ff9700 1 -- 192.168.123.100:0/1960425344 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f9db4012070 con 0x7f9dc406d520 2026-03-10T12:35:01.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.405+0000 7f9dc0ff9700 1 -- 192.168.123.100:0/1960425344 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9db4093050 con 0x7f9dc406d520 2026-03-10T12:35:01.406 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.406+0000 7f9dc2ffd700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9dac06c7a0 0x7f9dac06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.411+0000 7f9dc2ffd700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9dac06c7a0 0x7f9dac06ec50 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f9dc4082ad0 tx=0x7f9db800b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.516 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:01 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2173396806' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T12:35:01.534 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.523+0000 7fa429a9e700 1 -- 192.168.123.100:0/327670593 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa424071e40 msgr2=0x7fa4240722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.534 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.523+0000 7fa429a9e700 1 --2- 192.168.123.100:0/327670593 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa424071e40 0x7fa4240722b0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fa41c00d3f0 tx=0x7fa41c00d700 comp rx=0 tx=0).stop 2026-03-10T12:35:01.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.535+0000 7fa429a9e700 1 -- 192.168.123.100:0/327670593 shutdown_connections 2026-03-10T12:35:01.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.535+0000 7fa429a9e700 1 --2- 192.168.123.100:0/327670593 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa424071e40 0x7fa4240722b0 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.535+0000 7fa429a9e700 1 --2- 192.168.123.100:0/327670593 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 0x7fa42410cc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.535+0000 7fa429a9e700 1 -- 192.168.123.100:0/327670593 >> 192.168.123.100:0/327670593 conn(0x7fa42406c6c0 msgr2=0x7fa42406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.547 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.547+0000 7fa429a9e700 1 -- 192.168.123.100:0/327670593 shutdown_connections 2026-03-10T12:35:01.547 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.547+0000 7fa429a9e700 1 -- 192.168.123.100:0/327670593 wait complete. 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.549+0000 7fa429a9e700 1 Processor -- start 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.549+0000 7fa429a9e700 1 -- start start 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.549+0000 7fa429a9e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa424071e40 0x7fa424137840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.549+0000 7fa429a9e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 0x7fa424132840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.549+0000 7fa429a9e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa424132e10 con 0x7fa424071e40 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.549+0000 7fa429a9e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa424132f80 con 0x7fa42410c8b0 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa422ffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 0x7fa424132840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa422ffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 0x7fa424132840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48788/0 (socket says 192.168.123.100:48788) 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa422ffd700 1 -- 192.168.123.100:0/614491502 learned_addr learned my addr 192.168.123.100:0/614491502 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa422ffd700 1 -- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa424071e40 msgr2=0x7fa424137840 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa422ffd700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa424071e40 0x7fa424137840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa422ffd700 1 -- 192.168.123.100:0/614491502 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa41c007ed0 con 0x7fa42410c8b0 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa422ffd700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 0x7fa424132840 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fa41c004180 tx=0x7fa41c0043a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.550+0000 7fa420ff9700 1 -- 192.168.123.100:0/614491502 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa41c01d070 con 0x7fa42410c8b0 2026-03-10T12:35:01.552 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.551+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa424133200 con 0x7fa42410c8b0 2026-03-10T12:35:01.552 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.551+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa42407ee60 con 0x7fa42410c8b0 2026-03-10T12:35:01.552 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.551+0000 7fa420ff9700 1 -- 192.168.123.100:0/614491502 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa41c0049b0 con 0x7fa42410c8b0 2026-03-10T12:35:01.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.552+0000 7fa420ff9700 1 -- 192.168.123.100:0/614491502 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa41c017c90 con 0x7fa42410c8b0 2026-03-10T12:35:01.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.552+0000 7fa420ff9700 1 -- 192.168.123.100:0/614491502 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa41c02b430 con 0x7fa42410c8b0 2026-03-10T12:35:01.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.553+0000 7fa420ff9700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa40c06c550 0x7fa40c06ea00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.553+0000 7fa420ff9700 1 -- 192.168.123.100:0/614491502 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa41c013070 con 0x7fa42410c8b0 2026-03-10T12:35:01.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.553+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa410005320 con 0x7fa42410c8b0 2026-03-10T12:35:01.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.554+0000 7fa4237fe700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa40c06c550 0x7fa40c06ea00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.555+0000 7fa4237fe700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa40c06c550 0x7fa40c06ea00 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fa414009800 tx=0x7fa414006d20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.558+0000 7fa420ff9700 1 -- 192.168.123.100:0/614491502 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa41c058ca0 con 0x7fa42410c8b0 2026-03-10T12:35:01.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.626+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7f9dc4066e40 con 0x7f9dc406d520 2026-03-10T12:35:01.627 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.627+0000 7f9dc0ff9700 1 -- 192.168.123.100:0/1960425344 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f9db4023070 con 0x7f9dc406d520 2026-03-10T12:35:01.629 INFO:teuthology.orchestra.run.vm00.stdout:73014444042 2026-03-10T12:35:01.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9dac06c7a0 msgr2=0x7f9dac06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9dac06c7a0 0x7f9dac06ec50 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f9dc4082ad0 tx=0x7f9db800b410 comp rx=0 tx=0).stop 2026-03-10T12:35:01.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc406d520 msgr2=0x7f9dc4083200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc406d520 0x7f9dc4083200 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f9db400c6d0 tx=0x7f9db4005800 comp rx=0 tx=0).stop 2026-03-10T12:35:01.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 shutdown_connections 2026-03-10T12:35:01.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9dac06c7a0 0x7f9dac06ec50 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dc406d520 0x7f9dc4083200 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 --2- 192.168.123.100:0/1960425344 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dc410caf0 0x7f9dc4083740 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.644+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 >> 192.168.123.100:0/1960425344 conn(0x7f9dc406c830 msgr2=0x7f9dc410b2d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.645+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 shutdown_connections 2026-03-10T12:35:01.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.645+0000 7f9daa7fc700 1 -- 192.168.123.100:0/1960425344 wait complete. 2026-03-10T12:35:01.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.651+0000 7fec27795700 1 -- 192.168.123.100:0/74847663 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec2010e9e0 msgr2=0x7fec2010edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.651+0000 7fec27795700 1 --2- 192.168.123.100:0/74847663 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec2010e9e0 0x7fec2010edb0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fec14009b50 tx=0x7fec14009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:01.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 -- 192.168.123.100:0/74847663 shutdown_connections 2026-03-10T12:35:01.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 --2- 192.168.123.100:0/74847663 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fec20071b60 0x7fec20071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 --2- 192.168.123.100:0/74847663 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec2010e9e0 0x7fec2010edb0 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 -- 192.168.123.100:0/74847663 >> 192.168.123.100:0/74847663 conn(0x7fec2006c6c0 msgr2=0x7fec2006cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 -- 192.168.123.100:0/74847663 shutdown_connections 2026-03-10T12:35:01.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 -- 192.168.123.100:0/74847663 wait complete. 2026-03-10T12:35:01.655 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 Processor -- start 2026-03-10T12:35:01.655 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 -- start start 2026-03-10T12:35:01.655 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec20071b60 0x7fec20119160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.655 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fec2010e9e0 0x7fec201196a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.655 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec20116210 con 0x7fec20071b60 2026-03-10T12:35:01.655 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.654+0000 7fec27795700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec20116380 con 0x7fec2010e9e0 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec26793700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec20071b60 0x7fec20119160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec26793700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec20071b60 0x7fec20119160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33448/0 (socket says 192.168.123.100:33448) 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec26793700 1 -- 192.168.123.100:0/3685537998 learned_addr learned my addr 192.168.123.100:0/3685537998 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec25f92700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fec2010e9e0 0x7fec201196a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec26793700 1 -- 192.168.123.100:0/3685537998 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fec2010e9e0 msgr2=0x7fec201196a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec26793700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fec2010e9e0 0x7fec201196a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec26793700 1 -- 192.168.123.100:0/3685537998 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fec140097e0 con 0x7fec20071b60 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.655+0000 7fec26793700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec20071b60 0x7fec20119160 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fec14005950 tx=0x7fec14004ef0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.656+0000 7fec1b7fe700 1 -- 192.168.123.100:0/3685537998 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec1401d070 con 0x7fec20071b60 2026-03-10T12:35:01.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.656+0000 7fec1b7fe700 1 -- 192.168.123.100:0/3685537998 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fec1400bbb0 con 0x7fec20071b60 2026-03-10T12:35:01.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.656+0000 7fec1b7fe700 1 -- 192.168.123.100:0/3685537998 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec1400f700 con 0x7fec20071b60 2026-03-10T12:35:01.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.657+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fec20116600 con 0x7fec20071b60 2026-03-10T12:35:01.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.657+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fec20116a70 con 0x7fec20071b60 2026-03-10T12:35:01.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.658+0000 7fec1b7fe700 1 -- 192.168.123.100:0/3685537998 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fec1400bd20 con 0x7fec20071b60 2026-03-10T12:35:01.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.659+0000 7fec1b7fe700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fec1006c520 0x7fec1006e9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.659+0000 7fec25f92700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fec1006c520 0x7fec1006e9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.659+0000 7fec25f92700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fec1006c520 0x7fec1006e9d0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fec20117970 tx=0x7fec0c009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.659+0000 7fec1b7fe700 1 -- 192.168.123.100:0/3685537998 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fec1408cba0 con 0x7fec20071b60 2026-03-10T12:35:01.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.659+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fec2004f2a0 con 0x7fec20071b60 2026-03-10T12:35:01.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.662+0000 7fec1b7fe700 1 -- 192.168.123.100:0/3685537998 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fec140577a0 con 0x7fec20071b60 2026-03-10T12:35:01.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.693+0000 7f6afdd46700 1 -- 192.168.123.100:0/4056267411 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81005a0 msgr2=0x7f6af8100a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.693+0000 7f6afdd46700 1 --2- 192.168.123.100:0/4056267411 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81005a0 0x7f6af8100a10 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f6ae0009a60 tx=0x7f6ae0009d70 comp rx=0 tx=0).stop 2026-03-10T12:35:01.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.694+0000 7f6afdd46700 1 -- 192.168.123.100:0/4056267411 shutdown_connections 2026-03-10T12:35:01.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.694+0000 7f6afdd46700 1 --2- 192.168.123.100:0/4056267411 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81005a0 0x7f6af8100a10 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.694+0000 7f6afdd46700 1 --2- 192.168.123.100:0/4056267411 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af81065c0 0x7f6af8106990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.694+0000 7f6afdd46700 1 -- 192.168.123.100:0/4056267411 >> 192.168.123.100:0/4056267411 conn(0x7f6af8078550 msgr2=0x7f6af8078950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.697+0000 7f6afdd46700 1 -- 192.168.123.100:0/4056267411 shutdown_connections 2026-03-10T12:35:01.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.697+0000 7f6afdd46700 1 -- 192.168.123.100:0/4056267411 wait complete. 2026-03-10T12:35:01.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.697+0000 7f6afdd46700 1 Processor -- start 2026-03-10T12:35:01.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.697+0000 7f6afdd46700 1 -- start start 2026-03-10T12:35:01.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.698+0000 7f6afdd46700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af81005a0 0x7f6af81960f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6afdd46700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81065c0 0x7f6af8196630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6afdd46700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6af8196d10 con 0x7f6af81005a0 2026-03-10T12:35:01.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6afdd46700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6af819aaa0 con 0x7f6af81065c0 2026-03-10T12:35:01.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6af6ffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81065c0 0x7f6af8196630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6af6ffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81065c0 0x7f6af8196630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48832/0 (socket says 192.168.123.100:48832) 2026-03-10T12:35:01.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6af6ffd700 1 -- 192.168.123.100:0/3669257160 learned_addr learned my addr 192.168.123.100:0/3669257160 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:01.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6af6ffd700 1 -- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af81005a0 msgr2=0x7f6af81960f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.699+0000 7f6af77fe700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af81005a0 0x7f6af81960f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.703+0000 7f6af6ffd700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af81005a0 0x7f6af81960f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.703+0000 7f6af6ffd700 1 -- 192.168.123.100:0/3669257160 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ae0009710 con 0x7f6af81065c0 2026-03-10T12:35:01.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.703+0000 7f6af6ffd700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81065c0 0x7f6af8196630 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f6ae000f690 tx=0x7f6ae000f770 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.703+0000 7f6af77fe700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af81005a0 0x7f6af81960f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:01.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.703+0000 7f6af4ff9700 1 -- 192.168.123.100:0/3669257160 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ae001d070 con 0x7f6af81065c0 2026-03-10T12:35:01.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.703+0000 7f6af4ff9700 1 -- 192.168.123.100:0/3669257160 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6ae000bb40 con 0x7f6af81065c0 2026-03-10T12:35:01.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.704+0000 7f6af4ff9700 1 -- 192.168.123.100:0/3669257160 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ae0017710 con 0x7f6af81065c0 2026-03-10T12:35:01.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.704+0000 7f6afdd46700 1 -- 192.168.123.100:0/3669257160 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6af819ad20 con 0x7f6af81065c0 2026-03-10T12:35:01.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.704+0000 7f6afdd46700 1 -- 192.168.123.100:0/3669257160 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6af819b190 con 0x7f6af81065c0 2026-03-10T12:35:01.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.705+0000 7f6af4ff9700 1 -- 192.168.123.100:0/3669257160 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6ae0017870 con 0x7f6af81065c0 2026-03-10T12:35:01.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.705+0000 7f6afdd46700 1 -- 192.168.123.100:0/3669257160 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6af804ea50 con 0x7f6af81065c0 2026-03-10T12:35:01.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.705+0000 7f6af4ff9700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6ae406c480 0x7f6ae406e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:01.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.705+0000 7f6af4ff9700 1 -- 192.168.123.100:0/3669257160 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f6ae008cc30 con 0x7f6af81065c0 2026-03-10T12:35:01.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.706+0000 7f6af77fe700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6ae406c480 0x7f6ae406e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:01.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.706+0000 7f6af77fe700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6ae406c480 0x7f6ae406e930 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f6ae8005e50 tx=0x7f6ae8005a90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:01.708 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.708+0000 7f6af4ff9700 1 -- 192.168.123.100:0/3669257160 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6ae00576f0 con 0x7f6af81065c0 2026-03-10T12:35:01.778 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444041 got 73014444042 for osd.2 2026-03-10T12:35:01.778 DEBUG:teuthology.parallel:result is None 2026-03-10T12:35:01.814 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.814+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7fa4100059f0 con 0x7fa42410c8b0 2026-03-10T12:35:01.815 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.815+0000 7fa420ff9700 1 -- 192.168.123.100:0/614491502 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7fa41c026020 con 0x7fa42410c8b0 2026-03-10T12:35:01.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:01 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2173396806' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T12:35:01.815 INFO:teuthology.orchestra.run.vm00.stdout:124554051590 2026-03-10T12:35:01.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.821+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa40c06c550 msgr2=0x7fa40c06ea00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.821+0000 7fa429a9e700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa40c06c550 0x7fa40c06ea00 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fa414009800 tx=0x7fa414006d20 comp rx=0 tx=0).stop 2026-03-10T12:35:01.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.821+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 msgr2=0x7fa424132840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.821+0000 7fa429a9e700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 0x7fa424132840 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fa41c004180 tx=0x7fa41c0043a0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.823+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 shutdown_connections 2026-03-10T12:35:01.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.823+0000 7fa429a9e700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa40c06c550 0x7fa40c06ea00 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.823+0000 7fa429a9e700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa424071e40 0x7fa424137840 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.823+0000 7fa429a9e700 1 --2- 192.168.123.100:0/614491502 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa42410c8b0 0x7fa424132840 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.823+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 >> 192.168.123.100:0/614491502 conn(0x7fa42406c6c0 msgr2=0x7fa424070320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.823+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 shutdown_connections 2026-03-10T12:35:01.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.823+0000 7fa429a9e700 1 -- 192.168.123.100:0/614491502 wait complete. 2026-03-10T12:35:01.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.837+0000 7f6afdd46700 1 -- 192.168.123.100:0/3669257160 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7f6af8197510 con 0x7f6af81065c0 2026-03-10T12:35:01.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.838+0000 7f6af4ff9700 1 -- 192.168.123.100:0/3669257160 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f6ae0027020 con 0x7f6af81065c0 2026-03-10T12:35:01.839 INFO:teuthology.orchestra.run.vm00.stdout:55834574860 2026-03-10T12:35:01.841 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.841+0000 7f6aee7fc700 1 -- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6ae406c480 msgr2=0x7f6ae406e930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.841 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.841+0000 7f6aee7fc700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6ae406c480 0x7f6ae406e930 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f6ae8005e50 tx=0x7f6ae8005a90 comp rx=0 tx=0).stop 2026-03-10T12:35:01.842 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.841+0000 7f6aee7fc700 1 -- 192.168.123.100:0/3669257160 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81065c0 msgr2=0x7f6af8196630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.842 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.841+0000 7f6aee7fc700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81065c0 0x7f6af8196630 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f6ae000f690 tx=0x7f6ae000f770 comp rx=0 tx=0).stop 2026-03-10T12:35:01.842 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.842+0000 7f6aee7fc700 1 -- 192.168.123.100:0/3669257160 shutdown_connections 2026-03-10T12:35:01.842 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.842+0000 7f6aee7fc700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f6ae406c480 0x7f6ae406e930 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.842 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.842+0000 7f6aee7fc700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6af81005a0 0x7f6af81960f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.842 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.842+0000 7f6aee7fc700 1 --2- 192.168.123.100:0/3669257160 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6af81065c0 0x7f6af8196630 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.842 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.842+0000 7f6aee7fc700 1 -- 192.168.123.100:0/3669257160 >> 192.168.123.100:0/3669257160 conn(0x7f6af8078550 msgr2=0x7f6af80fed00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.842+0000 7f6aee7fc700 1 -- 192.168.123.100:0/3669257160 shutdown_connections 2026-03-10T12:35:01.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.843+0000 7f6aee7fc700 1 -- 192.168.123.100:0/3669257160 wait complete. 2026-03-10T12:35:01.884 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574859 got 55834574860 for osd.1 2026-03-10T12:35:01.884 DEBUG:teuthology.parallel:result is None 2026-03-10T12:35:01.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.903+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7fec201176b0 con 0x7fec20071b60 2026-03-10T12:35:01.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.904+0000 7fec1b7fe700 1 -- 192.168.123.100:0/3685537998 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fec14027030 con 0x7fec20071b60 2026-03-10T12:35:01.911 INFO:teuthology.orchestra.run.vm00.stdout:38654705678 2026-03-10T12:35:01.915 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fec1006c520 msgr2=0x7fec1006e9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.915 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fec1006c520 0x7fec1006e9d0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fec20117970 tx=0x7fec0c009450 comp rx=0 tx=0).stop 2026-03-10T12:35:01.915 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec20071b60 msgr2=0x7fec20119160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec20071b60 0x7fec20119160 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fec14005950 tx=0x7fec14004ef0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 shutdown_connections 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fec1006c520 0x7fec1006e9d0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fec20071b60 0x7fec20119160 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 --2- 192.168.123.100:0/3685537998 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fec2010e9e0 0x7fec201196a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.915+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 >> 192.168.123.100:0/3685537998 conn(0x7fec2006c6c0 msgr2=0x7fec2006cfb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.916+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 shutdown_connections 2026-03-10T12:35:01.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:01.916+0000 7fec27795700 1 -- 192.168.123.100:0/3685537998 wait complete. 2026-03-10T12:35:01.920 INFO:tasks.cephadm.ceph_manager.ceph:need seq 124554051590 got 124554051590 for osd.4 2026-03-10T12:35:01.920 DEBUG:teuthology.parallel:result is None 2026-03-10T12:35:02.010 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705677 got 38654705678 for osd.0 2026-03-10T12:35:02.010 DEBUG:teuthology.parallel:result is None 2026-03-10T12:35:02.010 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-10T12:35:02.010 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph pg dump --format=json 2026-03-10T12:35:02.159 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.412+0000 7f4851a53700 1 -- 192.168.123.100:0/1549910995 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c100550 msgr2=0x7f484c1009c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.412+0000 7f4851a53700 1 --2- 192.168.123.100:0/1549910995 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c100550 0x7f484c1009c0 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f4834009b50 tx=0x7f4834009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.412+0000 7f4851a53700 1 -- 192.168.123.100:0/1549910995 shutdown_connections 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.412+0000 7f4851a53700 1 --2- 192.168.123.100:0/1549910995 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c100550 0x7f484c1009c0 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.412+0000 7f4851a53700 1 --2- 192.168.123.100:0/1549910995 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f484c106570 0x7f484c106940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.412+0000 7f4851a53700 1 -- 192.168.123.100:0/1549910995 >> 192.168.123.100:0/1549910995 conn(0x7f484c0fbff0 msgr2=0x7f484c0fe400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.413+0000 7f4851a53700 1 -- 192.168.123.100:0/1549910995 shutdown_connections 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.413+0000 7f4851a53700 1 -- 192.168.123.100:0/1549910995 wait complete. 2026-03-10T12:35:02.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.413+0000 7f4851a53700 1 Processor -- start 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.413+0000 7f4851a53700 1 -- start start 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4851a53700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f484c100550 0x7f484c193f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4851a53700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c106570 0x7f484c1944a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4851a53700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f484c194b80 con 0x7f484c106570 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4851a53700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f484c198910 con 0x7f484c100550 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4843fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c106570 0x7f484c1944a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4843fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c106570 0x7f484c1944a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33480/0 (socket says 192.168.123.100:33480) 2026-03-10T12:35:02.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4843fff700 1 -- 192.168.123.100:0/1038917301 learned_addr learned my addr 192.168.123.100:0/1038917301 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:02.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4843fff700 1 -- 192.168.123.100:0/1038917301 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f484c100550 msgr2=0x7f484c193f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:02.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4843fff700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f484c100550 0x7f484c193f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.414+0000 7f4843fff700 1 -- 192.168.123.100:0/1038917301 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f483c009710 con 0x7f484c106570 2026-03-10T12:35:02.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.415+0000 7f4843fff700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c106570 0x7f484c1944a0 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f48340094d0 tx=0x7f4834004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:02.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.415+0000 7f4850a51700 1 -- 192.168.123.100:0/1038917301 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f483401d070 con 0x7f484c106570 2026-03-10T12:35:02.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.415+0000 7f4850a51700 1 -- 192.168.123.100:0/1038917301 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f483400bc50 con 0x7f484c106570 2026-03-10T12:35:02.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.415+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f48340097e0 con 0x7f484c106570 2026-03-10T12:35:02.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.415+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f484c198ef0 con 0x7f484c106570 2026-03-10T12:35:02.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.416+0000 7f4850a51700 1 -- 192.168.123.100:0/1038917301 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4834022620 con 0x7f484c106570 2026-03-10T12:35:02.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.417+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f484c04ea50 con 0x7f484c106570 2026-03-10T12:35:02.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.417+0000 7f4850a51700 1 -- 192.168.123.100:0/1038917301 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4834022840 con 0x7f484c106570 2026-03-10T12:35:02.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.418+0000 7f4850a51700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4838070a90 0x7f4838072f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:02.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.418+0000 7f4850a51700 1 -- 192.168.123.100:0/1038917301 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f483408d770 con 0x7f484c106570 2026-03-10T12:35:02.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.420+0000 7f4850a51700 1 -- 192.168.123.100:0/1038917301 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4834058230 con 0x7f484c106570 2026-03-10T12:35:02.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.420+0000 7f484affd700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4838070a90 0x7f4838072f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:02.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.421+0000 7f484affd700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4838070a90 0x7f4838072f40 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f483c005d90 tx=0x7f483c005ce0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:02.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.522+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f484c1991d0 con 0x7f4838070a90 2026-03-10T12:35:02.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.523+0000 7f4850a51700 1 -- 192.168.123.100:0/1038917301 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19152 (secure 0 0 0) 0x7f484c1991d0 con 0x7f4838070a90 2026-03-10T12:35:02.523 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:02.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.525+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4838070a90 msgr2=0x7f4838072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:02.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.525+0000 7f4851a53700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4838070a90 0x7f4838072f40 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f483c005d90 tx=0x7f483c005ce0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.525+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c106570 msgr2=0x7f484c1944a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:02.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.525+0000 7f4851a53700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c106570 0x7f484c1944a0 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f48340094d0 tx=0x7f4834004970 comp rx=0 tx=0).stop 2026-03-10T12:35:02.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.526+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 shutdown_connections 2026-03-10T12:35:02.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.526+0000 7f4851a53700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4838070a90 0x7f4838072f40 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.526+0000 7f4851a53700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f484c100550 0x7f484c193f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.526+0000 7f4851a53700 1 --2- 192.168.123.100:0/1038917301 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f484c106570 0x7f484c1944a0 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.526+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 >> 192.168.123.100:0/1038917301 conn(0x7f484c0fbff0 msgr2=0x7f484c0fd9b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:02.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.526+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 shutdown_connections 2026-03-10T12:35:02.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.526+0000 7f4851a53700 1 -- 192.168.123.100:0/1038917301 wait complete. 2026-03-10T12:35:02.527 INFO:teuthology.orchestra.run.vm00.stderr:dumped all 2026-03-10T12:35:02.568 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:02 vm00 ceph-mon[50686]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:02.568 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:02 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1960425344' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T12:35:02.568 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:02 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/614491502' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T12:35:02.568 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:02 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3669257160' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T12:35:02.568 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:02 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3685537998' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T12:35:02.574 INFO:teuthology.orchestra.run.vm00.stdout:{"pg_ready":true,"pg_map":{"version":71,"stamp":"2026-03-10T12:35:01.095394+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163692,"kb_used_data":3132,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640852,"statfs":{"total":128823853056,"available":128656232448,"internally_reserved":0,"allocated":3207168,"data_stored":2059902,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.693788"},"pg_stats":[{"pgid":"1.0","version":"21'76","reported_seq":138,"reported_epoch":33,"state":"active+clean","last_fresh":"2026-03-10T12:34:49.383187+0000","last_change":"2026-03-10T12:34:40.405853+0000","last_active":"2026-03-10T12:34:49.383187+0000","last_peered":"2026-03-10T12:34:49.383187+0000","last_clean":"2026-03-10T12:34:49.383187+0000","last_became_active":"2026-03-10T12:34:40.405698+0000","last_became_peered":"2026-03-10T12:34:40.405698+0000","last_unstale":"2026-03-10T12:34:49.383187+0000","last_undegraded":"2026-03-10T12:34:49.383187+0000","last_fullsized":"2026-03-10T12:34:49.383187+0000","mapping_epoch":28,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":29,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T12:34:21.411891+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T12:34:21.411891+0000","last_clean_scrub_stamp":"2026-03-10T12:34:21.411891+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T14:35:14.438014+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":33,"seq":141733920772,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":113677,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.432}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41299999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47899999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.72899999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59899999999999998}]}]},{"osd":4,"up_from":29,"seq":124554051590,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":113677,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51100000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35799999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.49299999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52100000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48799999999999999}]}]},{"osd":3,"up_from":24,"seq":103079215112,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":572957,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66700000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42199999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66000000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63200000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.32400000000000001}]}]},{"osd":2,"up_from":17,"seq":73014444042,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":113677,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42199999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56699999999999995}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.68500000000000005}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55000000000000004}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58299999999999996}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":572957,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42599999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41999999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.53200000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56799999999999995}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50800000000000001}]}]},{"osd":1,"up_from":13,"seq":55834574860,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":572957,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64000000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65900000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67800000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.879}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T12:35:02.574 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph pg dump --format=json 2026-03-10T12:35:02.717 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:02.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:02 vm07 ceph-mon[58582]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:02.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:02 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1960425344' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T12:35:02.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:02 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/614491502' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T12:35:02.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:02 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3669257160' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T12:35:02.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:02 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3685537998' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T12:35:02.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.967+0000 7fc46e2cc700 1 -- 192.168.123.100:0/4201548513 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc468068490 msgr2=0x7fc468068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:02.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.967+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/4201548513 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc468068490 0x7fc468068900 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7fc458009b00 tx=0x7fc458009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:02.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.968+0000 7fc46e2cc700 1 -- 192.168.123.100:0/4201548513 shutdown_connections 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.968+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/4201548513 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc468068490 0x7fc468068900 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.968+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/4201548513 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc4681013c0 0x7fc468101790 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.968+0000 7fc46e2cc700 1 -- 192.168.123.100:0/4201548513 >> 192.168.123.100:0/4201548513 conn(0x7fc4680754a0 msgr2=0x7fc4680758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.968+0000 7fc46e2cc700 1 -- 192.168.123.100:0/4201548513 shutdown_connections 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.968+0000 7fc46e2cc700 1 -- 192.168.123.100:0/4201548513 wait complete. 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.969+0000 7fc46e2cc700 1 Processor -- start 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.969+0000 7fc46e2cc700 1 -- start start 2026-03-10T12:35:02.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.969+0000 7fc46e2cc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc468068490 0x7fc468198320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:02.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.969+0000 7fc46e2cc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc4681013c0 0x7fc468198860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:02.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.969+0000 7fc46e2cc700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc468198f40 con 0x7fc4681013c0 2026-03-10T12:35:02.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.969+0000 7fc46e2cc700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc46819ccd0 con 0x7fc468068490 2026-03-10T12:35:02.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4677fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc4681013c0 0x7fc468198860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:02.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc467fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc468068490 0x7fc468198320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:02.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc467fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc468068490 0x7fc468198320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48874/0 (socket says 192.168.123.100:48874) 2026-03-10T12:35:02.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc467fff700 1 -- 192.168.123.100:0/1276014381 learned_addr learned my addr 192.168.123.100:0/1276014381 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4677fe700 1 -- 192.168.123.100:0/1276014381 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc468068490 msgr2=0x7fc468198320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4677fe700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc468068490 0x7fc468198320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4677fe700 1 -- 192.168.123.100:0/1276014381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4580097e0 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4677fe700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc4681013c0 0x7fc468198860 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7fc458009ad0 tx=0x7fc4580048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4657fa700 1 -- 192.168.123.100:0/1276014381 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc45801d070 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc46819cef0 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc46819d440 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4657fa700 1 -- 192.168.123.100:0/1276014381 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc458022470 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.970+0000 7fc4657fa700 1 -- 192.168.123.100:0/1276014381 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc45800f670 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.971+0000 7fc4657fa700 1 -- 192.168.123.100:0/1276014381 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fc45800baa0 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.972+0000 7fc4657fa700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc45006c6d0 0x7fc45006eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.972+0000 7fc4657fa700 1 -- 192.168.123.100:0/1276014381 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc45808c910 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.972+0000 7fc467fff700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc45006c6d0 0x7fc45006eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.972+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc454005320 con 0x7fc4681013c0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.972+0000 7fc467fff700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc45006c6d0 0x7fc45006eb80 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fc45c005950 tx=0x7fc45c00b500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:02.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:02.975+0000 7fc4657fa700 1 -- 192.168.123.100:0/1276014381 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc458057480 con 0x7fc4681013c0 2026-03-10T12:35:03.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.077+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7fc454000bf0 con 0x7fc45006c6d0 2026-03-10T12:35:03.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.078+0000 7fc4657fa700 1 -- 192.168.123.100:0/1276014381 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19152 (secure 0 0 0) 0x7fc454000bf0 con 0x7fc45006c6d0 2026-03-10T12:35:03.078 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:03.080 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.080+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc45006c6d0 msgr2=0x7fc45006eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:03.080 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.080+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc45006c6d0 0x7fc45006eb80 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fc45c005950 tx=0x7fc45c00b500 comp rx=0 tx=0).stop 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.080+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc4681013c0 msgr2=0x7fc468198860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.080+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc4681013c0 0x7fc468198860 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7fc458009ad0 tx=0x7fc4580048c0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.081+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 shutdown_connections 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.081+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fc45006c6d0 0x7fc45006eb80 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.081+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc468068490 0x7fc468198320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.081+0000 7fc46e2cc700 1 --2- 192.168.123.100:0/1276014381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc4681013c0 0x7fc468198860 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.081+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 >> 192.168.123.100:0/1276014381 conn(0x7fc4680754a0 msgr2=0x7fc4680fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.081+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 shutdown_connections 2026-03-10T12:35:03.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.081+0000 7fc46e2cc700 1 -- 192.168.123.100:0/1276014381 wait complete. 2026-03-10T12:35:03.082 INFO:teuthology.orchestra.run.vm00.stderr:dumped all 2026-03-10T12:35:03.144 INFO:teuthology.orchestra.run.vm00.stdout:{"pg_ready":true,"pg_map":{"version":71,"stamp":"2026-03-10T12:35:01.095394+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163692,"kb_used_data":3132,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640852,"statfs":{"total":128823853056,"available":128656232448,"internally_reserved":0,"allocated":3207168,"data_stored":2059902,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.693788"},"pg_stats":[{"pgid":"1.0","version":"21'76","reported_seq":138,"reported_epoch":33,"state":"active+clean","last_fresh":"2026-03-10T12:34:49.383187+0000","last_change":"2026-03-10T12:34:40.405853+0000","last_active":"2026-03-10T12:34:49.383187+0000","last_peered":"2026-03-10T12:34:49.383187+0000","last_clean":"2026-03-10T12:34:49.383187+0000","last_became_active":"2026-03-10T12:34:40.405698+0000","last_became_peered":"2026-03-10T12:34:40.405698+0000","last_unstale":"2026-03-10T12:34:49.383187+0000","last_undegraded":"2026-03-10T12:34:49.383187+0000","last_fullsized":"2026-03-10T12:34:49.383187+0000","mapping_epoch":28,"log_start":"0'0","ondisk_log_start":"0'0","created":19,"last_epoch_clean":29,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T12:34:21.411891+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T12:34:21.411891+0000","last_clean_scrub_stamp":"2026-03-10T12:34:21.411891+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T14:35:14.438014+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":33,"seq":141733920772,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":113677,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.432}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41299999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47899999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.72899999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59899999999999998}]}]},{"osd":4,"up_from":29,"seq":124554051590,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":113677,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.51100000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35799999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.49299999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52100000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48799999999999999}]}]},{"osd":3,"up_from":24,"seq":103079215112,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":572957,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66700000000000004}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42199999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66000000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63200000000000001}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.32400000000000001}]}]},{"osd":2,"up_from":17,"seq":73014444042,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27056,"kb_used_data":296,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940368,"statfs":{"total":21470642176,"available":21442936832,"internally_reserved":0,"allocated":303104,"data_stored":113677,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42199999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56699999999999995}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.68500000000000005}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.55000000000000004}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.58299999999999996}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":572957,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42599999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41999999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.53200000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56799999999999995}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50800000000000001}]}]},{"osd":1,"up_from":13,"seq":55834574860,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27508,"kb_used_data":748,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939916,"statfs":{"total":21470642176,"available":21442473984,"internally_reserved":0,"allocated":765952,"data_stored":572957,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64000000000000001}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65900000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67800000000000005}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.879}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T12:35:03.144 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-10T12:35:03.144 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-10T12:35:03.144 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-10T12:35:03.144 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph health --format=json 2026-03-10T12:35:03.284 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.507+0000 7fa76718e700 1 -- 192.168.123.100:0/4013483828 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760102790 msgr2=0x7fa760102c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.507+0000 7fa76718e700 1 --2- 192.168.123.100:0/4013483828 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760102790 0x7fa760102c00 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7fa754009b00 tx=0x7fa754009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.507+0000 7fa76718e700 1 -- 192.168.123.100:0/4013483828 shutdown_connections 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.507+0000 7fa76718e700 1 --2- 192.168.123.100:0/4013483828 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760102790 0x7fa760102c00 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.507+0000 7fa76718e700 1 --2- 192.168.123.100:0/4013483828 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa760108790 0x7fa760108b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.507+0000 7fa76718e700 1 -- 192.168.123.100:0/4013483828 >> 192.168.123.100:0/4013483828 conn(0x7fa7600fe2b0 msgr2=0x7fa7601006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.508+0000 7fa76718e700 1 -- 192.168.123.100:0/4013483828 shutdown_connections 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.508+0000 7fa76718e700 1 -- 192.168.123.100:0/4013483828 wait complete. 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.508+0000 7fa76718e700 1 Processor -- start 2026-03-10T12:35:03.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.508+0000 7fa76718e700 1 -- start start 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.508+0000 7fa76718e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa760102790 0x7fa7601983a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa76718e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760108790 0x7fa7601988e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa76718e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa760198fc0 con 0x7fa760108790 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa76718e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa76019cd50 con 0x7fa760102790 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa75ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760108790 0x7fa7601988e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa75ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760108790 0x7fa7601988e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33528/0 (socket says 192.168.123.100:33528) 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa75ffff700 1 -- 192.168.123.100:0/1436316044 learned_addr learned my addr 192.168.123.100:0/1436316044 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa75ffff700 1 -- 192.168.123.100:0/1436316044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa760102790 msgr2=0x7fa7601983a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa764f2a700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa760102790 0x7fa7601983a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa75ffff700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa760102790 0x7fa7601983a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa75ffff700 1 -- 192.168.123.100:0/1436316044 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa7540097e0 con 0x7fa760108790 2026-03-10T12:35:03.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa764f2a700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa760102790 0x7fa7601983a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:03.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.509+0000 7fa75ffff700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760108790 0x7fa7601988e0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7fa75400b5c0 tx=0x7fa7540049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:03.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.510+0000 7fa75dffb700 1 -- 192.168.123.100:0/1436316044 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa75401d070 con 0x7fa760108790 2026-03-10T12:35:03.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.510+0000 7fa75dffb700 1 -- 192.168.123.100:0/1436316044 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa75400bc50 con 0x7fa760108790 2026-03-10T12:35:03.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.510+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa76019cfd0 con 0x7fa760108790 2026-03-10T12:35:03.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.510+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa76019d4c0 con 0x7fa760108790 2026-03-10T12:35:03.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.511+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa76004ea50 con 0x7fa760108790 2026-03-10T12:35:03.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.511+0000 7fa75dffb700 1 -- 192.168.123.100:0/1436316044 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa754022620 con 0x7fa760108790 2026-03-10T12:35:03.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.512+0000 7fa75dffb700 1 -- 192.168.123.100:0/1436316044 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa754022780 con 0x7fa760108790 2026-03-10T12:35:03.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.512+0000 7fa75dffb700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa74806c680 0x7fa74806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:03.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.512+0000 7fa75dffb700 1 -- 192.168.123.100:0/1436316044 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa75408d8f0 con 0x7fa760108790 2026-03-10T12:35:03.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.514+0000 7fa764f2a700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa74806c680 0x7fa74806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:03.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.514+0000 7fa764f2a700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa74806c680 0x7fa74806eb30 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fa750005fd0 tx=0x7fa750005de0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:03.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.515+0000 7fa75dffb700 1 -- 192.168.123.100:0/1436316044 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa7540583b0 con 0x7fa760108790 2026-03-10T12:35:03.641 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.641+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7fa760066e40 con 0x7fa760108790 2026-03-10T12:35:03.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.643+0000 7fa75dffb700 1 -- 192.168.123.100:0/1436316044 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7fa754027080 con 0x7fa760108790 2026-03-10T12:35:03.644 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:03.644 INFO:teuthology.orchestra.run.vm00.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa74806c680 msgr2=0x7fa74806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa74806c680 0x7fa74806eb30 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fa750005fd0 tx=0x7fa750005de0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760108790 msgr2=0x7fa7601988e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760108790 0x7fa7601988e0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7fa75400b5c0 tx=0x7fa7540049e0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 shutdown_connections 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa74806c680 0x7fa74806eb30 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa760102790 0x7fa7601983a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 --2- 192.168.123.100:0/1436316044 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa760108790 0x7fa7601988e0 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 >> 192.168.123.100:0/1436316044 conn(0x7fa7600fe2b0 msgr2=0x7fa7600ffb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:03.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 shutdown_connections 2026-03-10T12:35:03.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:03.646+0000 7fa76718e700 1 -- 192.168.123.100:0/1436316044 wait complete. 2026-03-10T12:35:03.702 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-10T12:35:03.702 INFO:tasks.cephadm:Setup complete, yielding 2026-03-10T12:35:03.702 INFO:teuthology.run_tasks:Running task print... 2026-03-10T12:35:03.704 INFO:teuthology.task.print:**** done end installing v18.2.0 cephadm ... 2026-03-10T12:35:03.704 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:35:03.706 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:03.706 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-10T12:35:03.844 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.068+0000 7f956fb93700 1 -- 192.168.123.100:0/308916709 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 msgr2=0x7f9568103140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.068+0000 7f956fb93700 1 --2- 192.168.123.100:0/308916709 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568103140 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f9558009b00 tx=0x7f9558009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.069+0000 7f956fb93700 1 -- 192.168.123.100:0/308916709 shutdown_connections 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.069+0000 7f956fb93700 1 --2- 192.168.123.100:0/308916709 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9568103680 0x7f9568105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.069+0000 7f956fb93700 1 --2- 192.168.123.100:0/308916709 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568103140 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.069+0000 7f956fb93700 1 -- 192.168.123.100:0/308916709 >> 192.168.123.100:0/308916709 conn(0x7f95680faa70 msgr2=0x7f95680fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.069+0000 7f956fb93700 1 -- 192.168.123.100:0/308916709 shutdown_connections 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.069+0000 7f956fb93700 1 -- 192.168.123.100:0/308916709 wait complete. 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956fb93700 1 Processor -- start 2026-03-10T12:35:04.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956fb93700 1 -- start start 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956fb93700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956fb93700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9568103680 0x7f9568198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956fb93700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9568198b80 con 0x7f9568069180 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956fb93700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9568198cc0 con 0x7f9568103680 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956d92f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956d92f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33544/0 (socket says 192.168.123.100:33544) 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956d92f700 1 -- 192.168.123.100:0/3462828409 learned_addr learned my addr 192.168.123.100:0/3462828409 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956d12e700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9568103680 0x7f9568198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.070+0000 7f956d92f700 1 -- 192.168.123.100:0/3462828409 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9568103680 msgr2=0x7f9568198560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f956d92f700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9568103680 0x7f9568198560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f956d92f700 1 -- 192.168.123.100:0/3462828409 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95580097e0 con 0x7f9568069180 2026-03-10T12:35:04.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f956d92f700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568198020 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f9558000c00 tx=0x7f955800bb80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f955effd700 1 -- 192.168.123.100:0/3462828409 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f955801d070 con 0x7f9568069180 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f956818e800 con 0x7f9568069180 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f956818ecf0 con 0x7f9568069180 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f955effd700 1 -- 192.168.123.100:0/3462828409 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9558022470 con 0x7f9568069180 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.071+0000 7f955effd700 1 -- 192.168.123.100:0/3462828409 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f955800f650 con 0x7f9568069180 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.072+0000 7f955effd700 1 -- 192.168.123.100:0/3462828409 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f955800f7b0 con 0x7f9568069180 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.073+0000 7f956d12e700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9568103680 0x7f9568198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.073+0000 7f955effd700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f955406c680 0x7f955406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:04.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.073+0000 7f955effd700 1 -- 192.168.123.100:0/3462828409 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f955808c920 con 0x7f9568069180 2026-03-10T12:35:04.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.073+0000 7f956d12e700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f955406c680 0x7f955406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:04.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.073+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f954c005320 con 0x7f9568069180 2026-03-10T12:35:04.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.076+0000 7f956d12e700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f955406c680 0x7f955406eb30 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f9564009ea0 tx=0x7f9564009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:04.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.076+0000 7f955effd700 1 -- 192.168.123.100:0/3462828409 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f955805b0e0 con 0x7f9568069180 2026-03-10T12:35:04.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.181+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7f954c005190 con 0x7f9568069180 2026-03-10T12:35:04.188 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.188+0000 7f955effd700 1 -- 192.168.123.100:0/3462828409 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v14)=0 v14) v1 ==== 143+0+0 (secure 0 0 0) 0x7f9558027020 con 0x7f9568069180 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.193+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f955406c680 msgr2=0x7f955406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.193+0000 7f956fb93700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f955406c680 0x7f955406eb30 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f9564009ea0 tx=0x7f9564009450 comp rx=0 tx=0).stop 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.193+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 msgr2=0x7f9568198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.193+0000 7f956fb93700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568198020 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f9558000c00 tx=0x7f955800bb80 comp rx=0 tx=0).stop 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.194+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 shutdown_connections 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.194+0000 7f956fb93700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f955406c680 0x7f955406eb30 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.194+0000 7f956fb93700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9568069180 0x7f9568198020 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.194+0000 7f956fb93700 1 --2- 192.168.123.100:0/3462828409 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9568103680 0x7f9568198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.194+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 >> 192.168.123.100:0/3462828409 conn(0x7f95680faa70 msgr2=0x7f95680ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.194+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 shutdown_connections 2026-03-10T12:35:04.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.194+0000 7f956fb93700 1 -- 192.168.123.100:0/3462828409 wait complete. 2026-03-10T12:35:04.245 INFO:teuthology.run_tasks:Running task print... 2026-03-10T12:35:04.247 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-10T12:35:04.247 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:35:04.249 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:04.249 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph orch status' 2026-03-10T12:35:04.410 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:04.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.659+0000 7f7a6a441700 1 -- 192.168.123.100:0/739408894 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 msgr2=0x7f7a64105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.659+0000 7f7a6a441700 1 --2- 192.168.123.100:0/739408894 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64105ac0 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7f7a54009b00 tx=0x7f7a54009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.660+0000 7f7a6a441700 1 -- 192.168.123.100:0/739408894 shutdown_connections 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.660+0000 7f7a6a441700 1 --2- 192.168.123.100:0/739408894 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64105ac0 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.660+0000 7f7a6a441700 1 --2- 192.168.123.100:0/739408894 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7a64069160 0x7f7a64103160 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.660+0000 7f7a6a441700 1 -- 192.168.123.100:0/739408894 >> 192.168.123.100:0/739408894 conn(0x7f7a640faa70 msgr2=0x7f7a640fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.661+0000 7f7a6a441700 1 -- 192.168.123.100:0/739408894 shutdown_connections 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.661+0000 7f7a6a441700 1 -- 192.168.123.100:0/739408894 wait complete. 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.661+0000 7f7a6a441700 1 Processor -- start 2026-03-10T12:35:04.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.661+0000 7f7a6a441700 1 -- start start 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.661+0000 7f7a6a441700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7a64069160 0x7f7a64195de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a6a441700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64196320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a6a441700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7a64196940 con 0x7f7a641036a0 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a6a441700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7a64196a80 con 0x7f7a64069160 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a63fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7a64069160 0x7f7a64195de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a637fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64196320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a637fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64196320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33574/0 (socket says 192.168.123.100:33574) 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a637fe700 1 -- 192.168.123.100:0/3669690918 learned_addr learned my addr 192.168.123.100:0/3669690918 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a637fe700 1 -- 192.168.123.100:0/3669690918 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7a64069160 msgr2=0x7f7a64195de0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a637fe700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7a64069160 0x7f7a64195de0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a637fe700 1 -- 192.168.123.100:0/3669690918 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7a540097e0 con 0x7f7a641036a0 2026-03-10T12:35:04.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a637fe700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64196320 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7f7a5400bb70 tx=0x7f7a5400bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:04.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.662+0000 7f7a617fa700 1 -- 192.168.123.100:0/3669690918 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7a5401d070 con 0x7f7a641036a0 2026-03-10T12:35:04.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.663+0000 7f7a617fa700 1 -- 192.168.123.100:0/3669690918 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7a54022470 con 0x7f7a641036a0 2026-03-10T12:35:04.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.663+0000 7f7a617fa700 1 -- 192.168.123.100:0/3669690918 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7a5400f670 con 0x7f7a641036a0 2026-03-10T12:35:04.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.663+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7a6419b4d0 con 0x7f7a641036a0 2026-03-10T12:35:04.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.663+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7a6419b9c0 con 0x7f7a641036a0 2026-03-10T12:35:04.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.664+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7a6418ffa0 con 0x7f7a641036a0 2026-03-10T12:35:04.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.664+0000 7f7a617fa700 1 -- 192.168.123.100:0/3669690918 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7a540225e0 con 0x7f7a641036a0 2026-03-10T12:35:04.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.665+0000 7f7a617fa700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7a5006c680 0x7f7a5006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:04.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.665+0000 7f7a617fa700 1 -- 192.168.123.100:0/3669690918 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7a5408cc40 con 0x7f7a641036a0 2026-03-10T12:35:04.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.665+0000 7f7a63fff700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7a5006c680 0x7f7a5006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:04.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.665+0000 7f7a63fff700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7a5006c680 0x7f7a5006eb30 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f7a64068e90 tx=0x7f7a4c009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:04.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.668+0000 7f7a617fa700 1 -- 192.168.123.100:0/3669690918 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7a5405b190 con 0x7f7a641036a0 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='client.14442 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='client.14446 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: pgmap v72: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1436316044' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3462828409' entity='client.admin' 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:04 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:04.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.778+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7a64061190 con 0x7f7a5006c680 2026-03-10T12:35:04.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.780+0000 7f7a617fa700 1 -- 192.168.123.100:0/3669690918 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7f7a64061190 con 0x7f7a5006c680 2026-03-10T12:35:04.780 INFO:teuthology.orchestra.run.vm00.stdout:Backend: cephadm 2026-03-10T12:35:04.780 INFO:teuthology.orchestra.run.vm00.stdout:Available: Yes 2026-03-10T12:35:04.780 INFO:teuthology.orchestra.run.vm00.stdout:Paused: No 2026-03-10T12:35:04.782 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7a5006c680 msgr2=0x7f7a5006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.782 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7a5006c680 0x7f7a5006eb30 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f7a64068e90 tx=0x7f7a4c009450 comp rx=0 tx=0).stop 2026-03-10T12:35:04.782 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 msgr2=0x7f7a64196320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:04.782 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64196320 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7f7a5400bb70 tx=0x7f7a5400bba0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 shutdown_connections 2026-03-10T12:35:04.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7a5006c680 0x7f7a5006eb30 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7a64069160 0x7f7a64195de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 --2- 192.168.123.100:0/3669690918 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7a641036a0 0x7f7a64196320 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:04.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.782+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 >> 192.168.123.100:0/3669690918 conn(0x7f7a640faa70 msgr2=0x7f7a640fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:04.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.783+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 shutdown_connections 2026-03-10T12:35:04.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:04.783+0000 7f7a6a441700 1 -- 192.168.123.100:0/3669690918 wait complete. 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='client.14442 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='client.14446 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: pgmap v72: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1436316044' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3462828409' entity='client.admin' 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:04.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:04 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:04.822 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph orch ps' 2026-03-10T12:35:04.968 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:05.458 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.457+0000 7f8469ffb700 1 -- 192.168.123.100:0/3808637635 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f845c00f460 con 0x7f846c101710 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.458+0000 7f84712c6700 1 -- 192.168.123.100:0/3808637635 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c101710 msgr2=0x7f846c101b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.458+0000 7f84712c6700 1 --2- 192.168.123.100:0/3808637635 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c101710 0x7f846c101b60 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f845c009b50 tx=0x7f845c009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.459+0000 7f84712c6700 1 -- 192.168.123.100:0/3808637635 shutdown_connections 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.459+0000 7f84712c6700 1 --2- 192.168.123.100:0/3808637635 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c101710 0x7f846c101b60 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.459+0000 7f84712c6700 1 --2- 192.168.123.100:0/3808637635 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c100510 0x7f846c100920 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.459+0000 7f84712c6700 1 -- 192.168.123.100:0/3808637635 >> 192.168.123.100:0/3808637635 conn(0x7f846c0fba80 msgr2=0x7f846c0fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.459+0000 7f84712c6700 1 -- 192.168.123.100:0/3808637635 shutdown_connections 2026-03-10T12:35:05.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.459+0000 7f84712c6700 1 -- 192.168.123.100:0/3808637635 wait complete. 2026-03-10T12:35:05.460 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f84712c6700 1 Processor -- start 2026-03-10T12:35:05.460 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f84712c6700 1 -- start start 2026-03-10T12:35:05.460 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f84712c6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c100510 0x7f846c197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f84712c6700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c101710 0x7f846c198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f84712c6700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f846c198b50 con 0x7f846c100510 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f84712c6700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f846c198c90 con 0x7f846c101710 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f846a7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c101710 0x7f846c198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f846a7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c101710 0x7f846c198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48924/0 (socket says 192.168.123.100:48924) 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.460+0000 7f846a7fc700 1 -- 192.168.123.100:0/2113606090 learned_addr learned my addr 192.168.123.100:0/2113606090 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.461+0000 7f846a7fc700 1 -- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c100510 msgr2=0x7f846c197ff0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.461+0000 7f846affd700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c100510 0x7f846c197ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.461+0000 7f846a7fc700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c100510 0x7f846c197ff0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.461+0000 7f846a7fc700 1 -- 192.168.123.100:0/2113606090 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f845c0097e0 con 0x7f846c101710 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.461+0000 7f846affd700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c100510 0x7f846c197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:05.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.461+0000 7f846a7fc700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c101710 0x7f846c198530 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f845c0094d0 tx=0x7f845c0057f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:05.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.461+0000 7f8463fff700 1 -- 192.168.123.100:0/2113606090 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f845c01d070 con 0x7f846c101710 2026-03-10T12:35:05.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.462+0000 7f84712c6700 1 -- 192.168.123.100:0/2113606090 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f846c19d6e0 con 0x7f846c101710 2026-03-10T12:35:05.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.462+0000 7f8463fff700 1 -- 192.168.123.100:0/2113606090 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f845c004e80 con 0x7f846c101710 2026-03-10T12:35:05.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.462+0000 7f8463fff700 1 -- 192.168.123.100:0/2113606090 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f845c00f700 con 0x7f846c101710 2026-03-10T12:35:05.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.462+0000 7f84712c6700 1 -- 192.168.123.100:0/2113606090 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f846c19dbd0 con 0x7f846c101710 2026-03-10T12:35:05.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.463+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f84500052f0 con 0x7f846c101710 2026-03-10T12:35:05.464 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.463+0000 7f8463fff700 1 -- 192.168.123.100:0/2113606090 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f845c00bc30 con 0x7f846c101710 2026-03-10T12:35:05.464 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.464+0000 7f8463fff700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f845806c680 0x7f845806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:05.464 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.464+0000 7f8463fff700 1 -- 192.168.123.100:0/2113606090 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f845c08d800 con 0x7f846c101710 2026-03-10T12:35:05.465 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.464+0000 7f846affd700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f845806c680 0x7f845806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:05.465 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.464+0000 7f846affd700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f845806c680 0x7f845806eb30 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f846c101570 tx=0x7f8454009670 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:05.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.466+0000 7f8463fff700 1 -- 192.168.123.100:0/2113606090 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f845c05bd50 con 0x7f846c101710 2026-03-10T12:35:05.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.572+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8450000bc0 con 0x7f845806c680 2026-03-10T12:35:05.577 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.577+0000 7f8463fff700 1 -- 192.168.123.100:0/2113606090 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2640 (secure 0 0 0) 0x7f8450000bc0 con 0x7f845806c680 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (79s) 46s ago 2m 20.7M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (2m) 46s ago 2m 7792k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (97s) 20s ago 96s 7977k - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (2m) 46s ago 2m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (95s) 20s ago 95s 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (78s) 46s ago 113s 75.5M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:9283,8765,8443 running (2m) 46s ago 2m 486M - 18.2.0 dc2bc1663786 8dc0a869be20 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (91s) 20s ago 91s 447M - 18.2.0 dc2bc1663786 1662ba2e507c 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (2m) 46s ago 2m 44.9M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (90s) 20s ago 89s 41.4M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (2m) 46s ago 2m 13.7M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (92s) 20s ago 92s 12.0M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (70s) 46s ago 70s 38.0M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (60s) 46s ago 60s 37.8M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (50s) 46s ago 50s 35.2M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (41s) 20s ago 41s 41.1M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (30s) 20s ago 30s 39.8M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (21s) 20s ago 21s 11.5M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:35:05.578 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (73s) 46s ago 108s 32.4M - 2.43.0 a07b618ecd1d 5d567c813f4b 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.579+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f845806c680 msgr2=0x7f845806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.579+0000 7f8461ffb700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f845806c680 0x7f845806eb30 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f846c101570 tx=0x7f8454009670 comp rx=0 tx=0).stop 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c101710 msgr2=0x7f846c198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c101710 0x7f846c198530 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f845c0094d0 tx=0x7f845c0057f0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 shutdown_connections 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f845806c680 0x7f845806eb30 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f846c100510 0x7f846c197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 --2- 192.168.123.100:0/2113606090 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f846c101710 0x7f846c198530 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 >> 192.168.123.100:0/2113606090 conn(0x7f846c0fba80 msgr2=0x7f846c0fd5c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 shutdown_connections 2026-03-10T12:35:05.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.580+0000 7f8461ffb700 1 -- 192.168.123.100:0/2113606090 wait complete. 2026-03-10T12:35:05.622 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph orch ls' 2026-03-10T12:35:05.763 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:05.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.985+0000 7f3b00eb9700 1 -- 192.168.123.100:0/1484063625 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc102760 msgr2=0x7f3afc102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:05.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.985+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/1484063625 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc102760 0x7f3afc102b70 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7f3ae4009b00 tx=0x7f3ae4009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:05.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.986+0000 7f3b00eb9700 1 -- 192.168.123.100:0/1484063625 shutdown_connections 2026-03-10T12:35:05.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.986+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/1484063625 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc103960 0x7f3afc103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.986+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/1484063625 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc102760 0x7f3afc102b70 unknown :-1 s=CLOSED pgs=246 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.986+0000 7f3b00eb9700 1 -- 192.168.123.100:0/1484063625 >> 192.168.123.100:0/1484063625 conn(0x7f3afc0fdcf0 msgr2=0x7f3afc100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:05.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.986+0000 7f3b00eb9700 1 -- 192.168.123.100:0/1484063625 shutdown_connections 2026-03-10T12:35:05.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3b00eb9700 1 -- 192.168.123.100:0/1484063625 wait complete. 2026-03-10T12:35:05.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3b00eb9700 1 Processor -- start 2026-03-10T12:35:05.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3b00eb9700 1 -- start start 2026-03-10T12:35:05.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3b00eb9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc102760 0x7f3afc198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:05.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3b00eb9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc103960 0x7f3afc198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:05.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3b00eb9700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3afc198b80 con 0x7f3afc103960 2026-03-10T12:35:05.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3b00eb9700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3afc198cc0 con 0x7f3afc102760 2026-03-10T12:35:05.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3afa59c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc102760 0x7f3afc198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:05.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3afa59c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc102760 0x7f3afc198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48950/0 (socket says 192.168.123.100:48950) 2026-03-10T12:35:05.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.987+0000 7f3afa59c700 1 -- 192.168.123.100:0/2704193031 learned_addr learned my addr 192.168.123.100:0/2704193031 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:05.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3afa59c700 1 -- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc103960 msgr2=0x7f3afc198560 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:35:05.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3af9d9b700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc103960 0x7f3afc198560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3afa59c700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc103960 0x7f3afc198560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3afa59c700 1 -- 192.168.123.100:0/2704193031 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ae40097e0 con 0x7f3afc102760 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3afa59c700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc102760 0x7f3afc198020 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f3ae400bb40 tx=0x7f3ae400bc20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3af37fe700 1 -- 192.168.123.100:0/2704193031 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ae401d070 con 0x7f3afc102760 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3af37fe700 1 -- 192.168.123.100:0/2704193031 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3ae4004cf0 con 0x7f3afc102760 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3af9d9b700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc103960 0x7f3afc198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3afc19d710 con 0x7f3afc102760 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3afc075190 con 0x7f3afc102760 2026-03-10T12:35:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.988+0000 7f3af37fe700 1 -- 192.168.123.100:0/2704193031 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ae400f650 con 0x7f3afc102760 2026-03-10T12:35:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.989+0000 7f3af37fe700 1 -- 192.168.123.100:0/2704193031 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3ae40229e0 con 0x7f3afc102760 2026-03-10T12:35:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.990+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3afc066e40 con 0x7f3afc102760 2026-03-10T12:35:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.990+0000 7f3af37fe700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ae806c750 0x7f3ae806ec00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:05.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.990+0000 7f3af37fe700 1 -- 192.168.123.100:0/2704193031 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f3ae408cac0 con 0x7f3afc102760 2026-03-10T12:35:05.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.990+0000 7f3af9d9b700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ae806c750 0x7f3ae806ec00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:05.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.991+0000 7f3af9d9b700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ae806c750 0x7f3ae806ec00 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f3aec009dd0 tx=0x7f3aec009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:05.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:05.992+0000 7f3af37fe700 1 -- 192.168.123.100:0/2704193031 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3ae405b140 con 0x7f3afc102760 2026-03-10T12:35:06.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.096+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f3afc1082b0 con 0x7f3ae806c750 2026-03-10T12:35:06.099 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.099+0000 7f3af37fe700 1 -- 192.168.123.100:0/2704193031 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7f3afc1082b0 con 0x7f3ae806c750 2026-03-10T12:35:06.099 INFO:teuthology.orchestra.run.vm00.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-10T12:35:06.099 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager ?:9093,9094 1/1 46s ago 2m count:1 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter 2/2 46s ago 2m * 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:crash 2/2 46s ago 2m * 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:grafana ?:3000 1/1 46s ago 2m count:1 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:mgr 2/2 46s ago 2m count:2 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:mon 2/2 46s ago 2m vm00:192.168.123.100=vm00;vm07:192.168.123.107=vm07;count:2 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter ?:9100 2/2 46s ago 2m * 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:osd 6 46s ago - 2026-03-10T12:35:06.100 INFO:teuthology.orchestra.run.vm00.stdout:prometheus ?:9095 1/1 46s ago 2m count:1 2026-03-10T12:35:06.101 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ae806c750 msgr2=0x7f3ae806ec00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:06.101 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ae806c750 0x7f3ae806ec00 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f3aec009dd0 tx=0x7f3aec009450 comp rx=0 tx=0).stop 2026-03-10T12:35:06.101 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc102760 msgr2=0x7f3afc198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:06.101 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc102760 0x7f3afc198020 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f3ae400bb40 tx=0x7f3ae400bc20 comp rx=0 tx=0).stop 2026-03-10T12:35:06.101 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 shutdown_connections 2026-03-10T12:35:06.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3ae806c750 0x7f3ae806ec00 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3afc102760 0x7f3afc198020 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 --2- 192.168.123.100:0/2704193031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3afc103960 0x7f3afc198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.101+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 >> 192.168.123.100:0/2704193031 conn(0x7f3afc0fdcf0 msgr2=0x7f3afc106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:06.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.102+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 shutdown_connections 2026-03-10T12:35:06.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.102+0000 7f3b00eb9700 1 -- 192.168.123.100:0/2704193031 wait complete. 2026-03-10T12:35:06.150 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph orch host ls' 2026-03-10T12:35:06.299 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.548+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1852586109 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c18074d80 msgr2=0x7f7c180731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.548+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1852586109 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c18074d80 0x7f7c180731e0 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7f7c14009b00 tx=0x7f7c14009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.549+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1852586109 shutdown_connections 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.549+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1852586109 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7c180737b0 0x7f7c18073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.549+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1852586109 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c18074d80 0x7f7c180731e0 unknown :-1 s=CLOSED pgs=247 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.549+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1852586109 >> 192.168.123.100:0/1852586109 conn(0x7f7c180fbaa0 msgr2=0x7f7c180fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.549+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1852586109 shutdown_connections 2026-03-10T12:35:06.549 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.549+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1852586109 wait complete. 2026-03-10T12:35:06.550 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.550+0000 7f7c1fc1c700 1 Processor -- start 2026-03-10T12:35:06.550 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.550+0000 7f7c1fc1c700 1 -- start start 2026-03-10T12:35:06.550 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.550+0000 7f7c1fc1c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c180737b0 0x7f7c1819c3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:06.550 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.550+0000 7f7c1fc1c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7c18074d80 0x7f7c1819c8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:06.550 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.550+0000 7f7c1fc1c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c1819cf10 con 0x7f7c180737b0 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.550+0000 7f7c1fc1c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7c1819d050 con 0x7f7c18074d80 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d9b8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c180737b0 0x7f7c1819c3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d9b8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c180737b0 0x7f7c1819c3b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33620/0 (socket says 192.168.123.100:33620) 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d9b8700 1 -- 192.168.123.100:0/1064616509 learned_addr learned my addr 192.168.123.100:0/1064616509 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d1b7700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7c18074d80 0x7f7c1819c8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d9b8700 1 -- 192.168.123.100:0/1064616509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7c18074d80 msgr2=0x7f7c1819c8f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d9b8700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7c18074d80 0x7f7c1819c8f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d9b8700 1 -- 192.168.123.100:0/1064616509 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7c140097e0 con 0x7f7c180737b0 2026-03-10T12:35:06.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d1b7700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7c18074d80 0x7f7c1819c8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:35:06.552 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.551+0000 7f7c1d9b8700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c180737b0 0x7f7c1819c3b0 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f7c1400bb70 tx=0x7f7c1400bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:06.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.552+0000 7f7c0affd700 1 -- 192.168.123.100:0/1064616509 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c1401d070 con 0x7f7c180737b0 2026-03-10T12:35:06.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.552+0000 7f7c0affd700 1 -- 192.168.123.100:0/1064616509 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7c14022470 con 0x7f7c180737b0 2026-03-10T12:35:06.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.552+0000 7f7c0affd700 1 -- 192.168.123.100:0/1064616509 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7c1400f650 con 0x7f7c180737b0 2026-03-10T12:35:06.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.552+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7c181a1aa0 con 0x7f7c180737b0 2026-03-10T12:35:06.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.552+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7c181a1f90 con 0x7f7c180737b0 2026-03-10T12:35:06.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.553+0000 7f7c0affd700 1 -- 192.168.123.100:0/1064616509 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7c1400f7b0 con 0x7f7c180737b0 2026-03-10T12:35:06.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.553+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7c18066e40 con 0x7f7c180737b0 2026-03-10T12:35:06.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.556+0000 7f7c0affd700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7c0406c630 0x7f7c0406eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:06.556 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.556+0000 7f7c1d1b7700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7c0406c630 0x7f7c0406eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:06.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.556+0000 7f7c0affd700 1 -- 192.168.123.100:0/1064616509 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7c1408caa0 con 0x7f7c180737b0 2026-03-10T12:35:06.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.557+0000 7f7c1d1b7700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7c0406c630 0x7f7c0406eae0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f7c0c009fd0 tx=0x7f7c0c009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:06.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.557+0000 7f7c0affd700 1 -- 192.168.123.100:0/1064616509 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7c1405b810 con 0x7f7c180737b0 2026-03-10T12:35:06.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.662+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f7c181035e0 con 0x7f7c0406c630 2026-03-10T12:35:06.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.663+0000 7f7c0affd700 1 -- 192.168.123.100:0/1064616509 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f7c181035e0 con 0x7f7c0406c630 2026-03-10T12:35:06.663 INFO:teuthology.orchestra.run.vm00.stdout:HOST ADDR LABELS STATUS 2026-03-10T12:35:06.663 INFO:teuthology.orchestra.run.vm00.stdout:vm00 192.168.123.100 2026-03-10T12:35:06.663 INFO:teuthology.orchestra.run.vm00.stdout:vm07 192.168.123.107 2026-03-10T12:35:06.664 INFO:teuthology.orchestra.run.vm00.stdout:2 hosts in cluster 2026-03-10T12:35:06.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7c0406c630 msgr2=0x7f7c0406eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7c0406c630 0x7f7c0406eae0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f7c0c009fd0 tx=0x7f7c0c009450 comp rx=0 tx=0).stop 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c180737b0 msgr2=0x7f7c1819c3b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c180737b0 0x7f7c1819c3b0 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f7c1400bb70 tx=0x7f7c1400bba0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 shutdown_connections 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7c0406c630 0x7f7c0406eae0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7c180737b0 0x7f7c1819c3b0 secure :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f7c1400bb70 tx=0x7f7c1400bba0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 --2- 192.168.123.100:0/1064616509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7c18074d80 0x7f7c1819c8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.665+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 >> 192.168.123.100:0/1064616509 conn(0x7f7c180fbaa0 msgr2=0x7f7c18101ec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.666+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 shutdown_connections 2026-03-10T12:35:06.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:06.666+0000 7f7c1fc1c700 1 -- 192.168.123.100:0/1064616509 wait complete. 2026-03-10T12:35:06.699 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:06 vm00 ceph-mon[50686]: from='client.14458 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:06.699 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:06 vm00 ceph-mon[50686]: pgmap v73: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:06.732 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph orch device ls' 2026-03-10T12:35:06.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:06 vm07 ceph-mon[58582]: from='client.14458 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:06.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:06 vm07 ceph-mon[58582]: pgmap v73: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:06.871 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.093+0000 7fd8f7d30700 1 -- 192.168.123.100:0/772702974 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 msgr2=0x7fd8f0101b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.093+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/772702974 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0101b80 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7fd8ec009b00 tx=0x7fd8ec009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.094+0000 7fd8f7d30700 1 -- 192.168.123.100:0/772702974 shutdown_connections 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.094+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/772702974 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0101b80 unknown :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.094+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/772702974 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd8f0100530 0x7fd8f0100940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.094+0000 7fd8f7d30700 1 -- 192.168.123.100:0/772702974 >> 192.168.123.100:0/772702974 conn(0x7fd8f00fbaa0 msgr2=0x7fd8f00fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.094+0000 7fd8f7d30700 1 -- 192.168.123.100:0/772702974 shutdown_connections 2026-03-10T12:35:07.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.094+0000 7fd8f7d30700 1 -- 192.168.123.100:0/772702974 wait complete. 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.095+0000 7fd8f7d30700 1 Processor -- start 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f7d30700 1 -- start start 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f7d30700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd8f0100530 0x7fd8f0197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f7d30700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f7d30700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8f0198b50 con 0x7fd8f0101730 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f7d30700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8f0198c90 con 0x7fd8f0100530 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f52cb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f52cb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33630/0 (socket says 192.168.123.100:33630) 2026-03-10T12:35:07.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f52cb700 1 -- 192.168.123.100:0/111359646 learned_addr learned my addr 192.168.123.100:0/111359646 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:07.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f52cb700 1 -- 192.168.123.100:0/111359646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd8f0100530 msgr2=0x7fd8f0197ff0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:35:07.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f52cb700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd8f0100530 0x7fd8f0197ff0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.096+0000 7fd8f52cb700 1 -- 192.168.123.100:0/111359646 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8ec0097e0 con 0x7fd8f0101730 2026-03-10T12:35:07.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.097+0000 7fd8f52cb700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0198530 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7fd8ec009fd0 tx=0x7fd8ec004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:07.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.097+0000 7fd8e6ffd700 1 -- 192.168.123.100:0/111359646 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8ec01d070 con 0x7fd8f0101730 2026-03-10T12:35:07.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.097+0000 7fd8e6ffd700 1 -- 192.168.123.100:0/111359646 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd8ec00bb40 con 0x7fd8f0101730 2026-03-10T12:35:07.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.097+0000 7fd8e6ffd700 1 -- 192.168.123.100:0/111359646 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8ec00f670 con 0x7fd8f0101730 2026-03-10T12:35:07.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.097+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd8f019d6e0 con 0x7fd8f0101730 2026-03-10T12:35:07.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.097+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd8f019dbd0 con 0x7fd8f0101730 2026-03-10T12:35:07.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.098+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd8f0105810 con 0x7fd8f0101730 2026-03-10T12:35:07.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.099+0000 7fd8e6ffd700 1 -- 192.168.123.100:0/111359646 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd8ec004d50 con 0x7fd8f0101730 2026-03-10T12:35:07.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.099+0000 7fd8e6ffd700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd8dc06c7a0 0x7fd8dc06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:07.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.099+0000 7fd8e6ffd700 1 -- 192.168.123.100:0/111359646 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd8ec08cb60 con 0x7fd8f0101730 2026-03-10T12:35:07.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.100+0000 7fd8f5acc700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd8dc06c7a0 0x7fd8dc06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:07.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.100+0000 7fd8f5acc700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd8dc06c7a0 0x7fd8dc06ec50 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fd8f0101590 tx=0x7fd8e0009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:07.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.102+0000 7fd8e6ffd700 1 -- 192.168.123.100:0/111359646 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd8ec05b130 con 0x7fd8f0101730 2026-03-10T12:35:07.209 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.207+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7fd8f0061190 con 0x7fd8dc06c7a0 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.211+0000 7fd8e6ffd700 1 -- 192.168.123.100:0/111359646 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1188 (secure 0 0 0) 0x7fd8f0061190 con 0x7fd8dc06c7a0 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm00 /dev/vdb hdd DWNBRSTVMM00001 20.0G Yes 49s ago 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm00 /dev/vdc hdd DWNBRSTVMM00002 20.0G No 49s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm00 /dev/vdd hdd DWNBRSTVMM00003 20.0G No 49s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm00 /dev/vde hdd DWNBRSTVMM00004 20.0G No 49s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm07 /dev/vdb hdd DWNBRSTVMM07001 20.0G Yes 20s ago 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm07 /dev/vdc hdd DWNBRSTVMM07002 20.0G No 20s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm07 /dev/vdd hdd DWNBRSTVMM07003 20.0G No 20s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T12:35:07.211 INFO:teuthology.orchestra.run.vm00.stdout:vm07 /dev/vde hdd DWNBRSTVMM07004 20.0G No 20s ago Insufficient space (<10 extents) on vgs, LVM detected, locked 2026-03-10T12:35:07.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd8dc06c7a0 msgr2=0x7fd8dc06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:07.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd8dc06c7a0 0x7fd8dc06ec50 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fd8f0101590 tx=0x7fd8e0009450 comp rx=0 tx=0).stop 2026-03-10T12:35:07.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 msgr2=0x7fd8f0198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:07.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0198530 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7fd8ec009fd0 tx=0x7fd8ec004970 comp rx=0 tx=0).stop 2026-03-10T12:35:07.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 shutdown_connections 2026-03-10T12:35:07.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd8dc06c7a0 0x7fd8dc06ec50 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd8f0100530 0x7fd8f0197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.214 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 --2- 192.168.123.100:0/111359646 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd8f0101730 0x7fd8f0198530 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.214 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.213+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 >> 192.168.123.100:0/111359646 conn(0x7fd8f00fbaa0 msgr2=0x7fd8f0102950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:07.214 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.214+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 shutdown_connections 2026-03-10T12:35:07.214 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.214+0000 7fd8f7d30700 1 -- 192.168.123.100:0/111359646 wait complete. 2026-03-10T12:35:07.275 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:35:07.277 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:07.277 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-10T12:35:07.419 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:07.471 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:07 vm00 ceph-mon[50686]: from='client.24287 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:07.471 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:07 vm00 ceph-mon[50686]: from='client.24291 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.665+0000 7f4e169d8700 1 -- 192.168.123.100:0/3809088509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 msgr2=0x7f4e10101b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.665+0000 7f4e169d8700 1 --2- 192.168.123.100:0/3809088509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10101b60 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7f4e00009b00 tx=0x7f4e00009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 -- 192.168.123.100:0/3809088509 shutdown_connections 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 --2- 192.168.123.100:0/3809088509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10101b60 unknown :-1 s=CLOSED pgs=251 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 --2- 192.168.123.100:0/3809088509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e10100510 0x7f4e10100920 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 -- 192.168.123.100:0/3809088509 >> 192.168.123.100:0/3809088509 conn(0x7f4e100fba80 msgr2=0x7f4e100fdef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 -- 192.168.123.100:0/3809088509 shutdown_connections 2026-03-10T12:35:07.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 -- 192.168.123.100:0/3809088509 wait complete. 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 Processor -- start 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.666+0000 7f4e169d8700 1 -- start start 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e169d8700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e10100510 0x7f4e10193bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e169d8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10194100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e169d8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e10194720 con 0x7f4e10101710 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e169d8700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e10194860 con 0x7f4e10100510 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e10100510 0x7f4e10193bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10194100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e10100510 0x7f4e10193bc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:49020/0 (socket says 192.168.123.100:49020) 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10194100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33642/0 (socket says 192.168.123.100:33642) 2026-03-10T12:35:07.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0ffff700 1 -- 192.168.123.100:0/48658909 learned_addr learned my addr 192.168.123.100:0/48658909 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:07.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0f7fe700 1 -- 192.168.123.100:0/48658909 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e10100510 msgr2=0x7f4e10193bc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:07.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0f7fe700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e10100510 0x7f4e10193bc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:07.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.667+0000 7f4e0f7fe700 1 -- 192.168.123.100:0/48658909 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4e000097e0 con 0x7f4e10101710 2026-03-10T12:35:07.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.668+0000 7f4e0f7fe700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10194100 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7f4e00009ad0 tx=0x7f4e00004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:07.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.668+0000 7f4e0d7fa700 1 -- 192.168.123.100:0/48658909 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4e0001d070 con 0x7f4e10101710 2026-03-10T12:35:07.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.668+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4e101992b0 con 0x7f4e10101710 2026-03-10T12:35:07.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.668+0000 7f4e0d7fa700 1 -- 192.168.123.100:0/48658909 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4e0000bc50 con 0x7f4e10101710 2026-03-10T12:35:07.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.668+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4e101997a0 con 0x7f4e10101710 2026-03-10T12:35:07.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.669+0000 7f4e0d7fa700 1 -- 192.168.123.100:0/48658909 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4e00022470 con 0x7f4e10101710 2026-03-10T12:35:07.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.669+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4e10066e40 con 0x7f4e10101710 2026-03-10T12:35:07.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.673+0000 7f4e0d7fa700 1 -- 192.168.123.100:0/48658909 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4e000225d0 con 0x7f4e10101710 2026-03-10T12:35:07.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.673+0000 7f4e0d7fa700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4dfc070ae0 0x7f4dfc072f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:07.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.673+0000 7f4e0d7fa700 1 -- 192.168.123.100:0/48658909 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(34..34 src has 1..34) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4e0008d950 con 0x7f4e10101710 2026-03-10T12:35:07.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.674+0000 7f4e0ffff700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4dfc070ae0 0x7f4dfc072f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:07.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.674+0000 7f4e0d7fa700 1 -- 192.168.123.100:0/48658909 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4e000b97c0 con 0x7f4e10101710 2026-03-10T12:35:07.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.674+0000 7f4e0ffff700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4dfc070ae0 0x7f4dfc072f90 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f4df800ac30 tx=0x7f4df800a5c0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:07.791 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:07.790+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7f4e10199a80 con 0x7f4dfc070ae0 2026-03-10T12:35:07.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:07 vm07 ceph-mon[58582]: from='client.24287 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:07.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:07 vm07 ceph-mon[58582]: from='client.24291 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:08.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:08 vm00 ceph-mon[50686]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:08.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:08 vm00 ceph-mon[50686]: pgmap v74: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:08.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:08 vm00 ceph-mon[50686]: from='client.14470 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:08.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T12:35:08.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:08 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:09.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:08 vm07 ceph-mon[58582]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:09.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:08 vm07 ceph-mon[58582]: pgmap v74: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:09.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:08 vm07 ceph-mon[58582]: from='client.14470 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:09.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T12:35:09.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:08 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:09.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.511+0000 7f4e0d7fa700 1 -- 192.168.123.100:0/48658909 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f4e10199a80 con 0x7f4dfc070ae0 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4dfc070ae0 msgr2=0x7f4dfc072f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4dfc070ae0 0x7f4dfc072f90 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f4df800ac30 tx=0x7f4df800a5c0 comp rx=0 tx=0).stop 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 msgr2=0x7f4e10194100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10194100 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7f4e00009ad0 tx=0x7f4e00004ab0 comp rx=0 tx=0).stop 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 shutdown_connections 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4dfc070ae0 0x7f4dfc072f90 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e10100510 0x7f4e10193bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 --2- 192.168.123.100:0/48658909 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e10101710 0x7f4e10194100 unknown :-1 s=CLOSED pgs=252 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.513+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 >> 192.168.123.100:0/48658909 conn(0x7f4e100fba80 msgr2=0x7f4e10104940 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.514+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 shutdown_connections 2026-03-10T12:35:09.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:09.514+0000 7f4e169d8700 1 -- 192.168.123.100:0/48658909 wait complete. 2026-03-10T12:35:09.564 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph fs dump' 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00[50682]: 2026-03-10T12:35:09.476+0000 7f7485dbf700 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='client.14474 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: pgmap v76: 33 pgs: 9 creating+peering, 23 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: fsmap cephfs:0 2026-03-10T12:35:09.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:09.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:09.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:09.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:09.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:09.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:09.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:09.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:09 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:09.776 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='client.14474 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: pgmap v76: 33 pgs: 9 creating+peering, 23 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: fsmap cephfs:0 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:10.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:09 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:10.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.096+0000 7f1219c75700 1 -- 192.168.123.100:0/1283887917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 msgr2=0x7f12141084f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:10.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.096+0000 7f1219c75700 1 --2- 192.168.123.100:0/1283887917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 0x7f12141084f0 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7f120c007780 tx=0x7f120c00c050 comp rx=0 tx=0).stop 2026-03-10T12:35:10.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.097+0000 7f1219c75700 1 -- 192.168.123.100:0/1283887917 shutdown_connections 2026-03-10T12:35:10.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.097+0000 7f1219c75700 1 --2- 192.168.123.100:0/1283887917 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1214071960 0x7f1214071dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.097+0000 7f1219c75700 1 --2- 192.168.123.100:0/1283887917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 0x7f12141084f0 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.097 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.097+0000 7f1219c75700 1 -- 192.168.123.100:0/1283887917 >> 192.168.123.100:0/1283887917 conn(0x7f121406d3e0 msgr2=0x7f121406f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:10.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.102+0000 7f1219c75700 1 -- 192.168.123.100:0/1283887917 shutdown_connections 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.103+0000 7f1219c75700 1 -- 192.168.123.100:0/1283887917 wait complete. 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.103+0000 7f1219c75700 1 Processor -- start 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1219c75700 1 -- start start 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1219c75700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1214071960 0x7f12141b7260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1219c75700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 0x7f12141b77a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1219c75700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f12141b7dc0 con 0x7f12141080e0 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1219c75700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f12141b7f00 con 0x7f1214071960 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1212ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 0x7f12141b77a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1212ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 0x7f12141b77a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52010/0 (socket says 192.168.123.100:52010) 2026-03-10T12:35:10.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f1212ffd700 1 -- 192.168.123.100:0/1355152081 learned_addr learned my addr 192.168.123.100:0/1355152081 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:10.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.105+0000 7f12137fe700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1214071960 0x7f12141b7260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:10.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.106+0000 7f12137fe700 1 -- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 msgr2=0x7f12141b77a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:10.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.106+0000 7f12137fe700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 0x7f12141b77a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.106+0000 7f12137fe700 1 -- 192.168.123.100:0/1355152081 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f120c007430 con 0x7f1214071960 2026-03-10T12:35:10.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.106+0000 7f12137fe700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1214071960 0x7f12141b7260 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f120c00ceb0 tx=0x7f120c00cc70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:10.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.107+0000 7f1210ff9700 1 -- 192.168.123.100:0/1355152081 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f120c022070 con 0x7f1214071960 2026-03-10T12:35:10.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.107+0000 7f1210ff9700 1 -- 192.168.123.100:0/1355152081 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f120c00a7c0 con 0x7f1214071960 2026-03-10T12:35:10.108 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.107+0000 7f1210ff9700 1 -- 192.168.123.100:0/1355152081 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f120c00f040 con 0x7f1214071960 2026-03-10T12:35:10.109 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.109+0000 7f1219c75700 1 -- 192.168.123.100:0/1355152081 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f121407e610 con 0x7f1214071960 2026-03-10T12:35:10.109 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.109+0000 7f1219c75700 1 -- 192.168.123.100:0/1355152081 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f121407ead0 con 0x7f1214071960 2026-03-10T12:35:10.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.109+0000 7f1219c75700 1 -- 192.168.123.100:0/1355152081 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1214066e40 con 0x7f1214071960 2026-03-10T12:35:10.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.112+0000 7f1210ff9700 1 -- 192.168.123.100:0/1355152081 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f120c01ed50 con 0x7f1214071960 2026-03-10T12:35:10.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.112+0000 7f1210ff9700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f120006c4d0 0x7f120006e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:10.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.113+0000 7f1212ffd700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f120006c4d0 0x7f120006e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:10.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.113+0000 7f1212ffd700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f120006c4d0 0x7f120006e980 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f11fc005950 tx=0x7f11fc009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:10.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.113+0000 7f1210ff9700 1 -- 192.168.123.100:0/1355152081 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f120c08ca00 con 0x7f1214071960 2026-03-10T12:35:10.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.115+0000 7f1210ff9700 1 -- 192.168.123.100:0/1355152081 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f120c05ad00 con 0x7f1214071960 2026-03-10T12:35:10.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.258+0000 7f1219c75700 1 -- 192.168.123.100:0/1355152081 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f121407ed80 con 0x7f1214071960 2026-03-10T12:35:10.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.259+0000 7f1210ff9700 1 -- 192.168.123.100:0/1355152081 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1093 (secure 0 0 0) 0x7f120c018460 con 0x7f1214071960 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:e2 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:epoch 2 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:35:10.260 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:35:09.477807+0000 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 1 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:in 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:up {} 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 0 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:10.261 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:10.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.263+0000 7f120a7fc700 1 -- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f120006c4d0 msgr2=0x7f120006e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:10.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.263+0000 7f120a7fc700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f120006c4d0 0x7f120006e980 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f11fc005950 tx=0x7f11fc009450 comp rx=0 tx=0).stop 2026-03-10T12:35:10.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.263+0000 7f120a7fc700 1 -- 192.168.123.100:0/1355152081 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1214071960 msgr2=0x7f12141b7260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:10.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.263+0000 7f120a7fc700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1214071960 0x7f12141b7260 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f120c00ceb0 tx=0x7f120c00cc70 comp rx=0 tx=0).stop 2026-03-10T12:35:10.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.265+0000 7f120a7fc700 1 -- 192.168.123.100:0/1355152081 shutdown_connections 2026-03-10T12:35:10.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.265+0000 7f120a7fc700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f120006c4d0 0x7f120006e980 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.265+0000 7f120a7fc700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1214071960 0x7f12141b7260 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.265+0000 7f120a7fc700 1 --2- 192.168.123.100:0/1355152081 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f12141080e0 0x7f12141b77a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.265+0000 7f120a7fc700 1 -- 192.168.123.100:0/1355152081 >> 192.168.123.100:0/1355152081 conn(0x7f121406d3e0 msgr2=0x7f1214074f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:10.267 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.267+0000 7f120a7fc700 1 -- 192.168.123.100:0/1355152081 shutdown_connections 2026-03-10T12:35:10.267 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.267+0000 7f120a7fc700 1 -- 192.168.123.100:0/1355152081 wait complete. 2026-03-10T12:35:10.268 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 2 2026-03-10T12:35:10.518 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:35:10.521 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:10.521 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph fs set cephfs max_mds 2' 2026-03-10T12:35:10.709 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:10.878 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:10 vm00 ceph-mon[50686]: Saving service mds.cephfs spec with placement count:4 2026-03-10T12:35:10.878 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:10 vm00 ceph-mon[50686]: Deploying daemon mds.cephfs.vm00.lnokoe on vm00 2026-03-10T12:35:10.878 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:10 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1355152081' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:10.878 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:10 vm00 ceph-mon[50686]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.978+0000 7f741eb86700 1 -- 192.168.123.100:0/4057835267 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7418105a60 msgr2=0x7f7418107e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.978+0000 7f741eb86700 1 --2- 192.168.123.100:0/4057835267 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7418105a60 0x7f7418107e40 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f740c009b00 tx=0x7f740c009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.979+0000 7f741eb86700 1 -- 192.168.123.100:0/4057835267 shutdown_connections 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.979+0000 7f741eb86700 1 --2- 192.168.123.100:0/4057835267 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7418105a60 0x7f7418107e40 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.979+0000 7f741eb86700 1 --2- 192.168.123.100:0/4057835267 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f74180691a0 0x7f7418105520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.979+0000 7f741eb86700 1 -- 192.168.123.100:0/4057835267 >> 192.168.123.100:0/4057835267 conn(0x7f74180faa70 msgr2=0x7f74180fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.979+0000 7f741eb86700 1 -- 192.168.123.100:0/4057835267 shutdown_connections 2026-03-10T12:35:10.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.979+0000 7f741eb86700 1 -- 192.168.123.100:0/4057835267 wait complete. 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.979+0000 7f741eb86700 1 Processor -- start 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f741eb86700 1 -- start start 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f741eb86700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f74180691a0 0x7f7418198050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f741eb86700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7418105a60 0x7f7418198590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f741eb86700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7418198bb0 con 0x7f74180691a0 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f741eb86700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7418198cf0 con 0x7f7418105a60 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f7417fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7418105a60 0x7f7418198590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f7417fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7418105a60 0x7f7418198590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:35816/0 (socket says 192.168.123.100:35816) 2026-03-10T12:35:10.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f7417fff700 1 -- 192.168.123.100:0/2676233917 learned_addr learned my addr 192.168.123.100:0/2676233917 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f741c922700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f74180691a0 0x7f7418198050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f7417fff700 1 -- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f74180691a0 msgr2=0x7f7418198050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f7417fff700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f74180691a0 0x7f7418198050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f7417fff700 1 -- 192.168.123.100:0/2676233917 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f740c0097e0 con 0x7f7418105a60 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.980+0000 7f741c922700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f74180691a0 0x7f7418198050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.981+0000 7f7417fff700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7418105a60 0x7f7418198590 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f740c005230 tx=0x7f740c005790 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.981+0000 7f7415ffb700 1 -- 192.168.123.100:0/2676233917 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f740c01d070 con 0x7f7418105a60 2026-03-10T12:35:10.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.981+0000 7f7415ffb700 1 -- 192.168.123.100:0/2676233917 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f740c00be00 con 0x7f7418105a60 2026-03-10T12:35:10.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.981+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f741819d740 con 0x7f7418105a60 2026-03-10T12:35:10.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.981+0000 7f7415ffb700 1 -- 192.168.123.100:0/2676233917 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f740c00f460 con 0x7f7418105a60 2026-03-10T12:35:10.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.981+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f741819dbd0 con 0x7f7418105a60 2026-03-10T12:35:10.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.982+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7418192240 con 0x7f7418105a60 2026-03-10T12:35:10.985 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.984+0000 7f7415ffb700 1 -- 192.168.123.100:0/2676233917 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f740c003780 con 0x7f7418105a60 2026-03-10T12:35:10.985 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.985+0000 7f7415ffb700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f740006c680 0x7f740006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:10.985 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.985+0000 7f7415ffb700 1 -- 192.168.123.100:0/2676233917 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f740c08d2f0 con 0x7f7418105a60 2026-03-10T12:35:10.985 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.985+0000 7f741c922700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f740006c680 0x7f740006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:10.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.985+0000 7f741c922700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f740006c680 0x7f740006eb30 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f7418069570 tx=0x7f7408008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:10.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:10.986+0000 7f7415ffb700 1 -- 192.168.123.100:0/2676233917 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f740c05b580 con 0x7f7418105a60 2026-03-10T12:35:11.044 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:10 vm07 ceph-mon[58582]: Saving service mds.cephfs spec with placement count:4 2026-03-10T12:35:11.044 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:10 vm07 ceph-mon[58582]: Deploying daemon mds.cephfs.vm00.lnokoe on vm00 2026-03-10T12:35:11.044 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:10 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1355152081' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:11.044 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:10 vm07 ceph-mon[58582]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T12:35:11.117 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.116+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"} v 0) v1 -- 0x7f741802cc30 con 0x7f7418105a60 2026-03-10T12:35:11.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.643+0000 7f7415ffb700 1 -- 192.168.123.100:0/2676233917 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7f740c005c00 con 0x7f7418105a60 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f740006c680 msgr2=0x7f740006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f740006c680 0x7f740006eb30 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f7418069570 tx=0x7f7408008040 comp rx=0 tx=0).stop 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7418105a60 msgr2=0x7f7418198590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7418105a60 0x7f7418198590 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f740c005230 tx=0x7f740c005790 comp rx=0 tx=0).stop 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 shutdown_connections 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f740006c680 0x7f740006eb30 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f74180691a0 0x7f7418198050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 --2- 192.168.123.100:0/2676233917 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7418105a60 0x7f7418198590 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.645+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 >> 192.168.123.100:0/2676233917 conn(0x7f74180faa70 msgr2=0x7f74180fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.646+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 shutdown_connections 2026-03-10T12:35:11.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:11.646+0000 7f741eb86700 1 -- 192.168.123.100:0/2676233917 wait complete. 2026-03-10T12:35:11.699 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:35:11.701 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:11.701 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph fs set cephfs allow_standby_replay false' 2026-03-10T12:35:11.874 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: Deploying daemon mds.cephfs.vm07.wznhgu on vm07 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: pgmap v80: 65 pgs: 1 creating+activating, 24 creating+peering, 17 unknown, 23 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2676233917' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T12:35:11.905 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] up:boot 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] up:boot 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: daemon mds.cephfs.vm07.wznhgu assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: fsmap cephfs:0 2 up:standby 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:creating} 1 up:standby 2026-03-10T12:35:11.906 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:11 vm00 ceph-mon[50686]: daemon mds.cephfs.vm07.wznhgu is now active in filesystem cephfs as rank 0 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: Deploying daemon mds.cephfs.vm07.wznhgu on vm07 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: pgmap v80: 65 pgs: 1 creating+activating, 24 creating+peering, 17 unknown, 23 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2676233917' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] up:boot 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] up:boot 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: daemon mds.cephfs.vm07.wznhgu assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: fsmap cephfs:0 2 up:standby 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:creating} 1 up:standby 2026-03-10T12:35:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:11 vm07 ceph-mon[58582]: daemon mds.cephfs.vm07.wznhgu is now active in filesystem cephfs as rank 0 2026-03-10T12:35:12.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.300+0000 7fab01daa700 1 -- 192.168.123.100:0/3389304590 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc072440 msgr2=0x7faafc10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:12.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.300+0000 7fab01daa700 1 --2- 192.168.123.100:0/3389304590 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc072440 0x7faafc10be90 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7faae8009b00 tx=0x7faae8009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:12.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.300+0000 7fab01daa700 1 -- 192.168.123.100:0/3389304590 shutdown_connections 2026-03-10T12:35:12.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.300+0000 7fab01daa700 1 --2- 192.168.123.100:0/3389304590 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc072440 0x7faafc10be90 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.300+0000 7fab01daa700 1 --2- 192.168.123.100:0/3389304590 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faafc071a60 0x7faafc071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.300+0000 7fab01daa700 1 -- 192.168.123.100:0/3389304590 >> 192.168.123.100:0/3389304590 conn(0x7faafc06d1a0 msgr2=0x7faafc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:12.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.302+0000 7fab01daa700 1 -- 192.168.123.100:0/3389304590 shutdown_connections 2026-03-10T12:35:12.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab01daa700 1 -- 192.168.123.100:0/3389304590 wait complete. 2026-03-10T12:35:12.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab01daa700 1 Processor -- start 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab01daa700 1 -- start start 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab01daa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc071a60 0x7faafc1a49f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab01daa700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faafc072440 0x7faafc1a4f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab01daa700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faafc1a5550 con 0x7faafc071a60 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab01daa700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faafc1a5690 con 0x7faafc072440 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7faafbfff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faafc072440 0x7faafc1a4f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7faafbfff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faafc072440 0x7faafc1a4f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:35836/0 (socket says 192.168.123.100:35836) 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7faafbfff700 1 -- 192.168.123.100:0/3027898609 learned_addr learned my addr 192.168.123.100:0/3027898609 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.303+0000 7fab00da8700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc071a60 0x7faafc1a49f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.304+0000 7fab00da8700 1 -- 192.168.123.100:0/3027898609 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faafc072440 msgr2=0x7faafc1a4f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.304+0000 7fab00da8700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faafc072440 0x7faafc1a4f30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.304+0000 7fab00da8700 1 -- 192.168.123.100:0/3027898609 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faae80097e0 con 0x7faafc071a60 2026-03-10T12:35:12.305 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.304+0000 7fab00da8700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc071a60 0x7faafc1a49f0 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7faaf000d6c0 tx=0x7faaf000d9d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:12.305 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.304+0000 7faaf9ffb700 1 -- 192.168.123.100:0/3027898609 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faaf00041d0 con 0x7faafc071a60 2026-03-10T12:35:12.305 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.304+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faafc10f620 con 0x7faafc071a60 2026-03-10T12:35:12.305 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.304+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faafc10fb70 con 0x7faafc071a60 2026-03-10T12:35:12.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.305+0000 7faaf9ffb700 1 -- 192.168.123.100:0/3027898609 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faaf0004330 con 0x7faafc071a60 2026-03-10T12:35:12.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.305+0000 7faaf9ffb700 1 -- 192.168.123.100:0/3027898609 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faaf0003d80 con 0x7faafc071a60 2026-03-10T12:35:12.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.306+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faae0005320 con 0x7faafc071a60 2026-03-10T12:35:12.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.306+0000 7faaf9ffb700 1 -- 192.168.123.100:0/3027898609 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7faaf000f690 con 0x7faafc071a60 2026-03-10T12:35:12.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.306+0000 7faaf9ffb700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7faaec06c5a0 0x7faaec06ea50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:12.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.306+0000 7faafbfff700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7faaec06c5a0 0x7faaec06ea50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:12.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.307+0000 7faaf9ffb700 1 -- 192.168.123.100:0/3027898609 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7faaf008b020 con 0x7faafc071a60 2026-03-10T12:35:12.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.307+0000 7faafbfff700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7faaec06c5a0 0x7faaec06ea50 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7faae800b5c0 tx=0x7faae8005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:12.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.309+0000 7faaf9ffb700 1 -- 192.168.123.100:0/3027898609 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7faaf00592b0 con 0x7faafc071a60 2026-03-10T12:35:12.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.443+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"} v 0) v1 -- 0x7faae0005f70 con 0x7faafc071a60 2026-03-10T12:35:12.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.658+0000 7faaf9ffb700 1 -- 192.168.123.100:0/3027898609 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]=0 v5) v1 ==== 122+0+0 (secure 0 0 0) 0x7faaf0058e40 con 0x7faafc071a60 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7faaec06c5a0 msgr2=0x7faaec06ea50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7faaec06c5a0 0x7faaec06ea50 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7faae800b5c0 tx=0x7faae8005fb0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc071a60 msgr2=0x7faafc1a49f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc071a60 0x7faafc1a49f0 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7faaf000d6c0 tx=0x7faaf000d9d0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 shutdown_connections 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7faaec06c5a0 0x7faaec06ea50 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faafc071a60 0x7faafc1a49f0 unknown :-1 s=CLOSED pgs=256 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 --2- 192.168.123.100:0/3027898609 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faafc072440 0x7faafc1a4f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 >> 192.168.123.100:0/3027898609 conn(0x7faafc06d1a0 msgr2=0x7faafc10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 shutdown_connections 2026-03-10T12:35:12.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:12.661+0000 7fab01daa700 1 -- 192.168.123.100:0/3027898609 wait complete. 2026-03-10T12:35:12.827 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:35:12.829 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:12.829 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph fs set cephfs inline_data false' 2026-03-10T12:35:12.987 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:13.057 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: Deploying daemon mds.cephfs.vm00.wdwvcu on vm00 2026-03-10T12:35:13.057 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3027898609' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T12:35:13.057 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] up:active 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/3027898609' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] up:boot 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: daemon mds.cephfs.vm00.lnokoe assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: Cluster is now healthy 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:35:13.058 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:12 vm00 ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:creating} 1 up:standby 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: Deploying daemon mds.cephfs.vm00.wdwvcu on vm00 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3027898609' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] up:active 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/3027898609' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] up:boot 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: daemon mds.cephfs.vm00.lnokoe assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: Cluster is now healthy 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:35:13.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:12 vm07 ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:creating} 1 up:standby 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.275+0000 7f132a942700 1 -- 192.168.123.100:0/3724349637 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324102760 msgr2=0x7f1324102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.275+0000 7f1322ffd700 1 -- 192.168.123.100:0/3724349637 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f131800ba40 con 0x7f1324102760 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.275+0000 7f132a942700 1 --2- 192.168.123.100:0/3724349637 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324102760 0x7f1324102b70 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7f1318009b00 tx=0x7f1318009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.276+0000 7f132a942700 1 -- 192.168.123.100:0/3724349637 shutdown_connections 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.276+0000 7f132a942700 1 --2- 192.168.123.100:0/3724349637 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1324103960 0x7f1324103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.276+0000 7f132a942700 1 --2- 192.168.123.100:0/3724349637 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324102760 0x7f1324102b70 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.276+0000 7f132a942700 1 -- 192.168.123.100:0/3724349637 >> 192.168.123.100:0/3724349637 conn(0x7f13240fdcf0 msgr2=0x7f1324100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.276+0000 7f132a942700 1 -- 192.168.123.100:0/3724349637 shutdown_connections 2026-03-10T12:35:13.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.276+0000 7f132a942700 1 -- 192.168.123.100:0/3724349637 wait complete. 2026-03-10T12:35:13.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.276+0000 7f132a942700 1 Processor -- start 2026-03-10T12:35:13.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f132a942700 1 -- start start 2026-03-10T12:35:13.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f132a942700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1324102760 0x7f1324198050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:13.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f132a942700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324103960 0x7f1324198590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:13.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f132a942700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1324198bb0 con 0x7f1324103960 2026-03-10T12:35:13.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f132a942700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1324198cf0 con 0x7f1324102760 2026-03-10T12:35:13.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f13237fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324103960 0x7f1324198590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:13.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f13237fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324103960 0x7f1324198590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52114/0 (socket says 192.168.123.100:52114) 2026-03-10T12:35:13.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f13237fe700 1 -- 192.168.123.100:0/2979628120 learned_addr learned my addr 192.168.123.100:0/2979628120 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:13.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f1323fff700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1324102760 0x7f1324198050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:13.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f13237fe700 1 -- 192.168.123.100:0/2979628120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1324102760 msgr2=0x7f1324198050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:13.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f13237fe700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1324102760 0x7f1324198050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:13.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.277+0000 7f13237fe700 1 -- 192.168.123.100:0/2979628120 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13180097e0 con 0x7f1324103960 2026-03-10T12:35:13.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.278+0000 7f13237fe700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324103960 0x7f1324198590 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f131400b700 tx=0x7f131400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:13.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.278+0000 7f13217fa700 1 -- 192.168.123.100:0/2979628120 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1314010840 con 0x7f1324103960 2026-03-10T12:35:13.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.278+0000 7f13217fa700 1 -- 192.168.123.100:0/2979628120 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1314010e80 con 0x7f1324103960 2026-03-10T12:35:13.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.278+0000 7f13217fa700 1 -- 192.168.123.100:0/2979628120 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f131400d590 con 0x7f1324103960 2026-03-10T12:35:13.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.278+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f132419d7a0 con 0x7f1324103960 2026-03-10T12:35:13.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.278+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1324075360 con 0x7f1324103960 2026-03-10T12:35:13.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.279+0000 7f13217fa700 1 -- 192.168.123.100:0/2979628120 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f131400d6f0 con 0x7f1324103960 2026-03-10T12:35:13.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.280+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1324066e40 con 0x7f1324103960 2026-03-10T12:35:13.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.281+0000 7f13217fa700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f130c06c7a0 0x7f130c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:13.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.281+0000 7f13217fa700 1 -- 192.168.123.100:0/2979628120 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f131408c3b0 con 0x7f1324103960 2026-03-10T12:35:13.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.281+0000 7f1323fff700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f130c06c7a0 0x7f130c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:13.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.281+0000 7f1323fff700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f130c06c7a0 0x7f130c06ec50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f13180052a0 tx=0x7f131800b580 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:13.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.283+0000 7f13217fa700 1 -- 192.168.123.100:0/2979628120 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f131405a5c0 con 0x7f1324103960 2026-03-10T12:35:13.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.413+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"} v 0) v1 -- 0x7f132419d930 con 0x7f1324103960 2026-03-10T12:35:13.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.667+0000 7f13217fa700 1 -- 192.168.123.100:0/2979628120 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]=0 inline data disabled v7) v1 ==== 133+0+0 (secure 0 0 0) 0x7f132419d930 con 0x7f1324103960 2026-03-10T12:35:13.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.670+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f130c06c7a0 msgr2=0x7f130c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:13.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.670+0000 7f132a942700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f130c06c7a0 0x7f130c06ec50 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f13180052a0 tx=0x7f131800b580 comp rx=0 tx=0).stop 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.670+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324103960 msgr2=0x7f1324198590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.670+0000 7f132a942700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324103960 0x7f1324198590 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f131400b700 tx=0x7f131400bac0 comp rx=0 tx=0).stop 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.671+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 shutdown_connections 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.671+0000 7f132a942700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f130c06c7a0 0x7f130c06ec50 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.671+0000 7f132a942700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1324102760 0x7f1324198050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.671+0000 7f132a942700 1 --2- 192.168.123.100:0/2979628120 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1324103960 0x7f1324198590 unknown :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.671+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 >> 192.168.123.100:0/2979628120 conn(0x7f13240fdcf0 msgr2=0x7f1324106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.671+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 shutdown_connections 2026-03-10T12:35:13.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:13.671+0000 7f132a942700 1 -- 192.168.123.100:0/2979628120 wait complete. 2026-03-10T12:35:13.672 INFO:teuthology.orchestra.run.vm00.stderr:inline data disabled 2026-03-10T12:35:13.729 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:35:13.731 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:13.731 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph fs dump' 2026-03-10T12:35:13.899 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: daemon mds.cephfs.vm00.lnokoe is now active in filesystem cephfs as rank 1 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: Deploying daemon mds.cephfs.vm07.rhzwnr on vm07 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: pgmap v82: 65 pgs: 1 creating+activating, 24 creating+peering, 11 unknown, 29 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2979628120' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] up:active 2026-03-10T12:35:13.927 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2979628120' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-10T12:35:13.928 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 1 up:standby 2026-03-10T12:35:13.928 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.928 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.928 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.928 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:13.928 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:13 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: daemon mds.cephfs.vm00.lnokoe is now active in filesystem cephfs as rank 1 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: Deploying daemon mds.cephfs.vm07.rhzwnr on vm07 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: pgmap v82: 65 pgs: 1 creating+activating, 24 creating+peering, 11 unknown, 29 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2979628120' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]: dispatch 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] up:active 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2979628120' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "false"}]': finished 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 1 up:standby 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:13 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.183+0000 7f82a547b700 1 -- 192.168.123.100:0/3327819805 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0072470 msgr2=0x7f82a010beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.183+0000 7f82a547b700 1 --2- 192.168.123.100:0/3327819805 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0072470 0x7f82a010beb0 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7f8290009b00 tx=0x7f8290009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.184+0000 7f82a547b700 1 -- 192.168.123.100:0/3327819805 shutdown_connections 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.184+0000 7f82a547b700 1 --2- 192.168.123.100:0/3327819805 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0072470 0x7f82a010beb0 unknown :-1 s=CLOSED pgs=261 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.184+0000 7f82a547b700 1 --2- 192.168.123.100:0/3327819805 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 0x7f82a0071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.184+0000 7f82a547b700 1 -- 192.168.123.100:0/3327819805 >> 192.168.123.100:0/3327819805 conn(0x7f82a006d1a0 msgr2=0x7f82a006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.184+0000 7f82a547b700 1 -- 192.168.123.100:0/3327819805 shutdown_connections 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.184+0000 7f82a547b700 1 -- 192.168.123.100:0/3327819805 wait complete. 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f82a547b700 1 Processor -- start 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f82a547b700 1 -- start start 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f82a547b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 0x7f82a0116aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f82a547b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0116fe0 0x7f82a01b27d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f82a547b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82a01174e0 con 0x7f82a0116fe0 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f82a547b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82a0117650 con 0x7f82a0071a90 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f829effd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 0x7f82a0116aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f829effd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 0x7f82a0116aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:35886/0 (socket says 192.168.123.100:35886) 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.185+0000 7f829effd700 1 -- 192.168.123.100:0/106985939 learned_addr learned my addr 192.168.123.100:0/106985939 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.186+0000 7f829effd700 1 -- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0116fe0 msgr2=0x7f82a01b27d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:14.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.186+0000 7f829e7fc700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0116fe0 0x7f82a01b27d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:14.188 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.186+0000 7f829effd700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0116fe0 0x7f82a01b27d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.188 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.186+0000 7f829effd700 1 -- 192.168.123.100:0/106985939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f82900097e0 con 0x7f82a0071a90 2026-03-10T12:35:14.188 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.186+0000 7f829e7fc700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0116fe0 0x7f82a01b27d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:14.188 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.186+0000 7f829effd700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 0x7f82a0116aa0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f829800c390 tx=0x7f829800c750 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.187+0000 7f8287fff700 1 -- 192.168.123.100:0/106985939 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f829800e030 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.187+0000 7f8287fff700 1 -- 192.168.123.100:0/106985939 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f829800f040 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.187+0000 7f8287fff700 1 -- 192.168.123.100:0/106985939 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8298014700 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.187+0000 7f82a547b700 1 -- 192.168.123.100:0/106985939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82a01b2d70 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.187+0000 7f82a547b700 1 -- 192.168.123.100:0/106985939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82a01b3230 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.188+0000 7f8287fff700 1 -- 192.168.123.100:0/106985939 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8298009280 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.188+0000 7f82a547b700 1 -- 192.168.123.100:0/106985939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f828c005320 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.189+0000 7f8287fff700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f828806c680 0x7f828806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.189+0000 7f8287fff700 1 -- 192.168.123.100:0/106985939 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f829808b870 con 0x7f82a0071a90 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.189+0000 7f829e7fc700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f828806c680 0x7f828806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:14.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.189+0000 7f829e7fc700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f828806c680 0x7f828806eb30 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f829000b5c0 tx=0x7f8290011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:14.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.192+0000 7f8287fff700 1 -- 192.168.123.100:0/106985939 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8298059a80 con 0x7f82a0071a90 2026-03-10T12:35:14.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.323+0000 7f82a547b700 1 -- 192.168.123.100:0/106985939 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f828c006200 con 0x7f82a0071a90 2026-03-10T12:35:14.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.323+0000 7f8287fff700 1 -- 192.168.123.100:0/106985939 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 7 v7) v1 ==== 75+0+1618 (secure 0 0 0) 0x7f8298059610 con 0x7f82a0071a90 2026-03-10T12:35:14.325 INFO:teuthology.orchestra.run.vm00.stdout:e7 2026-03-10T12:35:14.325 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:35:14.325 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:epoch 7 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:35:13.664035+0000 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 2 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 2 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:14.326 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.326+0000 7f8285ffb700 1 -- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f828806c680 msgr2=0x7f828806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.326+0000 7f8285ffb700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f828806c680 0x7f828806eb30 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f829000b5c0 tx=0x7f8290011040 comp rx=0 tx=0).stop 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 -- 192.168.123.100:0/106985939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 msgr2=0x7f82a0116aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 0x7f82a0116aa0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f829800c390 tx=0x7f829800c750 comp rx=0 tx=0).stop 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 -- 192.168.123.100:0/106985939 shutdown_connections 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f828806c680 0x7f828806eb30 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f82a0071a90 0x7f82a0116aa0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 --2- 192.168.123.100:0/106985939 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f82a0116fe0 0x7f82a01b27d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 -- 192.168.123.100:0/106985939 >> 192.168.123.100:0/106985939 conn(0x7f82a006d1a0 msgr2=0x7f82a010b200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:14.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.327+0000 7f8285ffb700 1 -- 192.168.123.100:0/106985939 shutdown_connections 2026-03-10T12:35:14.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.328+0000 7f8285ffb700 1 -- 192.168.123.100:0/106985939 wait complete. 2026-03-10T12:35:14.332 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 7 2026-03-10T12:35:14.392 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-10T12:35:14.567 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.906+0000 7fcb50a3d700 1 -- 192.168.123.100:0/1337162916 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c072440 msgr2=0x7fcb4c10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.906+0000 7fcb50a3d700 1 --2- 192.168.123.100:0/1337162916 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c072440 0x7fcb4c10be90 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7fcb4400b600 tx=0x7fcb4400b910 comp rx=0 tx=0).stop 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.907+0000 7fcb50a3d700 1 -- 192.168.123.100:0/1337162916 shutdown_connections 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.907+0000 7fcb50a3d700 1 --2- 192.168.123.100:0/1337162916 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c072440 0x7fcb4c10be90 unknown :-1 s=CLOSED pgs=262 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.907+0000 7fcb50a3d700 1 --2- 192.168.123.100:0/1337162916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb4c071a60 0x7fcb4c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.907+0000 7fcb50a3d700 1 -- 192.168.123.100:0/1337162916 >> 192.168.123.100:0/1337162916 conn(0x7fcb4c06d1a0 msgr2=0x7fcb4c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.907+0000 7fcb50a3d700 1 -- 192.168.123.100:0/1337162916 shutdown_connections 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.907+0000 7fcb50a3d700 1 -- 192.168.123.100:0/1337162916 wait complete. 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb50a3d700 1 Processor -- start 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb50a3d700 1 -- start start 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb50a3d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c071a60 0x7fcb4c1a4a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:14.909 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb50a3d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb4c072440 0x7fcb4c1a4fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb50a3d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb4c1a55c0 con 0x7fcb4c071a60 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb50a3d700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcb4c1a5700 con 0x7fcb4c072440 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4ad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c071a60 0x7fcb4c1a4a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4ad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c071a60 0x7fcb4c1a4a60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52144/0 (socket says 192.168.123.100:52144) 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4ad9d700 1 -- 192.168.123.100:0/2991349473 learned_addr learned my addr 192.168.123.100:0/2991349473 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4a59c700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb4c072440 0x7fcb4c1a4fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4ad9d700 1 -- 192.168.123.100:0/2991349473 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb4c072440 msgr2=0x7fcb4c1a4fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4ad9d700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb4c072440 0x7fcb4c1a4fa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4ad9d700 1 -- 192.168.123.100:0/2991349473 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcb4400b050 con 0x7fcb4c071a60 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.908+0000 7fcb4a59c700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb4c072440 0x7fcb4c1a4fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.909+0000 7fcb4ad9d700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c071a60 0x7fcb4c1a4a60 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7fcb3c00b700 tx=0x7fcb3c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.909+0000 7fcb33fff700 1 -- 192.168.123.100:0/2991349473 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb3c010820 con 0x7fcb4c071a60 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.909+0000 7fcb33fff700 1 -- 192.168.123.100:0/2991349473 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcb3c010e60 con 0x7fcb4c071a60 2026-03-10T12:35:14.910 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.909+0000 7fcb33fff700 1 -- 192.168.123.100:0/2991349473 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcb3c017570 con 0x7fcb4c071a60 2026-03-10T12:35:14.911 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.909+0000 7fcb50a3d700 1 -- 192.168.123.100:0/2991349473 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcb4c077140 con 0x7fcb4c071a60 2026-03-10T12:35:14.911 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.909+0000 7fcb50a3d700 1 -- 192.168.123.100:0/2991349473 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcb4c077610 con 0x7fcb4c071a60 2026-03-10T12:35:14.914 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.911+0000 7fcb33fff700 1 -- 192.168.123.100:0/2991349473 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fcb3c010980 con 0x7fcb4c071a60 2026-03-10T12:35:14.914 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.911+0000 7fcb50a3d700 1 -- 192.168.123.100:0/2991349473 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcb4c19ec20 con 0x7fcb4c071a60 2026-03-10T12:35:14.917 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.914+0000 7fcb33fff700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcb3406c4d0 0x7fcb3406e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:14.917 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.914+0000 7fcb33fff700 1 -- 192.168.123.100:0/2991349473 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fcb3c08afa0 con 0x7fcb4c071a60 2026-03-10T12:35:14.917 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.917+0000 7fcb4a59c700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcb3406c4d0 0x7fcb3406e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:14.917 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.917+0000 7fcb33fff700 1 -- 192.168.123.100:0/2991349473 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcb3c058b60 con 0x7fcb4c071a60 2026-03-10T12:35:14.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:14.919+0000 7fcb4a59c700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcb3406c4d0 0x7fcb3406e980 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fcb44015040 tx=0x7fcb44007ba0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:15.069 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:14 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:15.069 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:14 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/106985939' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:15.069 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:14 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:boot 2026-03-10T12:35:15.069 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:14 vm00 ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:15.069 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:14 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:35:15.069 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:14 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:15.069 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:14 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:15.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.067+0000 7fcb50a3d700 1 -- 192.168.123.100:0/2991349473 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7fcb4c04ea50 con 0x7fcb4c071a60 2026-03-10T12:35:15.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.068+0000 7fcb33fff700 1 -- 192.168.123.100:0/2991349473 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 8 v8) v1 ==== 93+0+4768 (secure 0 0 0) 0x7fcb3c058d40 con 0x7fcb4c071a60 2026-03-10T12:35:15.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.071+0000 7fcb31ffb700 1 -- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcb3406c4d0 msgr2=0x7fcb3406e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:15.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.071+0000 7fcb31ffb700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcb3406c4d0 0x7fcb3406e980 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fcb44015040 tx=0x7fcb44007ba0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.071+0000 7fcb31ffb700 1 -- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c071a60 msgr2=0x7fcb4c1a4a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:15.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.071+0000 7fcb31ffb700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c071a60 0x7fcb4c1a4a60 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7fcb3c00b700 tx=0x7fcb3c00bac0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.073+0000 7fcb31ffb700 1 -- 192.168.123.100:0/2991349473 shutdown_connections 2026-03-10T12:35:15.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.073+0000 7fcb31ffb700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcb3406c4d0 0x7fcb3406e980 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.073+0000 7fcb31ffb700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcb4c071a60 0x7fcb4c1a4a60 unknown :-1 s=CLOSED pgs=263 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.073+0000 7fcb31ffb700 1 --2- 192.168.123.100:0/2991349473 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcb4c072440 0x7fcb4c1a4fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.073+0000 7fcb31ffb700 1 -- 192.168.123.100:0/2991349473 >> 192.168.123.100:0/2991349473 conn(0x7fcb4c06d1a0 msgr2=0x7fcb4c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:15.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.075+0000 7fcb31ffb700 1 -- 192.168.123.100:0/2991349473 shutdown_connections 2026-03-10T12:35:15.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.075+0000 7fcb31ffb700 1 -- 192.168.123.100:0/2991349473 wait complete. 2026-03-10T12:35:15.076 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 8 2026-03-10T12:35:15.090 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:35:15.149 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-10T12:35:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:14 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:14 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/106985939' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:14 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:boot 2026-03-10T12:35:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:14 vm07 ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:14 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:35:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:14 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:14 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:15.338 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:15.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.629+0000 7f530b405700 1 -- 192.168.123.100:0/1539582050 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304071a60 msgr2=0x7f5304071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:15.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.629+0000 7f530b405700 1 --2- 192.168.123.100:0/1539582050 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304071a60 0x7f5304071e70 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f5300009b50 tx=0x7f5300009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:15.631 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.630+0000 7f530b405700 1 -- 192.168.123.100:0/1539582050 shutdown_connections 2026-03-10T12:35:15.631 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.630+0000 7f530b405700 1 --2- 192.168.123.100:0/1539582050 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5304072440 0x7f530410be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.631 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.630+0000 7f530b405700 1 --2- 192.168.123.100:0/1539582050 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304071a60 0x7f5304071e70 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.631 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.630+0000 7f530b405700 1 -- 192.168.123.100:0/1539582050 >> 192.168.123.100:0/1539582050 conn(0x7f530406d1a0 msgr2=0x7f530406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:15.631 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.630+0000 7f530b405700 1 -- 192.168.123.100:0/1539582050 shutdown_connections 2026-03-10T12:35:15.631 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.630+0000 7f530b405700 1 -- 192.168.123.100:0/1539582050 wait complete. 2026-03-10T12:35:15.631 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f530b405700 1 Processor -- start 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f530b405700 1 -- start start 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f530b405700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5304071a60 0x7f53041a48f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f530b405700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304072440 0x7f53041a4e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f530b405700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53041a5450 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f530b405700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53041a5590 con 0x7f5304071a60 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f5309c02700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304072440 0x7f53041a4e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f5309c02700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304072440 0x7f53041a4e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52160/0 (socket says 192.168.123.100:52160) 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f5309c02700 1 -- 192.168.123.100:0/912098828 learned_addr learned my addr 192.168.123.100:0/912098828 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f5309c02700 1 -- 192.168.123.100:0/912098828 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5304071a60 msgr2=0x7f53041a48f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f5309c02700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5304071a60 0x7f53041a48f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f5309c02700 1 -- 192.168.123.100:0/912098828 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f53000097e0 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.631+0000 7f5309c02700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304072440 0x7f53041a4e30 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f52f400d900 tx=0x7f52f400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.632+0000 7f52fb7fe700 1 -- 192.168.123.100:0/912098828 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52f40098e0 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.632+0000 7f52fb7fe700 1 -- 192.168.123.100:0/912098828 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f52f4010460 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.632+0000 7f52fb7fe700 1 -- 192.168.123.100:0/912098828 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52f400f5d0 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.632+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f530410f620 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.632+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f530410fb70 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.633+0000 7f52fb7fe700 1 -- 192.168.123.100:0/912098828 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f52f40105d0 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.634+0000 7f52fb7fe700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f52f006c7a0 0x7f52f006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.634+0000 7f52fb7fe700 1 -- 192.168.123.100:0/912098828 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f52f408ad40 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.634+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f530419eba0 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.637+0000 7f52fb7fe700 1 -- 192.168.123.100:0/912098828 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f52f4058fd0 con 0x7f5304072440 2026-03-10T12:35:15.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.637+0000 7f530a403700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f52f006c7a0 0x7f52f006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:15.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.641+0000 7f530a403700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f52f006c7a0 0x7f52f006ec50 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5300009b20 tx=0x7f5300000bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:15.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.822+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f530410f7b0 con 0x7f5304072440 2026-03-10T12:35:15.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.824+0000 7f52fb7fe700 1 -- 192.168.123.100:0/912098828 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v8) v1 ==== 78+0+83 (secure 0 0 0) 0x7f530410f7b0 con 0x7f5304072440 2026-03-10T12:35:15.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f52f006c7a0 msgr2=0x7f52f006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:15.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f52f006c7a0 0x7f52f006ec50 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f5300009b20 tx=0x7f5300000bc0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304072440 msgr2=0x7f53041a4e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:15.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304072440 0x7f53041a4e30 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f52f400d900 tx=0x7f52f400dcc0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 shutdown_connections 2026-03-10T12:35:15.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f52f006c7a0 0x7f52f006ec50 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5304071a60 0x7f53041a48f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 --2- 192.168.123.100:0/912098828 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5304072440 0x7f53041a4e30 unknown :-1 s=CLOSED pgs=265 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:15.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 >> 192.168.123.100:0/912098828 conn(0x7f530406d1a0 msgr2=0x7f530410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:15.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 shutdown_connections 2026-03-10T12:35:15.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:15.830+0000 7f530b405700 1 -- 192.168.123.100:0/912098828 wait complete. 2026-03-10T12:35:15.847 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:15 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2991349473' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T12:35:15.847 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:15 vm00 ceph-mon[50686]: pgmap v83: 65 pgs: 8 creating+peering, 57 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.2 KiB/s wr, 12 op/s 2026-03-10T12:35:15.847 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:15.847 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:15.847 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:15.847 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:15.847 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:15 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:15.847 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:35:15.881 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-10T12:35:15.885 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 2026-03-10T12:35:16.054 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:16.276 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:15 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2991349473' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T12:35:16.276 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:15 vm07 ceph-mon[58582]: pgmap v83: 65 pgs: 8 creating+peering, 57 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.2 KiB/s wr, 12 op/s 2026-03-10T12:35:16.276 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:16.276 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:16.276 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:16.276 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:16.276 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:15 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.345+0000 7f15c8f9d700 1 -- 192.168.123.100:0/3977309660 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c4071b60 msgr2=0x7f15c4071fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.345+0000 7f15c8f9d700 1 --2- 192.168.123.100:0/3977309660 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c4071b60 0x7f15c4071fd0 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7f15b8009b50 tx=0x7f15b8009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.346+0000 7f15c8f9d700 1 -- 192.168.123.100:0/3977309660 shutdown_connections 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.346+0000 7f15c8f9d700 1 --2- 192.168.123.100:0/3977309660 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c4071b60 0x7f15c4071fd0 unknown :-1 s=CLOSED pgs=266 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.346+0000 7f15c8f9d700 1 --2- 192.168.123.100:0/3977309660 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15c410e9e0 0x7f15c410edb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.346+0000 7f15c8f9d700 1 -- 192.168.123.100:0/3977309660 >> 192.168.123.100:0/3977309660 conn(0x7f15c406c6c0 msgr2=0x7f15c406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.346+0000 7f15c8f9d700 1 -- 192.168.123.100:0/3977309660 shutdown_connections 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.346+0000 7f15c8f9d700 1 -- 192.168.123.100:0/3977309660 wait complete. 2026-03-10T12:35:16.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.347+0000 7f15c8f9d700 1 Processor -- start 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.347+0000 7f15c8f9d700 1 -- start start 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.347+0000 7f15c8f9d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15c4071b60 0x7f15c4115910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.347+0000 7f15c8f9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c410e9e0 0x7f15c4115e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.347+0000 7f15c8f9d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15c4119aa0 con 0x7f15c410e9e0 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.347+0000 7f15c8f9d700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15c4116390 con 0x7f15c4071b60 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.347+0000 7f15c2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c410e9e0 0x7f15c4115e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.348+0000 7f15c2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c410e9e0 0x7f15c4115e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52180/0 (socket says 192.168.123.100:52180) 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.348+0000 7f15c2ffd700 1 -- 192.168.123.100:0/1722409702 learned_addr learned my addr 192.168.123.100:0/1722409702 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.348+0000 7f15c37fe700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15c4071b60 0x7f15c4115910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.348+0000 7f15c2ffd700 1 -- 192.168.123.100:0/1722409702 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15c4071b60 msgr2=0x7f15c4115910 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.348+0000 7f15c2ffd700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15c4071b60 0x7f15c4115910 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.348+0000 7f15c2ffd700 1 -- 192.168.123.100:0/1722409702 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15b80097e0 con 0x7f15c410e9e0 2026-03-10T12:35:16.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.348+0000 7f15c2ffd700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c410e9e0 0x7f15c4115e50 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7f15b8005950 tx=0x7f15b8004e80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.349+0000 7f15c0ff9700 1 -- 192.168.123.100:0/1722409702 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15b801d070 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.349+0000 7f15c8f9d700 1 -- 192.168.123.100:0/1722409702 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15c4116610 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.349+0000 7f15c8f9d700 1 -- 192.168.123.100:0/1722409702 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15c41b7900 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.349+0000 7f15c0ff9700 1 -- 192.168.123.100:0/1722409702 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f15b8022470 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.349+0000 7f15c0ff9700 1 -- 192.168.123.100:0/1722409702 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15b800f630 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.350+0000 7f15c0ff9700 1 -- 192.168.123.100:0/1722409702 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f15b800baa0 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.351+0000 7f15c8f9d700 1 -- 192.168.123.100:0/1722409702 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f15b0005320 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.351+0000 7f15c0ff9700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f15ac06c690 0x7f15ac06eb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.351+0000 7f15c0ff9700 1 -- 192.168.123.100:0/1722409702 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f15b808cd30 con 0x7f15c410e9e0 2026-03-10T12:35:16.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.354+0000 7f15c37fe700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f15ac06c690 0x7f15ac06eb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:16.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.354+0000 7f15c0ff9700 1 -- 192.168.123.100:0/1722409702 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f15b8057560 con 0x7f15c410e9e0 2026-03-10T12:35:16.363 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.357+0000 7f15c37fe700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f15ac06c690 0x7f15ac06eb40 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f15b4005950 tx=0x7f15b40058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:16.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.490+0000 7f15c8f9d700 1 -- 192.168.123.100:0/1722409702 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f15b0005f70 con 0x7f15c410e9e0 2026-03-10T12:35:16.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.492+0000 7f15c0ff9700 1 -- 192.168.123.100:0/1722409702 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 9 v9) v1 ==== 93+0+4767 (secure 0 0 0) 0x7f15b8027090 con 0x7f15c410e9e0 2026-03-10T12:35:16.492 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:16.492 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":9,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":5},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":8}],"filesystems":[{"mdsmap":{"epoch":9,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:35:15.824106+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":2,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-10T12:35:16.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 -- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f15ac06c690 msgr2=0x7f15ac06eb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:16.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f15ac06c690 0x7f15ac06eb40 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f15b4005950 tx=0x7f15b40058e0 comp rx=0 tx=0).stop 2026-03-10T12:35:16.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 -- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c410e9e0 msgr2=0x7f15c4115e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:16.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c410e9e0 0x7f15c4115e50 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7f15b8005950 tx=0x7f15b8004e80 comp rx=0 tx=0).stop 2026-03-10T12:35:16.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 -- 192.168.123.100:0/1722409702 shutdown_connections 2026-03-10T12:35:16.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f15ac06c690 0x7f15ac06eb40 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:16.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f15c4071b60 0x7f15c4115910 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:16.496 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 --2- 192.168.123.100:0/1722409702 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f15c410e9e0 0x7f15c4115e50 unknown :-1 s=CLOSED pgs=267 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:16.496 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 -- 192.168.123.100:0/1722409702 >> 192.168.123.100:0/1722409702 conn(0x7f15c406c6c0 msgr2=0x7f15c406cfa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:16.496 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 -- 192.168.123.100:0/1722409702 shutdown_connections 2026-03-10T12:35:16.496 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:16.495+0000 7f15aa7fc700 1 -- 192.168.123.100:0/1722409702 wait complete. 2026-03-10T12:35:16.497 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 9 2026-03-10T12:35:16.579 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 2, 'flags': 18} 2026-03-10T12:35:16.579 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-10T12:35:16.590 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-10T12:35:16.590 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-10T12:35:16.590 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-10T12:35:16.590 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T12:35:16.590 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:16.590 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-10T12:35:16.590 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T12:35:16.590 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:16.590 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:16.590 DEBUG:teuthology.orchestra.run.vm07:> ip netns list 2026-03-10T12:35:16.611 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:16.612 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link delete ceph-brx 2026-03-10T12:35:16.698 INFO:teuthology.orchestra.run.vm07.stderr:Cannot find device "ceph-brx" 2026-03-10T12:35:16.700 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:35:16.700 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:16.700 DEBUG:teuthology.orchestra.run.vm00:> ip netns list 2026-03-10T12:35:16.728 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:16.728 DEBUG:teuthology.orchestra.run.vm00:> sudo ip link delete ceph-brx 2026-03-10T12:35:16.815 INFO:teuthology.orchestra.run.vm00.stderr:Cannot find device "ceph-brx" 2026-03-10T12:35:16.817 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:35:16.817 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-10T12:35:16.817 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T12:35:16.817 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs ls 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/912098828' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] up:active 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/1722409702' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:16 vm00 ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:17.039 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/912098828' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] up:active 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/1722409702' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:17.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:16 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:17.337 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.336+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2948064348 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce4071e40 msgr2=0x7f0ce40722b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:17.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.336+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2948064348 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce4071e40 0x7f0ce40722b0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f0cdc00b600 tx=0x7f0cdc00b910 comp rx=0 tx=0).stop 2026-03-10T12:35:17.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.337+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2948064348 shutdown_connections 2026-03-10T12:35:17.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.337+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2948064348 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce4071e40 0x7f0ce40722b0 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:17.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.337+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2948064348 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ce410c8f0 0x7f0ce410ccc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:17.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.337+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2948064348 >> 192.168.123.100:0/2948064348 conn(0x7f0ce406c6c0 msgr2=0x7f0ce406cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:17.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2948064348 shutdown_connections 2026-03-10T12:35:17.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2948064348 wait complete. 2026-03-10T12:35:17.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 Processor -- start 2026-03-10T12:35:17.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 -- start start 2026-03-10T12:35:17.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ce410c8f0 0x7f0ce407d160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:17.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce407d6a0 0x7f0ce4081b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ce407db10 con 0x7f0ce407d6a0 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce8e2b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ce407dc50 con 0x7f0ce410c8f0 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce407d6a0 0x7f0ce4081b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce407d6a0 0x7f0ce4081b10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52200/0 (socket says 192.168.123.100:52200) 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.338+0000 7f0ce2ffd700 1 -- 192.168.123.100:0/2122999395 learned_addr learned my addr 192.168.123.100:0/2122999395 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.339+0000 7f0ce2ffd700 1 -- 192.168.123.100:0/2122999395 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ce410c8f0 msgr2=0x7f0ce407d160 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.339+0000 7f0ce2ffd700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ce410c8f0 0x7f0ce407d160 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.339+0000 7f0ce2ffd700 1 -- 192.168.123.100:0/2122999395 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0cdc00b050 con 0x7f0ce407d6a0 2026-03-10T12:35:17.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.339+0000 7f0ce2ffd700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce407d6a0 0x7f0ce4081b10 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f0cdc003c40 tx=0x7f0cdc003c70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:17.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.340+0000 7f0ce0ff9700 1 -- 192.168.123.100:0/2122999395 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0cdc00e030 con 0x7f0ce407d6a0 2026-03-10T12:35:17.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.340+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ce4082050 con 0x7f0ce407d6a0 2026-03-10T12:35:17.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.340+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ce40825a0 con 0x7f0ce407d6a0 2026-03-10T12:35:17.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.342+0000 7f0ce0ff9700 1 -- 192.168.123.100:0/2122999395 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0cdc003ec0 con 0x7f0ce407d6a0 2026-03-10T12:35:17.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.342+0000 7f0ce0ff9700 1 -- 192.168.123.100:0/2122999395 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0cdc01cd80 con 0x7f0ce407d6a0 2026-03-10T12:35:17.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.344+0000 7f0ce0ff9700 1 -- 192.168.123.100:0/2122999395 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f0cdc012430 con 0x7f0ce407d6a0 2026-03-10T12:35:17.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.344+0000 7f0ce0ff9700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0ccc06ea90 0x7f0ccc070f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:17.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.344+0000 7f0ce0ff9700 1 -- 192.168.123.100:0/2122999395 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f0cdc08e0e0 con 0x7f0ce407d6a0 2026-03-10T12:35:17.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.344+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0cd0005320 con 0x7f0ce407d6a0 2026-03-10T12:35:17.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.346+0000 7f0ce37fe700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0ccc06ea90 0x7f0ccc070f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:17.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.348+0000 7f0ce37fe700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0ccc06ea90 0x7f0ccc070f40 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f0cd4009710 tx=0x7f0cd4006c60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:17.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.348+0000 7f0ce0ff9700 1 -- 192.168.123.100:0/2122999395 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0cdc0588e0 con 0x7f0ce407d6a0 2026-03-10T12:35:17.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.492+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f0cd0006200 con 0x7f0ce407d6a0 2026-03-10T12:35:17.494 INFO:teuthology.orchestra.run.vm00.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T12:35:17.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.492+0000 7f0ce0ff9700 1 -- 192.168.123.100:0/2122999395 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v9) v1 ==== 53+0+83 (secure 0 0 0) 0x7f0cdc017070 con 0x7f0ce407d6a0 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0ccc06ea90 msgr2=0x7f0ccc070f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0ccc06ea90 0x7f0ccc070f40 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f0cd4009710 tx=0x7f0cd4006c60 comp rx=0 tx=0).stop 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce407d6a0 msgr2=0x7f0ce4081b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce407d6a0 0x7f0ce4081b10 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f0cdc003c40 tx=0x7f0cdc003c70 comp rx=0 tx=0).stop 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 shutdown_connections 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0ccc06ea90 0x7f0ccc070f40 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ce410c8f0 0x7f0ce407d160 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 --2- 192.168.123.100:0/2122999395 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ce407d6a0 0x7f0ce4081b10 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:17.497 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.496+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 >> 192.168.123.100:0/2122999395 conn(0x7f0ce406c6c0 msgr2=0x7f0ce4070080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:17.498 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.497+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 shutdown_connections 2026-03-10T12:35:17.498 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17.497+0000 7f0ce8e2b700 1 -- 192.168.123.100:0/2122999395 wait complete. 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm00.local 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T12:35:17.558 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-10T12:35:17.559 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:17.559 DEBUG:teuthology.orchestra.run.vm00:> ip addr 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: valid_lft forever preferred_lft forever 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: inet6 ::1/128 scope host 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: valid_lft forever preferred_lft forever 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: link/ether 52:55:00:00:00:00 brd ff:ff:ff:ff:ff:ff 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: altname enp0s3 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: altname ens3 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: inet 192.168.123.100/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: valid_lft 3172sec preferred_lft 3172sec 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: inet6 fe80::5055:ff:fe00:0/64 scope link noprefixroute 2026-03-10T12:35:17.581 INFO:teuthology.orchestra.run.vm00.stdout: valid_lft forever preferred_lft forever 2026-03-10T12:35:17.581 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T12:35:17.581 DEBUG:teuthology.orchestra.run.vm00:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:17.581 DEBUG:teuthology.orchestra.run.vm00:> set -e 2026-03-10T12:35:17.581 DEBUG:teuthology.orchestra.run.vm00:> sudo ip link add name ceph-brx type bridge 2026-03-10T12:35:17.581 DEBUG:teuthology.orchestra.run.vm00:> sudo ip addr flush dev ceph-brx 2026-03-10T12:35:17.581 DEBUG:teuthology.orchestra.run.vm00:> sudo ip link set ceph-brx up 2026-03-10T12:35:17.582 DEBUG:teuthology.orchestra.run.vm00:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T12:35:17.582 DEBUG:teuthology.orchestra.run.vm00:> ') 2026-03-10T12:35:17.676 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:17.764 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:17.764 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:17.764 DEBUG:teuthology.orchestra.run.vm00:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T12:35:17.831 INFO:teuthology.orchestra.run.vm00.stdout:1 2026-03-10T12:35:17.831 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:17.831 DEBUG:teuthology.orchestra.run.vm00:> ip r 2026-03-10T12:35:17.857 INFO:teuthology.orchestra.run.vm00.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.100 metric 100 2026-03-10T12:35:17.857 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.100 metric 100 2026-03-10T12:35:17.857 INFO:teuthology.orchestra.run.vm00.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T12:35:17.857 DEBUG:teuthology.orchestra.run.vm00:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:17.857 DEBUG:teuthology.orchestra.run.vm00:> set -e 2026-03-10T12:35:17.857 DEBUG:teuthology.orchestra.run.vm00:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T12:35:17.857 DEBUG:teuthology.orchestra.run.vm00:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T12:35:17.857 DEBUG:teuthology.orchestra.run.vm00:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T12:35:17.857 DEBUG:teuthology.orchestra.run.vm00:> ') 2026-03-10T12:35:17.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:17 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:18.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:18 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:18.013 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:18.013 DEBUG:teuthology.orchestra.run.vm00:> ip netns list 2026-03-10T12:35:18.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:17 vm07 ceph-mon[58582]: pgmap v84: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.4 KiB/s wr, 10 op/s 2026-03-10T12:35:18.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:17 vm07 ceph-mon[58582]: from='client.? 192.168.123.100:0/2122999395' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T12:35:18.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:17 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] up:standby 2026-03-10T12:35:18.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:17 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] up:active 2026-03-10T12:35:18.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:17 vm07 ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:18.070 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:18.070 DEBUG:teuthology.orchestra.run.vm00:> ip netns list-id 2026-03-10T12:35:18.127 DEBUG:teuthology.orchestra.run.vm00:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:18.127 DEBUG:teuthology.orchestra.run.vm00:> set -e 2026-03-10T12:35:18.127 DEBUG:teuthology.orchestra.run.vm00:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T12:35:18.127 DEBUG:teuthology.orchestra.run.vm00:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-10T12:35:18.127 DEBUG:teuthology.orchestra.run.vm00:> ') 2026-03-10T12:35:18.203 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:18 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:18.212 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:17 vm00 ceph-mon[50686]: pgmap v84: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.4 KiB/s wr, 10 op/s 2026-03-10T12:35:18.212 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:17 vm00 ceph-mon[50686]: from='client.? 192.168.123.100:0/2122999395' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T12:35:18.212 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:17 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] up:standby 2026-03-10T12:35:18.212 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:17 vm00 ceph-mon[50686]: mds.? [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] up:active 2026-03-10T12:35:18.212 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:17 vm00 ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:18.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:18 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:18.236 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> set -e 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-10T12:35:18.237 DEBUG:teuthology.orchestra.run.vm00:> ') 2026-03-10T12:35:18.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:18 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:18.404 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:18 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:18.408 DEBUG:teuthology.orchestra.run.vm00:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:18.408 DEBUG:teuthology.orchestra.run.vm00:> set -e 2026-03-10T12:35:18.408 DEBUG:teuthology.orchestra.run.vm00:> sudo ip link set brx.0 up 2026-03-10T12:35:18.408 DEBUG:teuthology.orchestra.run.vm00:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T12:35:18.408 DEBUG:teuthology.orchestra.run.vm00:> ') 2026-03-10T12:35:18.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:18 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:18.470 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:18 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:18.473 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-10T12:35:18.473 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T12:35:18.473 DEBUG:teuthology.orchestra.run.vm00:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:18.532 INFO:teuthology.orchestra.run.vm00.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-10T12:35:18.532 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T12:35:18.532 DEBUG:teuthology.orchestra.run.vm00:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:18.594 DEBUG:teuthology.orchestra.run.vm00:> sudo modprobe fuse 2026-03-10T12:35:18.664 DEBUG:teuthology.orchestra.run.vm00:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T12:35:18.723 INFO:teuthology.orchestra.run.vm00.stdout:/proc 2026-03-10T12:35:18.723 INFO:teuthology.orchestra.run.vm00.stdout:/sys 2026-03-10T12:35:18.723 INFO:teuthology.orchestra.run.vm00.stdout:/dev 2026-03-10T12:35:18.723 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/security 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/dev/shm 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/dev/pts 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/cgroup 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/pstore 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/bpf 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/config 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/ 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/selinux 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/dev/hugepages 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/dev/mqueue 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/debug 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/tracing 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/fuse/connections 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/user/1000 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/42be3d5aa77ee82d52a4ee9a6de8414954b1c04767e8ef34ef974ce93710c6b3/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/d91ffb33c7d21522d291418334f6a51742f49f9b479967842f9b7a34727ba34e/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/user/0 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/72f4b538f275c82269f6745d690508e43850f71aa69b5c8285f80d3d2c40c426/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/b5f45d02d1480b6a88cbd68b0927be648aeceec3b98ee7054b1ee0c66a7415d9/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/2fd0a9f1c362168325d9056c72fd283da2d5f80922db75a6550ec63c4a350ce8/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/58de5e6c737063f6313f5558be7b94fbac36de8d45356beb551226a8a04f3d5a/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/8c76a4f9a39c2bef50d23e75b717861416974c626b53ed88513a2e97a149f635/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/2d4d53c65fd9a5c44299e58a55bb89f8b130b99a6d65b3da9d588334d3318be8/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/2160c5cab2626d9091bf6c509fe7cc3e061379c8d347f45749dc11de5407ad14/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/f74b556cf11f684aa22fa34d83a57438ddeaab4dc0963f5ce698298350e04d7f/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/fcce0614f9e68e7d6716552162f564e4f2e8b05fdefef7b7cb74edb229a726b0/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/8a3dff0d1467e0d3cab84a00efa87dcf0055579d95880ca4cebc42e429b49f76/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/9c93d3021cdec9e8084e203c11733a5493e6edc2965d9c5a07f4c432dfa2b2f0/merged 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/netns 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run.vm00.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T12:35:18.724 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:18.724 DEBUG:teuthology.orchestra.run.vm00:> ls /sys/fs/fuse/connections 2026-03-10T12:35:18.782 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T12:35:18.782 DEBUG:teuthology.orchestra.run.vm00:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-10T12:35:18.824 DEBUG:teuthology.orchestra.run.vm00:> sudo modprobe fuse 2026-03-10T12:35:18.854 DEBUG:teuthology.orchestra.run.vm00:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T12:35:18.906 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm00.stderr:ceph-fuse[91644]: starting ceph client 2026-03-10T12:35:18.906 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm00.stderr:2026-03-10T12:35:18.906+0000 7ff4ed788480 -1 init, newargv = 0x55855175da60 newargc=15 2026-03-10T12:35:18.917 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm00.stderr:ceph-fuse[91644]: starting fuse 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/proc 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/dev 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/security 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/dev/shm 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/dev/pts 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/cgroup 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/pstore 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/bpf 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/config 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/ 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/selinux 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/dev/hugepages 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/dev/mqueue 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/debug 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/kernel/tracing 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/sys/fs/fuse/connections 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/user/1000 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/42be3d5aa77ee82d52a4ee9a6de8414954b1c04767e8ef34ef974ce93710c6b3/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/d91ffb33c7d21522d291418334f6a51742f49f9b479967842f9b7a34727ba34e/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/user/0 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/72f4b538f275c82269f6745d690508e43850f71aa69b5c8285f80d3d2c40c426/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/b5f45d02d1480b6a88cbd68b0927be648aeceec3b98ee7054b1ee0c66a7415d9/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/2fd0a9f1c362168325d9056c72fd283da2d5f80922db75a6550ec63c4a350ce8/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/58de5e6c737063f6313f5558be7b94fbac36de8d45356beb551226a8a04f3d5a/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/8c76a4f9a39c2bef50d23e75b717861416974c626b53ed88513a2e97a149f635/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/2d4d53c65fd9a5c44299e58a55bb89f8b130b99a6d65b3da9d588334d3318be8/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/2160c5cab2626d9091bf6c509fe7cc3e061379c8d347f45749dc11de5407ad14/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/f74b556cf11f684aa22fa34d83a57438ddeaab4dc0963f5ce698298350e04d7f/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/fcce0614f9e68e7d6716552162f564e4f2e8b05fdefef7b7cb74edb229a726b0/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/8a3dff0d1467e0d3cab84a00efa87dcf0055579d95880ca4cebc42e429b49f76/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/var/lib/containers/storage/overlay/9c93d3021cdec9e8084e203c11733a5493e6edc2965d9c5a07f4c432dfa2b2f0/merged 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/netns 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T12:35:18.925 INFO:teuthology.orchestra.run.vm00.stdout:/home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:18.926 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:18.926 DEBUG:teuthology.orchestra.run.vm00:> ls /sys/fs/fuse/connections 2026-03-10T12:35:18.983 INFO:teuthology.orchestra.run.vm00.stdout:96 2026-03-10T12:35:18.983 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [96] 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> sudo stdin-killer -- python3 -c ' 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> import glob 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> import re 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> import os 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> import subprocess 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> def _find_admin_socket(client_name): 2026-03-10T12:35:18.983 DEBUG:teuthology.orchestra.run.vm00:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> files = glob.glob(asok_path) 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> # Given a non-glob path, it better be there 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> if "*" not in asok_path: 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> assert(len(files) == 1) 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> return files[0] 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> for f in files: 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> contents = proc_f.read() 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> if mountpoint in contents: 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> return f 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> print(_find_admin_socket("client.0")) 2026-03-10T12:35:18.984 DEBUG:teuthology.orchestra.run.vm00:> ' 2026-03-10T12:35:19.083 INFO:teuthology.orchestra.run.vm00.stdout:/var/run/ceph/ceph-client.0.91644.asok 2026-03-10T12:35:19.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:19.092 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.91644.asok 2026-03-10T12:35:19.092 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:19.092 DEBUG:teuthology.orchestra.run.vm00:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.91644.asok status 2026-03-10T12:35:19.156 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:18 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:19.156 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:18 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:19.156 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:18 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:19.156 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:18 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:19.156 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:18 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:19.156 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:18 vm00.local ceph-mon[50686]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:standby 2026-03-10T12:35:19.156 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:18 vm00.local ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "metadata": { 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "entity_id": "0", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "hostname": "vm00.local", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "pid": "91644", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "root": "/" 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "dentry_count": 0, 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "dentry_pinned_count": 0, 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "id": 14518, 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "inst": { 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "name": { 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "type": "client", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "num": 14518 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "addr": { 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "type": "v1", 2026-03-10T12:35:19.200 INFO:teuthology.orchestra.run.vm00.stdout: "addr": "192.168.123.100:0", 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "nonce": 1707042861 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "addr": { 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "type": "v1", 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "addr": "192.168.123.100:0", 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "nonce": 1707042861 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "inst_str": "client.14518 192.168.123.100:0/1707042861", 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "addr_str": "192.168.123.100:0/1707042861", 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "inode_count": 1, 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "mds_epoch": 10, 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "osd_epoch": 39, 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "osd_epoch_barrier": 0, 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "blocklisted": false, 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout: "fs_name": "cephfs" 2026-03-10T12:35:19.201 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:35:19.205 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T12:35:19.205 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs ls 2026-03-10T12:35:19.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:18 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:18 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:18 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:18 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:18 vm07 ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:18 vm07 ceph-mon[58582]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:standby 2026-03-10T12:35:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:18 vm07 ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:35:19.398 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:19.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.660+0000 7fe8396c6700 1 -- 192.168.123.100:0/3168927562 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 msgr2=0x7fe83410c820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.660+0000 7fe8396c6700 1 --2- 192.168.123.100:0/3168927562 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe83410c820 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7fe824009b00 tx=0x7fe824009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.661+0000 7fe8396c6700 1 -- 192.168.123.100:0/3168927562 shutdown_connections 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.661+0000 7fe8396c6700 1 --2- 192.168.123.100:0/3168927562 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe83410c820 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.661+0000 7fe8396c6700 1 --2- 192.168.123.100:0/3168927562 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe8340730f0 0x7fe8340734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.661+0000 7fe8396c6700 1 -- 192.168.123.100:0/3168927562 >> 192.168.123.100:0/3168927562 conn(0x7fe8340fc000 msgr2=0x7fe8340fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.662+0000 7fe8396c6700 1 -- 192.168.123.100:0/3168927562 shutdown_connections 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.662+0000 7fe8396c6700 1 -- 192.168.123.100:0/3168927562 wait complete. 2026-03-10T12:35:19.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.662+0000 7fe8396c6700 1 Processor -- start 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.662+0000 7fe8396c6700 1 -- start start 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.663+0000 7fe8396c6700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe8340730f0 0x7fe8341982a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.663+0000 7fe8396c6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe834198910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.663+0000 7fe8327fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe834198910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.663+0000 7fe8327fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe834198910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57776/0 (socket says 192.168.123.100:57776) 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.663+0000 7fe8327fc700 1 -- 192.168.123.100:0/2507657220 learned_addr learned my addr 192.168.123.100:0/2507657220 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.663+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe83419cc50 con 0x7fe834073a00 2026-03-10T12:35:19.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.663+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe83419cdc0 con 0x7fe8340730f0 2026-03-10T12:35:19.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe8327fc700 1 -- 192.168.123.100:0/2507657220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe8340730f0 msgr2=0x7fe8341982a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:19.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe832ffd700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe8340730f0 0x7fe8341982a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:19.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe8327fc700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe8340730f0 0x7fe8341982a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe8327fc700 1 -- 192.168.123.100:0/2507657220 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe8240097e0 con 0x7fe834073a00 2026-03-10T12:35:19.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe832ffd700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe8340730f0 0x7fe8341982a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:19.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe8327fc700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe834198910 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7fe8240048c0 tx=0x7fe8240048f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:19.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe82bfff700 1 -- 192.168.123.100:0/2507657220 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe82401d070 con 0x7fe834073a00 2026-03-10T12:35:19.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe82bfff700 1 -- 192.168.123.100:0/2507657220 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe824004b80 con 0x7fe834073a00 2026-03-10T12:35:19.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe82bfff700 1 -- 192.168.123.100:0/2507657220 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe82400f650 con 0x7fe834073a00 2026-03-10T12:35:19.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe83419d0a0 con 0x7fe834073a00 2026-03-10T12:35:19.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.664+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe83419d460 con 0x7fe834073a00 2026-03-10T12:35:19.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.666+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe834109f20 con 0x7fe834073a00 2026-03-10T12:35:19.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.667+0000 7fe82bfff700 1 -- 192.168.123.100:0/2507657220 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe82400bc50 con 0x7fe834073a00 2026-03-10T12:35:19.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.667+0000 7fe82bfff700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe82006c7a0 0x7fe82006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:19.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.667+0000 7fe82bfff700 1 -- 192.168.123.100:0/2507657220 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fe82408cf00 con 0x7fe834073a00 2026-03-10T12:35:19.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.669+0000 7fe832ffd700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe82006c7a0 0x7fe82006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:19.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.669+0000 7fe832ffd700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe82006c7a0 0x7fe82006ec50 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fe81c005fd0 tx=0x7fe81c005ee0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:19.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.669+0000 7fe82bfff700 1 -- 192.168.123.100:0/2507657220 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe824057700 con 0x7fe834073a00 2026-03-10T12:35:19.793 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.793+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7fe834199600 con 0x7fe834073a00 2026-03-10T12:35:19.794 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.793+0000 7fe82bfff700 1 -- 192.168.123.100:0/2507657220 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v11) v1 ==== 53+0+83 (secure 0 0 0) 0x7fe824027070 con 0x7fe834073a00 2026-03-10T12:35:19.794 INFO:teuthology.orchestra.run.vm00.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T12:35:19.796 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.796+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe82006c7a0 msgr2=0x7fe82006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:19.796 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.796+0000 7fe8396c6700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe82006c7a0 0x7fe82006ec50 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fe81c005fd0 tx=0x7fe81c005ee0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.796 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.796+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 msgr2=0x7fe834198910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:19.796 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.796+0000 7fe8396c6700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe834198910 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7fe8240048c0 tx=0x7fe8240048f0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.797+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 shutdown_connections 2026-03-10T12:35:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.797+0000 7fe8396c6700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe82006c7a0 0x7fe82006ec50 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.797+0000 7fe8396c6700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe8340730f0 0x7fe8341982a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.797+0000 7fe8396c6700 1 --2- 192.168.123.100:0/2507657220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe834073a00 0x7fe834198910 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.797+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 >> 192.168.123.100:0/2507657220 conn(0x7fe8340fc000 msgr2=0x7fe834107060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.797+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 shutdown_connections 2026-03-10T12:35:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:19.797+0000 7fe8396c6700 1 -- 192.168.123.100:0/2507657220 wait complete. 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm07.local 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T12:35:19.864 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-10T12:35:19.864 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:19.864 DEBUG:teuthology.orchestra.run.vm07:> ip addr 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft forever preferred_lft forever 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: inet6 ::1/128 scope host 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft forever preferred_lft forever 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: link/ether 52:55:00:00:00:07 brd ff:ff:ff:ff:ff:ff 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: altname enp0s3 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: altname ens3 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: inet 192.168.123.107/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft 3145sec preferred_lft 3145sec 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: inet6 fe80::5055:ff:fe00:7/64 scope link noprefixroute 2026-03-10T12:35:19.880 INFO:teuthology.orchestra.run.vm07.stdout: valid_lft forever preferred_lft forever 2026-03-10T12:35:19.881 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T12:35:19.881 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:19.881 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-10T12:35:19.881 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link add name ceph-brx type bridge 2026-03-10T12:35:19.881 DEBUG:teuthology.orchestra.run.vm07:> sudo ip addr flush dev ceph-brx 2026-03-10T12:35:19.881 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link set ceph-brx up 2026-03-10T12:35:19.881 DEBUG:teuthology.orchestra.run.vm07:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T12:35:19.881 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-10T12:35:19.955 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:19 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:20.031 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:20.037 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:20.037 DEBUG:teuthology.orchestra.run.vm07:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T12:35:20.107 INFO:teuthology.orchestra.run.vm07.stdout:1 2026-03-10T12:35:20.108 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:20.108 DEBUG:teuthology.orchestra.run.vm07:> ip r 2026-03-10T12:35:20.166 INFO:teuthology.orchestra.run.vm07.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.107 metric 100 2026-03-10T12:35:20.166 INFO:teuthology.orchestra.run.vm07.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.107 metric 100 2026-03-10T12:35:20.166 INFO:teuthology.orchestra.run.vm07.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T12:35:20.166 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:20.166 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-10T12:35:20.166 DEBUG:teuthology.orchestra.run.vm07:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T12:35:20.166 DEBUG:teuthology.orchestra.run.vm07:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T12:35:20.166 DEBUG:teuthology.orchestra.run.vm07:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T12:35:20.166 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-10T12:35:20.234 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:20 vm07.local ceph-mon[58582]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.1 KiB/s wr, 9 op/s 2026-03-10T12:35:20.234 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:20 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/2507657220' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T12:35:20.244 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:20.305 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:20.310 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:20.310 DEBUG:teuthology.orchestra.run.vm07:> ip netns list 2026-03-10T12:35:20.365 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:20.365 DEBUG:teuthology.orchestra.run.vm07:> ip netns list-id 2026-03-10T12:35:20.420 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:20.420 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-10T12:35:20.420 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T12:35:20.420 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-10T12:35:20.420 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-10T12:35:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:20 vm00.local ceph-mon[50686]: pgmap v85: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.1 KiB/s wr, 9 op/s 2026-03-10T12:35:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:20 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/2507657220' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T12:35:20.493 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:20.519 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:20.522 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-10T12:35:20.522 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:20.522 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-10T12:35:20.522 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-10T12:35:20.523 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T12:35:20.523 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-10T12:35:20.523 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-10T12:35:20.523 DEBUG:teuthology.orchestra.run.vm07:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-10T12:35:20.523 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-10T12:35:20.596 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:20.658 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:20.665 DEBUG:teuthology.orchestra.run.vm07:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T12:35:20.665 DEBUG:teuthology.orchestra.run.vm07:> set -e 2026-03-10T12:35:20.665 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link set brx.0 up 2026-03-10T12:35:20.665 DEBUG:teuthology.orchestra.run.vm07:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T12:35:20.665 DEBUG:teuthology.orchestra.run.vm07:> ') 2026-03-10T12:35:20.737 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T12:35:20.764 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:20 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:20.768 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-10T12:35:20.768 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T12:35:20.768 DEBUG:teuthology.orchestra.run.vm07:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:20.823 INFO:teuthology.orchestra.run.vm07.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-10T12:35:20.823 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T12:35:20.823 DEBUG:teuthology.orchestra.run.vm07:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:20.878 DEBUG:teuthology.orchestra.run.vm07:> sudo modprobe fuse 2026-03-10T12:35:20.946 DEBUG:teuthology.orchestra.run.vm07:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/proc 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/sys 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/dev 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/security 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/dev/shm 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/dev/pts 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/run 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/cgroup 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/pstore 2026-03-10T12:35:21.004 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/bpf 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/config 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/ 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/selinux 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/dev/hugepages 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/dev/mqueue 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/debug 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/tracing 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/fuse/connections 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/1000 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/0 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/27d07818bdb1191145f6f4bf503bcb1e25bf393187074a759a17976c082492a3/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/4e470cbd9ba1f7eb2334c092c3523d15dc348343fb74e91dce771a5859387ee5/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/865563d08c5cfb639ff9312e34c9c0e748e07f2393ed9c462a3f9a11e38a90cd/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/5b75749e51b97079e1657eca570b1accfdbbc95eec8e352d582260dd5f628445/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/bd25401adea8ab9cbe0b175b51e7d391cff4a4f5c8aa816174242796ddde48ca/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/ad41b620f663f56acbe7d22d085abcd4d1f8754efa1f0ac73ad3bf779df3c9b6/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/b829b2d591976f8210f669a89275319fe652185ac6947c93a279d9f98f34865d/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/380975e5ee2045b4b9ad7e0b8f33224fc4c0b621a5b808354f824bc22cb72963/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/3b04a3c3ac6cc38738ee2b76ecf7e10d7f6ce649f827bc517cf3cfe2b34c7671/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/84351eea17438a3fbe25f7aa14876c2f38547d059127232cc8b851fc88a26c84/merged 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T12:35:21.005 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:21.005 DEBUG:teuthology.orchestra.run.vm07:> ls /sys/fs/fuse/connections 2026-03-10T12:35:21.063 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T12:35:21.063 DEBUG:teuthology.orchestra.run.vm07:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-10T12:35:21.105 DEBUG:teuthology.orchestra.run.vm07:> sudo modprobe fuse 2026-03-10T12:35:21.135 DEBUG:teuthology.orchestra.run.vm07:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T12:35:21.175 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm07.stderr:ceph-fuse[80379]: starting ceph client 2026-03-10T12:35:21.175 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm07.stderr:2026-03-10T12:35:21.174+0000 7fe030e28480 -1 init, newargv = 0x56426b783ed0 newargc=15 2026-03-10T12:35:21.185 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm07.stderr:ceph-fuse[80379]: starting fuse 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/proc 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/dev 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/security 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/dev/shm 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/dev/pts 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/cgroup 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/pstore 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/bpf 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/config 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/ 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/selinux 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/dev/hugepages 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/dev/mqueue 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/debug 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/kernel/tracing 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/sys/fs/fuse/connections 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/1000 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/user/0 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/27d07818bdb1191145f6f4bf503bcb1e25bf393187074a759a17976c082492a3/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/4e470cbd9ba1f7eb2334c092c3523d15dc348343fb74e91dce771a5859387ee5/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/865563d08c5cfb639ff9312e34c9c0e748e07f2393ed9c462a3f9a11e38a90cd/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/5b75749e51b97079e1657eca570b1accfdbbc95eec8e352d582260dd5f628445/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/bd25401adea8ab9cbe0b175b51e7d391cff4a4f5c8aa816174242796ddde48ca/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/ad41b620f663f56acbe7d22d085abcd4d1f8754efa1f0ac73ad3bf779df3c9b6/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/b829b2d591976f8210f669a89275319fe652185ac6947c93a279d9f98f34865d/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/380975e5ee2045b4b9ad7e0b8f33224fc4c0b621a5b808354f824bc22cb72963/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/3b04a3c3ac6cc38738ee2b76ecf7e10d7f6ce649f827bc517cf3cfe2b34c7671/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/var/lib/containers/storage/overlay/84351eea17438a3fbe25f7aa14876c2f38547d059127232cc8b851fc88a26c84/merged 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T12:35:21.200 INFO:teuthology.orchestra.run.vm07.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T12:35:21.201 INFO:teuthology.orchestra.run.vm07.stdout:/home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:21.201 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:21.201 DEBUG:teuthology.orchestra.run.vm07:> ls /sys/fs/fuse/connections 2026-03-10T12:35:21.259 INFO:teuthology.orchestra.run.vm07.stdout:90 2026-03-10T12:35:21.259 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> sudo stdin-killer -- python3 -c ' 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> import glob 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> import re 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> import os 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> import subprocess 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> def _find_admin_socket(client_name): 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> files = glob.glob(asok_path) 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> # Given a non-glob path, it better be there 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> if "*" not in asok_path: 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> assert(len(files) == 1) 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> return files[0] 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> for f in files: 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> contents = proc_f.read() 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> if mountpoint in contents: 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> return f 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> print(_find_admin_socket("client.1")) 2026-03-10T12:35:21.259 DEBUG:teuthology.orchestra.run.vm07:> ' 2026-03-10T12:35:21.355 INFO:teuthology.orchestra.run.vm07.stdout:/var/run/ceph/ceph-client.1.80379.asok 2026-03-10T12:35:21.357 INFO:teuthology.orchestra.run.vm07.stderr:2026-03-10T12:35:21 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T12:35:21.363 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.80379.asok 2026-03-10T12:35:21.363 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:21.363 DEBUG:teuthology.orchestra.run.vm07:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.80379.asok status 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout:{ 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "metadata": { 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "ceph_sha1": "5dd24139a1eada541a3bc16b6941c5dde975e26d", 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "ceph_version": "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)", 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "entity_id": "1", 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "hostname": "vm07.local", 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "pid": "80379", 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "root": "/" 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "dentry_count": 0, 2026-03-10T12:35:21.469 INFO:teuthology.orchestra.run.vm07.stdout: "dentry_pinned_count": 0, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "id": 14524, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "inst": { 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "name": { 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "type": "client", 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "num": 14524 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "addr": { 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "type": "v1", 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "addr": "192.168.144.1:0", 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "nonce": 4070542078 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: } 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "addr": { 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "type": "v1", 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "addr": "192.168.144.1:0", 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "nonce": 4070542078 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: }, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "inst_str": "client.14524 192.168.144.1:0/4070542078", 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "addr_str": "192.168.144.1:0/4070542078", 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "inode_count": 1, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "mds_epoch": 10, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "osd_epoch": 39, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "osd_epoch_barrier": 0, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "blocklisted": false, 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout: "fs_name": "cephfs" 2026-03-10T12:35:21.470 INFO:teuthology.orchestra.run.vm07.stdout:} 2026-03-10T12:35:21.475 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:21.475 DEBUG:teuthology.orchestra.run.vm00:> stat --file-system '--printf=%T 2026-03-10T12:35:21.475 DEBUG:teuthology.orchestra.run.vm00:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:21.490 INFO:teuthology.orchestra.run.vm00.stdout:fuseblk 2026-03-10T12:35:21.490 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:21.490 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:21.490 DEBUG:teuthology.orchestra.run.vm00:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:21.559 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:21.559 DEBUG:teuthology.orchestra.run.vm07:> stat --file-system '--printf=%T 2026-03-10T12:35:21.559 DEBUG:teuthology.orchestra.run.vm07:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:21.578 INFO:teuthology.orchestra.run.vm07.stdout:fuseblk 2026-03-10T12:35:21.578 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:21.579 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:35:21.579 DEBUG:teuthology.orchestra.run.vm07:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:21.646 INFO:teuthology.run_tasks:Running task print... 2026-03-10T12:35:21.649 INFO:teuthology.task.print:**** done client 2026-03-10T12:35:21.649 INFO:teuthology.run_tasks:Running task parallel... 2026-03-10T12:35:21.652 INFO:teuthology.task.parallel:starting parallel... 2026-03-10T12:35:21.652 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T12:35:21.652 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T12:35:21.652 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:21.652 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs false || true' 2026-03-10T12:35:21.653 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T12:35:21.653 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-10T12:35:21.654 INFO:tasks.workunit:Pulling workunits from ref 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T12:35:21.654 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-10T12:35:21.654 INFO:tasks.workunit:timeout=3h 2026-03-10T12:35:21.654 INFO:tasks.workunit:cleanup=True 2026-03-10T12:35:21.654 DEBUG:teuthology.orchestra.run.vm00:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout:Device: 60h/96d Inode: 1 Links: 2 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout:Modify: 2026-03-10 12:35:11.655829796 +0000 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout:Change: 2026-03-10 12:35:21.557813556 +0000 2026-03-10T12:35:21.676 INFO:teuthology.orchestra.run.vm00.stdout: Birth: - 2026-03-10T12:35:21.676 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-10T12:35:21.676 DEBUG:teuthology.orchestra.run.vm00:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-10T12:35:21.751 DEBUG:teuthology.orchestra.run.vm07:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:21.769 INFO:teuthology.orchestra.run.vm07.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:21.769 INFO:teuthology.orchestra.run.vm07.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T12:35:21.769 INFO:teuthology.orchestra.run.vm07.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-10T12:35:21.770 INFO:teuthology.orchestra.run.vm07.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T12:35:21.770 INFO:teuthology.orchestra.run.vm07.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T12:35:21.770 INFO:teuthology.orchestra.run.vm07.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T12:35:21.770 INFO:teuthology.orchestra.run.vm07.stdout:Modify: 2026-03-10 12:35:21.746107302 +0000 2026-03-10T12:35:21.770 INFO:teuthology.orchestra.run.vm07.stdout:Change: 2026-03-10 12:35:21.746107302 +0000 2026-03-10T12:35:21.770 INFO:teuthology.orchestra.run.vm07.stdout: Birth: - 2026-03-10T12:35:21.770 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-10T12:35:21.770 DEBUG:teuthology.orchestra.run.vm07:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-10T12:35:21.813 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:21.851 DEBUG:teuthology.orchestra.run.vm00:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T12:35:21.851 DEBUG:teuthology.orchestra.run.vm07:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T12:35:21.877 INFO:tasks.workunit.client.0.vm00.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-10T12:35:21.910 INFO:tasks.workunit.client.1.vm07.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-10T12:35:22.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.075+0000 7efecc495700 1 -- 192.168.123.100:0/1195683701 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec41038d0 msgr2=0x7efec4105cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.075+0000 7efecc495700 1 --2- 192.168.123.100:0/1195683701 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec41038d0 0x7efec4105cb0 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7efec0009b00 tx=0x7efec0009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:22.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.076+0000 7efecc495700 1 -- 192.168.123.100:0/1195683701 shutdown_connections 2026-03-10T12:35:22.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.076+0000 7efecc495700 1 --2- 192.168.123.100:0/1195683701 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec41038d0 0x7efec4105cb0 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.076+0000 7efecc495700 1 --2- 192.168.123.100:0/1195683701 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efec4100fb0 0x7efec4103390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.076+0000 7efecc495700 1 -- 192.168.123.100:0/1195683701 >> 192.168.123.100:0/1195683701 conn(0x7efec40fa990 msgr2=0x7efec40fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:22.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.076+0000 7efecc495700 1 -- 192.168.123.100:0/1195683701 shutdown_connections 2026-03-10T12:35:22.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.076+0000 7efecc495700 1 -- 192.168.123.100:0/1195683701 wait complete. 2026-03-10T12:35:22.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efecc495700 1 Processor -- start 2026-03-10T12:35:22.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efecc495700 1 -- start start 2026-03-10T12:35:22.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efecc495700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec4100fb0 0x7efec4197fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efecc495700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efec41038d0 0x7efec4198510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efecc495700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efec4198b30 con 0x7efec4100fb0 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efecc495700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efec4198c70 con 0x7efec41038d0 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efeca231700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec4100fb0 0x7efec4197fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efeca231700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec4100fb0 0x7efec4197fd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57786/0 (socket says 192.168.123.100:57786) 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efeca231700 1 -- 192.168.123.100:0/1248550334 learned_addr learned my addr 192.168.123.100:0/1248550334 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efec9a30700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efec41038d0 0x7efec4198510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efeca231700 1 -- 192.168.123.100:0/1248550334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efec41038d0 msgr2=0x7efec4198510 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efeca231700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efec41038d0 0x7efec4198510 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.079 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.077+0000 7efeca231700 1 -- 192.168.123.100:0/1248550334 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efec00097e0 con 0x7efec4100fb0 2026-03-10T12:35:22.079 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.078+0000 7efeca231700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec4100fb0 0x7efec4197fd0 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7efeb400b700 tx=0x7efeb400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:22.079 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.079+0000 7efebb7fe700 1 -- 192.168.123.100:0/1248550334 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efeb4010840 con 0x7efec4100fb0 2026-03-10T12:35:22.079 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.079+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efec4190a70 con 0x7efec4100fb0 2026-03-10T12:35:22.080 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.079+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efec4190fc0 con 0x7efec4100fb0 2026-03-10T12:35:22.080 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.079+0000 7efebb7fe700 1 -- 192.168.123.100:0/1248550334 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efeb4010e80 con 0x7efec4100fb0 2026-03-10T12:35:22.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.080+0000 7efebb7fe700 1 -- 192.168.123.100:0/1248550334 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efeb400d590 con 0x7efec4100fb0 2026-03-10T12:35:22.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.080+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efea8005320 con 0x7efec4100fb0 2026-03-10T12:35:22.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.082+0000 7efebb7fe700 1 -- 192.168.123.100:0/1248550334 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7efeb400d770 con 0x7efec4100fb0 2026-03-10T12:35:22.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.082+0000 7efebb7fe700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7efeb006c5d0 0x7efeb006ea80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:22.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.082+0000 7efebb7fe700 1 -- 192.168.123.100:0/1248550334 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7efeb408c410 con 0x7efec4100fb0 2026-03-10T12:35:22.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.083+0000 7efebb7fe700 1 -- 192.168.123.100:0/1248550334 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7efeb405a6a0 con 0x7efec4100fb0 2026-03-10T12:35:22.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.083+0000 7efec9a30700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7efeb006c5d0 0x7efeb006ea80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:22.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.084+0000 7efec9a30700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7efeb006c5d0 0x7efeb006ea80 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7efec0005340 tx=0x7efec000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:22.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.190+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7efea8005f70 con 0x7efec4100fb0 2026-03-10T12:35:22.198 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.197+0000 7efebb7fe700 1 -- 192.168.123.100:0/1248550334 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v16)=0 v16) v1 ==== 126+0+0 (secure 0 0 0) 0x7efeb405a230 con 0x7efec4100fb0 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.203+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7efeb006c5d0 msgr2=0x7efeb006ea80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.203+0000 7efecc495700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7efeb006c5d0 0x7efeb006ea80 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7efec0005340 tx=0x7efec000b540 comp rx=0 tx=0).stop 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec4100fb0 msgr2=0x7efec4197fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec4100fb0 0x7efec4197fd0 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7efeb400b700 tx=0x7efeb400bac0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 shutdown_connections 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7efeb006c5d0 0x7efeb006ea80 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7efec4100fb0 0x7efec4197fd0 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 --2- 192.168.123.100:0/1248550334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7efec41038d0 0x7efec4198510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 >> 192.168.123.100:0/1248550334 conn(0x7efec40fa990 msgr2=0x7efec40ff630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 shutdown_connections 2026-03-10T12:35:22.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.204+0000 7efecc495700 1 -- 192.168.123.100:0/1248550334 wait complete. 2026-03-10T12:35:22.241 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:22 vm07.local ceph-mon[58582]: pgmap v86: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s wr, 8 op/s 2026-03-10T12:35:22.258 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T12:35:22.258 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:22.258 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-10T12:35:22.460 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:22 vm00.local ceph-mon[50686]: pgmap v86: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.8 KiB/s wr, 8 op/s 2026-03-10T12:35:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:22 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:22 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/1248550334' entity='client.admin' 2026-03-10T12:35:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:22 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:22 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:22 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:22.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:22 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:22.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:22 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:22.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:22 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/1248550334' entity='client.admin' 2026-03-10T12:35:22.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:22 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:22.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:22 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:22.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:22 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:22.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:22 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:22.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.704+0000 7f10ecbe0700 1 -- 192.168.123.100:0/2792148793 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8103960 msgr2=0x7f10e8103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.704+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/2792148793 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8103960 0x7f10e8103db0 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7f10d8009b00 tx=0x7f10d8009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:22.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.704+0000 7f10ecbe0700 1 -- 192.168.123.100:0/2792148793 shutdown_connections 2026-03-10T12:35:22.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.704+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/2792148793 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8103960 0x7f10e8103db0 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.704 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.704+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/2792148793 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10e8102760 0x7f10e8102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.704+0000 7f10ecbe0700 1 -- 192.168.123.100:0/2792148793 >> 192.168.123.100:0/2792148793 conn(0x7f10e80fdcf0 msgr2=0x7f10e8100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:22.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.704+0000 7f10ecbe0700 1 -- 192.168.123.100:0/2792148793 shutdown_connections 2026-03-10T12:35:22.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.705+0000 7f10ecbe0700 1 -- 192.168.123.100:0/2792148793 wait complete. 2026-03-10T12:35:22.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.705+0000 7f10ecbe0700 1 Processor -- start 2026-03-10T12:35:22.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.705+0000 7f10ecbe0700 1 -- start start 2026-03-10T12:35:22.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.705+0000 7f10ecbe0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8102760 0x7f10e8198030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.705+0000 7f10ecbe0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10e8103960 0x7f10e8198570 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.705+0000 7f10ecbe0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10e8198b90 con 0x7f10e8102760 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.705+0000 7f10ecbe0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10e8198cd0 con 0x7f10e8103960 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8102760 0x7f10e8198030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e5d9b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10e8103960 0x7f10e8198570 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e5d9b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10e8103960 0x7f10e8198570 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:39702/0 (socket says 192.168.123.100:39702) 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e5d9b700 1 -- 192.168.123.100:0/36847177 learned_addr learned my addr 192.168.123.100:0/36847177 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8102760 0x7f10e8198030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57812/0 (socket says 192.168.123.100:57812) 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e659c700 1 -- 192.168.123.100:0/36847177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10e8103960 msgr2=0x7f10e8198570 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e659c700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10e8103960 0x7f10e8198570 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e659c700 1 -- 192.168.123.100:0/36847177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10d80097e0 con 0x7f10e8102760 2026-03-10T12:35:22.706 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.706+0000 7f10e659c700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8102760 0x7f10e8198030 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f10d000b810 tx=0x7f10d000bb20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:22.708 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.707+0000 7f10df7fe700 1 -- 192.168.123.100:0/36847177 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10d000d610 con 0x7f10e8102760 2026-03-10T12:35:22.708 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.707+0000 7f10df7fe700 1 -- 192.168.123.100:0/36847177 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f10d000dc50 con 0x7f10e8102760 2026-03-10T12:35:22.708 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.707+0000 7f10df7fe700 1 -- 192.168.123.100:0/36847177 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10d0017400 con 0x7f10e8102760 2026-03-10T12:35:22.708 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.707+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f10e819d780 con 0x7f10e8102760 2026-03-10T12:35:22.708 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.707+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f10e819dcd0 con 0x7f10e8102760 2026-03-10T12:35:22.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.708+0000 7f10df7fe700 1 -- 192.168.123.100:0/36847177 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f10d000d770 con 0x7f10e8102760 2026-03-10T12:35:22.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.708+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f10e8066e40 con 0x7f10e8102760 2026-03-10T12:35:22.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.708+0000 7f10df7fe700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f10d406c630 0x7f10d406eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:22.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.708+0000 7f10df7fe700 1 -- 192.168.123.100:0/36847177 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f10d008b1c0 con 0x7f10e8102760 2026-03-10T12:35:22.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.711+0000 7f10e5d9b700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f10d406c630 0x7f10d406eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:22.711 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.711+0000 7f10e5d9b700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f10d406c630 0x7f10d406eae0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f10d8006010 tx=0x7f10d800b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:22.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.711+0000 7f10df7fe700 1 -- 192.168.123.100:0/36847177 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f10d00593d0 con 0x7f10e8102760 2026-03-10T12:35:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.818+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f10e819dfb0 con 0x7f10e8102760 2026-03-10T12:35:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.819+0000 7f10df7fe700 1 -- 192.168.123.100:0/36847177 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v16)=0 v16) v1 ==== 155+0+0 (secure 0 0 0) 0x7f10d0058f60 con 0x7f10e8102760 2026-03-10T12:35:22.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.823+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f10d406c630 msgr2=0x7f10d406eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.823 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.823+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f10d406c630 0x7f10d406eae0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f10d8006010 tx=0x7f10d800b540 comp rx=0 tx=0).stop 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.823+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8102760 msgr2=0x7f10e8198030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.823+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8102760 0x7f10e8198030 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f10d000b810 tx=0x7f10d000bb20 comp rx=0 tx=0).stop 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.824+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 shutdown_connections 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.824+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f10d406c630 0x7f10d406eae0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.824+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f10e8102760 0x7f10e8198030 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.824+0000 7f10ecbe0700 1 --2- 192.168.123.100:0/36847177 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10e8103960 0x7f10e8198570 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.824+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 >> 192.168.123.100:0/36847177 conn(0x7f10e80fdcf0 msgr2=0x7f10e81000b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.824+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 shutdown_connections 2026-03-10T12:35:22.824 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:22.824+0000 7f10ecbe0700 1 -- 192.168.123.100:0/36847177 wait complete. 2026-03-10T12:35:22.872 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-10T12:35:23.021 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:23.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.297+0000 7f4f92b0b700 1 -- 192.168.123.100:0/2094469792 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 msgr2=0x7f4f8c103ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.297+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/2094469792 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c103ba0 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f4f78009b50 tx=0x7f4f78009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:23.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.298+0000 7f4f92b0b700 1 -- 192.168.123.100:0/2094469792 shutdown_connections 2026-03-10T12:35:23.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.298+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/2094469792 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c103ba0 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.298+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/2094469792 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f8c0fedc0 0x7f4f8c1011e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.298+0000 7f4f92b0b700 1 -- 192.168.123.100:0/2094469792 >> 192.168.123.100:0/2094469792 conn(0x7f4f8c0fa9b0 msgr2=0x7f4f8c0fce20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:23.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.298+0000 7f4f92b0b700 1 -- 192.168.123.100:0/2094469792 shutdown_connections 2026-03-10T12:35:23.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.299+0000 7f4f92b0b700 1 -- 192.168.123.100:0/2094469792 wait complete. 2026-03-10T12:35:23.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.299+0000 7f4f92b0b700 1 Processor -- start 2026-03-10T12:35:23.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.299+0000 7f4f92b0b700 1 -- start start 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.299+0000 7f4f92b0b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f8c0fedc0 0x7f4f8c19c400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f92b0b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c19c940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f92b0b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f8c19cf60 con 0x7f4f8c101720 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f92b0b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f8c19d0a0 con 0x7f4f8c0fedc0 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f8bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c19c940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f8bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c19c940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57844/0 (socket says 192.168.123.100:57844) 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f8bfff700 1 -- 192.168.123.100:0/930826183 learned_addr learned my addr 192.168.123.100:0/930826183 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f908a7700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f8c0fedc0 0x7f4f8c19c400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f8bfff700 1 -- 192.168.123.100:0/930826183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f8c0fedc0 msgr2=0x7f4f8c19c400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f8bfff700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f8c0fedc0 0x7f4f8c19c400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f8bfff700 1 -- 192.168.123.100:0/930826183 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4f780097e0 con 0x7f4f8c101720 2026-03-10T12:35:23.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f908a7700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f8c0fedc0 0x7f4f8c19c400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:35:23.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.300+0000 7f4f8bfff700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c19c940 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f4f78005950 tx=0x7f4f780057d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:23.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.301+0000 7f4f89ffb700 1 -- 192.168.123.100:0/930826183 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4f7801d070 con 0x7f4f8c101720 2026-03-10T12:35:23.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.301+0000 7f4f89ffb700 1 -- 192.168.123.100:0/930826183 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4f7800bb50 con 0x7f4f8c101720 2026-03-10T12:35:23.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.301+0000 7f4f89ffb700 1 -- 192.168.123.100:0/930826183 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4f7800f7d0 con 0x7f4f8c101720 2026-03-10T12:35:23.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.301+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4f8c1a1af0 con 0x7f4f8c101720 2026-03-10T12:35:23.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.301+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4f8c1a1fe0 con 0x7f4f8c101720 2026-03-10T12:35:23.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.302+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4f8c196600 con 0x7f4f8c101720 2026-03-10T12:35:23.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.303+0000 7f4f89ffb700 1 -- 192.168.123.100:0/930826183 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f4f7800bcc0 con 0x7f4f8c101720 2026-03-10T12:35:23.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.304+0000 7f4f89ffb700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4f7c06c7a0 0x7f4f7c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:23.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.304+0000 7f4f89ffb700 1 -- 192.168.123.100:0/930826183 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f4f7808dad0 con 0x7f4f8c101720 2026-03-10T12:35:23.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.305+0000 7f4f908a7700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4f7c06c7a0 0x7f4f7c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:23.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.306+0000 7f4f908a7700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4f7c06c7a0 0x7f4f7c06ec50 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f4f80005950 tx=0x7f4f800058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:23.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.306+0000 7f4f89ffb700 1 -- 192.168.123.100:0/930826183 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f4f7805bd60 con 0x7f4f8c101720 2026-03-10T12:35:23.414 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:23 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:23.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.413+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f4f8c066e40 con 0x7f4f8c101720 2026-03-10T12:35:23.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.415+0000 7f4f89ffb700 1 -- 192.168.123.100:0/930826183 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v16)=0 v16) v1 ==== 163+0+0 (secure 0 0 0) 0x7f4f780270e0 con 0x7f4f8c101720 2026-03-10T12:35:23.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.417+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4f7c06c7a0 msgr2=0x7f4f7c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.417+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4f7c06c7a0 0x7f4f7c06ec50 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f4f80005950 tx=0x7f4f800058e0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.417+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 msgr2=0x7f4f8c19c940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.417+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c19c940 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f4f78005950 tx=0x7f4f780057d0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.417+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 shutdown_connections 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.417+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f4f7c06c7a0 0x7f4f7c06ec50 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.418+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f8c0fedc0 0x7f4f8c19c400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.418+0000 7f4f92b0b700 1 --2- 192.168.123.100:0/930826183 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f8c101720 0x7f4f8c19c940 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.418+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 >> 192.168.123.100:0/930826183 conn(0x7f4f8c0fa9b0 msgr2=0x7f4f8c0fce20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.418+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 shutdown_connections 2026-03-10T12:35:23.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.418+0000 7f4f92b0b700 1 -- 192.168.123.100:0/930826183 wait complete. 2026-03-10T12:35:23.459 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-10T12:35:23.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:23 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:23.627 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.877+0000 7f6220a44700 1 -- 192.168.123.100:0/447663590 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c103960 msgr2=0x7f621c103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.877+0000 7f6220a44700 1 --2- 192.168.123.100:0/447663590 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c103960 0x7f621c103db0 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f620c009b00 tx=0x7f620c009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.878+0000 7f6220a44700 1 -- 192.168.123.100:0/447663590 shutdown_connections 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.878+0000 7f6220a44700 1 --2- 192.168.123.100:0/447663590 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c103960 0x7f621c103db0 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.878+0000 7f6220a44700 1 --2- 192.168.123.100:0/447663590 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f621c102760 0x7f621c102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.878+0000 7f6220a44700 1 -- 192.168.123.100:0/447663590 >> 192.168.123.100:0/447663590 conn(0x7f621c0fdcf0 msgr2=0x7f621c100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 -- 192.168.123.100:0/447663590 shutdown_connections 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 -- 192.168.123.100:0/447663590 wait complete. 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 Processor -- start 2026-03-10T12:35:23.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 -- start start 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c102760 0x7f621c078b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f621c103960 0x7f621c079040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f621c0755f0 con 0x7f621c102760 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f6220a44700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f621c075760 con 0x7f621c103960 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f621a59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c102760 0x7f621c078b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f621a59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c102760 0x7f621c078b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57858/0 (socket says 192.168.123.100:57858) 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.879+0000 7f621a59c700 1 -- 192.168.123.100:0/1386683056 learned_addr learned my addr 192.168.123.100:0/1386683056 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f6219d9b700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f621c103960 0x7f621c079040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f621a59c700 1 -- 192.168.123.100:0/1386683056 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f621c103960 msgr2=0x7f621c079040 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f621a59c700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f621c103960 0x7f621c079040 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f621a59c700 1 -- 192.168.123.100:0/1386683056 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f620c0097e0 con 0x7f621c102760 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f621a59c700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c102760 0x7f621c078b00 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f620400ba70 tx=0x7f620400be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f62137fe700 1 -- 192.168.123.100:0/1386683056 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f620400c760 con 0x7f621c102760 2026-03-10T12:35:23.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f62137fe700 1 -- 192.168.123.100:0/1386683056 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f620400cda0 con 0x7f621c102760 2026-03-10T12:35:23.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f621c075a40 con 0x7f621c102760 2026-03-10T12:35:23.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.880+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f621c075f90 con 0x7f621c102760 2026-03-10T12:35:23.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.881+0000 7f62137fe700 1 -- 192.168.123.100:0/1386683056 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6204012550 con 0x7f621c102760 2026-03-10T12:35:23.884 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.881+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f61fc005320 con 0x7f621c102760 2026-03-10T12:35:23.884 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.882+0000 7f62137fe700 1 -- 192.168.123.100:0/1386683056 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f6204012770 con 0x7f621c102760 2026-03-10T12:35:23.884 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.882+0000 7f62137fe700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f620806c4c0 0x7f620806e970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:23.884 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.882+0000 7f62137fe700 1 -- 192.168.123.100:0/1386683056 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f620408aa50 con 0x7f621c102760 2026-03-10T12:35:23.885 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.884+0000 7f6219d9b700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f620806c4c0 0x7f620806e970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:23.885 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.884+0000 7f62137fe700 1 -- 192.168.123.100:0/1386683056 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f6204059dd0 con 0x7f621c102760 2026-03-10T12:35:23.885 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.885+0000 7f6219d9b700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f620806c4c0 0x7f620806e970 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f620c00b5c0 tx=0x7f620c005ea0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:23.996 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.996+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f61fc005f70 con 0x7f621c102760 2026-03-10T12:35:23.997 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.996+0000 7f62137fe700 1 -- 192.168.123.100:0/1386683056 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v16)=0 v16) v1 ==== 135+0+0 (secure 0 0 0) 0x7f6204014e60 con 0x7f621c102760 2026-03-10T12:35:23.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f620806c4c0 msgr2=0x7f620806e970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f620806c4c0 0x7f620806e970 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f620c00b5c0 tx=0x7f620c005ea0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c102760 msgr2=0x7f621c078b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:23.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c102760 0x7f621c078b00 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f620400ba70 tx=0x7f620400be30 comp rx=0 tx=0).stop 2026-03-10T12:35:23.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 shutdown_connections 2026-03-10T12:35:23.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f620806c4c0 0x7f620806e970 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:23.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f621c102760 0x7f621c078b00 unknown :-1 s=CLOSED pgs=282 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 --2- 192.168.123.100:0/1386683056 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f621c103960 0x7f621c079040 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 >> 192.168.123.100:0/1386683056 conn(0x7f621c0fdcf0 msgr2=0x7f621c106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:24.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 shutdown_connections 2026-03-10T12:35:24.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:23.999+0000 7f6220a44700 1 -- 192.168.123.100:0/1386683056 wait complete. 2026-03-10T12:35:24.048 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-10T12:35:24.199 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:24.250 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:24 vm00.local ceph-mon[50686]: pgmap v87: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s wr, 7 op/s 2026-03-10T12:35:24.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 -- 192.168.123.100:0/1755086983 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b4071950 msgr2=0x7f89b4071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:24.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 --2- 192.168.123.100:0/1755086983 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b4071950 0x7f89b4071d60 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f89b00099c0 tx=0x7f89b0009cd0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 -- 192.168.123.100:0/1755086983 shutdown_connections 2026-03-10T12:35:24.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 --2- 192.168.123.100:0/1755086983 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b4072330 0x7f89b40770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 --2- 192.168.123.100:0/1755086983 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b4071950 0x7f89b4071d60 unknown :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 -- 192.168.123.100:0/1755086983 >> 192.168.123.100:0/1755086983 conn(0x7f89b406d1a0 msgr2=0x7f89b406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 -- 192.168.123.100:0/1755086983 shutdown_connections 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.486+0000 7f89bad93700 1 -- 192.168.123.100:0/1755086983 wait complete. 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89bad93700 1 Processor -- start 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89bad93700 1 -- start start 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89bad93700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b4072330 0x7f89b4082470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89bad93700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b40829b0 0x7f89b4082e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89bad93700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89b4083e20 con 0x7f89b40829b0 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89bad93700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89b41b2a90 con 0x7f89b4072330 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9590700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b40829b0 0x7f89b4082e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9590700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b40829b0 0x7f89b4082e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57862/0 (socket says 192.168.123.100:57862) 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9590700 1 -- 192.168.123.100:0/3632574449 learned_addr learned my addr 192.168.123.100:0/3632574449 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9d91700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b4072330 0x7f89b4082470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9d91700 1 -- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b40829b0 msgr2=0x7f89b4082e20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9d91700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b40829b0 0x7f89b4082e20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9d91700 1 -- 192.168.123.100:0/3632574449 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89b00096b0 con 0x7f89b4072330 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.487+0000 7f89b9d91700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b4072330 0x7f89b4082470 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f89b0009e90 tx=0x7f89b00057e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.488+0000 7f89aaffd700 1 -- 192.168.123.100:0/3632574449 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89b0011bc0 con 0x7f89b4072330 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.488+0000 7f89bad93700 1 -- 192.168.123.100:0/3632574449 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f89b41b2c30 con 0x7f89b4072330 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.488+0000 7f89bad93700 1 -- 192.168.123.100:0/3632574449 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f89b41b3090 con 0x7f89b4072330 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.488+0000 7f89aaffd700 1 -- 192.168.123.100:0/3632574449 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f89b0011d20 con 0x7f89b4072330 2026-03-10T12:35:24.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.488+0000 7f89aaffd700 1 -- 192.168.123.100:0/3632574449 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89b0019a90 con 0x7f89b4072330 2026-03-10T12:35:24.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.489+0000 7f89aaffd700 1 -- 192.168.123.100:0/3632574449 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f89b0019bf0 con 0x7f89b4072330 2026-03-10T12:35:24.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.490+0000 7f89aaffd700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f89a006c7a0 0x7f89a006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:24.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.491+0000 7f89b9590700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f89a006c7a0 0x7f89a006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:24.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.491+0000 7f89b9590700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f89a006c7a0 0x7f89a006ec50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f89ac00bfd0 tx=0x7f89ac00b040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:24.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.491+0000 7f89aaffd700 1 -- 192.168.123.100:0/3632574449 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f89b0015070 con 0x7f89b4072330 2026-03-10T12:35:24.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.492+0000 7f89bad93700 1 -- 192.168.123.100:0/3632574449 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8998005320 con 0x7f89b4072330 2026-03-10T12:35:24.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.495+0000 7f89aaffd700 1 -- 192.168.123.100:0/3632574449 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f89b005eac0 con 0x7f89b4072330 2026-03-10T12:35:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:24 vm07.local ceph-mon[58582]: pgmap v87: 65 pgs: 65 active+clean; 453 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.4 KiB/s wr, 7 op/s 2026-03-10T12:35:24.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.652+0000 7f89bad93700 1 -- 192.168.123.100:0/3632574449 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f8998000c90 con 0x7f89a006c7a0 2026-03-10T12:35:24.665 INFO:teuthology.orchestra.run.vm00.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:35:24.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.665+0000 7f89aaffd700 1 -- 192.168.123.100:0/3632574449 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f8998000c90 con 0x7f89a006c7a0 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 -- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f89a006c7a0 msgr2=0x7f89a006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f89a006c7a0 0x7f89a006ec50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f89ac00bfd0 tx=0x7f89ac00b040 comp rx=0 tx=0).stop 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 -- 192.168.123.100:0/3632574449 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b4072330 msgr2=0x7f89b4082470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b4072330 0x7f89b4082470 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f89b0009e90 tx=0x7f89b00057e0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 -- 192.168.123.100:0/3632574449 shutdown_connections 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f89a006c7a0 0x7f89a006ec50 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b4072330 0x7f89b4082470 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 --2- 192.168.123.100:0/3632574449 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b40829b0 0x7f89b4082e20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 -- 192.168.123.100:0/3632574449 >> 192.168.123.100:0/3632574449 conn(0x7f89b406d1a0 msgr2=0x7f89b4070590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:24.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 -- 192.168.123.100:0/3632574449 shutdown_connections 2026-03-10T12:35:24.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:24.668+0000 7f89a8ff9700 1 -- 192.168.123.100:0/3632574449 wait complete. 2026-03-10T12:35:24.740 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T12:35:24.740 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:35:24.740 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-10T12:35:24.964 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:35:25.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.324+0000 7fb8348fa700 1 -- 192.168.123.100:0/3518775728 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830071a60 msgr2=0x7fb830071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.324+0000 7fb8348fa700 1 --2- 192.168.123.100:0/3518775728 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830071a60 0x7fb830071e70 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7fb820009b00 tx=0x7fb820009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:25.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.325+0000 7fb8348fa700 1 -- 192.168.123.100:0/3518775728 shutdown_connections 2026-03-10T12:35:25.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.325+0000 7fb8348fa700 1 --2- 192.168.123.100:0/3518775728 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb830072440 0x7fb83010cfa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.325+0000 7fb8348fa700 1 --2- 192.168.123.100:0/3518775728 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830071a60 0x7fb830071e70 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.325+0000 7fb8348fa700 1 -- 192.168.123.100:0/3518775728 >> 192.168.123.100:0/3518775728 conn(0x7fb83006d1a0 msgr2=0x7fb83006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 -- 192.168.123.100:0/3518775728 shutdown_connections 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 -- 192.168.123.100:0/3518775728 wait complete. 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 Processor -- start 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 -- start start 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb830071a60 0x7fb83019c120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830072440 0x7fb83019c660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb83019cc80 con 0x7fb830072440 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb8348fa700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb83019cdc0 con 0x7fb830071a60 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.327+0000 7fb82e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830072440 0x7fb83019c660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb82e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830072440 0x7fb83019c660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57892/0 (socket says 192.168.123.100:57892) 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb82e59c700 1 -- 192.168.123.100:0/1132744322 learned_addr learned my addr 192.168.123.100:0/1132744322 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:25.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb82ed9d700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb830071a60 0x7fb83019c120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb82ed9d700 1 -- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830072440 msgr2=0x7fb83019c660 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb82ed9d700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830072440 0x7fb83019c660 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb82ed9d700 1 -- 192.168.123.100:0/1132744322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb8200097e0 con 0x7fb830071a60 2026-03-10T12:35:25.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb82ed9d700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb830071a60 0x7fb83019c120 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fb820009b00 tx=0x7fb820004990 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:25.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb817fff700 1 -- 192.168.123.100:0/1132744322 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb82001d070 con 0x7fb830071a60 2026-03-10T12:35:25.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.328+0000 7fb817fff700 1 -- 192.168.123.100:0/1132744322 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb82000bc50 con 0x7fb830071a60 2026-03-10T12:35:25.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.329+0000 7fb817fff700 1 -- 192.168.123.100:0/1132744322 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb82000f700 con 0x7fb830071a60 2026-03-10T12:35:25.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.329+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb8301a1810 con 0x7fb830071a60 2026-03-10T12:35:25.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.329+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb8301a1c80 con 0x7fb830071a60 2026-03-10T12:35:25.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.330+0000 7fb815ffb700 1 -- 192.168.123.100:0/1132744322 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb8100052f0 con 0x7fb830071a60 2026-03-10T12:35:25.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.330+0000 7fb817fff700 1 -- 192.168.123.100:0/1132744322 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb820022470 con 0x7fb830071a60 2026-03-10T12:35:25.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.331+0000 7fb817fff700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb81806e690 0x7fb818070b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.331+0000 7fb817fff700 1 -- 192.168.123.100:0/1132744322 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb82008d7e0 con 0x7fb830071a60 2026-03-10T12:35:25.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.331+0000 7fb82e59c700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb81806e690 0x7fb818070b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.331+0000 7fb82e59c700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb81806e690 0x7fb818070b40 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fb82400bd60 tx=0x7fb82400b480 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:25.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.332+0000 7fb817fff700 1 -- 192.168.123.100:0/1132744322 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb82005ba70 con 0x7fb830071a60 2026-03-10T12:35:25.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.510+0000 7fb815ffb700 1 -- 192.168.123.100:0/1132744322 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb810000bc0 con 0x7fb81806e690 2026-03-10T12:35:25.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.513+0000 7fb817fff700 1 -- 192.168.123.100:0/1132744322 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fb810000bc0 con 0x7fb81806e690 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb81806e690 msgr2=0x7fb818070b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb81806e690 0x7fb818070b40 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fb82400bd60 tx=0x7fb82400b480 comp rx=0 tx=0).stop 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb830071a60 msgr2=0x7fb83019c120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb830071a60 0x7fb83019c120 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fb820009b00 tx=0x7fb820004990 comp rx=0 tx=0).stop 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 shutdown_connections 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb81806e690 0x7fb818070b40 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb830071a60 0x7fb83019c120 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 --2- 192.168.123.100:0/1132744322 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb830072440 0x7fb83019c660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.516+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 >> 192.168.123.100:0/1132744322 conn(0x7fb83006d1a0 msgr2=0x7fb83010b7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.517+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 shutdown_connections 2026-03-10T12:35:25.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.517+0000 7fb8348fa700 1 -- 192.168.123.100:0/1132744322 wait complete. 2026-03-10T12:35:25.530 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:35:25.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 -- 192.168.123.100:0/163665245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b4071950 msgr2=0x7f26b4071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 --2- 192.168.123.100:0/163665245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b4071950 0x7f26b4071d60 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f26a4008790 tx=0x7f26a4008aa0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 -- 192.168.123.100:0/163665245 shutdown_connections 2026-03-10T12:35:25.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 --2- 192.168.123.100:0/163665245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26b4072330 0x7f26b40770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 --2- 192.168.123.100:0/163665245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b4071950 0x7f26b4071d60 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 -- 192.168.123.100:0/163665245 >> 192.168.123.100:0/163665245 conn(0x7f26b406d1a0 msgr2=0x7f26b406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:25.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 -- 192.168.123.100:0/163665245 shutdown_connections 2026-03-10T12:35:25.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.609+0000 7f26b9d85700 1 -- 192.168.123.100:0/163665245 wait complete. 2026-03-10T12:35:25.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.610+0000 7f26b9d85700 1 Processor -- start 2026-03-10T12:35:25.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.610+0000 7f26b9d85700 1 -- start start 2026-03-10T12:35:25.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.610+0000 7f26b9d85700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26b4072330 0x7f26b4080360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.610+0000 7f26b9d85700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b40808a0 0x7f26b4080d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.610+0000 7f26b9d85700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26b412dd80 con 0x7f26b40808a0 2026-03-10T12:35:25.613 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.610+0000 7f26b9d85700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f26b412def0 con 0x7f26b4072330 2026-03-10T12:35:25.613 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.611+0000 7f26b3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b40808a0 0x7f26b4080d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.613 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.611+0000 7f26b3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b40808a0 0x7f26b4080d10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57918/0 (socket says 192.168.123.100:57918) 2026-03-10T12:35:25.613 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.611+0000 7f26b3fff700 1 -- 192.168.123.100:0/361268346 learned_addr learned my addr 192.168.123.100:0/361268346 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:25.613 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.612+0000 7f26b8d83700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26b4072330 0x7f26b4080360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.612+0000 7f26b3fff700 1 -- 192.168.123.100:0/361268346 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26b4072330 msgr2=0x7f26b4080360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.612+0000 7f26b3fff700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26b4072330 0x7f26b4080360 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.612+0000 7f26b3fff700 1 -- 192.168.123.100:0/361268346 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f26a4008440 con 0x7f26b40808a0 2026-03-10T12:35:25.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.613+0000 7f26b3fff700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b40808a0 0x7f26b4080d10 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f26ac00f520 tx=0x7f26ac00f830 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:25.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.613+0000 7f26b1ffb700 1 -- 192.168.123.100:0/361268346 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26ac010040 con 0x7f26b40808a0 2026-03-10T12:35:25.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.613+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f26b412e170 con 0x7f26b40808a0 2026-03-10T12:35:25.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.613+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f26b412e6c0 con 0x7f26b40808a0 2026-03-10T12:35:25.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.614+0000 7f26b1ffb700 1 -- 192.168.123.100:0/361268346 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f26ac009bf0 con 0x7f26b40808a0 2026-03-10T12:35:25.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.615+0000 7f26b1ffb700 1 -- 192.168.123.100:0/361268346 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26ac0158b0 con 0x7f26b40808a0 2026-03-10T12:35:25.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.615+0000 7f26b1ffb700 1 -- 192.168.123.100:0/361268346 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f26ac015b20 con 0x7f26b40808a0 2026-03-10T12:35:25.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.616+0000 7f26b1ffb700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f269c06c7a0 0x7f269c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.616+0000 7f26b8d83700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f269c06c7a0 0x7f269c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.617+0000 7f26b8d83700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f269c06c7a0 0x7f269c06ec50 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f26a4008760 tx=0x7f26a4011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:25.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.617+0000 7f26b1ffb700 1 -- 192.168.123.100:0/361268346 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f26ac08cef0 con 0x7f26b40808a0 2026-03-10T12:35:25.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.617+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f26a0005320 con 0x7f26b40808a0 2026-03-10T12:35:25.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.620+0000 7f26b1ffb700 1 -- 192.168.123.100:0/361268346 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f26ac05b180 con 0x7f26b40808a0 2026-03-10T12:35:25.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.771+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f26a0000bf0 con 0x7f269c06c7a0 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: from='client.24357 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:25.773 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:35:25.774 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:25 vm00.local ceph-mon[50686]: pgmap v88: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.0 KiB/s wr, 7 op/s 2026-03-10T12:35:25.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.774+0000 7f26b1ffb700 1 -- 192.168.123.100:0/361268346 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f26a0000bf0 con 0x7f269c06c7a0 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.777+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f269c06c7a0 msgr2=0x7f269c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.777+0000 7f26b9d85700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f269c06c7a0 0x7f269c06ec50 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f26a4008760 tx=0x7f26a4011040 comp rx=0 tx=0).stop 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.777+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b40808a0 msgr2=0x7f26b4080d10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.777+0000 7f26b9d85700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b40808a0 0x7f26b4080d10 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f26ac00f520 tx=0x7f26ac00f830 comp rx=0 tx=0).stop 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.778+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 shutdown_connections 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.778+0000 7f26b9d85700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f269c06c7a0 0x7f269c06ec50 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.778+0000 7f26b9d85700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f26b4072330 0x7f26b4080360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.778+0000 7f26b9d85700 1 --2- 192.168.123.100:0/361268346 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f26b40808a0 0x7f26b4080d10 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.778+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 >> 192.168.123.100:0/361268346 conn(0x7f26b406d1a0 msgr2=0x7f26b40703d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.778+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 shutdown_connections 2026-03-10T12:35:25.778 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.778+0000 7f26b9d85700 1 -- 192.168.123.100:0/361268346 wait complete. 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.893+0000 7f6736da8700 1 -- 192.168.123.100:0/2183658453 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6730074d80 msgr2=0x7f67300731e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.893+0000 7f6736da8700 1 --2- 192.168.123.100:0/2183658453 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6730074d80 0x7f67300731e0 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f672c009b80 tx=0x7f672c009e90 comp rx=0 tx=0).stop 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.893+0000 7f6736da8700 1 -- 192.168.123.100:0/2183658453 shutdown_connections 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.893+0000 7f6736da8700 1 --2- 192.168.123.100:0/2183658453 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f67300737b0 0x7f6730073c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.893+0000 7f6736da8700 1 --2- 192.168.123.100:0/2183658453 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6730074d80 0x7f67300731e0 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.893+0000 7f6736da8700 1 -- 192.168.123.100:0/2183658453 >> 192.168.123.100:0/2183658453 conn(0x7f67300fb860 msgr2=0x7f67300fdc90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.894+0000 7f6736da8700 1 -- 192.168.123.100:0/2183658453 shutdown_connections 2026-03-10T12:35:25.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.894+0000 7f6736da8700 1 -- 192.168.123.100:0/2183658453 wait complete. 2026-03-10T12:35:25.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.894+0000 7f6736da8700 1 Processor -- start 2026-03-10T12:35:25.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.894+0000 7f6736da8700 1 -- start start 2026-03-10T12:35:25.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f6736da8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f67300737b0 0x7f6730197b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f6735da6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f67300737b0 0x7f6730197b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f6735da6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f67300737b0 0x7f6730197b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:57944/0 (socket says 192.168.123.100:57944) 2026-03-10T12:35:25.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f6735da6700 1 -- 192.168.123.100:0/3551370099 learned_addr learned my addr 192.168.123.100:0/3551370099 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:25.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f6736da8700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6730074d80 0x7f6730198080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67301986a0 con 0x7f67300737b0 2026-03-10T12:35:25.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67301987e0 con 0x7f6730074d80 2026-03-10T12:35:25.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.895+0000 7f67355a5700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6730074d80 0x7f6730198080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.897+0000 7f6735da6700 1 -- 192.168.123.100:0/3551370099 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6730074d80 msgr2=0x7f6730198080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:25.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.897+0000 7f6735da6700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6730074d80 0x7f6730198080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:25.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.897+0000 7f6735da6700 1 -- 192.168.123.100:0/3551370099 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f672c0097e0 con 0x7f67300737b0 2026-03-10T12:35:25.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.897+0000 7f6735da6700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f67300737b0 0x7f6730197b40 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7f672c009b50 tx=0x7f672c00bae0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:25.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.899+0000 7f6726ffd700 1 -- 192.168.123.100:0/3551370099 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f672c01d070 con 0x7f67300737b0 2026-03-10T12:35:25.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.899+0000 7f6726ffd700 1 -- 192.168.123.100:0/3551370099 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f672c00be30 con 0x7f67300737b0 2026-03-10T12:35:25.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.899+0000 7f6726ffd700 1 -- 192.168.123.100:0/3551370099 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f672c00f890 con 0x7f67300737b0 2026-03-10T12:35:25.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.899+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f673019d230 con 0x7f67300737b0 2026-03-10T12:35:25.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.899+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f673019d720 con 0x7f67300737b0 2026-03-10T12:35:25.907 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.904+0000 7f6726ffd700 1 -- 192.168.123.100:0/3551370099 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f672c022b70 con 0x7f67300737b0 2026-03-10T12:35:25.907 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.904+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f673004ea50 con 0x7f67300737b0 2026-03-10T12:35:25.912 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.904+0000 7f6726ffd700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f671c06c690 0x7f671c06eb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:25.912 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.904+0000 7f6726ffd700 1 -- 192.168.123.100:0/3551370099 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f672c08d230 con 0x7f67300737b0 2026-03-10T12:35:25.912 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.907+0000 7f6726ffd700 1 -- 192.168.123.100:0/3551370099 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f672c05b440 con 0x7f67300737b0 2026-03-10T12:35:25.912 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.912+0000 7f67355a5700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f671c06c690 0x7f671c06eb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:25.915 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:25.914+0000 7f67355a5700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f671c06c690 0x7f671c06eb40 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f6720005950 tx=0x7f672000b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.056 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.055+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f67301073f0 con 0x7f671c06c690 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.062+0000 7f6726ffd700 1 -- 192.168.123.100:0/3551370099 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7f67301073f0 con 0x7f671c06c690 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (99s) 8s ago 2m 22.8M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (2m) 8s ago 2m 8074k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (117s) 9s ago 117s 8208k - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (2m) 8s ago 2m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (116s) 9s ago 116s 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (98s) 8s ago 2m 82.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (15s) 8s ago 15s 17.2M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (13s) 8s ago 13s 14.2M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (12s) 9s ago 12s 13.7M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (14s) 9s ago 14s 18.6M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:9283,8765,8443 running (3m) 8s ago 3m 498M - 18.2.0 dc2bc1663786 8dc0a869be20 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (112s) 9s ago 112s 448M - 18.2.0 dc2bc1663786 1662ba2e507c 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (3m) 8s ago 3m 50.6M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (110s) 9s ago 110s 44.3M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:35:26.063 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (2m) 8s ago 2m 14.4M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (113s) 9s ago 113s 12.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (91s) 8s ago 91s 45.5M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (81s) 8s ago 81s 45.9M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (71s) 8s ago 71s 46.7M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (61s) 9s ago 61s 44.8M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (51s) 9s ago 51s 43.9M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (41s) 9s ago 41s 42.7M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:35:26.064 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (93s) 8s ago 2m 39.1M - 2.43.0 a07b618ecd1d 5d567c813f4b 2026-03-10T12:35:26.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f671c06c690 msgr2=0x7f671c06eb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f671c06c690 0x7f671c06eb40 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f6720005950 tx=0x7f672000b410 comp rx=0 tx=0).stop 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f67300737b0 msgr2=0x7f6730197b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f67300737b0 0x7f6730197b40 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7f672c009b50 tx=0x7f672c00bae0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 shutdown_connections 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f671c06c690 0x7f671c06eb40 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f67300737b0 0x7f6730197b40 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 --2- 192.168.123.100:0/3551370099 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6730074d80 0x7f6730198080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 >> 192.168.123.100:0/3551370099 conn(0x7f67300fb860 msgr2=0x7f67300fdac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 shutdown_connections 2026-03-10T12:35:26.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.065+0000 7f6736da8700 1 -- 192.168.123.100:0/3551370099 wait complete. 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: from='client.24357 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T12:35:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:25 vm07.local ceph-mon[58582]: pgmap v88: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.0 KiB/s wr, 7 op/s 2026-03-10T12:35:26.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 -- 192.168.123.100:0/2204163441 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 msgr2=0x7f5eec109d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 --2- 192.168.123.100:0/2204163441 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 0x7f5eec109d60 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f5ee8009b00 tx=0x7f5ee8009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 -- 192.168.123.100:0/2204163441 shutdown_connections 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 --2- 192.168.123.100:0/2204163441 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5eec072320 0x7f5eec10c2f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 --2- 192.168.123.100:0/2204163441 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 0x7f5eec109d60 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 -- 192.168.123.100:0/2204163441 >> 192.168.123.100:0/2204163441 conn(0x7f5eec06d1a0 msgr2=0x7f5eec06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 -- 192.168.123.100:0/2204163441 shutdown_connections 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.143+0000 7f5ef268a700 1 -- 192.168.123.100:0/2204163441 wait complete. 2026-03-10T12:35:26.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.144+0000 7f5ef268a700 1 Processor -- start 2026-03-10T12:35:26.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.144+0000 7f5ef268a700 1 -- start start 2026-03-10T12:35:26.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.144+0000 7f5ef268a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 0x7f5eec199680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.144+0000 7f5ef268a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5eec199bc0 0x7f5eec1960e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.144+0000 7f5ef268a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5eec196620 con 0x7f5eec071e80 2026-03-10T12:35:26.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.144+0000 7f5ef268a700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5eec196790 con 0x7f5eec199bc0 2026-03-10T12:35:26.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef1688700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 0x7f5eec199680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef0e87700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5eec199bc0 0x7f5eec1960e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef0e87700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5eec199bc0 0x7f5eec1960e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:39830/0 (socket says 192.168.123.100:39830) 2026-03-10T12:35:26.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef0e87700 1 -- 192.168.123.100:0/3641401202 learned_addr learned my addr 192.168.123.100:0/3641401202 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:26.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef1688700 1 -- 192.168.123.100:0/3641401202 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5eec199bc0 msgr2=0x7f5eec1960e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef1688700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5eec199bc0 0x7f5eec1960e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef1688700 1 -- 192.168.123.100:0/3641401202 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5ee80097e0 con 0x7f5eec071e80 2026-03-10T12:35:26.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.145+0000 7f5ef1688700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 0x7f5eec199680 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f5ee8009ad0 tx=0x7f5ee800bb20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.146+0000 7f5ee27fc700 1 -- 192.168.123.100:0/3641401202 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ee801d070 con 0x7f5eec071e80 2026-03-10T12:35:26.147 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.146+0000 7f5ee27fc700 1 -- 192.168.123.100:0/3641401202 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5ee8022470 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.146+0000 7f5ee27fc700 1 -- 192.168.123.100:0/3641401202 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ee800f670 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.146+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5eec196990 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.146+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5eec1a01d0 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.147+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5eec04ea50 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.149+0000 7f5ee27fc700 1 -- 192.168.123.100:0/3641401202 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5ee80225e0 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.149+0000 7f5ee27fc700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5ed806c520 0x7f5ed806e9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.149+0000 7f5ee27fc700 1 -- 192.168.123.100:0/3641401202 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f5ee808d6c0 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.150+0000 7f5ef0e87700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5ed806c520 0x7f5ed806e9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.150+0000 7f5ee27fc700 1 -- 192.168.123.100:0/3641401202 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5ee805b950 con 0x7f5eec071e80 2026-03-10T12:35:26.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.151+0000 7f5ef0e87700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5ed806c520 0x7f5ed806e9d0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f5ee4003f90 tx=0x7f5ee400b040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.310+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5eec072320 con 0x7f5eec071e80 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.312+0000 7f5ee27fc700 1 -- 192.168.123.100:0/3641401202 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f5ee8022890 con 0x7f5eec071e80 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:35:26.313 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.315+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5ed806c520 msgr2=0x7f5ed806e9d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.315+0000 7f5ef268a700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5ed806c520 0x7f5ed806e9d0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f5ee4003f90 tx=0x7f5ee400b040 comp rx=0 tx=0).stop 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.315+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 msgr2=0x7f5eec199680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.315+0000 7f5ef268a700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 0x7f5eec199680 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f5ee8009ad0 tx=0x7f5ee800bb20 comp rx=0 tx=0).stop 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.316+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 shutdown_connections 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.316+0000 7f5ef268a700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5ed806c520 0x7f5ed806e9d0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.316+0000 7f5ef268a700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5eec071e80 0x7f5eec199680 unknown :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.316+0000 7f5ef268a700 1 --2- 192.168.123.100:0/3641401202 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5eec199bc0 0x7f5eec1960e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.316+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 >> 192.168.123.100:0/3641401202 conn(0x7f5eec06d1a0 msgr2=0x7f5eec10dc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.316+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 shutdown_connections 2026-03-10T12:35:26.316 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.316+0000 7f5ef268a700 1 -- 192.168.123.100:0/3641401202 wait complete. 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.388+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/4071155469 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a5cb0 msgr2=0x7f9fc00a6100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.388+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/4071155469 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc00a6100 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7f9fc80669f0 tx=0x7f9fc80699f0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.388+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/4071155469 shutdown_connections 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.388+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/4071155469 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc00a6100 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.388+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/4071155469 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a4bc0 0x7f9fc00a4fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.388+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/4071155469 >> 192.168.123.100:0/4071155469 conn(0x7f9fc009f7b0 msgr2=0x7f9fc00a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/4071155469 shutdown_connections 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/4071155469 wait complete. 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 Processor -- start 2026-03-10T12:35:26.389 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 -- start start 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a4bc0 0x7f9fc01423d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc0142910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fc0142f30 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fcdbcc700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fc0143070 con 0x7f9fc00a5cb0 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.389+0000 7f9fccbca700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a4bc0 0x7f9fc01423d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fc7fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc0142910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fc7fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc0142910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:39846/0 (socket says 192.168.123.100:39846) 2026-03-10T12:35:26.390 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fc7fff700 1 -- 192.168.123.100:0/1177160479 learned_addr learned my addr 192.168.123.100:0/1177160479 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fccbca700 1 -- 192.168.123.100:0/1177160479 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a5cb0 msgr2=0x7f9fc0142910 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fccbca700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc0142910 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fccbca700 1 -- 192.168.123.100:0/1177160479 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9fc8067050 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fc7fff700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc0142910 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fccbca700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a4bc0 0x7f9fc01423d0 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f9fb400ca20 tx=0x7f9fb400cd30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fc5ffb700 1 -- 192.168.123.100:0/1177160479 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9fb40041d0 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fc5ffb700 1 -- 192.168.123.100:0/1177160479 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9fb4004d10 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fc5ffb700 1 -- 192.168.123.100:0/1177160479 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9fb40076f0 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9fc0147b20 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.390+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9fc0148070 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.391+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9fc0004f40 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.395 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.395+0000 7f9fc5ffb700 1 -- 192.168.123.100:0/1177160479 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f9fb4004750 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.395+0000 7f9fc5ffb700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fb806c7a0 0x7f9fb806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.395+0000 7f9fc5ffb700 1 -- 192.168.123.100:0/1177160479 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f9fb408ba80 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.395+0000 7f9fc7fff700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fb806c7a0 0x7f9fb806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.395+0000 7f9fc5ffb700 1 -- 192.168.123.100:0/1177160479 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9fb408be60 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.396+0000 7f9fc7fff700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fb806c7a0 0x7f9fb806ec50 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f9fc8066860 tx=0x7f9fc8064010 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.538 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.537+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f9fc0148500 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.538 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.538+0000 7f9fc5ffb700 1 -- 192.168.123.100:0/1177160479 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1836 (secure 0 0 0) 0x7f9fb4059d10 con 0x7f9fc00a4bc0 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:e11 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:epoch 10 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:35:17.532287+0000 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:26.539 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fb806c7a0 msgr2=0x7f9fb806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fb806c7a0 0x7f9fb806ec50 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f9fc8066860 tx=0x7f9fc8064010 comp rx=0 tx=0).stop 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a4bc0 msgr2=0x7f9fc01423d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a4bc0 0x7f9fc01423d0 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f9fb400ca20 tx=0x7f9fb400cd30 comp rx=0 tx=0).stop 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 shutdown_connections 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f9fb806c7a0 0x7f9fb806ec50 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9fc00a4bc0 0x7f9fc01423d0 unknown :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 --2- 192.168.123.100:0/1177160479 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9fc00a5cb0 0x7f9fc0142910 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.541+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 >> 192.168.123.100:0/1177160479 conn(0x7f9fc009f7b0 msgr2=0x7f9fc00a8ee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.542+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 shutdown_connections 2026-03-10T12:35:26.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.542+0000 7f9fcdbcc700 1 -- 192.168.123.100:0/1177160479 wait complete. 2026-03-10T12:35:26.543 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 11 2026-03-10T12:35:26.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.618+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/1868023835 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74103960 msgr2=0x7f8e74103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.618+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/1868023835 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74103960 0x7f8e74103db0 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7f8e70009b50 tx=0x7f8e70009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:26.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.624+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/1868023835 shutdown_connections 2026-03-10T12:35:26.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.624+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/1868023835 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74103960 0x7f8e74103db0 unknown :-1 s=CLOSED pgs=293 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.624+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/1868023835 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74102760 0x7f8e74102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.624+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/1868023835 >> 192.168.123.100:0/1868023835 conn(0x7f8e740fdcf0 msgr2=0x7f8e74100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.624+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/1868023835 shutdown_connections 2026-03-10T12:35:26.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.624+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/1868023835 wait complete. 2026-03-10T12:35:26.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.624+0000 7f8e7bfd1700 1 Processor -- start 2026-03-10T12:35:26.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.625+0000 7f8e7bfd1700 1 -- start start 2026-03-10T12:35:26.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.625+0000 7f8e7bfd1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74102760 0x7f8e7419c470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.625+0000 7f8e7bfd1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74103960 0x7f8e7419c9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.625+0000 7f8e7bfd1700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e7419cfd0 con 0x7f8e74102760 2026-03-10T12:35:26.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.625+0000 7f8e7bfd1700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e7419d110 con 0x7f8e74103960 2026-03-10T12:35:26.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.625+0000 7f8e7956c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74103960 0x7f8e7419c9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.626+0000 7f8e7956c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74103960 0x7f8e7419c9b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:39848/0 (socket says 192.168.123.100:39848) 2026-03-10T12:35:26.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.626+0000 7f8e7956c700 1 -- 192.168.123.100:0/2291497645 learned_addr learned my addr 192.168.123.100:0/2291497645 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:26.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.626+0000 7f8e79d6d700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74102760 0x7f8e7419c470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.626+0000 7f8e7956c700 1 -- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74102760 msgr2=0x7f8e7419c470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.626+0000 7f8e7956c700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74102760 0x7f8e7419c470 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.626+0000 7f8e7956c700 1 -- 192.168.123.100:0/2291497645 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8e700097e0 con 0x7f8e74103960 2026-03-10T12:35:26.627 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.626+0000 7f8e79d6d700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74102760 0x7f8e7419c470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:35:26.627 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.627+0000 7f8e7956c700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74103960 0x7f8e7419c9b0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f8e70005270 tx=0x7f8e70005710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.627+0000 7f8e6affd700 1 -- 192.168.123.100:0/2291497645 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e7001d070 con 0x7f8e74103960 2026-03-10T12:35:26.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.627+0000 7f8e6affd700 1 -- 192.168.123.100:0/2291497645 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8e7000bc30 con 0x7f8e74103960 2026-03-10T12:35:26.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.627+0000 7f8e6affd700 1 -- 192.168.123.100:0/2291497645 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e700217c0 con 0x7f8e74103960 2026-03-10T12:35:26.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.627+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e741a1b60 con 0x7f8e74103960 2026-03-10T12:35:26.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.627+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e741a2000 con 0x7f8e74103960 2026-03-10T12:35:26.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.629+0000 7f8e6affd700 1 -- 192.168.123.100:0/2291497645 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8e7000f460 con 0x7f8e74103960 2026-03-10T12:35:26.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.629+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8e74066e40 con 0x7f8e74103960 2026-03-10T12:35:26.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.629+0000 7f8e6affd700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8e6006c6f0 0x7f8e6006eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.629+0000 7f8e6affd700 1 -- 192.168.123.100:0/2291497645 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f8e7008cc00 con 0x7f8e74103960 2026-03-10T12:35:26.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.630+0000 7f8e79d6d700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8e6006c6f0 0x7f8e6006eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.630+0000 7f8e79d6d700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8e6006c6f0 0x7f8e6006eba0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f8e64006fd0 tx=0x7f8e64009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.632 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.632+0000 7f8e6affd700 1 -- 192.168.123.100:0/2291497645 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8e7005ae40 con 0x7f8e74103960 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.750+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8e741082b0 con 0x7f8e6006c6f0 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.751+0000 7f8e6affd700 1 -- 192.168.123.100:0/2291497645 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f8e741082b0 con 0x7f8e6006c6f0 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:35:26.752 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "", 2026-03-10T12:35:26.753 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T12:35:26.753 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:35:26.753 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:35:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8e6006c6f0 msgr2=0x7f8e6006eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8e6006c6f0 0x7f8e6006eba0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f8e64006fd0 tx=0x7f8e64009380 comp rx=0 tx=0).stop 2026-03-10T12:35:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74103960 msgr2=0x7f8e7419c9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74103960 0x7f8e7419c9b0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f8e70005270 tx=0x7f8e70005710 comp rx=0 tx=0).stop 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 shutdown_connections 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8e6006c6f0 0x7f8e6006eba0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8e74102760 0x7f8e7419c470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 --2- 192.168.123.100:0/2291497645 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8e74103960 0x7f8e7419c9b0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 >> 192.168.123.100:0/2291497645 conn(0x7f8e740fdcf0 msgr2=0x7f8e74106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 shutdown_connections 2026-03-10T12:35:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.754+0000 7f8e7bfd1700 1 -- 192.168.123.100:0/2291497645 wait complete. 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.824+0000 7fa06a590700 1 -- 192.168.123.100:0/2322875451 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 msgr2=0x7fa064105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.824+0000 7fa06a590700 1 --2- 192.168.123.100:0/2322875451 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064105ad0 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7fa060009b00 tx=0x7fa060009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.825+0000 7fa06a590700 1 -- 192.168.123.100:0/2322875451 shutdown_connections 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.825+0000 7fa06a590700 1 --2- 192.168.123.100:0/2322875451 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064105ad0 unknown :-1 s=CLOSED pgs=294 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.825+0000 7fa06a590700 1 --2- 192.168.123.100:0/2322875451 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa064100dd0 0x7fa0641031b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.825+0000 7fa06a590700 1 -- 192.168.123.100:0/2322875451 >> 192.168.123.100:0/2322875451 conn(0x7fa0640fa7b0 msgr2=0x7fa0640fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.825+0000 7fa06a590700 1 -- 192.168.123.100:0/2322875451 shutdown_connections 2026-03-10T12:35:26.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.825+0000 7fa06a590700 1 -- 192.168.123.100:0/2322875451 wait complete. 2026-03-10T12:35:26.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.825+0000 7fa06a590700 1 Processor -- start 2026-03-10T12:35:26.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa06a590700 1 -- start start 2026-03-10T12:35:26.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa06a590700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa064100dd0 0x7fa064197dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa06a590700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064198300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa068d8d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064198300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa068d8d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064198300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:58004/0 (socket says 192.168.123.100:58004) 2026-03-10T12:35:26.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa068d8d700 1 -- 192.168.123.100:0/33790436 learned_addr learned my addr 192.168.123.100:0/33790436 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:26.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa06a590700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa064198890 con 0x7fa0641036f0 2026-03-10T12:35:26.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.826+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa0641989d0 con 0x7fa064100dd0 2026-03-10T12:35:26.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa068d8d700 1 -- 192.168.123.100:0/33790436 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa064100dd0 msgr2=0x7fa064197dc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa068d8d700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa064100dd0 0x7fa064197dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa068d8d700 1 -- 192.168.123.100:0/33790436 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa0600097e0 con 0x7fa0641036f0 2026-03-10T12:35:26.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa068d8d700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064198300 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7fa060004a00 tx=0x7fa060004ae0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa05a7fc700 1 -- 192.168.123.100:0/33790436 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa06001d070 con 0x7fa0641036f0 2026-03-10T12:35:26.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa05a7fc700 1 -- 192.168.123.100:0/33790436 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa06000bcd0 con 0x7fa0641036f0 2026-03-10T12:35:26.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa06419d430 con 0x7fa0641036f0 2026-03-10T12:35:26.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.827+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa06419d8f0 con 0x7fa0641036f0 2026-03-10T12:35:26.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.828+0000 7fa05a7fc700 1 -- 192.168.123.100:0/33790436 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa06000f980 con 0x7fa0641036f0 2026-03-10T12:35:26.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.830+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa0640fc3b0 con 0x7fa0641036f0 2026-03-10T12:35:26.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.830+0000 7fa05a7fc700 1 -- 192.168.123.100:0/33790436 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fa060022ca0 con 0x7fa0641036f0 2026-03-10T12:35:26.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.830+0000 7fa05a7fc700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa05006c680 0x7fa05006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:26.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.830+0000 7fa05a7fc700 1 -- 192.168.123.100:0/33790436 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fa06008d000 con 0x7fa0641036f0 2026-03-10T12:35:26.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.834+0000 7fa05a7fc700 1 -- 192.168.123.100:0/33790436 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fa06005b240 con 0x7fa0641036f0 2026-03-10T12:35:26.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.834+0000 7fa06958e700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa05006c680 0x7fa05006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:26.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.835+0000 7fa06958e700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa05006c680 0x7fa05006eb30 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fa054005fd0 tx=0x7fa054005d00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:26.980 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:26 vm00.local ceph-mon[50686]: from='client.24361 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:26.981 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:26 vm00.local ceph-mon[50686]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:26.981 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:26 vm00.local ceph-mon[50686]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:26.981 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:26 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/3641401202' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:35:26.981 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:26 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/1177160479' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:26.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.980+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fa06404ea50 con 0x7fa0641036f0 2026-03-10T12:35:26.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.983+0000 7fa05a7fc700 1 -- 192.168.123.100:0/33790436 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fa060027030 con 0x7fa0641036f0 2026-03-10T12:35:26.983 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:35:26.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.985+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa05006c680 msgr2=0x7fa05006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.985+0000 7fa06a590700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa05006c680 0x7fa05006eb30 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fa054005fd0 tx=0x7fa054005d00 comp rx=0 tx=0).stop 2026-03-10T12:35:26.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.985+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 msgr2=0x7fa064198300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:26.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.985+0000 7fa06a590700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064198300 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7fa060004a00 tx=0x7fa060004ae0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.986+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 shutdown_connections 2026-03-10T12:35:26.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.986+0000 7fa06a590700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fa05006c680 0x7fa05006eb30 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.986+0000 7fa06a590700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa064100dd0 0x7fa064197dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.986+0000 7fa06a590700 1 --2- 192.168.123.100:0/33790436 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa0641036f0 0x7fa064198300 unknown :-1 s=CLOSED pgs=295 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:26.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.986+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 >> 192.168.123.100:0/33790436 conn(0x7fa0640fa7b0 msgr2=0x7fa0640ff450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:26.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.986+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 shutdown_connections 2026-03-10T12:35:26.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:26.986+0000 7fa06a590700 1 -- 192.168.123.100:0/33790436 wait complete. 2026-03-10T12:35:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:26 vm07.local ceph-mon[58582]: from='client.24361 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:26 vm07.local ceph-mon[58582]: from='client.14552 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:26 vm07.local ceph-mon[58582]: from='client.14556 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:26 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/3641401202' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:35:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:26 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/1177160479' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:27 vm07.local ceph-mon[58582]: from='client.24377 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:27 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/33790436' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:35:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:27 vm07.local ceph-mon[58582]: pgmap v89: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.5 KiB/s wr, 2 op/s 2026-03-10T12:35:28.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:27 vm00.local ceph-mon[50686]: from='client.24377 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:28.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:27 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/33790436' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:35:28.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:27 vm00.local ceph-mon[50686]: pgmap v89: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.5 KiB/s wr, 2 op/s 2026-03-10T12:35:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:30 vm00.local ceph-mon[50686]: pgmap v90: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-10T12:35:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:30 vm07.local ceph-mon[58582]: pgmap v90: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-10T12:35:32.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:32 vm07.local ceph-mon[58582]: pgmap v91: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-10T12:35:32.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:32 vm00.local ceph-mon[50686]: pgmap v91: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 853 B/s wr, 0 op/s 2026-03-10T12:35:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:34 vm00.local ceph-mon[50686]: pgmap v92: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 0 op/s 2026-03-10T12:35:34.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:34 vm07.local ceph-mon[58582]: pgmap v92: 65 pgs: 65 active+clean; 459 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 0 op/s 2026-03-10T12:35:36.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:36 vm00.local ceph-mon[50686]: pgmap v93: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s wr, 1 op/s 2026-03-10T12:35:37.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:36 vm07.local ceph-mon[58582]: pgmap v93: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s wr, 1 op/s 2026-03-10T12:35:38.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:37 vm00.local ceph-mon[50686]: pgmap v94: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-10T12:35:38.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:37 vm07.local ceph-mon[58582]: pgmap v94: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-10T12:35:39.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:38 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:39.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:38 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:40.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:40 vm00.local ceph-mon[50686]: pgmap v95: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-10T12:35:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:40 vm07.local ceph-mon[58582]: pgmap v95: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-10T12:35:42.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:42 vm07.local ceph-mon[58582]: pgmap v96: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-10T12:35:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:42 vm00.local ceph-mon[50686]: pgmap v96: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-10T12:35:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:44 vm07.local ceph-mon[58582]: pgmap v97: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T12:35:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:44 vm00.local ceph-mon[50686]: pgmap v97: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T12:35:46.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:46 vm07.local ceph-mon[58582]: pgmap v98: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T12:35:46.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:46 vm00.local ceph-mon[50686]: pgmap v98: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T12:35:48.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:48 vm07.local ceph-mon[58582]: pgmap v99: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:48.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:48 vm00.local ceph-mon[50686]: pgmap v99: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:50.460 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:50 vm00.local ceph-mon[50686]: pgmap v100: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:50.492 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:50 vm07.local ceph-mon[58582]: pgmap v100: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:52 vm00.local ceph-mon[50686]: pgmap v101: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:52 vm07.local ceph-mon[58582]: pgmap v101: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:53.470 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:53 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:53.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:53 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:35:54.472 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:54 vm07.local ceph-mon[58582]: pgmap v102: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:54 vm00.local ceph-mon[50686]: pgmap v102: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:56 vm00.local ceph-mon[50686]: pgmap v103: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:56.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:56 vm07.local ceph-mon[58582]: pgmap v103: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 -- 192.168.123.100:0/2901142502 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 msgr2=0x7f9460073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 --2- 192.168.123.100:0/2901142502 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f9460073c20 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f9454009b00 tx=0x7f9454009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 -- 192.168.123.100:0/2901142502 shutdown_connections 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 --2- 192.168.123.100:0/2901142502 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f9460073c20 unknown :-1 s=CLOSED pgs=296 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 --2- 192.168.123.100:0/2901142502 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9460074d80 0x7f94600731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 -- 192.168.123.100:0/2901142502 >> 192.168.123.100:0/2901142502 conn(0x7f94600fbaa0 msgr2=0x7f94600fdf10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 -- 192.168.123.100:0/2901142502 shutdown_connections 2026-03-10T12:35:57.067 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 -- 192.168.123.100:0/2901142502 wait complete. 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 Processor -- start 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.067+0000 7f9466cbc700 1 -- start start 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9466cbc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f946019c430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9466cbc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9460074d80 0x7f946019c970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9464a58700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f946019c430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9464a58700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f946019c430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33210/0 (socket says 192.168.123.100:33210) 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9464a58700 1 -- 192.168.123.100:0/3630533421 learned_addr learned my addr 192.168.123.100:0/3630533421 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9466cbc700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f946019cf90 con 0x7f94600737b0 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f946019d0d0 con 0x7f9460074d80 2026-03-10T12:35:57.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f945ffff700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9460074d80 0x7f946019c970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9464a58700 1 -- 192.168.123.100:0/3630533421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9460074d80 msgr2=0x7f946019c970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9464a58700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9460074d80 0x7f946019c970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.068+0000 7f9464a58700 1 -- 192.168.123.100:0/3630533421 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f94540097e0 con 0x7f94600737b0 2026-03-10T12:35:57.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.070+0000 7f9464a58700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f946019c430 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7f945000b700 tx=0x7f945000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.070+0000 7f945dffb700 1 -- 192.168.123.100:0/3630533421 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9450010840 con 0x7f94600737b0 2026-03-10T12:35:57.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.070+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f94601a1b80 con 0x7f94600737b0 2026-03-10T12:35:57.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.070+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f94601a20d0 con 0x7f94600737b0 2026-03-10T12:35:57.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.070+0000 7f945dffb700 1 -- 192.168.123.100:0/3630533421 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9450010e80 con 0x7f94600737b0 2026-03-10T12:35:57.070 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.070+0000 7f945dffb700 1 -- 192.168.123.100:0/3630533421 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f945000d590 con 0x7f94600737b0 2026-03-10T12:35:57.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.070+0000 7f945dffb700 1 -- 192.168.123.100:0/3630533421 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f945000d6f0 con 0x7f94600737b0 2026-03-10T12:35:57.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.071+0000 7f945dffb700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f944806c7a0 0x7f944806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.071+0000 7f945dffb700 1 -- 192.168.123.100:0/3630533421 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f945008b6f0 con 0x7f94600737b0 2026-03-10T12:35:57.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.071+0000 7f945ffff700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f944806c7a0 0x7f944806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.072 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.071+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f944c005320 con 0x7f94600737b0 2026-03-10T12:35:57.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.076+0000 7f945dffb700 1 -- 192.168.123.100:0/3630533421 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f9450014080 con 0x7f94600737b0 2026-03-10T12:35:57.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.076+0000 7f945ffff700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f944806c7a0 0x7f944806ec50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f9454009fd0 tx=0x7f9454005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.199 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.198+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f944c000bf0 con 0x7f944806c7a0 2026-03-10T12:35:57.199 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.199+0000 7f945dffb700 1 -- 192.168.123.100:0/3630533421 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f944c000bf0 con 0x7f944806c7a0 2026-03-10T12:35:57.202 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.202+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f944806c7a0 msgr2=0x7f944806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.202 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.202+0000 7f9466cbc700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f944806c7a0 0x7f944806ec50 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f9454009fd0 tx=0x7f9454005c00 comp rx=0 tx=0).stop 2026-03-10T12:35:57.203 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 msgr2=0x7f946019c430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f946019c430 secure :-1 s=READY pgs=297 cs=0 l=1 rev1=1 crypto rx=0x7f945000b700 tx=0x7f945000bac0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 shutdown_connections 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f944806c7a0 0x7f944806ec50 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f94600737b0 0x7f946019c430 unknown :-1 s=CLOSED pgs=297 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 --2- 192.168.123.100:0/3630533421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9460074d80 0x7f946019c970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 >> 192.168.123.100:0/3630533421 conn(0x7f94600fbaa0 msgr2=0x7f9460101ec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 shutdown_connections 2026-03-10T12:35:57.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.204+0000 7f9466cbc700 1 -- 192.168.123.100:0/3630533421 wait complete. 2026-03-10T12:35:57.214 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2370107106 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c106f10 msgr2=0x7f3c3c107320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2370107106 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c106f10 0x7f3c3c107320 secure :-1 s=READY pgs=298 cs=0 l=1 rev1=1 crypto rx=0x7f3c2c009b00 tx=0x7f3c2c009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2370107106 shutdown_connections 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2370107106 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c107d70 0x7f3c3c1081e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2370107106 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c106f10 0x7f3c3c107320 unknown :-1 s=CLOSED pgs=298 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2370107106 >> 192.168.123.100:0/2370107106 conn(0x7f3c3c069ed0 msgr2=0x7f3c3c0783e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2370107106 shutdown_connections 2026-03-10T12:35:57.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2370107106 wait complete. 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 Processor -- start 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 -- start start 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.277+0000 7f3c41ff9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c106f10 0x7f3c3c19c4a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c41ff9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c107d70 0x7f3c3c19c9e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c41ff9700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c3c19d000 con 0x7f3c3c107d70 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c41ff9700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c3c19d140 con 0x7f3c3c106f10 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c3b7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c106f10 0x7f3c3c19c4a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c3b7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c106f10 0x7f3c3c19c4a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:49590/0 (socket says 192.168.123.100:49590) 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c3b7fe700 1 -- 192.168.123.100:0/2001961220 learned_addr learned my addr 192.168.123.100:0/2001961220 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:57.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c32dff700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c107d70 0x7f3c3c19c9e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c3b7fe700 1 -- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c107d70 msgr2=0x7f3c3c19c9e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c3b7fe700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c107d70 0x7f3c3c19c9e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c3b7fe700 1 -- 192.168.123.100:0/2001961220 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3c2c0097e0 con 0x7f3c3c106f10 2026-03-10T12:35:57.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c32dff700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c107d70 0x7f3c3c19c9e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:35:57.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.278+0000 7f3c3b7fe700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c106f10 0x7f3c3c19c4a0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f3c2c0052d0 tx=0x7f3c2c0049d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.279+0000 7f3c397fa700 1 -- 192.168.123.100:0/2001961220 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c2c01d070 con 0x7f3c3c106f10 2026-03-10T12:35:57.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.279+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3c3c1a1b90 con 0x7f3c3c106f10 2026-03-10T12:35:57.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.279+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3c3c1a2020 con 0x7f3c3c106f10 2026-03-10T12:35:57.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.279+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3c3c066e40 con 0x7f3c3c106f10 2026-03-10T12:35:57.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.280+0000 7f3c397fa700 1 -- 192.168.123.100:0/2001961220 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3c2c004500 con 0x7f3c3c106f10 2026-03-10T12:35:57.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.280+0000 7f3c397fa700 1 -- 192.168.123.100:0/2001961220 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c2c01d070 con 0x7f3c3c106f10 2026-03-10T12:35:57.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.283+0000 7f3c397fa700 1 -- 192.168.123.100:0/2001961220 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3c2c00bc50 con 0x7f3c3c106f10 2026-03-10T12:35:57.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.283+0000 7f3c397fa700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3c2806c710 0x7f3c2806ebc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.283+0000 7f3c32dff700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3c2806c710 0x7f3c2806ebc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.284+0000 7f3c32dff700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3c2806c710 0x7f3c2806ebc0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f3c24009ba0 tx=0x7f3c24008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.284+0000 7f3c397fa700 1 -- 192.168.123.100:0/2001961220 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f3c2c030080 con 0x7f3c3c106f10 2026-03-10T12:35:57.285 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.285+0000 7f3c397fa700 1 -- 192.168.123.100:0/2001961220 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3c2c057b70 con 0x7f3c3c106f10 2026-03-10T12:35:57.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.410+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3c3c10c6c0 con 0x7f3c2806c710 2026-03-10T12:35:57.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.411+0000 7f3c397fa700 1 -- 192.168.123.100:0/2001961220 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f3c3c10c6c0 con 0x7f3c2806c710 2026-03-10T12:35:57.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3c2806c710 msgr2=0x7f3c2806ebc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3c2806c710 0x7f3c2806ebc0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f3c24009ba0 tx=0x7f3c24008040 comp rx=0 tx=0).stop 2026-03-10T12:35:57.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c106f10 msgr2=0x7f3c3c19c4a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c106f10 0x7f3c3c19c4a0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f3c2c0052d0 tx=0x7f3c2c0049d0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 shutdown_connections 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3c2806c710 0x7f3c2806ebc0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3c3c106f10 0x7f3c3c19c4a0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 --2- 192.168.123.100:0/2001961220 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3c3c107d70 0x7f3c3c19c9e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 >> 192.168.123.100:0/2001961220 conn(0x7f3c3c069ed0 msgr2=0x7f3c3c10afa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 shutdown_connections 2026-03-10T12:35:57.416 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.416+0000 7f3c41ff9700 1 -- 192.168.123.100:0/2001961220 wait complete. 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.491+0000 7fd65e147700 1 -- 192.168.123.100:0/3394719126 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658103720 msgr2=0x7fd658105b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.491+0000 7fd65e147700 1 --2- 192.168.123.100:0/3394719126 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658103720 0x7fd658105b00 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7fd654009b00 tx=0x7fd654009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.491+0000 7fd65e147700 1 -- 192.168.123.100:0/3394719126 shutdown_connections 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.491+0000 7fd65e147700 1 --2- 192.168.123.100:0/3394719126 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658103720 0x7fd658105b00 unknown :-1 s=CLOSED pgs=299 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.491+0000 7fd65e147700 1 --2- 192.168.123.100:0/3394719126 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd658100e00 0x7fd6581031e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.491+0000 7fd65e147700 1 -- 192.168.123.100:0/3394719126 >> 192.168.123.100:0/3394719126 conn(0x7fd6580fa740 msgr2=0x7fd6580fcb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.491+0000 7fd65e147700 1 -- 192.168.123.100:0/3394719126 shutdown_connections 2026-03-10T12:35:57.492 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.492+0000 7fd65e147700 1 -- 192.168.123.100:0/3394719126 wait complete. 2026-03-10T12:35:57.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.493+0000 7fd65e147700 1 Processor -- start 2026-03-10T12:35:57.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.493+0000 7fd65e147700 1 -- start start 2026-03-10T12:35:57.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.493+0000 7fd65e147700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658100e00 0x7fd658197d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.493+0000 7fd65e147700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd658103720 0x7fd658198240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.494+0000 7fd65e147700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6581987d0 con 0x7fd658100e00 2026-03-10T12:35:57.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.494+0000 7fd65e147700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd658198940 con 0x7fd658103720 2026-03-10T12:35:57.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.494+0000 7fd65d145700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658100e00 0x7fd658197d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.494+0000 7fd65d145700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658100e00 0x7fd658197d00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33266/0 (socket says 192.168.123.100:33266) 2026-03-10T12:35:57.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.494+0000 7fd65d145700 1 -- 192.168.123.100:0/3626743381 learned_addr learned my addr 192.168.123.100:0/3626743381 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:57.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.494+0000 7fd65c944700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd658103720 0x7fd658198240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.495+0000 7fd65d145700 1 -- 192.168.123.100:0/3626743381 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd658103720 msgr2=0x7fd658198240 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.495+0000 7fd65d145700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd658103720 0x7fd658198240 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.495+0000 7fd65d145700 1 -- 192.168.123.100:0/3626743381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6540097e0 con 0x7fd658100e00 2026-03-10T12:35:57.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.495+0000 7fd65d145700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658100e00 0x7fd658197d00 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7fd64800c960 tx=0x7fd64800cc70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.496+0000 7fd64e7fc700 1 -- 192.168.123.100:0/3626743381 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd648007a10 con 0x7fd658100e00 2026-03-10T12:35:57.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.496+0000 7fd64e7fc700 1 -- 192.168.123.100:0/3626743381 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd648007b70 con 0x7fd658100e00 2026-03-10T12:35:57.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.496+0000 7fd64e7fc700 1 -- 192.168.123.100:0/3626743381 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd648018610 con 0x7fd658100e00 2026-03-10T12:35:57.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.496+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd65819d390 con 0x7fd658100e00 2026-03-10T12:35:57.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.496+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd65818e7a0 con 0x7fd658100e00 2026-03-10T12:35:57.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.497+0000 7fd64e7fc700 1 -- 192.168.123.100:0/3626743381 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fd648018840 con 0x7fd658100e00 2026-03-10T12:35:57.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.497+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6580fc320 con 0x7fd658100e00 2026-03-10T12:35:57.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.501+0000 7fd64e7fc700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd64406c680 0x7fd64406eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.501+0000 7fd64e7fc700 1 -- 192.168.123.100:0/3626743381 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fd64808cb50 con 0x7fd658100e00 2026-03-10T12:35:57.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.501+0000 7fd64e7fc700 1 -- 192.168.123.100:0/3626743381 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fd64800f9c0 con 0x7fd658100e00 2026-03-10T12:35:57.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.502+0000 7fd65c944700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd64406c680 0x7fd64406eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.502+0000 7fd65c944700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd64406c680 0x7fd64406eb30 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fd654005f50 tx=0x7fd654005ec0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.613 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.612+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd658061190 con 0x7fd64406c680 2026-03-10T12:35:57.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.619+0000 7fd64e7fc700 1 -- 192.168.123.100:0/3626743381 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fd658061190 con 0x7fd64406c680 2026-03-10T12:35:57.619 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:35:57.619 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (2m) 39s ago 2m 22.8M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:35:57.619 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (3m) 39s ago 3m 8074k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:35:57.619 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (2m) 40s ago 2m 8208k - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:35:57.619 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (3m) 39s ago 3m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:35:57.619 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (2m) 40s ago 2m 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (2m) 39s ago 2m 82.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (47s) 39s ago 46s 17.2M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (45s) 39s ago 45s 14.2M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (44s) 40s ago 43s 13.7M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (46s) 40s ago 46s 18.6M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:9283,8765,8443 running (3m) 39s ago 3m 498M - 18.2.0 dc2bc1663786 8dc0a869be20 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (2m) 40s ago 2m 448M - 18.2.0 dc2bc1663786 1662ba2e507c 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (3m) 39s ago 3m 50.6M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (2m) 40s ago 2m 44.3M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (3m) 39s ago 3m 14.4M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (2m) 40s ago 2m 12.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (2m) 39s ago 2m 45.5M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (112s) 39s ago 112s 45.9M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (102s) 39s ago 102s 46.7M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (93s) 40s ago 93s 44.8M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (82s) 40s ago 82s 43.9M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (73s) 40s ago 73s 42.7M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:35:57.620 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (2m) 39s ago 2m 39.1M - 2.43.0 a07b618ecd1d 5d567c813f4b 2026-03-10T12:35:57.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd64406c680 msgr2=0x7fd64406eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd64406c680 0x7fd64406eb30 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fd654005f50 tx=0x7fd654005ec0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658100e00 msgr2=0x7fd658197d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658100e00 0x7fd658197d00 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7fd64800c960 tx=0x7fd64800cc70 comp rx=0 tx=0).stop 2026-03-10T12:35:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 shutdown_connections 2026-03-10T12:35:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fd64406c680 0x7fd64406eb30 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd658100e00 0x7fd658197d00 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 --2- 192.168.123.100:0/3626743381 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd658103720 0x7fd658198240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 >> 192.168.123.100:0/3626743381 conn(0x7fd6580fa740 msgr2=0x7fd6580ff480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 shutdown_connections 2026-03-10T12:35:57.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.622+0000 7fd65e147700 1 -- 192.168.123.100:0/3626743381 wait complete. 2026-03-10T12:35:57.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.691+0000 7f967284f700 1 -- 192.168.123.100:0/3262772209 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 msgr2=0x7f966c101bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.691+0000 7f967284f700 1 --2- 192.168.123.100:0/3262772209 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c101bb0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f965c009b50 tx=0x7f965c009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:57.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.694+0000 7f967284f700 1 -- 192.168.123.100:0/3262772209 shutdown_connections 2026-03-10T12:35:57.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.694+0000 7f967284f700 1 --2- 192.168.123.100:0/3262772209 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c101bb0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.694+0000 7f967284f700 1 --2- 192.168.123.100:0/3262772209 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f966c100560 0x7f966c100970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.694+0000 7f967284f700 1 -- 192.168.123.100:0/3262772209 >> 192.168.123.100:0/3262772209 conn(0x7f966c0fbb10 msgr2=0x7f966c0fdf40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.694+0000 7f967284f700 1 -- 192.168.123.100:0/3262772209 shutdown_connections 2026-03-10T12:35:57.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.694+0000 7f967284f700 1 -- 192.168.123.100:0/3262772209 wait complete. 2026-03-10T12:35:57.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.695+0000 7f967284f700 1 Processor -- start 2026-03-10T12:35:57.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.695+0000 7f967284f700 1 -- start start 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.695+0000 7f967284f700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f966c100560 0x7f966c193c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.695+0000 7f967284f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c194160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.695+0000 7f967284f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f966c194780 con 0x7f966c101760 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.695+0000 7f967284f700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f966c199190 con 0x7f966c100560 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c194160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c194160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:37104/0 (socket says 192.168.123.100:37104) 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966b7fe700 1 -- 192.168.123.100:0/350429443 learned_addr learned my addr 192.168.123.100:0/350429443 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:57.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966bfff700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f966c100560 0x7f966c193c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966b7fe700 1 -- 192.168.123.100:0/350429443 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f966c100560 msgr2=0x7f966c193c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966b7fe700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f966c100560 0x7f966c193c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966b7fe700 1 -- 192.168.123.100:0/350429443 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9654009710 con 0x7f966c101760 2026-03-10T12:35:57.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f966b7fe700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c194160 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f965c006010 tx=0x7f965c004ca0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f96697fa700 1 -- 192.168.123.100:0/350429443 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f965c01d070 con 0x7f966c101760 2026-03-10T12:35:57.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f96697fa700 1 -- 192.168.123.100:0/350429443 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f965c00bc50 con 0x7f966c101760 2026-03-10T12:35:57.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f96697fa700 1 -- 192.168.123.100:0/350429443 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f965c00f7d0 con 0x7f966c101760 2026-03-10T12:35:57.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f965c0097e0 con 0x7f966c101760 2026-03-10T12:35:57.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.697+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f966c199660 con 0x7f966c101760 2026-03-10T12:35:57.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.699+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f966c066e40 con 0x7f966c101760 2026-03-10T12:35:57.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.699+0000 7f96697fa700 1 -- 192.168.123.100:0/350429443 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f965c022ac0 con 0x7f966c101760 2026-03-10T12:35:57.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.699+0000 7f96697fa700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f965806c6a0 0x7f965806eb50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.699+0000 7f96697fa700 1 -- 192.168.123.100:0/350429443 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f965c08ce20 con 0x7f966c101760 2026-03-10T12:35:57.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.703+0000 7f96697fa700 1 -- 192.168.123.100:0/350429443 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f965c057670 con 0x7f966c101760 2026-03-10T12:35:57.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.703+0000 7f966bfff700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f965806c6a0 0x7f965806eb50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.703+0000 7f966bfff700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f965806c6a0 0x7f965806eb50 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f966c195100 tx=0x7f9654009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.846+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f966c199910 con 0x7f966c101760 2026-03-10T12:35:57.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.847+0000 7f96697fa700 1 -- 192.168.123.100:0/350429443 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f965c05ac90 con 0x7f966c101760 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:35:57.848 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:35:57.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.850+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f965806c6a0 msgr2=0x7f965806eb50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.850+0000 7f967284f700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f965806c6a0 0x7f965806eb50 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f966c195100 tx=0x7f9654009450 comp rx=0 tx=0).stop 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.850+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 msgr2=0x7f966c194160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.850+0000 7f967284f700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c194160 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f965c006010 tx=0x7f965c004ca0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.851+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 shutdown_connections 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.851+0000 7f967284f700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f965806c6a0 0x7f965806eb50 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.851+0000 7f967284f700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f966c100560 0x7f966c193c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.851+0000 7f967284f700 1 --2- 192.168.123.100:0/350429443 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f966c101760 0x7f966c194160 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.851+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 >> 192.168.123.100:0/350429443 conn(0x7f966c0fbb10 msgr2=0x7f966c104990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.851+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 shutdown_connections 2026-03-10T12:35:57.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.851+0000 7f967284f700 1 -- 192.168.123.100:0/350429443 wait complete. 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.920+0000 7f38eaea6700 1 -- 192.168.123.100:0/2312584720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 msgr2=0x7f38e4101bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.920+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2312584720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4101bb0 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7f38d0009b50 tx=0x7f38d0009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 -- 192.168.123.100:0/2312584720 shutdown_connections 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2312584720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4101bb0 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2312584720 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f38e4100560 0x7f38e4100970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 -- 192.168.123.100:0/2312584720 >> 192.168.123.100:0/2312584720 conn(0x7f38e40fbb10 msgr2=0x7f38e40fdf40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 -- 192.168.123.100:0/2312584720 shutdown_connections 2026-03-10T12:35:57.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 -- 192.168.123.100:0/2312584720 wait complete. 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 Processor -- start 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.921+0000 7f38eaea6700 1 -- start start 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38eaea6700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f38e4100560 0x7f38e4193bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38eaea6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4194100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38eaea6700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38e4194720 con 0x7f38e4101760 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38eaea6700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38e4194860 con 0x7f38e4100560 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4194100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4194100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:37124/0 (socket says 192.168.123.100:37124) 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e3fff700 1 -- 192.168.123.100:0/2415555897 learned_addr learned my addr 192.168.123.100:0/2415555897 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:57.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e8c42700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f38e4100560 0x7f38e4193bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e3fff700 1 -- 192.168.123.100:0/2415555897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f38e4100560 msgr2=0x7f38e4193bc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:57.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e3fff700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f38e4100560 0x7f38e4193bc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:57.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e3fff700 1 -- 192.168.123.100:0/2415555897 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38d00097e0 con 0x7f38e4101760 2026-03-10T12:35:57.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e3fff700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4194100 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7f38d0006010 tx=0x7f38d0005250 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.924 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e1ffb700 1 -- 192.168.123.100:0/2415555897 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38d001d070 con 0x7f38e4101760 2026-03-10T12:35:57.924 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e1ffb700 1 -- 192.168.123.100:0/2415555897 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f38d0022470 con 0x7f38e4101760 2026-03-10T12:35:57.924 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38e1ffb700 1 -- 192.168.123.100:0/2415555897 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38d000f670 con 0x7f38e4101760 2026-03-10T12:35:57.924 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38e41992b0 con 0x7f38e4101760 2026-03-10T12:35:57.924 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.923+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38e41997a0 con 0x7f38e4101760 2026-03-10T12:35:57.925 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.924+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38e4066e40 con 0x7f38e4101760 2026-03-10T12:35:57.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.925+0000 7f38e1ffb700 1 -- 192.168.123.100:0/2415555897 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f38d0022a50 con 0x7f38e4101760 2026-03-10T12:35:57.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.926+0000 7f38e1ffb700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f38d4070ae0 0x7f38d4072f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:57.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.926+0000 7f38e1ffb700 1 -- 192.168.123.100:0/2415555897 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f38d008ce80 con 0x7f38e4101760 2026-03-10T12:35:57.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.926+0000 7f38e8c42700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f38d4070ae0 0x7f38d4072f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:57.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.926+0000 7f38e8c42700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f38d4070ae0 0x7f38d4072f90 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f38e41015c0 tx=0x7f38d8009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:57.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:57.928+0000 7f38e1ffb700 1 -- 192.168.123.100:0/2415555897 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f38d005b1d0 con 0x7f38e4101760 2026-03-10T12:35:58.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.064+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f38e4199b70 con 0x7f38e4101760 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.064+0000 7f38e1ffb700 1 -- 192.168.123.100:0/2415555897 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1836 (secure 0 0 0) 0x7f38d0027740 con 0x7f38e4101760 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:e11 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:epoch 10 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:35:17.532287+0000 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:58.065 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:58.066 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:35:58.066 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:35:58.066 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:58.066 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.068+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f38d4070ae0 msgr2=0x7f38d4072f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.068+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f38d4070ae0 0x7f38d4072f90 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f38e41015c0 tx=0x7f38d8009450 comp rx=0 tx=0).stop 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.068+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 msgr2=0x7f38e4194100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.068+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4194100 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7f38d0006010 tx=0x7f38d0005250 comp rx=0 tx=0).stop 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.069+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 shutdown_connections 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.069+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f38d4070ae0 0x7f38d4072f90 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.069+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f38e4100560 0x7f38e4193bc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.069+0000 7f38eaea6700 1 --2- 192.168.123.100:0/2415555897 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38e4101760 0x7f38e4194100 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.069+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 >> 192.168.123.100:0/2415555897 conn(0x7f38e40fbb10 msgr2=0x7f38e4104990 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:58.068 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.069+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 shutdown_connections 2026-03-10T12:35:58.069 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.069+0000 7f38eaea6700 1 -- 192.168.123.100:0/2415555897 wait complete. 2026-03-10T12:35:58.070 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 11 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 -- 192.168.123.100:0/483279118 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 msgr2=0x7f5a70105800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 --2- 192.168.123.100:0/483279118 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a70105800 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7f5a58009b00 tx=0x7f5a58009e10 comp rx=0 tx=0).stop 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 -- 192.168.123.100:0/483279118 shutdown_connections 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 --2- 192.168.123.100:0/483279118 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a70105800 unknown :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 --2- 192.168.123.100:0/483279118 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a70069180 0x7f5a70102e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 -- 192.168.123.100:0/483279118 >> 192.168.123.100:0/483279118 conn(0x7f5a700fa7b0 msgr2=0x7f5a700fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 -- 192.168.123.100:0/483279118 shutdown_connections 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.155+0000 7f5a753d6700 1 -- 192.168.123.100:0/483279118 wait complete. 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.156+0000 7f5a753d6700 1 Processor -- start 2026-03-10T12:35:58.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.156+0000 7f5a753d6700 1 -- start start 2026-03-10T12:35:58.157 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.156+0000 7f5a753d6700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a70069180 0x7f5a70197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:58.157 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.156+0000 7f5a753d6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a701982e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:58.157 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.156+0000 7f5a753d6700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a70198900 con 0x7f5a701033c0 2026-03-10T12:35:58.157 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.156+0000 7f5a753d6700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a70198a40 con 0x7f5a70069180 2026-03-10T12:35:58.157 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.157+0000 7f5a6f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a701982e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:58.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.157+0000 7f5a6f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a701982e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:37148/0 (socket says 192.168.123.100:37148) 2026-03-10T12:35:58.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.157+0000 7f5a6f7fe700 1 -- 192.168.123.100:0/1438547029 learned_addr learned my addr 192.168.123.100:0/1438547029 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:58.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.157+0000 7f5a6f7fe700 1 -- 192.168.123.100:0/1438547029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a70069180 msgr2=0x7f5a70197da0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:35:58.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.157+0000 7f5a6ffff700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a70069180 0x7f5a70197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:58.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.157+0000 7f5a6f7fe700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a70069180 0x7f5a70197da0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.157+0000 7f5a6f7fe700 1 -- 192.168.123.100:0/1438547029 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a580097e0 con 0x7f5a701033c0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a6ffff700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a70069180 0x7f5a70197da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a6f7fe700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a701982e0 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7f5a580052d0 tx=0x7f5a58004b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a6d7fa700 1 -- 192.168.123.100:0/1438547029 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a5801d070 con 0x7f5a701033c0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a7019d490 con 0x7f5a701033c0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a7019d980 con 0x7f5a701033c0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a6d7fa700 1 -- 192.168.123.100:0/1438547029 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5a5800bd10 con 0x7f5a701033c0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a6d7fa700 1 -- 192.168.123.100:0/1438547029 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a5800f740 con 0x7f5a701033c0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.159+0000 7f5a6d7fa700 1 -- 192.168.123.100:0/1438547029 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f5a5800f8a0 con 0x7f5a701033c0 2026-03-10T12:35:58.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.160+0000 7f5a6d7fa700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5a5c06c6d0 0x7f5a5c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:58.161 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.161+0000 7f5a6ffff700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5a5c06c6d0 0x7f5a5c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:58.161 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.161+0000 7f5a6ffff700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5a5c06c6d0 0x7f5a5c06eb80 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f5a60005fd0 tx=0x7f5a60005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:58.161 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.161+0000 7f5a6d7fa700 1 -- 192.168.123.100:0/1438547029 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f5a5808d0d0 con 0x7f5a701033c0 2026-03-10T12:35:58.162 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.161+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5a50005320 con 0x7f5a701033c0 2026-03-10T12:35:58.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.165+0000 7f5a6d7fa700 1 -- 192.168.123.100:0/1438547029 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f5a5805b390 con 0x7f5a701033c0 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.275+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5a50000bf0 con 0x7f5a5c06c6d0 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.276+0000 7f5a6d7fa700 1 -- 192.168.123.100:0/1438547029 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f5a50000bf0 con 0x7f5a5c06c6d0 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "", 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:35:58.276 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:35:58.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.278+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5a5c06c6d0 msgr2=0x7f5a5c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.278+0000 7f5a753d6700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5a5c06c6d0 0x7f5a5c06eb80 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f5a60005fd0 tx=0x7f5a60005dc0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.278+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 msgr2=0x7f5a701982e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.278+0000 7f5a753d6700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a701982e0 secure :-1 s=READY pgs=306 cs=0 l=1 rev1=1 crypto rx=0x7f5a580052d0 tx=0x7f5a58004b10 comp rx=0 tx=0).stop 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.279+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 shutdown_connections 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.279+0000 7f5a753d6700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f5a5c06c6d0 0x7f5a5c06eb80 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.279+0000 7f5a753d6700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a70069180 0x7f5a70197da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.279+0000 7f5a753d6700 1 --2- 192.168.123.100:0/1438547029 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a701033c0 0x7f5a701982e0 unknown :-1 s=CLOSED pgs=306 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.279+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 >> 192.168.123.100:0/1438547029 conn(0x7f5a700fa7b0 msgr2=0x7f5a701006b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.279+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 shutdown_connections 2026-03-10T12:35:58.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.279+0000 7f5a753d6700 1 -- 192.168.123.100:0/1438547029 wait complete. 2026-03-10T12:35:58.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.352+0000 7f5065761700 1 -- 192.168.123.100:0/3565079698 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 msgr2=0x7f5060101b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.352+0000 7f5065761700 1 --2- 192.168.123.100:0/3565079698 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f5060101b90 secure :-1 s=READY pgs=307 cs=0 l=1 rev1=1 crypto rx=0x7f5050009b50 tx=0x7f5050009e60 comp rx=0 tx=0).stop 2026-03-10T12:35:58.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.356+0000 7f5065761700 1 -- 192.168.123.100:0/3565079698 shutdown_connections 2026-03-10T12:35:58.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.356+0000 7f5065761700 1 --2- 192.168.123.100:0/3565079698 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f5060101b90 unknown :-1 s=CLOSED pgs=307 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.356+0000 7f5065761700 1 --2- 192.168.123.100:0/3565079698 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5060100540 0x7f5060100950 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.356+0000 7f5065761700 1 -- 192.168.123.100:0/3565079698 >> 192.168.123.100:0/3565079698 conn(0x7f50600fbad0 msgr2=0x7f50600fdf20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:58.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.356+0000 7f5065761700 1 -- 192.168.123.100:0/3565079698 shutdown_connections 2026-03-10T12:35:58.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.356+0000 7f5065761700 1 -- 192.168.123.100:0/3565079698 wait complete. 2026-03-10T12:35:58.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.357+0000 7f5065761700 1 Processor -- start 2026-03-10T12:35:58.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.357+0000 7f5065761700 1 -- start start 2026-03-10T12:35:58.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.357+0000 7f5065761700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5060100540 0x7f506019c430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.357+0000 7f5065761700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f506019c970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.357+0000 7f5065761700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f506019cf90 con 0x7f5060101740 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.357+0000 7f5065761700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f506019d0d0 con 0x7f5060100540 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505e7fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f506019c970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505e7fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f506019c970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:37172/0 (socket says 192.168.123.100:37172) 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505e7fc700 1 -- 192.168.123.100:0/3144911850 learned_addr learned my addr 192.168.123.100:0/3144911850 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505effd700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5060100540 0x7f506019c430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:58.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505e7fc700 1 -- 192.168.123.100:0/3144911850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5060100540 msgr2=0x7f506019c430 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505e7fc700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5060100540 0x7f506019c430 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505e7fc700 1 -- 192.168.123.100:0/3144911850 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50500097e0 con 0x7f5060101740 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.358+0000 7f505effd700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5060100540 0x7f506019c430 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.359+0000 7f505e7fc700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f506019c970 secure :-1 s=READY pgs=308 cs=0 l=1 rev1=1 crypto rx=0x7f505000b5c0 tx=0x7f5050005250 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.359+0000 7f5057fff700 1 -- 192.168.123.100:0/3144911850 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f505001d070 con 0x7f5060101740 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.359+0000 7f5057fff700 1 -- 192.168.123.100:0/3144911850 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f505000bc30 con 0x7f5060101740 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.359+0000 7f5057fff700 1 -- 192.168.123.100:0/3144911850 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f505000f910 con 0x7f5060101740 2026-03-10T12:35:58.359 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.359+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f50601a1b20 con 0x7f5060101740 2026-03-10T12:35:58.360 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.359+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f50601a2010 con 0x7f5060101740 2026-03-10T12:35:58.361 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.360+0000 7f5057fff700 1 -- 192.168.123.100:0/3144911850 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f505000fa70 con 0x7f5060101740 2026-03-10T12:35:58.361 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.361+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5060066e40 con 0x7f5060101740 2026-03-10T12:35:58.364 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.364+0000 7f5057fff700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f504c06c6e0 0x7f504c06eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:35:58.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.364+0000 7f505effd700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f504c06c6e0 0x7f504c06eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:35:58.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.365+0000 7f5057fff700 1 -- 192.168.123.100:0/3144911850 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f505008e030 con 0x7f5060101740 2026-03-10T12:35:58.366 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.365+0000 7f505effd700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f504c06c6e0 0x7f504c06eb90 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f5048005950 tx=0x7f50480058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:35:58.366 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.365+0000 7f5057fff700 1 -- 192.168.123.100:0/3144911850 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f505005c2f0 con 0x7f5060101740 2026-03-10T12:35:58.520 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.520+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f50601a23b0 con 0x7f5060101740 2026-03-10T12:35:58.521 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.521+0000 7f5057fff700 1 -- 192.168.123.100:0/3144911850 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f50500208e0 con 0x7f5060101740 2026-03-10T12:35:58.522 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:35:58.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f504c06c6e0 msgr2=0x7f504c06eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f504c06c6e0 0x7f504c06eb90 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f5048005950 tx=0x7f50480058e0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 msgr2=0x7f506019c970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:35:58.525 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f506019c970 secure :-1 s=READY pgs=308 cs=0 l=1 rev1=1 crypto rx=0x7f505000b5c0 tx=0x7f5050005250 comp rx=0 tx=0).stop 2026-03-10T12:35:58.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 shutdown_connections 2026-03-10T12:35:58.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f504c06c6e0 0x7f504c06eb90 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5060100540 0x7f506019c430 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 --2- 192.168.123.100:0/3144911850 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5060101740 0x7f506019c970 unknown :-1 s=CLOSED pgs=308 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:35:58.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 >> 192.168.123.100:0/3144911850 conn(0x7f50600fbad0 msgr2=0x7f5060104970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:35:58.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 shutdown_connections 2026-03-10T12:35:58.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:35:58.525+0000 7f5065761700 1 -- 192.168.123.100:0/3144911850 wait complete. 2026-03-10T12:35:58.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:58 vm00.local ceph-mon[50686]: pgmap v104: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:58.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:58 vm00.local ceph-mon[50686]: from='client.14576 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:58.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:58 vm00.local ceph-mon[50686]: from='client.24381 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:58.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:58 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/350429443' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:35:58.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:58 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/2415555897' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:58 vm07.local ceph-mon[58582]: pgmap v104: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:35:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:58 vm07.local ceph-mon[58582]: from='client.14576 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:58 vm07.local ceph-mon[58582]: from='client.24381 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:58 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/350429443' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:35:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:58 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/2415555897' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:35:59.491 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:59 vm07.local ceph-mon[58582]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:59.491 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:59 vm07.local ceph-mon[58582]: from='client.14596 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:59.491 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:35:59 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/3144911850' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:35:59.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:59 vm00.local ceph-mon[50686]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:59.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:59 vm00.local ceph-mon[50686]: from='client.14596 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:35:59.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:35:59 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/3144911850' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:36:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:00 vm00.local ceph-mon[50686]: pgmap v105: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:00 vm07.local ceph-mon[58582]: pgmap v105: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:01.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:01 vm00.local ceph-mon[50686]: pgmap v106: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:01 vm07.local ceph-mon[58582]: pgmap v106: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:04.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:04 vm07.local ceph-mon[58582]: pgmap v107: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:04 vm00.local ceph-mon[50686]: pgmap v107: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:06.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:06 vm07.local ceph-mon[58582]: pgmap v108: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:06.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:06 vm00.local ceph-mon[50686]: pgmap v108: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:08.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:08 vm07.local ceph-mon[58582]: pgmap v109: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:08 vm00.local ceph-mon[50686]: pgmap v109: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:09.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:09 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:09.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:09 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:10 vm00.local ceph-mon[50686]: pgmap v110: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:10.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:10 vm07.local ceph-mon[58582]: pgmap v110: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:12.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:12 vm07.local ceph-mon[58582]: pgmap v111: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:12.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:12 vm00.local ceph-mon[50686]: pgmap v111: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:14 vm07.local ceph-mon[58582]: pgmap v112: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:14.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:14 vm00.local ceph-mon[50686]: pgmap v112: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:16 vm07.local ceph-mon[58582]: pgmap v113: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:16.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:16 vm00.local ceph-mon[50686]: pgmap v113: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:18.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:18 vm00.local ceph-mon[50686]: pgmap v114: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:18.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:18 vm07.local ceph-mon[58582]: pgmap v114: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:20.421 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:20 vm00.local ceph-mon[50686]: pgmap v115: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:20.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:20 vm07.local ceph-mon[58582]: pgmap v115: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:22 vm00.local ceph-mon[50686]: pgmap v116: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:22 vm07.local ceph-mon[58582]: pgmap v116: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:24 vm00.local ceph-mon[50686]: pgmap v117: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:24 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:24 vm07.local ceph-mon[58582]: pgmap v117: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:24 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:26.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:26 vm00.local ceph-mon[50686]: pgmap v118: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:26 vm07.local ceph-mon[58582]: pgmap v118: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:28.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.633+0000 7f17f599b700 1 -- 192.168.123.100:0/789732405 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 msgr2=0x7f17f01083c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:28.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.633+0000 7f17f599b700 1 --2- 192.168.123.100:0/789732405 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 0x7f17f01083c0 secure :-1 s=READY pgs=309 cs=0 l=1 rev1=1 crypto rx=0x7f17e0009b00 tx=0x7f17e0009e10 comp rx=0 tx=0).stop 2026-03-10T12:36:28.634 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:28 vm00.local ceph-mon[50686]: pgmap v119: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:28.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.634+0000 7f17f599b700 1 -- 192.168.123.100:0/789732405 shutdown_connections 2026-03-10T12:36:28.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.634+0000 7f17f599b700 1 --2- 192.168.123.100:0/789732405 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17f0108990 0x7f17f0071fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.634+0000 7f17f599b700 1 --2- 192.168.123.100:0/789732405 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 0x7f17f01083c0 unknown :-1 s=CLOSED pgs=309 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.634+0000 7f17f599b700 1 -- 192.168.123.100:0/789732405 >> 192.168.123.100:0/789732405 conn(0x7f17f006d3e0 msgr2=0x7f17f006f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 -- 192.168.123.100:0/789732405 shutdown_connections 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 -- 192.168.123.100:0/789732405 wait complete. 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 Processor -- start 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 -- start start 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 0x7f17f019c480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17f0108990 0x7f17f019c9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17f019cfe0 con 0x7f17f0107fb0 2026-03-10T12:36:28.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.636+0000 7f17f599b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17f019d120 con 0x7f17f0108990 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17ee7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17f0108990 0x7f17f019c9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17ee7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17f0108990 0x7f17f019c9c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:40618/0 (socket says 192.168.123.100:40618) 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17ee7fc700 1 -- 192.168.123.100:0/1753791548 learned_addr learned my addr 192.168.123.100:0/1753791548 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17eeffd700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 0x7f17f019c480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17eeffd700 1 -- 192.168.123.100:0/1753791548 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17f0108990 msgr2=0x7f17f019c9c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17eeffd700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17f0108990 0x7f17f019c9c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17eeffd700 1 -- 192.168.123.100:0/1753791548 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17e00097e0 con 0x7f17f0107fb0 2026-03-10T12:36:28.638 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17eeffd700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 0x7f17f019c480 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7f17e0000c00 tx=0x7f17e0004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17f4999700 1 -- 192.168.123.100:0/1753791548 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f17e001d070 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.637+0000 7f17f4999700 1 -- 192.168.123.100:0/1753791548 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f17e000bc50 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.638+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f17f01a1b70 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.638+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f17f01a2060 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.638+0000 7f17f4999700 1 -- 192.168.123.100:0/1753791548 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f17e000f700 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.638+0000 7f17d67fc700 1 -- 192.168.123.100:0/1753791548 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f17d00052f0 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.641+0000 7f17f4999700 1 -- 192.168.123.100:0/1753791548 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f17e000f960 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.642+0000 7f17f4999700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f17d8070930 0x7f17d8072de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.642+0000 7f17f4999700 1 -- 192.168.123.100:0/1753791548 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f17e008d2a0 con 0x7f17f0107fb0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.642+0000 7f17ee7fc700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f17d8070930 0x7f17d8072de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.642+0000 7f17ee7fc700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f17d8070930 0x7f17d8072de0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f17e4009cc0 tx=0x7f17e4009480 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:28.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.643+0000 7f17f4999700 1 -- 192.168.123.100:0/1753791548 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f17e005b5e0 con 0x7f17f0107fb0 2026-03-10T12:36:28.813 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.813+0000 7f17d67fc700 1 -- 192.168.123.100:0/1753791548 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f17d0000bc0 con 0x7f17d8070930 2026-03-10T12:36:28.814 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.814+0000 7f17f4999700 1 -- 192.168.123.100:0/1753791548 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f17d0000bc0 con 0x7f17d8070930 2026-03-10T12:36:28.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:28 vm07.local ceph-mon[58582]: pgmap v119: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f17d8070930 msgr2=0x7f17d8072de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f17d8070930 0x7f17d8072de0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f17e4009cc0 tx=0x7f17e4009480 comp rx=0 tx=0).stop 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 msgr2=0x7f17f019c480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 0x7f17f019c480 secure :-1 s=READY pgs=310 cs=0 l=1 rev1=1 crypto rx=0x7f17e0000c00 tx=0x7f17e0004c30 comp rx=0 tx=0).stop 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 shutdown_connections 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f17d8070930 0x7f17d8072de0 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17f0107fb0 0x7f17f019c480 unknown :-1 s=CLOSED pgs=310 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 --2- 192.168.123.100:0/1753791548 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17f0108990 0x7f17f019c9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.818 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 >> 192.168.123.100:0/1753791548 conn(0x7f17f006d3e0 msgr2=0x7f17f010d290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:28.818 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 shutdown_connections 2026-03-10T12:36:28.818 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.817+0000 7f17f599b700 1 -- 192.168.123.100:0/1753791548 wait complete. 2026-03-10T12:36:28.827 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.896+0000 7f7259ced700 1 -- 192.168.123.100:0/2090153098 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7254072360 msgr2=0x7f72540770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.896+0000 7f7259ced700 1 --2- 192.168.123.100:0/2090153098 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7254072360 0x7f72540770e0 secure :-1 s=READY pgs=311 cs=0 l=1 rev1=1 crypto rx=0x7f724c00d3f0 tx=0x7f724c00d700 comp rx=0 tx=0).stop 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 -- 192.168.123.100:0/2090153098 shutdown_connections 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 --2- 192.168.123.100:0/2090153098 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7254072360 0x7f72540770e0 unknown :-1 s=CLOSED pgs=311 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 --2- 192.168.123.100:0/2090153098 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 0x7f7254071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 -- 192.168.123.100:0/2090153098 >> 192.168.123.100:0/2090153098 conn(0x7f725406d1a0 msgr2=0x7f725406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 -- 192.168.123.100:0/2090153098 shutdown_connections 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 -- 192.168.123.100:0/2090153098 wait complete. 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 Processor -- start 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 -- start start 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 0x7f7254131420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7254131960 0x7f725407f5f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7254131e60 con 0x7f7254131960 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.897+0000 7f7259ced700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7254131fd0 con 0x7f7254071980 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.898+0000 7f72537fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 0x7f7254131420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.898+0000 7f72537fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 0x7f7254131420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:40636/0 (socket says 192.168.123.100:40636) 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.898+0000 7f72537fe700 1 -- 192.168.123.100:0/1588232350 learned_addr learned my addr 192.168.123.100:0/1588232350 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.898+0000 7f72537fe700 1 -- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7254131960 msgr2=0x7f725407f5f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.898+0000 7f72537fe700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7254131960 0x7f725407f5f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.898+0000 7f72537fe700 1 -- 192.168.123.100:0/1588232350 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f724c007ed0 con 0x7f7254071980 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.899+0000 7f72537fe700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 0x7f7254131420 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f724400d8d0 tx=0x7f724400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:28.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.899+0000 7f7250ff9700 1 -- 192.168.123.100:0/1588232350 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7244009940 con 0x7f7254071980 2026-03-10T12:36:28.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.899+0000 7f7259ced700 1 -- 192.168.123.100:0/1588232350 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f725407fb90 con 0x7f7254071980 2026-03-10T12:36:28.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.899+0000 7f7259ced700 1 -- 192.168.123.100:0/1588232350 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72540800b0 con 0x7f7254071980 2026-03-10T12:36:28.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.900+0000 7f7250ff9700 1 -- 192.168.123.100:0/1588232350 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7244010460 con 0x7f7254071980 2026-03-10T12:36:28.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.900+0000 7f7250ff9700 1 -- 192.168.123.100:0/1588232350 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f724400f5d0 con 0x7f7254071980 2026-03-10T12:36:28.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.901+0000 7f7250ff9700 1 -- 192.168.123.100:0/1588232350 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f724400f7e0 con 0x7f7254071980 2026-03-10T12:36:28.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.902+0000 7f7250ff9700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f723c06c6d0 0x7f723c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:28.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.902+0000 7f7252ffd700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f723c06c6d0 0x7f723c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:28.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.902+0000 7f7252ffd700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f723c06c6d0 0x7f723c06eb80 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f724c00db80 tx=0x7f724c0061f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:28.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.903+0000 7f7250ff9700 1 -- 192.168.123.100:0/1588232350 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f724408c7b0 con 0x7f7254071980 2026-03-10T12:36:28.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.903+0000 7f7259ced700 1 -- 192.168.123.100:0/1588232350 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7240005320 con 0x7f7254071980 2026-03-10T12:36:28.907 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:28.906+0000 7f7250ff9700 1 -- 192.168.123.100:0/1588232350 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f724405a730 con 0x7f7254071980 2026-03-10T12:36:29.018 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.016+0000 7f7259ced700 1 -- 192.168.123.100:0/1588232350 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7240000bf0 con 0x7f723c06c6d0 2026-03-10T12:36:29.018 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.017+0000 7f7250ff9700 1 -- 192.168.123.100:0/1588232350 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f7240000bf0 con 0x7f723c06c6d0 2026-03-10T12:36:29.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 -- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f723c06c6d0 msgr2=0x7f723c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f723c06c6d0 0x7f723c06eb80 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f724c00db80 tx=0x7f724c0061f0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 -- 192.168.123.100:0/1588232350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 msgr2=0x7f7254131420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 0x7f7254131420 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f724400d8d0 tx=0x7f724400dc90 comp rx=0 tx=0).stop 2026-03-10T12:36:29.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 -- 192.168.123.100:0/1588232350 shutdown_connections 2026-03-10T12:36:29.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f723c06c6d0 0x7f723c06eb80 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7254071980 0x7f7254131420 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 --2- 192.168.123.100:0/1588232350 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7254131960 0x7f725407f5f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.020+0000 7f723a7fc700 1 -- 192.168.123.100:0/1588232350 >> 192.168.123.100:0/1588232350 conn(0x7f725406d1a0 msgr2=0x7f7254076540 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.021+0000 7f723a7fc700 1 -- 192.168.123.100:0/1588232350 shutdown_connections 2026-03-10T12:36:29.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.021+0000 7f723a7fc700 1 -- 192.168.123.100:0/1588232350 wait complete. 2026-03-10T12:36:29.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 -- 192.168.123.100:0/3356802902 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14071980 msgr2=0x7fdd14071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 --2- 192.168.123.100:0/3356802902 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14071980 0x7fdd14071d90 secure :-1 s=READY pgs=312 cs=0 l=1 rev1=1 crypto rx=0x7fdd04007780 tx=0x7fdd0400c050 comp rx=0 tx=0).stop 2026-03-10T12:36:29.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 -- 192.168.123.100:0/3356802902 shutdown_connections 2026-03-10T12:36:29.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 --2- 192.168.123.100:0/3356802902 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd14072360 0x7fdd140770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 --2- 192.168.123.100:0/3356802902 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14071980 0x7fdd14071d90 unknown :-1 s=CLOSED pgs=312 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 -- 192.168.123.100:0/3356802902 >> 192.168.123.100:0/3356802902 conn(0x7fdd1406d1a0 msgr2=0x7fdd1406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 -- 192.168.123.100:0/3356802902 shutdown_connections 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.088+0000 7fdd1ada7700 1 -- 192.168.123.100:0/3356802902 wait complete. 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd1ada7700 1 Processor -- start 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd1ada7700 1 -- start start 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd1ada7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14072360 0x7fdd14131380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd1ada7700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd141318c0 0x7fdd1407f550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd1ada7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdd14131dc0 con 0x7fdd14072360 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd1ada7700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdd14131f30 con 0x7fdd141318c0 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd18b43700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14072360 0x7fdd14131380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd18b43700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14072360 0x7fdd14131380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52658/0 (socket says 192.168.123.100:52658) 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd18b43700 1 -- 192.168.123.100:0/2767056864 learned_addr learned my addr 192.168.123.100:0/2767056864 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd13fff700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd141318c0 0x7fdd1407f550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd13fff700 1 -- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14072360 msgr2=0x7fdd14131380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd13fff700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14072360 0x7fdd14131380 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.089+0000 7fdd13fff700 1 -- 192.168.123.100:0/2767056864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdd04007430 con 0x7fdd141318c0 2026-03-10T12:36:29.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.090+0000 7fdd13fff700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd141318c0 0x7fdd1407f550 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fdd0c009fd0 tx=0x7fdd0c00d3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.090+0000 7fdd11ffb700 1 -- 192.168.123.100:0/2767056864 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd0c00de40 con 0x7fdd141318c0 2026-03-10T12:36:29.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.090+0000 7fdd1ada7700 1 -- 192.168.123.100:0/2767056864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdd1407faf0 con 0x7fdd141318c0 2026-03-10T12:36:29.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.090+0000 7fdd1ada7700 1 -- 192.168.123.100:0/2767056864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdd14080010 con 0x7fdd141318c0 2026-03-10T12:36:29.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.091+0000 7fdd11ffb700 1 -- 192.168.123.100:0/2767056864 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdd0c00f040 con 0x7fdd141318c0 2026-03-10T12:36:29.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.091+0000 7fdd11ffb700 1 -- 192.168.123.100:0/2767056864 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd0c015610 con 0x7fdd141318c0 2026-03-10T12:36:29.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.092+0000 7fdd11ffb700 1 -- 192.168.123.100:0/2767056864 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fdd0c015770 con 0x7fdd141318c0 2026-03-10T12:36:29.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.092+0000 7fdd11ffb700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdcfc06c6d0 0x7fdcfc06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.092+0000 7fdd18b43700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdcfc06c6d0 0x7fdcfc06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.093+0000 7fdd18b43700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdcfc06c6d0 0x7fdcfc06eb80 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fdd04005b40 tx=0x7fdd04005a90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.093+0000 7fdd11ffb700 1 -- 192.168.123.100:0/2767056864 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fdd0c08c890 con 0x7fdd141318c0 2026-03-10T12:36:29.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.093+0000 7fdd1ada7700 1 -- 192.168.123.100:0/2767056864 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdd00005320 con 0x7fdd141318c0 2026-03-10T12:36:29.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.096+0000 7fdd11ffb700 1 -- 192.168.123.100:0/2767056864 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fdd0c05aaa0 con 0x7fdd141318c0 2026-03-10T12:36:29.215 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.212+0000 7fdd1ada7700 1 -- 192.168.123.100:0/2767056864 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fdd00000bf0 con 0x7fdcfc06c6d0 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.218+0000 7fdd11ffb700 1 -- 192.168.123.100:0/2767056864 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fdd00000bf0 con 0x7fdcfc06c6d0 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (2m) 71s ago 3m 22.8M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (3m) 71s ago 3m 8074k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (3m) 72s ago 3m 8208k - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (3m) 71s ago 3m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (2m) 72s ago 2m 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (2m) 71s ago 3m 82.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (78s) 71s ago 78s 17.2M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (76s) 71s ago 76s 14.2M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (75s) 72s ago 75s 13.7M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (77s) 72s ago 77s 18.6M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:9283,8765,8443 running (4m) 71s ago 4m 498M - 18.2.0 dc2bc1663786 8dc0a869be20 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (2m) 72s ago 2m 448M - 18.2.0 dc2bc1663786 1662ba2e507c 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (4m) 71s ago 4m 50.6M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (2m) 72s ago 2m 44.3M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (3m) 71s ago 3m 14.4M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (2m) 72s ago 2m 12.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (2m) 71s ago 2m 45.5M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (2m) 71s ago 2m 45.9M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (2m) 71s ago 2m 46.7M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (2m) 72s ago 2m 44.8M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (114s) 72s ago 114s 43.9M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (105s) 72s ago 105s 42.7M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:36:29.219 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (2m) 71s ago 3m 39.1M - 2.43.0 a07b618ecd1d 5d567c813f4b 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.225+0000 7fdcfb7fe700 1 -- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdcfc06c6d0 msgr2=0x7fdcfc06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.225+0000 7fdcfb7fe700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdcfc06c6d0 0x7fdcfc06eb80 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fdd04005b40 tx=0x7fdd04005a90 comp rx=0 tx=0).stop 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.225+0000 7fdcfb7fe700 1 -- 192.168.123.100:0/2767056864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd141318c0 msgr2=0x7fdd1407f550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.225+0000 7fdcfb7fe700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd141318c0 0x7fdd1407f550 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fdd0c009fd0 tx=0x7fdd0c00d3b0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.226+0000 7fdcfb7fe700 1 -- 192.168.123.100:0/2767056864 shutdown_connections 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.226+0000 7fdcfb7fe700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fdcfc06c6d0 0x7fdcfc06eb80 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.226+0000 7fdcfb7fe700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdd14072360 0x7fdd14131380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.226+0000 7fdcfb7fe700 1 --2- 192.168.123.100:0/2767056864 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdd141318c0 0x7fdd1407f550 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.226+0000 7fdcfb7fe700 1 -- 192.168.123.100:0/2767056864 >> 192.168.123.100:0/2767056864 conn(0x7fdd1406d1a0 msgr2=0x7fdd14070600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.226+0000 7fdcfb7fe700 1 -- 192.168.123.100:0/2767056864 shutdown_connections 2026-03-10T12:36:29.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.226+0000 7fdcfb7fe700 1 -- 192.168.123.100:0/2767056864 wait complete. 2026-03-10T12:36:29.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.318+0000 7fe3c1c46700 1 -- 192.168.123.100:0/1345116930 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc108990 msgr2=0x7fe3bc071fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.318+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/1345116930 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc108990 0x7fe3bc071fe0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fe3b400b210 tx=0x7fe3b400b520 comp rx=0 tx=0).stop 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 -- 192.168.123.100:0/1345116930 shutdown_connections 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/1345116930 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc108990 0x7fe3bc071fe0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/1345116930 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc107fb0 0x7fe3bc1083c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 -- 192.168.123.100:0/1345116930 >> 192.168.123.100:0/1345116930 conn(0x7fe3bc06d3e0 msgr2=0x7fe3bc06f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 -- 192.168.123.100:0/1345116930 shutdown_connections 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 -- 192.168.123.100:0/1345116930 wait complete. 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 Processor -- start 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 -- start start 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc107fb0 0x7fe3bc1a0820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc108990 0x7fe3bc1a0d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe3bc1a1380 con 0x7fe3bc108990 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.319+0000 7fe3c1c46700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe3bc1a14c0 con 0x7fe3bc107fb0 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3baffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc108990 0x7fe3bc1a0d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3baffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc108990 0x7fe3bc1a0d60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52686/0 (socket says 192.168.123.100:52686) 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3baffd700 1 -- 192.168.123.100:0/3427029102 learned_addr learned my addr 192.168.123.100:0/3427029102 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3bb7fe700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc107fb0 0x7fe3bc1a0820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3baffd700 1 -- 192.168.123.100:0/3427029102 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc107fb0 msgr2=0x7fe3bc1a0820 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3baffd700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc107fb0 0x7fe3bc1a0820 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3baffd700 1 -- 192.168.123.100:0/3427029102 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe3b4009e30 con 0x7fe3bc108990 2026-03-10T12:36:29.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.320+0000 7fe3baffd700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc108990 0x7fe3bc1a0d60 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7fe3b4000f80 tx=0x7fe3b4003ce0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.321+0000 7fe3b8ff9700 1 -- 192.168.123.100:0/3427029102 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3b400e070 con 0x7fe3bc108990 2026-03-10T12:36:29.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.321+0000 7fe3b8ff9700 1 -- 192.168.123.100:0/3427029102 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe3b4009400 con 0x7fe3bc108990 2026-03-10T12:36:29.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.321+0000 7fe3b8ff9700 1 -- 192.168.123.100:0/3427029102 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe3b4012d00 con 0x7fe3bc108990 2026-03-10T12:36:29.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.321+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe3bc1a5f10 con 0x7fe3bc108990 2026-03-10T12:36:29.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.321+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe3bc1a6430 con 0x7fe3bc108990 2026-03-10T12:36:29.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.322+0000 7fe3a27fc700 1 -- 192.168.123.100:0/3427029102 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe39c0052f0 con 0x7fe3bc108990 2026-03-10T12:36:29.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.326+0000 7fe3b8ff9700 1 -- 192.168.123.100:0/3427029102 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fe3b4019040 con 0x7fe3bc108990 2026-03-10T12:36:29.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.326+0000 7fe3b8ff9700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3a406c630 0x7fe3a406eae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.327+0000 7fe3b8ff9700 1 -- 192.168.123.100:0/3427029102 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fe3b408d360 con 0x7fe3bc108990 2026-03-10T12:36:29.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.327+0000 7fe3b8ff9700 1 -- 192.168.123.100:0/3427029102 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fe3b408d7f0 con 0x7fe3bc108990 2026-03-10T12:36:29.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.333+0000 7fe3bb7fe700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3a406c630 0x7fe3a406eae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.340+0000 7fe3bb7fe700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3a406c630 0x7fe3a406eae0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fe3bc071d10 tx=0x7fe3ac009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.483 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.479+0000 7fe3a27fc700 1 -- 192.168.123.100:0/3427029102 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fe39c005c90 con 0x7fe3bc108990 2026-03-10T12:36:29.484 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.484+0000 7fe3b8ff9700 1 -- 192.168.123.100:0/3427029102 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fe3b405b5f0 con 0x7fe3bc108990 2026-03-10T12:36:29.484 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:36:29.484 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:36:29.485 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:36:29.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.488+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3a406c630 msgr2=0x7fe3a406eae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.488+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3a406c630 0x7fe3a406eae0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7fe3bc071d10 tx=0x7fe3ac009500 comp rx=0 tx=0).stop 2026-03-10T12:36:29.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.488+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc108990 msgr2=0x7fe3bc1a0d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.488+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc108990 0x7fe3bc1a0d60 secure :-1 s=READY pgs=313 cs=0 l=1 rev1=1 crypto rx=0x7fe3b4000f80 tx=0x7fe3b4003ce0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.490+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 shutdown_connections 2026-03-10T12:36:29.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.490+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fe3a406c630 0x7fe3a406eae0 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.490+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3bc107fb0 0x7fe3bc1a0820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.490+0000 7fe3c1c46700 1 --2- 192.168.123.100:0/3427029102 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3bc108990 0x7fe3bc1a0d60 unknown :-1 s=CLOSED pgs=313 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.490+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 >> 192.168.123.100:0/3427029102 conn(0x7fe3bc06d3e0 msgr2=0x7fe3bc10b280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.490+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 shutdown_connections 2026-03-10T12:36:29.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.490+0000 7fe3c1c46700 1 -- 192.168.123.100:0/3427029102 wait complete. 2026-03-10T12:36:29.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.578+0000 7f30d9014700 1 -- 192.168.123.100:0/2769829160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4072360 msgr2=0x7f30d40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.578+0000 7f30d9014700 1 --2- 192.168.123.100:0/2769829160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4072360 0x7f30d40770e0 secure :-1 s=READY pgs=314 cs=0 l=1 rev1=1 crypto rx=0x7f30cc00b600 tx=0x7f30cc00b910 comp rx=0 tx=0).stop 2026-03-10T12:36:29.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 -- 192.168.123.100:0/2769829160 shutdown_connections 2026-03-10T12:36:29.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 --2- 192.168.123.100:0/2769829160 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4072360 0x7f30d40770e0 unknown :-1 s=CLOSED pgs=314 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 --2- 192.168.123.100:0/2769829160 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f30d4071980 0x7f30d4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 -- 192.168.123.100:0/2769829160 >> 192.168.123.100:0/2769829160 conn(0x7f30d406d1a0 msgr2=0x7f30d406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 -- 192.168.123.100:0/2769829160 shutdown_connections 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 -- 192.168.123.100:0/2769829160 wait complete. 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 Processor -- start 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.579+0000 7f30d9014700 1 -- start start 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d9014700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4071980 0x7f30d4082620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d9014700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f30d4082b60 0x7f30d4082fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d9014700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30d41b2a90 con 0x7f30d4071980 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d9014700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30d41b2bd0 con 0x7f30d4082b60 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d2d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4071980 0x7f30d4082620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d2d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4071980 0x7f30d4082620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52704/0 (socket says 192.168.123.100:52704) 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d2d9d700 1 -- 192.168.123.100:0/667749347 learned_addr learned my addr 192.168.123.100:0/667749347 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d259c700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f30d4082b60 0x7f30d4082fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d2d9d700 1 -- 192.168.123.100:0/667749347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f30d4082b60 msgr2=0x7f30d4082fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d2d9d700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f30d4082b60 0x7f30d4082fd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d2d9d700 1 -- 192.168.123.100:0/667749347 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f30cc00b050 con 0x7f30d4071980 2026-03-10T12:36:29.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.580+0000 7f30d2d9d700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4071980 0x7f30d4082620 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f30c400baa0 tx=0x7f30c400be60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.581+0000 7f30bbfff700 1 -- 192.168.123.100:0/667749347 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f30c400c760 con 0x7f30d4071980 2026-03-10T12:36:29.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.581+0000 7f30d9014700 1 -- 192.168.123.100:0/667749347 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f30d41b2e30 con 0x7f30d4071980 2026-03-10T12:36:29.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.581+0000 7f30d9014700 1 -- 192.168.123.100:0/667749347 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f30d41b32f0 con 0x7f30d4071980 2026-03-10T12:36:29.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.583+0000 7f30bbfff700 1 -- 192.168.123.100:0/667749347 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f30c400cda0 con 0x7f30d4071980 2026-03-10T12:36:29.584 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.583+0000 7f30bbfff700 1 -- 192.168.123.100:0/667749347 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f30c4012550 con 0x7f30d4071980 2026-03-10T12:36:29.584 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.584+0000 7f30bbfff700 1 -- 192.168.123.100:0/667749347 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f30c4012770 con 0x7f30d4071980 2026-03-10T12:36:29.584 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.584+0000 7f30bbfff700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f30bc06ea90 0x7f30bc070f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.584+0000 7f30bbfff700 1 -- 192.168.123.100:0/667749347 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f30c408bce0 con 0x7f30d4071980 2026-03-10T12:36:29.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.584+0000 7f30d259c700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f30bc06ea90 0x7f30bc070f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.585+0000 7f30d259c700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f30bc06ea90 0x7f30bc070f40 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f30cc00bd90 tx=0x7f30cc007c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.585+0000 7f30d9014700 1 -- 192.168.123.100:0/667749347 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f30c0005320 con 0x7f30d4071980 2026-03-10T12:36:29.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.588+0000 7f30bbfff700 1 -- 192.168.123.100:0/667749347 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f30c40564e0 con 0x7f30d4071980 2026-03-10T12:36:29.720 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.719+0000 7f30d9014700 1 -- 192.168.123.100:0/667749347 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f30c0006200 con 0x7f30d4071980 2026-03-10T12:36:29.721 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.720+0000 7f30bbfff700 1 -- 192.168.123.100:0/667749347 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1836 (secure 0 0 0) 0x7f30c4059b00 con 0x7f30d4071980 2026-03-10T12:36:29.721 INFO:teuthology.orchestra.run.vm00.stdout:e11 2026-03-10T12:36:29.721 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:36:29.721 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:36:29.721 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:36:29.721 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:epoch 10 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:35:17.532287+0000 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:36:29.722 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.723+0000 7f30b9ffb700 1 -- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f30bc06ea90 msgr2=0x7f30bc070f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.723+0000 7f30b9ffb700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f30bc06ea90 0x7f30bc070f40 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f30cc00bd90 tx=0x7f30cc007c00 comp rx=0 tx=0).stop 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.723+0000 7f30b9ffb700 1 -- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4071980 msgr2=0x7f30d4082620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.723+0000 7f30b9ffb700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4071980 0x7f30d4082620 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f30c400baa0 tx=0x7f30c400be60 comp rx=0 tx=0).stop 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.724+0000 7f30b9ffb700 1 -- 192.168.123.100:0/667749347 shutdown_connections 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.724+0000 7f30b9ffb700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f30bc06ea90 0x7f30bc070f40 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.724+0000 7f30b9ffb700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f30d4071980 0x7f30d4082620 unknown :-1 s=CLOSED pgs=315 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.724+0000 7f30b9ffb700 1 --2- 192.168.123.100:0/667749347 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f30d4082b60 0x7f30d4082fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.724+0000 7f30b9ffb700 1 -- 192.168.123.100:0/667749347 >> 192.168.123.100:0/667749347 conn(0x7f30d406d1a0 msgr2=0x7f30d40764e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.724+0000 7f30b9ffb700 1 -- 192.168.123.100:0/667749347 shutdown_connections 2026-03-10T12:36:29.725 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.724+0000 7f30b9ffb700 1 -- 192.168.123.100:0/667749347 wait complete. 2026-03-10T12:36:29.725 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 11 2026-03-10T12:36:29.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.799+0000 7f552d7e8700 1 -- 192.168.123.100:0/180989486 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528071980 msgr2=0x7f5528071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.799+0000 7f552d7e8700 1 --2- 192.168.123.100:0/180989486 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528071980 0x7f5528071d90 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f5518009b50 tx=0x7f5518009e60 comp rx=0 tx=0).stop 2026-03-10T12:36:29.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.799+0000 7f552d7e8700 1 -- 192.168.123.100:0/180989486 shutdown_connections 2026-03-10T12:36:29.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.799+0000 7f552d7e8700 1 --2- 192.168.123.100:0/180989486 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5528072360 0x7f55280770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.799+0000 7f552d7e8700 1 --2- 192.168.123.100:0/180989486 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528071980 0x7f5528071d90 unknown :-1 s=CLOSED pgs=316 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.799+0000 7f552d7e8700 1 -- 192.168.123.100:0/180989486 >> 192.168.123.100:0/180989486 conn(0x7f552806d1a0 msgr2=0x7f552806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 -- 192.168.123.100:0/180989486 shutdown_connections 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 -- 192.168.123.100:0/180989486 wait complete. 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 Processor -- start 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 -- start start 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5528072360 0x7f5528082530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528082a70 0x7f5528082ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55281b2a90 con 0x7f5528082a70 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.800+0000 7f552d7e8700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55281b2bd0 con 0x7f5528072360 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.801+0000 7f55267fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528082a70 0x7f5528082ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.801+0000 7f55267fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528082a70 0x7f5528082ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52732/0 (socket says 192.168.123.100:52732) 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.801+0000 7f55267fc700 1 -- 192.168.123.100:0/798290142 learned_addr learned my addr 192.168.123.100:0/798290142 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.801+0000 7f55267fc700 1 -- 192.168.123.100:0/798290142 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5528072360 msgr2=0x7f5528082530 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.801+0000 7f55267fc700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5528072360 0x7f5528082530 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.801+0000 7f55267fc700 1 -- 192.168.123.100:0/798290142 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55180097e0 con 0x7f5528082a70 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.801+0000 7f55267fc700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528082a70 0x7f5528082ee0 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f5520009f20 tx=0x7f552000bef0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.802+0000 7f550ffff700 1 -- 192.168.123.100:0/798290142 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f552000cb40 con 0x7f5528082a70 2026-03-10T12:36:29.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.802+0000 7f552d7e8700 1 -- 192.168.123.100:0/798290142 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f55281b2d70 con 0x7f5528082a70 2026-03-10T12:36:29.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.802+0000 7f552d7e8700 1 -- 192.168.123.100:0/798290142 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f55281b3290 con 0x7f5528082a70 2026-03-10T12:36:29.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.804+0000 7f550ffff700 1 -- 192.168.123.100:0/798290142 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f552000cca0 con 0x7f5528082a70 2026-03-10T12:36:29.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.804+0000 7f550ffff700 1 -- 192.168.123.100:0/798290142 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55200127b0 con 0x7f5528082a70 2026-03-10T12:36:29.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.805+0000 7f550ffff700 1 -- 192.168.123.100:0/798290142 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f55200129d0 con 0x7f5528082a70 2026-03-10T12:36:29.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.807+0000 7f550ffff700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f551006ea90 0x7f5510070f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:29.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.808+0000 7f5526ffd700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f551006ea90 0x7f5510070f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:29.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.808+0000 7f550ffff700 1 -- 192.168.123.100:0/798290142 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f552008c140 con 0x7f5528082a70 2026-03-10T12:36:29.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.808+0000 7f5526ffd700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f551006ea90 0x7f5510070f40 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f5518009b20 tx=0x7f55180058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:29.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.808+0000 7f552d7e8700 1 -- 192.168.123.100:0/798290142 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5514005320 con 0x7f5528082a70 2026-03-10T12:36:29.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.811+0000 7f550ffff700 1 -- 192.168.123.100:0/798290142 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f552004e720 con 0x7f5528082a70 2026-03-10T12:36:29.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.926+0000 7f552d7e8700 1 -- 192.168.123.100:0/798290142 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5514000bf0 con 0x7f551006ea90 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "", 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:36:29.935 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.933+0000 7f550ffff700 1 -- 192.168.123.100:0/798290142 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f5514000bf0 con 0x7f551006ea90 2026-03-10T12:36:29.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 -- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f551006ea90 msgr2=0x7f5510070f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f551006ea90 0x7f5510070f40 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f5518009b20 tx=0x7f55180058e0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 -- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528082a70 msgr2=0x7f5528082ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:29.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528082a70 0x7f5528082ee0 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f5520009f20 tx=0x7f552000bef0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 -- 192.168.123.100:0/798290142 shutdown_connections 2026-03-10T12:36:29.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f551006ea90 0x7f5510070f40 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5528072360 0x7f5528082530 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 --2- 192.168.123.100:0/798290142 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5528082a70 0x7f5528082ee0 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:29.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.938+0000 7f550dffb700 1 -- 192.168.123.100:0/798290142 >> 192.168.123.100:0/798290142 conn(0x7f552806d1a0 msgr2=0x7f5528076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:29.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.939+0000 7f550dffb700 1 -- 192.168.123.100:0/798290142 shutdown_connections 2026-03-10T12:36:29.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:29.939+0000 7f550dffb700 1 -- 192.168.123.100:0/798290142 wait complete. 2026-03-10T12:36:30.026 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.025+0000 7fb9f82f1700 1 -- 192.168.123.100:0/669582245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0071980 msgr2=0x7fb9f0071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:30.026 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.025+0000 7fb9f82f1700 1 --2- 192.168.123.100:0/669582245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0071980 0x7fb9f0071d90 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7fb9ec009b50 tx=0x7fb9ec009e60 comp rx=0 tx=0).stop 2026-03-10T12:36:30.026 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.025+0000 7fb9f82f1700 1 -- 192.168.123.100:0/669582245 shutdown_connections 2026-03-10T12:36:30.026 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.025+0000 7fb9f82f1700 1 --2- 192.168.123.100:0/669582245 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9f0072360 0x7fb9f00770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.026 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.025+0000 7fb9f82f1700 1 --2- 192.168.123.100:0/669582245 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0071980 0x7fb9f0071d90 unknown :-1 s=CLOSED pgs=318 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.026 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.025+0000 7fb9f82f1700 1 -- 192.168.123.100:0/669582245 >> 192.168.123.100:0/669582245 conn(0x7fb9f006d1a0 msgr2=0x7fb9f006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:30.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 -- 192.168.123.100:0/669582245 shutdown_connections 2026-03-10T12:36:30.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 -- 192.168.123.100:0/669582245 wait complete. 2026-03-10T12:36:30.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 Processor -- start 2026-03-10T12:36:30.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 -- start start 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9f0072360 0x7fb9f0082500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0082a40 0x7fb9f0082eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9f01b2a90 con 0x7fb9f0082a40 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f82f1700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9f01b2bd0 con 0x7fb9f0072360 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f588c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0082a40 0x7fb9f0082eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f588c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0082a40 0x7fb9f0082eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:52738/0 (socket says 192.168.123.100:52738) 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.026+0000 7fb9f588c700 1 -- 192.168.123.100:0/458902993 learned_addr learned my addr 192.168.123.100:0/458902993 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.027+0000 7fb9f588c700 1 -- 192.168.123.100:0/458902993 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9f0072360 msgr2=0x7fb9f0082500 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.027+0000 7fb9f588c700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9f0072360 0x7fb9f0082500 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.027+0000 7fb9f588c700 1 -- 192.168.123.100:0/458902993 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9ec0097e0 con 0x7fb9f0082a40 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.027+0000 7fb9f588c700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0082a40 0x7fb9f0082eb0 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7fb9e800c3e0 tx=0x7fb9e800c7a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:30.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.028+0000 7fb9e77fe700 1 -- 192.168.123.100:0/458902993 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb9e800e050 con 0x7fb9f0082a40 2026-03-10T12:36:30.030 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.028+0000 7fb9f82f1700 1 -- 192.168.123.100:0/458902993 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9f01b2d10 con 0x7fb9f0082a40 2026-03-10T12:36:30.030 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.028+0000 7fb9f82f1700 1 -- 192.168.123.100:0/458902993 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9f01b3200 con 0x7fb9f0082a40 2026-03-10T12:36:30.030 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.030+0000 7fb9e77fe700 1 -- 192.168.123.100:0/458902993 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb9e800f040 con 0x7fb9f0082a40 2026-03-10T12:36:30.030 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.030+0000 7fb9e77fe700 1 -- 192.168.123.100:0/458902993 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb9e8013610 con 0x7fb9f0082a40 2026-03-10T12:36:30.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.030+0000 7fb9e77fe700 1 -- 192.168.123.100:0/458902993 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb9e80137b0 con 0x7fb9f0082a40 2026-03-10T12:36:30.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.031+0000 7fb9e77fe700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9dc06ea90 0x7fb9dc070f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:36:30.032 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.031+0000 7fb9f608d700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9dc06ea90 0x7fb9dc070f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:36:30.032 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.031+0000 7fb9e77fe700 1 -- 192.168.123.100:0/458902993 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb9e808c800 con 0x7fb9f0082a40 2026-03-10T12:36:30.032 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.032+0000 7fb9f608d700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9dc06ea90 0x7fb9dc070f40 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fb9ec005cb0 tx=0x7fb9ec005be0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:36:30.032 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.032+0000 7fb9f82f1700 1 -- 192.168.123.100:0/458902993 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb9d4005320 con 0x7fb9f0082a40 2026-03-10T12:36:30.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.035+0000 7fb9e77fe700 1 -- 192.168.123.100:0/458902993 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb9e805aa90 con 0x7fb9f0082a40 2026-03-10T12:36:30.187 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.186+0000 7fb9f82f1700 1 -- 192.168.123.100:0/458902993 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb9d4005190 con 0x7fb9f0082a40 2026-03-10T12:36:30.188 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:36:30.188 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.186+0000 7fb9e77fe700 1 -- 192.168.123.100:0/458902993 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fb9e805a620 con 0x7fb9f0082a40 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.188+0000 7fb9e57ba700 1 -- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9dc06ea90 msgr2=0x7fb9dc070f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9dc06ea90 0x7fb9dc070f40 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fb9ec005cb0 tx=0x7fb9ec005be0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 -- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0082a40 msgr2=0x7fb9f0082eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0082a40 0x7fb9f0082eb0 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7fb9e800c3e0 tx=0x7fb9e800c7a0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 -- 192.168.123.100:0/458902993 shutdown_connections 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb9dc06ea90 0x7fb9dc070f40 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb9f0072360 0x7fb9f0082500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 --2- 192.168.123.100:0/458902993 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb9f0082a40 0x7fb9f0082eb0 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 -- 192.168.123.100:0/458902993 >> 192.168.123.100:0/458902993 conn(0x7fb9f006d1a0 msgr2=0x7fb9f00705c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 -- 192.168.123.100:0/458902993 shutdown_connections 2026-03-10T12:36:30.189 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:36:30.189+0000 7fb9e57ba700 1 -- 192.168.123.100:0/458902993 wait complete. 2026-03-10T12:36:30.431 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:30 vm00.local ceph-mon[50686]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:30.431 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:30 vm00.local ceph-mon[50686]: from='client.24401 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:30 vm00.local ceph-mon[50686]: pgmap v120: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:30 vm00.local ceph-mon[50686]: from='client.24405 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:30 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/3427029102' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:36:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:30 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/667749347' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:36:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:30 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/458902993' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:36:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:30 vm07.local ceph-mon[58582]: from='client.14604 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:30 vm07.local ceph-mon[58582]: from='client.24401 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:30 vm07.local ceph-mon[58582]: pgmap v120: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:30 vm07.local ceph-mon[58582]: from='client.24405 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:30 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/3427029102' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:36:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:30 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/667749347' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:36:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:30 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/458902993' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:36:31.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:31 vm00.local ceph-mon[50686]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:31.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:31 vm07.local ceph-mon[58582]: from='client.14622 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:36:32.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:32 vm00.local ceph-mon[50686]: pgmap v121: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:32.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:32 vm07.local ceph-mon[58582]: pgmap v121: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:34 vm00.local ceph-mon[50686]: pgmap v122: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:34.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:34 vm07.local ceph-mon[58582]: pgmap v122: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:36.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:36 vm00.local ceph-mon[50686]: pgmap v123: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:36 vm07.local ceph-mon[58582]: pgmap v123: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:38 vm00.local ceph-mon[50686]: pgmap v124: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:38 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:38 vm07.local ceph-mon[58582]: pgmap v124: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:38 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:40.423 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:40 vm00.local ceph-mon[50686]: pgmap v125: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:40.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:40 vm07.local ceph-mon[58582]: pgmap v125: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:42 vm00.local ceph-mon[50686]: pgmap v126: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:42 vm07.local ceph-mon[58582]: pgmap v126: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: git switch -c 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:Or undo this operation with: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: git switch - 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr: 2026-03-10T12:36:43.541 INFO:tasks.workunit.client.1.vm07.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T12:36:43.546 DEBUG:teuthology.orchestra.run.vm07:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-10T12:36:43.605 INFO:tasks.workunit.client.1.vm07.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T12:36:43.607 INFO:tasks.workunit.client.1.vm07.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T12:36:43.607 INFO:tasks.workunit.client.1.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T12:36:43.652 INFO:tasks.workunit.client.1.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T12:36:43.686 INFO:tasks.workunit.client.1.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T12:36:43.716 INFO:tasks.workunit.client.1.vm07.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T12:36:43.717 INFO:tasks.workunit.client.1.vm07.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T12:36:43.717 INFO:tasks.workunit.client.1.vm07.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T12:36:43.745 INFO:tasks.workunit.client.1.vm07.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T12:36:43.748 DEBUG:teuthology.orchestra.run.vm07:> set -ex 2026-03-10T12:36:43.748 DEBUG:teuthology.orchestra.run.vm07:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-10T12:36:43.804 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-10T12:36:43.805 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T12:36:43.805 DEBUG:teuthology.orchestra.run.vm07:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-10T12:36:43.874 INFO:tasks.workunit.client.1.vm07.stderr:+ mkdir -p fsstress 2026-03-10T12:36:43.876 INFO:tasks.workunit.client.1.vm07.stderr:+ pushd fsstress 2026-03-10T12:36:43.878 INFO:tasks.workunit.client.1.vm07.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T12:36:43.878 INFO:tasks.workunit.client.1.vm07.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T12:36:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:44 vm00.local ceph-mon[50686]: pgmap v127: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:44 vm07.local ceph-mon[58582]: pgmap v127: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr: 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr: 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T12:36:45.340 INFO:tasks.workunit.client.0.vm00.stderr: 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr: git switch -c 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr: 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr:Or undo this operation with: 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr: 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr: git switch - 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr: 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr: 2026-03-10T12:36:45.341 INFO:tasks.workunit.client.0.vm00.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T12:36:45.345 DEBUG:teuthology.orchestra.run.vm00:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-10T12:36:45.373 INFO:tasks.workunit.client.1.vm07.stderr:+ tar xzf ltp-full.tgz 2026-03-10T12:36:45.412 INFO:tasks.workunit.client.0.vm00.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T12:36:45.436 INFO:tasks.workunit.client.0.vm00.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T12:36:45.436 INFO:tasks.workunit.client.0.vm00.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T12:36:45.487 INFO:tasks.workunit.client.0.vm00.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T12:36:45.535 INFO:tasks.workunit.client.0.vm00.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T12:36:45.567 INFO:tasks.workunit.client.0.vm00.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T12:36:45.569 INFO:tasks.workunit.client.0.vm00.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T12:36:45.569 INFO:tasks.workunit.client.0.vm00.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T12:36:45.598 INFO:tasks.workunit.client.0.vm00.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T12:36:45.602 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:36:45.602 DEBUG:teuthology.orchestra.run.vm00:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-10T12:36:45.670 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-10T12:36:45.674 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T12:36:45.675 DEBUG:teuthology.orchestra.run.vm00:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-10T12:36:45.744 INFO:tasks.workunit.client.0.vm00.stderr:+ mkdir -p fsstress 2026-03-10T12:36:45.751 INFO:tasks.workunit.client.0.vm00.stderr:+ pushd fsstress 2026-03-10T12:36:45.757 INFO:tasks.workunit.client.0.vm00.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T12:36:45.757 INFO:tasks.workunit.client.0.vm00.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T12:36:46.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:46 vm07.local ceph-mon[58582]: pgmap v128: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:46.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:46 vm00.local ceph-mon[50686]: pgmap v128: 65 pgs: 65 active+clean; 462 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T12:36:47.523 INFO:tasks.workunit.client.0.vm00.stderr:+ tar xzf ltp-full.tgz 2026-03-10T12:36:47.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:47 vm00.local ceph-mon[50686]: pgmap v129: 65 pgs: 65 active+clean; 893 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 38 KiB/s wr, 4 op/s 2026-03-10T12:36:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:47 vm07.local ceph-mon[58582]: pgmap v129: 65 pgs: 65 active+clean; 893 KiB data, 165 MiB used, 120 GiB / 120 GiB avail; 38 KiB/s wr, 4 op/s 2026-03-10T12:36:51.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:50 vm07.local ceph-mon[58582]: pgmap v130: 65 pgs: 65 active+clean; 8.5 MiB data, 184 MiB used, 120 GiB / 120 GiB avail; 712 KiB/s wr, 37 op/s 2026-03-10T12:36:51.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:50 vm00.local ceph-mon[50686]: pgmap v130: 65 pgs: 65 active+clean; 8.5 MiB data, 184 MiB used, 120 GiB / 120 GiB avail; 712 KiB/s wr, 37 op/s 2026-03-10T12:36:52.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:52 vm07.local ceph-mon[58582]: pgmap v131: 65 pgs: 65 active+clean; 33 MiB data, 261 MiB used, 120 GiB / 120 GiB avail; 2.8 MiB/s wr, 194 op/s 2026-03-10T12:36:52.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:52 vm00.local ceph-mon[50686]: pgmap v131: 65 pgs: 65 active+clean; 33 MiB data, 261 MiB used, 120 GiB / 120 GiB avail; 2.8 MiB/s wr, 194 op/s 2026-03-10T12:36:53.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:53 vm07.local ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:36:53.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:53 vm00.local ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:36:54.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:54 vm07.local ceph-mon[58582]: pgmap v132: 65 pgs: 65 active+clean; 34 MiB data, 278 MiB used, 120 GiB / 120 GiB avail; 2.9 MiB/s wr, 216 op/s 2026-03-10T12:36:54.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:54 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:55.236 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:54 vm00.local ceph-mon[50686]: pgmap v132: 65 pgs: 65 active+clean; 34 MiB data, 278 MiB used, 120 GiB / 120 GiB avail; 2.9 MiB/s wr, 216 op/s 2026-03-10T12:36:55.236 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:54 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:36:56.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:56 vm00.local ceph-mon[50686]: pgmap v133: 65 pgs: 65 active+clean; 50 MiB data, 353 MiB used, 120 GiB / 120 GiB avail; 4.3 MiB/s wr, 359 op/s 2026-03-10T12:36:56.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:56 vm07.local ceph-mon[58582]: pgmap v133: 65 pgs: 65 active+clean; 50 MiB data, 353 MiB used, 120 GiB / 120 GiB avail; 4.3 MiB/s wr, 359 op/s 2026-03-10T12:36:58.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:58 vm00.local ceph-mon[50686]: pgmap v134: 65 pgs: 65 active+clean; 54 MiB data, 427 MiB used, 120 GiB / 120 GiB avail; 4.7 MiB/s wr, 425 op/s 2026-03-10T12:36:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:58 vm07.local ceph-mon[58582]: pgmap v134: 65 pgs: 65 active+clean; 54 MiB data, 427 MiB used, 120 GiB / 120 GiB avail; 4.7 MiB/s wr, 425 op/s 2026-03-10T12:37:00.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:36:59 vm07.local ceph-mon[58582]: pgmap v135: 65 pgs: 65 active+clean; 61 MiB data, 463 MiB used, 120 GiB / 120 GiB avail; 5.2 MiB/s wr, 458 op/s 2026-03-10T12:37:00.360 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:36:59 vm00.local ceph-mon[50686]: pgmap v135: 65 pgs: 65 active+clean; 61 MiB data, 463 MiB used, 120 GiB / 120 GiB avail; 5.2 MiB/s wr, 458 op/s 2026-03-10T12:37:00.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.359+0000 7f3e9b59e700 1 -- 192.168.123.100:0/136005616 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c072440 msgr2=0x7f3e9c10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:00.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.359+0000 7f3e9b59e700 1 --2- 192.168.123.100:0/136005616 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c072440 0x7f3e9c10be90 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f3e90009b00 tx=0x7f3e90009e10 comp rx=0 tx=0).stop 2026-03-10T12:37:00.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.364+0000 7f3e9b59e700 1 -- 192.168.123.100:0/136005616 shutdown_connections 2026-03-10T12:37:00.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.364+0000 7f3e9b59e700 1 --2- 192.168.123.100:0/136005616 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c072440 0x7f3e9c10be90 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.364+0000 7f3e9b59e700 1 --2- 192.168.123.100:0/136005616 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e9c071a60 0x7f3e9c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.365 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.364+0000 7f3e9b59e700 1 -- 192.168.123.100:0/136005616 >> 192.168.123.100:0/136005616 conn(0x7f3e9c06d1a0 msgr2=0x7f3e9c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:00.367 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.366+0000 7f3e9b59e700 1 -- 192.168.123.100:0/136005616 shutdown_connections 2026-03-10T12:37:00.368 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.366+0000 7f3e9b59e700 1 -- 192.168.123.100:0/136005616 wait complete. 2026-03-10T12:37:00.368 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e9b59e700 1 Processor -- start 2026-03-10T12:37:00.368 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e9b59e700 1 -- start start 2026-03-10T12:37:00.368 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e9b59e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e9c071a60 0x7f3e9c116a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:00.368 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e9b59e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c116f60 0x7f3e9c1b2830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:00.368 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e9b59e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e9c1173d0 con 0x7f3e9c071a60 2026-03-10T12:37:00.368 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e9b59e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e9c117540 con 0x7f3e9c116f60 2026-03-10T12:37:00.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e99d9b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c116f60 0x7f3e9c1b2830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:00.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e99d9b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c116f60 0x7f3e9c1b2830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:51962/0 (socket says 192.168.123.100:51962) 2026-03-10T12:37:00.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.367+0000 7f3e99d9b700 1 -- 192.168.123.100:0/241069863 learned_addr learned my addr 192.168.123.100:0/241069863 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:00.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.368+0000 7f3e99d9b700 1 -- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e9c071a60 msgr2=0x7f3e9c116a20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:00.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.368+0000 7f3e99d9b700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e9c071a60 0x7f3e9c116a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.368+0000 7f3e99d9b700 1 -- 192.168.123.100:0/241069863 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e900097e0 con 0x7f3e9c116f60 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.368+0000 7f3e99d9b700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c116f60 0x7f3e9c1b2830 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f3e90000c00 tx=0x7f3e9000f830 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.368+0000 7f3e8b7fe700 1 -- 192.168.123.100:0/241069863 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e9001c070 con 0x7f3e9c116f60 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.368+0000 7f3e8b7fe700 1 -- 192.168.123.100:0/241069863 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3e9000fbb0 con 0x7f3e9c116f60 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.368+0000 7f3e8b7fe700 1 -- 192.168.123.100:0/241069863 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e900178b0 con 0x7f3e9c116f60 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.370+0000 7f3e9b59e700 1 -- 192.168.123.100:0/241069863 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3e9c1b2d70 con 0x7f3e9c116f60 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.370+0000 7f3e9b59e700 1 -- 192.168.123.100:0/241069863 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3e9c1b3180 con 0x7f3e9c116f60 2026-03-10T12:37:00.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.371+0000 7f3e9b59e700 1 -- 192.168.123.100:0/241069863 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3e9c117e70 con 0x7f3e9c116f60 2026-03-10T12:37:00.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.372+0000 7f3e8b7fe700 1 -- 192.168.123.100:0/241069863 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3e9000fd20 con 0x7f3e9c116f60 2026-03-10T12:37:00.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.372+0000 7f3e8b7fe700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e8406c6d0 0x7f3e8406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:00.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.372+0000 7f3e8b7fe700 1 -- 192.168.123.100:0/241069863 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f3e9008d0a0 con 0x7f3e9c116f60 2026-03-10T12:37:00.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.375+0000 7f3e8b7fe700 1 -- 192.168.123.100:0/241069863 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3e900578a0 con 0x7f3e9c116f60 2026-03-10T12:37:00.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.378+0000 7f3e9a59c700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e8406c6d0 0x7f3e8406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:00.399 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.394+0000 7f3e9a59c700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e8406c6d0 0x7f3e8406eb80 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f3e9c1ae7a0 tx=0x7f3e94009250 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:00.701 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.699+0000 7f3e9b59e700 1 -- 192.168.123.100:0/241069863 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3e9c1b38a0 con 0x7f3e8406c6d0 2026-03-10T12:37:00.705 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.705+0000 7f3e8b7fe700 1 -- 192.168.123.100:0/241069863 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f3e9c1b38a0 con 0x7f3e8406c6d0 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 -- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e8406c6d0 msgr2=0x7f3e8406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e8406c6d0 0x7f3e8406eb80 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f3e9c1ae7a0 tx=0x7f3e94009250 comp rx=0 tx=0).stop 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 -- 192.168.123.100:0/241069863 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c116f60 msgr2=0x7f3e9c1b2830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c116f60 0x7f3e9c1b2830 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f3e90000c00 tx=0x7f3e9000f830 comp rx=0 tx=0).stop 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 -- 192.168.123.100:0/241069863 shutdown_connections 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f3e8406c6d0 0x7f3e8406eb80 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3e9c071a60 0x7f3e9c116a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 --2- 192.168.123.100:0/241069863 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3e9c116f60 0x7f3e9c1b2830 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 -- 192.168.123.100:0/241069863 >> 192.168.123.100:0/241069863 conn(0x7f3e9c06d1a0 msgr2=0x7f3e9c0705a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 -- 192.168.123.100:0/241069863 shutdown_connections 2026-03-10T12:37:00.710 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.708+0000 7f3e897fa700 1 -- 192.168.123.100:0/241069863 wait complete. 2026-03-10T12:37:00.723 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 -- 192.168.123.100:0/1880834211 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8072360 msgr2=0x7f93b80770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 --2- 192.168.123.100:0/1880834211 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8072360 0x7f93b80770e0 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7f93b000d3f0 tx=0x7f93b000d700 comp rx=0 tx=0).stop 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 -- 192.168.123.100:0/1880834211 shutdown_connections 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 --2- 192.168.123.100:0/1880834211 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8072360 0x7f93b80770e0 unknown :-1 s=CLOSED pgs=320 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 --2- 192.168.123.100:0/1880834211 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8071980 0x7f93b8071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 -- 192.168.123.100:0/1880834211 >> 192.168.123.100:0/1880834211 conn(0x7f93b806d1a0 msgr2=0x7f93b806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 -- 192.168.123.100:0/1880834211 shutdown_connections 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 -- 192.168.123.100:0/1880834211 wait complete. 2026-03-10T12:37:00.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 Processor -- start 2026-03-10T12:37:00.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.832+0000 7f93beb0c700 1 -- start start 2026-03-10T12:37:00.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.833+0000 7f93beb0c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8071980 0x7f93b8131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:00.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.833+0000 7f93beb0c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8131890 0x7f93b807f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:00.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.833+0000 7f93beb0c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93b8131d90 con 0x7f93b8071980 2026-03-10T12:37:00.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.833+0000 7f93beb0c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93b8131ed0 con 0x7f93b8131890 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93b7fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8131890 0x7f93b807f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93b7fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8131890 0x7f93b807f520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:51980/0 (socket says 192.168.123.100:51980) 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93b7fff700 1 -- 192.168.123.100:0/3573447754 learned_addr learned my addr 192.168.123.100:0/3573447754 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93bc8a8700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8071980 0x7f93b8131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93b7fff700 1 -- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8071980 msgr2=0x7f93b8131350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93b7fff700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8071980 0x7f93b8131350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93b7fff700 1 -- 192.168.123.100:0/3573447754 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93b0007ed0 con 0x7f93b8131890 2026-03-10T12:37:00.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.834+0000 7f93b7fff700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8131890 0x7f93b807f520 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f93b0003c30 tx=0x7f93b0003c60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:00.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.835+0000 7f93b5ffb700 1 -- 192.168.123.100:0/3573447754 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f93b001c070 con 0x7f93b8131890 2026-03-10T12:37:00.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.835+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f93b807fa60 con 0x7f93b8131890 2026-03-10T12:37:00.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.835+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f93b807ff00 con 0x7f93b8131890 2026-03-10T12:37:00.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.835+0000 7f93b5ffb700 1 -- 192.168.123.100:0/3573447754 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f93b0004370 con 0x7f93b8131890 2026-03-10T12:37:00.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.835+0000 7f93b5ffb700 1 -- 192.168.123.100:0/3573447754 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f93b0017860 con 0x7f93b8131890 2026-03-10T12:37:00.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.836+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f93a4005320 con 0x7f93b8131890 2026-03-10T12:37:00.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.838+0000 7f93b5ffb700 1 -- 192.168.123.100:0/3573447754 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f93b00179c0 con 0x7f93b8131890 2026-03-10T12:37:00.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.838+0000 7f93b5ffb700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93a006c680 0x7f93a006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:00.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.838+0000 7f93b5ffb700 1 -- 192.168.123.100:0/3573447754 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f93b0013070 con 0x7f93b8131890 2026-03-10T12:37:00.840 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.838+0000 7f93bc8a8700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93a006c680 0x7f93a006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:00.841 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.839+0000 7f93bc8a8700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93a006c680 0x7f93a006eb30 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f93a8009c80 tx=0x7f93a8009400 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:00.852 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:00.844+0000 7f93b5ffb700 1 -- 192.168.123.100:0/3573447754 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f93b005bbd0 con 0x7f93b8131890 2026-03-10T12:37:01.060 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.059+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f93a4000bf0 con 0x7f93a006c680 2026-03-10T12:37:01.063 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.060+0000 7f93b5ffb700 1 -- 192.168.123.100:0/3573447754 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f93a4000bf0 con 0x7f93a006c680 2026-03-10T12:37:01.063 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93a006c680 msgr2=0x7f93a006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.063 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93a006c680 0x7f93a006eb30 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f93a8009c80 tx=0x7f93a8009400 comp rx=0 tx=0).stop 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8131890 msgr2=0x7f93b807f520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8131890 0x7f93b807f520 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f93b0003c30 tx=0x7f93b0003c60 comp rx=0 tx=0).stop 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 shutdown_connections 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f93a006c680 0x7f93a006eb30 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f93b8071980 0x7f93b8131350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 --2- 192.168.123.100:0/3573447754 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f93b8131890 0x7f93b807f520 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 >> 192.168.123.100:0/3573447754 conn(0x7f93b806d1a0 msgr2=0x7f93b8076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.063+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 shutdown_connections 2026-03-10T12:37:01.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.064+0000 7f93beb0c700 1 -- 192.168.123.100:0/3573447754 wait complete. 2026-03-10T12:37:01.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 -- 192.168.123.100:0/3794301253 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54071a60 msgr2=0x7f7b54071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/3794301253 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54071a60 0x7f7b54071e70 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f7b44009a60 tx=0x7f7b44009d70 comp rx=0 tx=0).stop 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 -- 192.168.123.100:0/3794301253 shutdown_connections 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/3794301253 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54072440 0x7f7b5410be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/3794301253 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54071a60 0x7f7b54071e70 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 -- 192.168.123.100:0/3794301253 >> 192.168.123.100:0/3794301253 conn(0x7f7b5406d1a0 msgr2=0x7f7b5406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 -- 192.168.123.100:0/3794301253 shutdown_connections 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.191+0000 7f7b5a93c700 1 -- 192.168.123.100:0/3794301253 wait complete. 2026-03-10T12:37:01.192 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.192+0000 7f7b5a93c700 1 Processor -- start 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.192+0000 7f7b5a93c700 1 -- start start 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.192+0000 7f7b5a93c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54071a60 0x7f7b541a4ca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.192+0000 7f7b5a93c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54072440 0x7f7b541a51e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.192+0000 7f7b5a93c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b541a57e0 con 0x7f7b54071a60 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.192+0000 7f7b5a93c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7b541a5950 con 0x7f7b54072440 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.193+0000 7f7b5993a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54071a60 0x7f7b541a4ca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.193+0000 7f7b5993a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54071a60 0x7f7b541a4ca0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:35350/0 (socket says 192.168.123.100:35350) 2026-03-10T12:37:01.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.193+0000 7f7b5993a700 1 -- 192.168.123.100:0/953490166 learned_addr learned my addr 192.168.123.100:0/953490166 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:01.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.193+0000 7f7b59139700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54072440 0x7f7b541a51e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:01.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.194+0000 7f7b5993a700 1 -- 192.168.123.100:0/953490166 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54072440 msgr2=0x7f7b541a51e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.194+0000 7f7b5993a700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54072440 0x7f7b541a51e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.194+0000 7f7b5993a700 1 -- 192.168.123.100:0/953490166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7b44009710 con 0x7f7b54071a60 2026-03-10T12:37:01.195 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.195+0000 7f7b5993a700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54071a60 0x7f7b541a4ca0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f7b440038e0 tx=0x7f7b44003a40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:01.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.445+0000 7f7b4affd700 1 -- 192.168.123.100:0/953490166 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b4401d070 con 0x7f7b54071a60 2026-03-10T12:37:01.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.445+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7b541aa380 con 0x7f7b54071a60 2026-03-10T12:37:01.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.446+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7b541aa870 con 0x7f7b54071a60 2026-03-10T12:37:01.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.446+0000 7f7b4affd700 1 -- 192.168.123.100:0/953490166 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7b440043c0 con 0x7f7b54071a60 2026-03-10T12:37:01.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.447+0000 7f7b4affd700 1 -- 192.168.123.100:0/953490166 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b4400f670 con 0x7f7b54071a60 2026-03-10T12:37:01.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.447+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7b38005320 con 0x7f7b54071a60 2026-03-10T12:37:01.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.451+0000 7f7b4affd700 1 -- 192.168.123.100:0/953490166 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7b44004530 con 0x7f7b54071a60 2026-03-10T12:37:01.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.451+0000 7f7b4affd700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7b4006c4d0 0x7f7b4006e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:01.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.451+0000 7f7b59139700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7b4006c4d0 0x7f7b4006e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:01.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.452+0000 7f7b59139700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7b4006c4d0 0x7f7b4006e980 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f7b541a61b0 tx=0x7f7b50009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:01.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.452+0000 7f7b4affd700 1 -- 192.168.123.100:0/953490166 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7b4405b450 con 0x7f7b54071a60 2026-03-10T12:37:01.468 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.466+0000 7f7b4affd700 1 -- 192.168.123.100:0/953490166 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7b44057790 con 0x7f7b54071a60 2026-03-10T12:37:01.681 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.680+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f7b38000bf0 con 0x7f7b4006c4d0 2026-03-10T12:37:01.697 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:01 vm00.local ceph-mon[50686]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:01.698 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:01 vm00.local ceph-mon[50686]: from='client.24419 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:01.698 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:01 vm00.local ceph-mon[50686]: pgmap v136: 65 pgs: 65 active+clean; 75 MiB data, 654 MiB used, 119 GiB / 120 GiB avail; 5.9 MiB/s wr, 535 op/s 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (3m) 103s ago 4m 22.8M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (4m) 103s ago 4m 8074k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (3m) 104s ago 3m 8208k - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (4m) 103s ago 4m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (3m) 104s ago 3m 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (3m) 103s ago 3m 82.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (111s) 103s ago 111s 17.2M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (109s) 103s ago 109s 14.2M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (108s) 104s ago 108s 13.7M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (110s) 104s ago 110s 18.6M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:9283,8765,8443 running (4m) 103s ago 4m 498M - 18.2.0 dc2bc1663786 8dc0a869be20 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (3m) 104s ago 3m 448M - 18.2.0 dc2bc1663786 1662ba2e507c 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (4m) 103s ago 4m 50.6M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (3m) 104s ago 3m 44.3M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (4m) 103s ago 4m 14.4M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (3m) 104s ago 3m 12.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (3m) 103s ago 3m 45.5M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (2m) 103s ago 2m 45.9M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (2m) 103s ago 2m 46.7M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (2m) 104s ago 2m 44.8M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (2m) 104s ago 2m 43.9M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (2m) 104s ago 2m 42.7M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (3m) 103s ago 3m 39.1M - 2.43.0 a07b618ecd1d 5d567c813f4b 2026-03-10T12:37:01.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.695+0000 7f7b4affd700 1 -- 192.168.123.100:0/953490166 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7f7b38000bf0 con 0x7f7b4006c4d0 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7b4006c4d0 msgr2=0x7f7b4006e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7b4006c4d0 0x7f7b4006e980 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f7b541a61b0 tx=0x7f7b50009500 comp rx=0 tx=0).stop 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54071a60 msgr2=0x7f7b541a4ca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54071a60 0x7f7b541a4ca0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f7b440038e0 tx=0x7f7b44003a40 comp rx=0 tx=0).stop 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 shutdown_connections 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7b4006c4d0 0x7f7b4006e980 secure :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f7b541a61b0 tx=0x7f7b50009500 comp rx=0 tx=0).stop 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7b54071a60 0x7f7b541a4ca0 secure :-1 s=CLOSED pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f7b440038e0 tx=0x7f7b44003a40 comp rx=0 tx=0).stop 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 --2- 192.168.123.100:0/953490166 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7b54072440 0x7f7b541a51e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.698+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 >> 192.168.123.100:0/953490166 conn(0x7f7b5406d1a0 msgr2=0x7f7b5410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.699+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 shutdown_connections 2026-03-10T12:37:01.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.699+0000 7f7b5a93c700 1 -- 192.168.123.100:0/953490166 wait complete. 2026-03-10T12:37:01.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.846+0000 7f01394a1700 1 -- 192.168.123.100:0/1603868440 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134072360 msgr2=0x7f01340770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.846+0000 7f01394a1700 1 --2- 192.168.123.100:0/1603868440 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134072360 0x7f01340770e0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f012c00a390 tx=0x7f012c00a6a0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.846+0000 7f01394a1700 1 -- 192.168.123.100:0/1603868440 shutdown_connections 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.846+0000 7f01394a1700 1 --2- 192.168.123.100:0/1603868440 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134072360 0x7f01340770e0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.846+0000 7f01394a1700 1 --2- 192.168.123.100:0/1603868440 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0134071980 0x7f0134071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.846+0000 7f01394a1700 1 -- 192.168.123.100:0/1603868440 >> 192.168.123.100:0/1603868440 conn(0x7f013406d1a0 msgr2=0x7f013406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 -- 192.168.123.100:0/1603868440 shutdown_connections 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 -- 192.168.123.100:0/1603868440 wait complete. 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 Processor -- start 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 -- start start 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0134071980 0x7f0134082530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134082a70 0x7f0134082ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f013412dd80 con 0x7f0134071980 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01394a1700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f013412def0 con 0x7f0134082a70 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01327fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134082a70 0x7f0134082ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01327fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134082a70 0x7f0134082ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:52014/0 (socket says 192.168.123.100:52014) 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.847+0000 7f01327fc700 1 -- 192.168.123.100:0/2905682327 learned_addr learned my addr 192.168.123.100:0/2905682327 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.848+0000 7f01327fc700 1 -- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0134071980 msgr2=0x7f0134082530 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.848+0000 7f01327fc700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0134071980 0x7f0134082530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.848+0000 7f01327fc700 1 -- 192.168.123.100:0/2905682327 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f012c00a040 con 0x7f0134082a70 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.848+0000 7f01327fc700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134082a70 0x7f0134082ee0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f012c0062a0 tx=0x7f012c00b330 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.848+0000 7f011bfff700 1 -- 192.168.123.100:0/2905682327 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f012c00a710 con 0x7f0134082a70 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.848+0000 7f01394a1700 1 -- 192.168.123.100:0/2905682327 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f013412e110 con 0x7f0134082a70 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.848+0000 7f01394a1700 1 -- 192.168.123.100:0/2905682327 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f013412e660 con 0x7f0134082a70 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.849+0000 7f011bfff700 1 -- 192.168.123.100:0/2905682327 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f012c018070 con 0x7f0134082a70 2026-03-10T12:37:01.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.849+0000 7f011bfff700 1 -- 192.168.123.100:0/2905682327 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f012c014730 con 0x7f0134082a70 2026-03-10T12:37:01.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.850+0000 7f01394a1700 1 -- 192.168.123.100:0/2905682327 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0120005320 con 0x7f0134082a70 2026-03-10T12:37:01.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.850+0000 7f011bfff700 1 -- 192.168.123.100:0/2905682327 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f012c01a030 con 0x7f0134082a70 2026-03-10T12:37:01.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.851+0000 7f011bfff700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f011c06c6d0 0x7f011c06eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:01.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.851+0000 7f011bfff700 1 -- 192.168.123.100:0/2905682327 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f012c08bca0 con 0x7f0134082a70 2026-03-10T12:37:01.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.851+0000 7f0132ffd700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f011c06c6d0 0x7f011c06eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:01.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.852+0000 7f0132ffd700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f011c06c6d0 0x7f011c06eb80 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f012400b3c0 tx=0x7f012400d040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:01.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:01.854+0000 7f011bfff700 1 -- 192.168.123.100:0/2905682327 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f012c059eb0 con 0x7f0134082a70 2026-03-10T12:37:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:01 vm07.local ceph-mon[58582]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:01 vm07.local ceph-mon[58582]: from='client.24419 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:01 vm07.local ceph-mon[58582]: pgmap v136: 65 pgs: 65 active+clean; 75 MiB data, 654 MiB used, 119 GiB / 120 GiB avail; 5.9 MiB/s wr, 535 op/s 2026-03-10T12:37:02.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.112+0000 7f01394a1700 1 -- 192.168.123.100:0/2905682327 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0120005cc0 con 0x7f0134082a70 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.113+0000 7f011bfff700 1 -- 192.168.123.100:0/2905682327 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f012c059a40 con 0x7f0134082a70 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:37:02.114 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:37:02.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.117+0000 7f0119ffb700 1 -- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f011c06c6d0 msgr2=0x7f011c06eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.117+0000 7f0119ffb700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f011c06c6d0 0x7f011c06eb80 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f012400b3c0 tx=0x7f012400d040 comp rx=0 tx=0).stop 2026-03-10T12:37:02.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.117+0000 7f0119ffb700 1 -- 192.168.123.100:0/2905682327 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134082a70 msgr2=0x7f0134082ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.117+0000 7f0119ffb700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134082a70 0x7f0134082ee0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f012c0062a0 tx=0x7f012c00b330 comp rx=0 tx=0).stop 2026-03-10T12:37:02.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.117+0000 7f0119ffb700 1 -- 192.168.123.100:0/2905682327 shutdown_connections 2026-03-10T12:37:02.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.117+0000 7f0119ffb700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f011c06c6d0 0x7f011c06eb80 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.117+0000 7f0119ffb700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0134071980 0x7f0134082530 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.118+0000 7f0119ffb700 1 --2- 192.168.123.100:0/2905682327 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0134082a70 0x7f0134082ee0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.118+0000 7f0119ffb700 1 -- 192.168.123.100:0/2905682327 >> 192.168.123.100:0/2905682327 conn(0x7f013406d1a0 msgr2=0x7f01340764f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:02.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.118+0000 7f0119ffb700 1 -- 192.168.123.100:0/2905682327 shutdown_connections 2026-03-10T12:37:02.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.118+0000 7f0119ffb700 1 -- 192.168.123.100:0/2905682327 wait complete. 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/520994529 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea4071950 msgr2=0x7f0ea4071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 --2- 192.168.123.100:0/520994529 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea4071950 0x7f0ea4071d60 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7f0e94007780 tx=0x7f0e9400c050 comp rx=0 tx=0).stop 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/520994529 shutdown_connections 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 --2- 192.168.123.100:0/520994529 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ea4072330 0x7f0ea40770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 --2- 192.168.123.100:0/520994529 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea4071950 0x7f0ea4071d60 unknown :-1 s=CLOSED pgs=322 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/520994529 >> 192.168.123.100:0/520994529 conn(0x7f0ea406d1a0 msgr2=0x7f0ea406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/520994529 shutdown_connections 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/520994529 wait complete. 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 Processor -- start 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.247+0000 7f0ea9eaa700 1 -- start start 2026-03-10T12:37:02.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.248+0000 7f0ea9eaa700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ea4072330 0x7f0ea4131360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:02.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.248+0000 7f0ea9eaa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea41318a0 0x7f0ea407f520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:02.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.248+0000 7f0ea9eaa700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ea4131da0 con 0x7f0ea41318a0 2026-03-10T12:37:02.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.248+0000 7f0ea9eaa700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ea4131f10 con 0x7f0ea4072330 2026-03-10T12:37:02.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.249+0000 7f0ea3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea41318a0 0x7f0ea407f520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:02.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.249+0000 7f0ea3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea41318a0 0x7f0ea407f520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:35384/0 (socket says 192.168.123.100:35384) 2026-03-10T12:37:02.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.249+0000 7f0ea3fff700 1 -- 192.168.123.100:0/995236185 learned_addr learned my addr 192.168.123.100:0/995236185 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:02.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.249+0000 7f0ea8ea8700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ea4072330 0x7f0ea4131360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.250+0000 7f0ea3fff700 1 -- 192.168.123.100:0/995236185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ea4072330 msgr2=0x7f0ea4131360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.250+0000 7f0ea3fff700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ea4072330 0x7f0ea4131360 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.250+0000 7f0ea3fff700 1 -- 192.168.123.100:0/995236185 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e94007430 con 0x7f0ea41318a0 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.251+0000 7f0ea3fff700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea41318a0 0x7f0ea407f520 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7f0e9c00bf40 tx=0x7f0e9c00bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.251+0000 7f0ea1ffb700 1 -- 192.168.123.100:0/995236185 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e9c00cb40 con 0x7f0ea41318a0 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.251+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/995236185 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ea407fac0 con 0x7f0ea41318a0 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.251+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/995236185 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ea4080060 con 0x7f0ea41318a0 2026-03-10T12:37:02.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.252+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/995236185 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ea412b500 con 0x7f0ea41318a0 2026-03-10T12:37:02.255 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.254+0000 7f0ea1ffb700 1 -- 192.168.123.100:0/995236185 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0e9c00cca0 con 0x7f0ea41318a0 2026-03-10T12:37:02.255 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.254+0000 7f0ea1ffb700 1 -- 192.168.123.100:0/995236185 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e9c01c6d0 con 0x7f0ea41318a0 2026-03-10T12:37:02.255 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.254+0000 7f0ea1ffb700 1 -- 192.168.123.100:0/995236185 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f0e9c01c8f0 con 0x7f0ea41318a0 2026-03-10T12:37:02.255 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.255+0000 7f0ea1ffb700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0e8c06c7a0 0x7f0e8c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:02.255 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.255+0000 7f0ea1ffb700 1 -- 192.168.123.100:0/995236185 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f0e9c08c7a0 con 0x7f0ea41318a0 2026-03-10T12:37:02.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.255+0000 7f0ea8ea8700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0e8c06c7a0 0x7f0e8c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:02.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.257+0000 7f0ea8ea8700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0e8c06c7a0 0x7f0e8c06ec50 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f0e940073a0 tx=0x7f0e940072b0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:02.266 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.262+0000 7f0ea1ffb700 1 -- 192.168.123.100:0/995236185 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f0e9c05aae0 con 0x7f0ea41318a0 2026-03-10T12:37:02.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.477+0000 7f0ea9eaa700 1 -- 192.168.123.100:0/995236185 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f0ea404ea50 con 0x7f0ea41318a0 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:e12 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:epoch 12 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:36:51.752695+0000 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:37:02.484 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:02.485 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.482+0000 7f0ea1ffb700 1 -- 192.168.123.100:0/995236185 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1853 (secure 0 0 0) 0x7f0e9c05a670 con 0x7f0ea41318a0 2026-03-10T12:37:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 -- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0e8c06c7a0 msgr2=0x7f0e8c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0e8c06c7a0 0x7f0e8c06ec50 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f0e940073a0 tx=0x7f0e940072b0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 -- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea41318a0 msgr2=0x7f0ea407f520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea41318a0 0x7f0ea407f520 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7f0e9c00bf40 tx=0x7f0e9c00bf70 comp rx=0 tx=0).stop 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 -- 192.168.123.100:0/995236185 shutdown_connections 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f0e8c06c7a0 0x7f0e8c06ec50 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ea4072330 0x7f0ea4131360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 --2- 192.168.123.100:0/995236185 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ea41318a0 0x7f0ea407f520 unknown :-1 s=CLOSED pgs=323 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 -- 192.168.123.100:0/995236185 >> 192.168.123.100:0/995236185 conn(0x7f0ea406d1a0 msgr2=0x7f0ea40764d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 -- 192.168.123.100:0/995236185 shutdown_connections 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.486+0000 7f0e8b7fe700 1 -- 192.168.123.100:0/995236185 wait complete. 2026-03-10T12:37:02.488 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 12 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 -- 192.168.123.100:0/541882421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7df4072330 msgr2=0x7f7df40770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 --2- 192.168.123.100:0/541882421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7df4072330 0x7f7df40770b0 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7f7df000bc70 tx=0x7f7df000bf80 comp rx=0 tx=0).stop 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 -- 192.168.123.100:0/541882421 shutdown_connections 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 --2- 192.168.123.100:0/541882421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7df4072330 0x7f7df40770b0 unknown :-1 s=CLOSED pgs=324 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 --2- 192.168.123.100:0/541882421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4071950 0x7f7df4071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 -- 192.168.123.100:0/541882421 >> 192.168.123.100:0/541882421 conn(0x7f7df406d1a0 msgr2=0x7f7df406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 -- 192.168.123.100:0/541882421 shutdown_connections 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.615+0000 7f7dfa726700 1 -- 192.168.123.100:0/541882421 wait complete. 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.616+0000 7f7dfa726700 1 Processor -- start 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.616+0000 7f7dfa726700 1 -- start start 2026-03-10T12:37:02.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.616+0000 7f7dfa726700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7df4071950 0x7f7df40824c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.616+0000 7f7dfa726700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4082a00 0x7f7df4082e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.616+0000 7f7dfa726700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7df412dd80 con 0x7f7df4071950 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.616+0000 7f7dfa726700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7df412def0 con 0x7f7df4082a00 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.617+0000 7f7df8f23700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4082a00 0x7f7df4082e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.617+0000 7f7df8f23700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4082a00 0x7f7df4082e70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:52050/0 (socket says 192.168.123.100:52050) 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.617+0000 7f7df8f23700 1 -- 192.168.123.100:0/1718671957 learned_addr learned my addr 192.168.123.100:0/1718671957 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.617+0000 7f7df8f23700 1 -- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7df4071950 msgr2=0x7f7df40824c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.617+0000 7f7df8f23700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7df4071950 0x7f7df40824c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.617+0000 7f7df8f23700 1 -- 192.168.123.100:0/1718671957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7df000b920 con 0x7f7df4082a00 2026-03-10T12:37:02.618 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.617+0000 7f7df8f23700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4082a00 0x7f7df4082e70 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7df0005950 tx=0x7f7df0004030 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:02.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.689+0000 7f7dea7fc700 1 -- 192.168.123.100:0/1718671957 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7df0010030 con 0x7f7df4082a00 2026-03-10T12:37:02.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.689+0000 7f7dea7fc700 1 -- 192.168.123.100:0/1718671957 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7df001d940 con 0x7f7df4082a00 2026-03-10T12:37:02.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.689+0000 7f7dea7fc700 1 -- 192.168.123.100:0/1718671957 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7df0014a30 con 0x7f7df4082a00 2026-03-10T12:37:02.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.689+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7df412e170 con 0x7f7df4082a00 2026-03-10T12:37:02.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.689+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7df412e660 con 0x7f7df4082a00 2026-03-10T12:37:02.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.690+0000 7f7ddffff700 1 -- 192.168.123.100:0/1718671957 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7dd40052f0 con 0x7f7df4082a00 2026-03-10T12:37:02.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.692+0000 7f7dea7fc700 1 -- 192.168.123.100:0/1718671957 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f7df0014b90 con 0x7f7df4082a00 2026-03-10T12:37:02.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.692+0000 7f7dea7fc700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7de006c680 0x7f7de006eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:02.693 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.692+0000 7f7df9724700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7de006c680 0x7f7de006eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:02.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.695+0000 7f7dea7fc700 1 -- 192.168.123.100:0/1718671957 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f7df0014e40 con 0x7f7df4082a00 2026-03-10T12:37:02.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.695+0000 7f7df9724700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7de006c680 0x7f7de006eb30 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f7dec00bfd0 tx=0x7f7dec006210 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:02.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.709+0000 7f7dea7fc700 1 -- 192.168.123.100:0/1718671957 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f7df005b780 con 0x7f7df4082a00 2026-03-10T12:37:02.744 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:02 vm00.local ceph-mon[50686]: from='client.14634 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:02.744 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:02 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/2905682327' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:37:02.744 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:02 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/995236185' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:37:02.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.946+0000 7f7ddffff700 1 -- 192.168.123.100:0/1718671957 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7dd4000bc0 con 0x7f7de006c680 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "", 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.951+0000 7f7dea7fc700 1 -- 192.168.123.100:0/1718671957 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f7dd4000bc0 con 0x7f7de006c680 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.955+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7de006c680 msgr2=0x7f7de006eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.955 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.955+0000 7f7dfa726700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7de006c680 0x7f7de006eb30 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f7dec00bfd0 tx=0x7f7dec006210 comp rx=0 tx=0).stop 2026-03-10T12:37:02.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.955+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4082a00 msgr2=0x7f7df4082e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:02.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.955+0000 7f7dfa726700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4082a00 0x7f7df4082e70 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7df0005950 tx=0x7f7df0004030 comp rx=0 tx=0).stop 2026-03-10T12:37:02.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.956+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 shutdown_connections 2026-03-10T12:37:02.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.956+0000 7f7dfa726700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f7de006c680 0x7f7de006eb30 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.956+0000 7f7dfa726700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7df4071950 0x7f7df40824c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.956+0000 7f7dfa726700 1 --2- 192.168.123.100:0/1718671957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7df4082a00 0x7f7df4082e70 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:02.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.956+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 >> 192.168.123.100:0/1718671957 conn(0x7f7df406d1a0 msgr2=0x7f7df40764d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:02.957 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.956+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 shutdown_connections 2026-03-10T12:37:02.957 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:02.956+0000 7f7dfa726700 1 -- 192.168.123.100:0/1718671957 wait complete. 2026-03-10T12:37:03.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:02 vm07.local ceph-mon[58582]: from='client.14634 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:03.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:02 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/2905682327' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:37:03.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:02 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/995236185' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- 192.168.123.100:0/197610988 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218072360 msgr2=0x7fb2180770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 --2- 192.168.123.100:0/197610988 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218072360 0x7fb2180770e0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fb210009230 tx=0x7fb210009260 comp rx=0 tx=0).stop 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- 192.168.123.100:0/197610988 shutdown_connections 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 --2- 192.168.123.100:0/197610988 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218072360 0x7fb2180770e0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 --2- 192.168.123.100:0/197610988 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb218071980 0x7fb218071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- 192.168.123.100:0/197610988 >> 192.168.123.100:0/197610988 conn(0x7fb21806d1a0 msgr2=0x7fb21806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- 192.168.123.100:0/197610988 shutdown_connections 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- 192.168.123.100:0/197610988 wait complete. 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 Processor -- start 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- start start 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218071980 0x7fb2180824d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb218082a10 0x7fb218082e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2181b2a90 con 0x7fb218082a10 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.097+0000 7fb21f5f7700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2181b2bd0 con 0x7fb218071980 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.098+0000 7fb21d393700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218071980 0x7fb2180824d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.098+0000 7fb21d393700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218071980 0x7fb2180824d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:52060/0 (socket says 192.168.123.100:52060) 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.098+0000 7fb21d393700 1 -- 192.168.123.100:0/1553461573 learned_addr learned my addr 192.168.123.100:0/1553461573 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:03.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.098+0000 7fb21cb92700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb218082a10 0x7fb218082e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:03.099 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.099+0000 7fb21d393700 1 -- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb218082a10 msgr2=0x7fb218082e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:03.099 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.099+0000 7fb21d393700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb218082a10 0x7fb218082e80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.099 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.099+0000 7fb21d393700 1 -- 192.168.123.100:0/1553461573 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb210008ee0 con 0x7fb218071980 2026-03-10T12:37:03.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.100+0000 7fb21d393700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218071980 0x7fb2180824d0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fb214009e50 tx=0x7fb21400f3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:03.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.100+0000 7fb20e7fc700 1 -- 192.168.123.100:0/1553461573 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb21400b260 con 0x7fb218071980 2026-03-10T12:37:03.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.100+0000 7fb21f5f7700 1 -- 192.168.123.100:0/1553461573 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb2181b2d10 con 0x7fb218071980 2026-03-10T12:37:03.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.100+0000 7fb21f5f7700 1 -- 192.168.123.100:0/1553461573 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb2181b31c0 con 0x7fb218071980 2026-03-10T12:37:03.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.101+0000 7fb20e7fc700 1 -- 192.168.123.100:0/1553461573 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb214011040 con 0x7fb218071980 2026-03-10T12:37:03.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.101+0000 7fb20e7fc700 1 -- 192.168.123.100:0/1553461573 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2140153f0 con 0x7fb218071980 2026-03-10T12:37:03.102 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.101+0000 7fb21f5f7700 1 -- 192.168.123.100:0/1553461573 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb1fc005320 con 0x7fb218071980 2026-03-10T12:37:03.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.103+0000 7fb20e7fc700 1 -- 192.168.123.100:0/1553461573 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb21400fa80 con 0x7fb218071980 2026-03-10T12:37:03.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.103+0000 7fb20e7fc700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb20406c6d0 0x7fb20406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:03.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.103+0000 7fb21cb92700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb20406c6d0 0x7fb20406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:03.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.103+0000 7fb20e7fc700 1 -- 192.168.123.100:0/1553461573 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb21408d640 con 0x7fb218071980 2026-03-10T12:37:03.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.105+0000 7fb21cb92700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb20406c6d0 0x7fb20406eb80 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fb21000d010 tx=0x7fb21000c9d0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:03.111 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.106+0000 7fb20e7fc700 1 -- 192.168.123.100:0/1553461573 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb21405b8d0 con 0x7fb218071980 2026-03-10T12:37:03.377 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.376+0000 7fb21f5f7700 1 -- 192.168.123.100:0/1553461573 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb1fc005190 con 0x7fb218071980 2026-03-10T12:37:03.380 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:37:03.380 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.377+0000 7fb20e7fc700 1 -- 192.168.123.100:0/1553461573 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fb21405b460 con 0x7fb218071980 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 -- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb20406c6d0 msgr2=0x7fb20406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb20406c6d0 0x7fb20406eb80 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7fb21000d010 tx=0x7fb21000c9d0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 -- 192.168.123.100:0/1553461573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218071980 msgr2=0x7fb2180824d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218071980 0x7fb2180824d0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fb214009e50 tx=0x7fb21400f3b0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 -- 192.168.123.100:0/1553461573 shutdown_connections 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb20406c6d0 0x7fb20406eb80 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb218071980 0x7fb2180824d0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 --2- 192.168.123.100:0/1553461573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb218082a10 0x7fb218082e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 -- 192.168.123.100:0/1553461573 >> 192.168.123.100:0/1553461573 conn(0x7fb21806d1a0 msgr2=0x7fb218076450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 -- 192.168.123.100:0/1553461573 shutdown_connections 2026-03-10T12:37:03.381 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:03.380+0000 7fb203fff700 1 -- 192.168.123.100:0/1553461573 wait complete. 2026-03-10T12:37:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:04 vm00.local ceph-mon[50686]: from='client.24433 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:04 vm00.local ceph-mon[50686]: pgmap v137: 65 pgs: 65 active+clean; 75 MiB data, 668 MiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 394 op/s 2026-03-10T12:37:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:04 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/1553461573' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:37:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:04 vm07.local ceph-mon[58582]: from='client.24433 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:04 vm07.local ceph-mon[58582]: pgmap v137: 65 pgs: 65 active+clean; 75 MiB data, 668 MiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 394 op/s 2026-03-10T12:37:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:04 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/1553461573' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:37:06.415 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:06 vm00.local ceph-mon[50686]: pgmap v138: 65 pgs: 65 active+clean; 88 MiB data, 727 MiB used, 119 GiB / 120 GiB avail; 4.8 MiB/s wr, 499 op/s 2026-03-10T12:37:06.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:06 vm07.local ceph-mon[58582]: pgmap v138: 65 pgs: 65 active+clean; 88 MiB data, 727 MiB used, 119 GiB / 120 GiB avail; 4.8 MiB/s wr, 499 op/s 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: Upgrade: Need to upgrade myself (mgr.vm00.nescmq) 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm07 2026-03-10T12:37:08.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:08 vm00.local ceph-mon[50686]: pgmap v139: 65 pgs: 65 active+clean; 92 MiB data, 790 MiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 426 op/s 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: Upgrade: Need to upgrade myself (mgr.vm00.nescmq) 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm07 2026-03-10T12:37:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:08 vm07.local ceph-mon[58582]: pgmap v139: 65 pgs: 65 active+clean; 92 MiB data, 790 MiB used, 119 GiB / 120 GiB avail; 3.7 MiB/s wr, 426 op/s 2026-03-10T12:37:09.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:09 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:09 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:10.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:10 vm00.local ceph-mon[50686]: pgmap v140: 65 pgs: 65 active+clean; 98 MiB data, 847 MiB used, 119 GiB / 120 GiB avail; 3.9 MiB/s wr, 410 op/s 2026-03-10T12:37:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:10 vm07.local ceph-mon[58582]: pgmap v140: 65 pgs: 65 active+clean; 98 MiB data, 847 MiB used, 119 GiB / 120 GiB avail; 3.9 MiB/s wr, 410 op/s 2026-03-10T12:37:11.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:11 vm00.local ceph-mon[50686]: pgmap v141: 65 pgs: 65 active+clean; 124 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 5.6 MiB/s wr, 586 op/s 2026-03-10T12:37:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:11 vm07.local ceph-mon[58582]: pgmap v141: 65 pgs: 65 active+clean; 124 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 5.6 MiB/s wr, 586 op/s 2026-03-10T12:37:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:14 vm07.local ceph-mon[58582]: pgmap v142: 65 pgs: 65 active+clean; 125 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 4.5 MiB/s wr, 498 op/s 2026-03-10T12:37:14.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:14 vm00.local ceph-mon[50686]: pgmap v142: 65 pgs: 65 active+clean; 125 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 4.5 MiB/s wr, 498 op/s 2026-03-10T12:37:16.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:16 vm00.local ceph-mon[50686]: pgmap v143: 65 pgs: 65 active+clean; 142 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5.9 MiB/s wr, 617 op/s 2026-03-10T12:37:16.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:16 vm07.local ceph-mon[58582]: pgmap v143: 65 pgs: 65 active+clean; 142 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5.9 MiB/s wr, 617 op/s 2026-03-10T12:37:19.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:18 vm00.local ceph-mon[50686]: pgmap v144: 65 pgs: 65 active+clean; 148 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 5.3 MiB/s wr, 561 op/s 2026-03-10T12:37:19.318 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:18 vm07.local ceph-mon[58582]: pgmap v144: 65 pgs: 65 active+clean; 148 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 5.3 MiB/s wr, 561 op/s 2026-03-10T12:37:20.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:19 vm07.local ceph-mon[58582]: pgmap v145: 65 pgs: 65 active+clean; 148 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 5.0 MiB/s wr, 490 op/s 2026-03-10T12:37:20.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:19 vm00.local ceph-mon[50686]: pgmap v145: 65 pgs: 65 active+clean; 148 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 5.0 MiB/s wr, 490 op/s 2026-03-10T12:37:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:22 vm00.local ceph-mon[50686]: pgmap v146: 65 pgs: 65 active+clean; 171 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 6.5 MiB/s wr, 637 op/s 2026-03-10T12:37:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:22 vm07.local ceph-mon[58582]: pgmap v146: 65 pgs: 65 active+clean; 171 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 6.5 MiB/s wr, 637 op/s 2026-03-10T12:37:23.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:23 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:23.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:23 vm00.local ceph-mon[50686]: pgmap v147: 65 pgs: 65 active+clean; 175 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.5 MiB/s wr, 444 op/s 2026-03-10T12:37:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:23 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:23 vm07.local ceph-mon[58582]: pgmap v147: 65 pgs: 65 active+clean; 175 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 4.5 MiB/s wr, 444 op/s 2026-03-10T12:37:24.931 INFO:tasks.workunit.client.1.vm07.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T12:37:24.935 INFO:tasks.workunit.client.1.vm07.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T12:37:24.947 INFO:tasks.workunit.client.1.vm07.stderr:+ make 2026-03-10T12:37:25.431 INFO:tasks.workunit.client.1.vm07.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T12:37:25.962 INFO:tasks.workunit.client.1.vm07.stderr:++ readlink -f fsstress 2026-03-10T12:37:25.964 INFO:tasks.workunit.client.1.vm07.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T12:37:25.964 INFO:tasks.workunit.client.1.vm07.stderr:+ popd 2026-03-10T12:37:25.965 INFO:tasks.workunit.client.1.vm07.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T12:37:25.965 INFO:tasks.workunit.client.1.vm07.stderr:+ popd 2026-03-10T12:37:25.967 INFO:tasks.workunit.client.1.vm07.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-10T12:37:25.967 INFO:tasks.workunit.client.1.vm07.stderr:++ mktemp -d -p . 2026-03-10T12:37:25.970 INFO:tasks.workunit.client.1.vm07.stderr:+ T=./tmp.40gfFcEho1 2026-03-10T12:37:25.970 INFO:tasks.workunit.client.1.vm07.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.40gfFcEho1 -l 1 -n 1000 -p 10 -v 2026-03-10T12:37:25.973 INFO:tasks.workunit.client.1.vm07.stdout:seed = 1774111928 2026-03-10T12:37:25.976 INFO:tasks.workunit.client.1.vm07.stdout:0/0: fsync - no filename 2026-03-10T12:37:25.977 INFO:tasks.workunit.client.1.vm07.stdout:0/1: getdents . 0 2026-03-10T12:37:25.978 INFO:tasks.workunit.client.1.vm07.stdout:0/2: mkdir d0 0 2026-03-10T12:37:25.979 INFO:tasks.workunit.client.1.vm07.stdout:0/3: dwrite - no filename 2026-03-10T12:37:25.979 INFO:tasks.workunit.client.1.vm07.stdout:0/4: rename d0 to d0/d1 22 2026-03-10T12:37:25.979 INFO:tasks.workunit.client.1.vm07.stdout:0/5: dread - no filename 2026-03-10T12:37:25.980 INFO:tasks.workunit.client.1.vm07.stdout:0/6: getdents d0 0 2026-03-10T12:37:25.980 INFO:tasks.workunit.client.1.vm07.stdout:0/7: write - no filename 2026-03-10T12:37:25.980 INFO:tasks.workunit.client.1.vm07.stdout:0/8: write - no filename 2026-03-10T12:37:25.980 INFO:tasks.workunit.client.1.vm07.stdout:0/9: write - no filename 2026-03-10T12:37:25.981 INFO:tasks.workunit.client.1.vm07.stdout:6/0: write - no filename 2026-03-10T12:37:25.981 INFO:tasks.workunit.client.1.vm07.stdout:6/1: dread - no filename 2026-03-10T12:37:25.982 INFO:tasks.workunit.client.1.vm07.stdout:0/10: getdents d0 0 2026-03-10T12:37:25.983 INFO:tasks.workunit.client.1.vm07.stdout:0/11: chown d0 202934760 1 2026-03-10T12:37:25.983 INFO:tasks.workunit.client.1.vm07.stdout:0/12: dwrite - no filename 2026-03-10T12:37:25.983 INFO:tasks.workunit.client.1.vm07.stdout:0/13: link - no file 2026-03-10T12:37:25.983 INFO:tasks.workunit.client.1.vm07.stdout:5/0: mkdir d0 0 2026-03-10T12:37:25.983 INFO:tasks.workunit.client.1.vm07.stdout:0/14: stat d0 0 2026-03-10T12:37:25.983 INFO:tasks.workunit.client.1.vm07.stdout:0/15: write - no filename 2026-03-10T12:37:25.984 INFO:tasks.workunit.client.1.vm07.stdout:6/2: creat f0 x:0 0 0 2026-03-10T12:37:25.984 INFO:tasks.workunit.client.1.vm07.stdout:5/1: fdatasync - no filename 2026-03-10T12:37:25.984 INFO:tasks.workunit.client.1.vm07.stdout:5/2: dread - no filename 2026-03-10T12:37:25.984 INFO:tasks.workunit.client.1.vm07.stdout:5/3: write - no filename 2026-03-10T12:37:25.988 INFO:tasks.workunit.client.1.vm07.stdout:0/16: mknod d0/c2 0 2026-03-10T12:37:25.990 INFO:tasks.workunit.client.1.vm07.stdout:1/0: write - no filename 2026-03-10T12:37:25.990 INFO:tasks.workunit.client.1.vm07.stdout:1/1: write - no filename 2026-03-10T12:37:25.990 INFO:tasks.workunit.client.1.vm07.stdout:1/2: rename - no filename 2026-03-10T12:37:25.991 INFO:tasks.workunit.client.1.vm07.stdout:0/17: mknod d0/c3 0 2026-03-10T12:37:25.991 INFO:tasks.workunit.client.1.vm07.stdout:0/18: stat d0/c3 0 2026-03-10T12:37:25.992 INFO:tasks.workunit.client.1.vm07.stdout:5/4: creat d0/f1 x:0 0 0 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/3: symlink l0 0 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/4: fdatasync - no filename 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/5: read - no filename 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/6: read - no filename 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/7: dwrite - no filename 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/8: chown l0 27915 1 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/9: truncate - no filename 2026-03-10T12:37:26.010 INFO:tasks.workunit.client.1.vm07.stdout:1/10: write - no filename 2026-03-10T12:37:26.018 INFO:tasks.workunit.client.1.vm07.stdout:3/0: creat f0 x:0 0 0 2026-03-10T12:37:26.018 INFO:tasks.workunit.client.1.vm07.stdout:3/1: fdatasync f0 0 2026-03-10T12:37:26.020 INFO:tasks.workunit.client.1.vm07.stdout:1/11: rename l0 to l1 0 2026-03-10T12:37:26.020 INFO:tasks.workunit.client.1.vm07.stdout:1/12: dwrite - no filename 2026-03-10T12:37:26.022 INFO:tasks.workunit.client.1.vm07.stdout:2/0: mkdir d0 0 2026-03-10T12:37:26.022 INFO:tasks.workunit.client.1.vm07.stdout:2/1: link - no file 2026-03-10T12:37:26.028 INFO:tasks.workunit.client.1.vm07.stdout:3/2: creat f1 x:0 0 0 2026-03-10T12:37:26.030 INFO:tasks.workunit.client.1.vm07.stdout:1/13: symlink l2 0 2026-03-10T12:37:26.030 INFO:tasks.workunit.client.1.vm07.stdout:1/14: stat l2 0 2026-03-10T12:37:26.038 INFO:tasks.workunit.client.1.vm07.stdout:4/0: dwrite - no filename 2026-03-10T12:37:26.042 INFO:tasks.workunit.client.1.vm07.stdout:7/0: mkdir d0 0 2026-03-10T12:37:26.042 INFO:tasks.workunit.client.1.vm07.stdout:7/1: fdatasync - no filename 2026-03-10T12:37:26.046 INFO:tasks.workunit.client.1.vm07.stdout:7/2: chown d0 75 1 2026-03-10T12:37:26.046 INFO:tasks.workunit.client.1.vm07.stdout:7/3: dread - no filename 2026-03-10T12:37:26.046 INFO:tasks.workunit.client.1.vm07.stdout:7/4: fsync - no filename 2026-03-10T12:37:26.046 INFO:tasks.workunit.client.1.vm07.stdout:7/5: write - no filename 2026-03-10T12:37:26.052 INFO:tasks.workunit.client.1.vm07.stdout:3/3: creat f2 x:0 0 0 2026-03-10T12:37:26.060 INFO:tasks.workunit.client.1.vm07.stdout:1/15: mknod c3 0 2026-03-10T12:37:26.060 INFO:tasks.workunit.client.1.vm07.stdout:2/2: creat d0/f1 x:0 0 0 2026-03-10T12:37:26.062 INFO:tasks.workunit.client.1.vm07.stdout:9/0: read - no filename 2026-03-10T12:37:26.063 INFO:tasks.workunit.client.1.vm07.stdout:8/0: mknod c0 0 2026-03-10T12:37:26.065 INFO:tasks.workunit.client.1.vm07.stdout:3/4: mknod c3 0 2026-03-10T12:37:26.065 INFO:tasks.workunit.client.1.vm07.stdout:3/5: write f2 [30840,120174] 0 2026-03-10T12:37:26.066 INFO:tasks.workunit.client.1.vm07.stdout:3/6: dread - f0 zero size 2026-03-10T12:37:26.066 INFO:tasks.workunit.client.1.vm07.stdout:3/7: chown f2 1 1 2026-03-10T12:37:26.067 INFO:tasks.workunit.client.1.vm07.stdout:3/8: write f2 [1014641,37137] 0 2026-03-10T12:37:26.070 INFO:tasks.workunit.client.1.vm07.stdout:1/16: mknod c4 0 2026-03-10T12:37:26.070 INFO:tasks.workunit.client.1.vm07.stdout:1/17: dread - no filename 2026-03-10T12:37:26.071 INFO:tasks.workunit.client.1.vm07.stdout:2/3: mknod d0/c2 0 2026-03-10T12:37:26.071 INFO:tasks.workunit.client.1.vm07.stdout:4/1: getdents . 0 2026-03-10T12:37:26.073 INFO:tasks.workunit.client.1.vm07.stdout:8/1: mkdir d1 0 2026-03-10T12:37:26.075 INFO:tasks.workunit.client.1.vm07.stdout:3/9: unlink c3 0 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/1: symlink l0 0 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/2: write - no filename 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/3: dwrite - no filename 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/4: chown l0 3 1 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/5: truncate - no filename 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/6: dwrite - no filename 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/7: readlink l0 0 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/8: chown l0 3814493 1 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/9: fsync - no filename 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/10: dwrite - no filename 2026-03-10T12:37:26.079 INFO:tasks.workunit.client.1.vm07.stdout:9/11: write - no filename 2026-03-10T12:37:26.081 INFO:tasks.workunit.client.1.vm07.stdout:3/10: creat f4 x:0 0 0 2026-03-10T12:37:26.084 INFO:tasks.workunit.client.1.vm07.stdout:4/2: mkdir d0 0 2026-03-10T12:37:26.084 INFO:tasks.workunit.client.1.vm07.stdout:4/3: dwrite - no filename 2026-03-10T12:37:26.089 INFO:tasks.workunit.client.1.vm07.stdout:2/4: link d0/f1 d0/f3 0 2026-03-10T12:37:26.089 INFO:tasks.workunit.client.1.vm07.stdout:2/5: readlink - no filename 2026-03-10T12:37:26.091 INFO:tasks.workunit.client.1.vm07.stdout:8/2: creat d1/f2 x:0 0 0 2026-03-10T12:37:26.091 INFO:tasks.workunit.client.1.vm07.stdout:8/3: dread - d1/f2 zero size 2026-03-10T12:37:26.092 INFO:tasks.workunit.client.1.vm07.stdout:8/4: chown d1/f2 181 1 2026-03-10T12:37:26.092 INFO:tasks.workunit.client.1.vm07.stdout:3/11: symlink l5 0 2026-03-10T12:37:26.093 INFO:tasks.workunit.client.1.vm07.stdout:3/12: truncate f1 11304 0 2026-03-10T12:37:26.094 INFO:tasks.workunit.client.1.vm07.stdout:2/6: rename d0/f3 to d0/f4 0 2026-03-10T12:37:26.097 INFO:tasks.workunit.client.1.vm07.stdout:8/5: mkdir d1/d3 0 2026-03-10T12:37:26.097 INFO:tasks.workunit.client.1.vm07.stdout:8/6: read - d1/f2 zero size 2026-03-10T12:37:26.099 INFO:tasks.workunit.client.1.vm07.stdout:2/7: symlink d0/l5 0 2026-03-10T12:37:26.099 INFO:tasks.workunit.client.1.vm07.stdout:2/8: dread - d0/f4 zero size 2026-03-10T12:37:26.100 INFO:tasks.workunit.client.1.vm07.stdout:2/9: readlink d0/l5 0 2026-03-10T12:37:26.100 INFO:tasks.workunit.client.1.vm07.stdout:2/10: read - d0/f4 zero size 2026-03-10T12:37:26.102 INFO:tasks.workunit.client.1.vm07.stdout:9/12: link l0 l1 0 2026-03-10T12:37:26.102 INFO:tasks.workunit.client.1.vm07.stdout:9/13: dwrite - no filename 2026-03-10T12:37:26.102 INFO:tasks.workunit.client.1.vm07.stdout:9/14: dread - no filename 2026-03-10T12:37:26.105 INFO:tasks.workunit.client.1.vm07.stdout:8/7: mkdir d1/d3/d4 0 2026-03-10T12:37:26.105 INFO:tasks.workunit.client.1.vm07.stdout:8/8: write d1/f2 [500247,31786] 0 2026-03-10T12:37:26.110 INFO:tasks.workunit.client.1.vm07.stdout:8/9: dwrite d1/f2 [0,4194304] 0 2026-03-10T12:37:26.113 INFO:tasks.workunit.client.1.vm07.stdout:9/15: chown l1 31451130 1 2026-03-10T12:37:26.113 INFO:tasks.workunit.client.1.vm07.stdout:9/16: dread - no filename 2026-03-10T12:37:26.135 INFO:tasks.workunit.client.1.vm07.stdout:6/3: fsync f0 0 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:9/17: rename l0 to l2 0 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:9/18: dwrite - no filename 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:6/4: dwrite f0 [0,4194304] 0 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:8/10: rmdir d1/d3/d4 0 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:6/5: mkdir d1 0 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:8/11: dread d1/f2 [0,4194304] 0 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:6/6: creat d1/f2 x:0 0 0 2026-03-10T12:37:26.184 INFO:tasks.workunit.client.1.vm07.stdout:6/7: truncate d1/f2 239210 0 2026-03-10T12:37:26.230 INFO:tasks.workunit.client.1.vm07.stdout:0/19: getdents d0 0 2026-03-10T12:37:26.230 INFO:tasks.workunit.client.1.vm07.stdout:0/20: write - no filename 2026-03-10T12:37:26.232 INFO:tasks.workunit.client.1.vm07.stdout:0/21: creat d0/f4 x:0 0 0 2026-03-10T12:37:26.232 INFO:tasks.workunit.client.1.vm07.stdout:5/5: getdents d0 0 2026-03-10T12:37:26.235 INFO:tasks.workunit.client.1.vm07.stdout:5/6: creat d0/f2 x:0 0 0 2026-03-10T12:37:26.235 INFO:tasks.workunit.client.1.vm07.stdout:0/22: unlink d0/c3 0 2026-03-10T12:37:26.235 INFO:tasks.workunit.client.1.vm07.stdout:0/23: stat d0/c2 0 2026-03-10T12:37:26.236 INFO:tasks.workunit.client.1.vm07.stdout:1/18: rename l1 to l5 0 2026-03-10T12:37:26.236 INFO:tasks.workunit.client.1.vm07.stdout:1/19: dread - no filename 2026-03-10T12:37:26.237 INFO:tasks.workunit.client.1.vm07.stdout:0/24: unlink d0/f4 0 2026-03-10T12:37:26.237 INFO:tasks.workunit.client.1.vm07.stdout:0/25: dread - no filename 2026-03-10T12:37:26.237 INFO:tasks.workunit.client.1.vm07.stdout:0/26: write - no filename 2026-03-10T12:37:26.238 INFO:tasks.workunit.client.1.vm07.stdout:0/27: truncate - no filename 2026-03-10T12:37:26.238 INFO:tasks.workunit.client.1.vm07.stdout:1/20: creat f6 x:0 0 0 2026-03-10T12:37:26.239 INFO:tasks.workunit.client.1.vm07.stdout:5/7: symlink d0/l3 0 2026-03-10T12:37:26.239 INFO:tasks.workunit.client.1.vm07.stdout:1/21: write f6 [485009,28315] 0 2026-03-10T12:37:26.239 INFO:tasks.workunit.client.1.vm07.stdout:0/28: mknod d0/c5 0 2026-03-10T12:37:26.240 INFO:tasks.workunit.client.1.vm07.stdout:5/8: dread - d0/f1 zero size 2026-03-10T12:37:26.241 INFO:tasks.workunit.client.1.vm07.stdout:1/22: symlink l7 0 2026-03-10T12:37:26.242 INFO:tasks.workunit.client.1.vm07.stdout:0/29: rename d0/c5 to d0/c6 0 2026-03-10T12:37:26.242 INFO:tasks.workunit.client.1.vm07.stdout:0/30: dwrite - no filename 2026-03-10T12:37:26.242 INFO:tasks.workunit.client.1.vm07.stdout:0/31: write - no filename 2026-03-10T12:37:26.251 INFO:tasks.workunit.client.1.vm07.stdout:1/23: creat f8 x:0 0 0 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:0/32: symlink d0/l7 0 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/24: write f6 [719540,46539] 0 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:0/33: chown d0 4294 1 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:0/34: write - no filename 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:0/35: fsync - no filename 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/25: chown c3 55 1 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/26: chown l7 1 1 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/27: write f6 [716002,122459] 0 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/28: dread - f8 zero size 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/29: mkdir d9 0 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/30: rename d9 to d9/da 22 2026-03-10T12:37:26.257 INFO:tasks.workunit.client.1.vm07.stdout:1/31: creat d9/fb x:0 0 0 2026-03-10T12:37:26.258 INFO:tasks.workunit.client.1.vm07.stdout:1/32: creat d9/fc x:0 0 0 2026-03-10T12:37:26.259 INFO:tasks.workunit.client.1.vm07.stdout:1/33: creat d9/fd x:0 0 0 2026-03-10T12:37:26.260 INFO:tasks.workunit.client.1.vm07.stdout:1/34: chown f8 0 1 2026-03-10T12:37:26.311 INFO:tasks.workunit.client.1.vm07.stdout:2/11: getdents d0 0 2026-03-10T12:37:26.311 INFO:tasks.workunit.client.1.vm07.stdout:6/8: fdatasync f0 0 2026-03-10T12:37:26.320 INFO:tasks.workunit.client.1.vm07.stdout:1/35: dread f6 [0,4194304] 0 2026-03-10T12:37:26.324 INFO:tasks.workunit.client.1.vm07.stdout:2/12: creat d0/f6 x:0 0 0 2026-03-10T12:37:26.331 INFO:tasks.workunit.client.1.vm07.stdout:6/9: rename f0 to d1/f3 0 2026-03-10T12:37:26.331 INFO:tasks.workunit.client.1.vm07.stdout:6/10: chown d1 162 1 2026-03-10T12:37:26.332 INFO:tasks.workunit.client.1.vm07.stdout:1/36: creat d9/fe x:0 0 0 2026-03-10T12:37:26.333 INFO:tasks.workunit.client.1.vm07.stdout:1/37: write d9/fe [764588,94515] 0 2026-03-10T12:37:26.336 INFO:tasks.workunit.client.1.vm07.stdout:6/11: dread d1/f3 [0,4194304] 0 2026-03-10T12:37:26.337 INFO:tasks.workunit.client.1.vm07.stdout:6/12: readlink - no filename 2026-03-10T12:37:26.339 INFO:tasks.workunit.client.1.vm07.stdout:6/13: dread d1/f2 [0,4194304] 0 2026-03-10T12:37:26.339 INFO:tasks.workunit.client.1.vm07.stdout:1/38: mkdir d9/df 0 2026-03-10T12:37:26.340 INFO:tasks.workunit.client.1.vm07.stdout:6/14: read d1/f2 [163320,39814] 0 2026-03-10T12:37:26.342 INFO:tasks.workunit.client.1.vm07.stdout:6/15: dread d1/f3 [0,4194304] 0 2026-03-10T12:37:26.343 INFO:tasks.workunit.client.1.vm07.stdout:6/16: chown d1 43030219 1 2026-03-10T12:37:26.343 INFO:tasks.workunit.client.1.vm07.stdout:6/17: write d1/f2 [1236000,17020] 0 2026-03-10T12:37:26.345 INFO:tasks.workunit.client.1.vm07.stdout:2/13: mknod d0/c7 0 2026-03-10T12:37:26.347 INFO:tasks.workunit.client.1.vm07.stdout:8/12: write d1/f2 [4342516,16664] 0 2026-03-10T12:37:26.347 INFO:tasks.workunit.client.1.vm07.stdout:9/19: rename l2 to l3 0 2026-03-10T12:37:26.351 INFO:tasks.workunit.client.1.vm07.stdout:5/9: fsync d0/f2 0 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:5/10: dread - d0/f2 zero size 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:5/11: write d0/f2 [413829,66841] 0 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:6/18: write d1/f3 [2125619,13954] 0 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:1/39: creat d9/df/f10 x:0 0 0 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:8/13: unlink c0 0 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:2/14: mknod d0/c8 0 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:0/36: chown d0/c6 78257 1 2026-03-10T12:37:26.362 INFO:tasks.workunit.client.1.vm07.stdout:1/40: creat d9/df/f11 x:0 0 0 2026-03-10T12:37:26.369 INFO:tasks.workunit.client.1.vm07.stdout:9/20: link l1 l4 0 2026-03-10T12:37:26.372 INFO:tasks.workunit.client.1.vm07.stdout:0/37: symlink d0/l8 0 2026-03-10T12:37:26.372 INFO:tasks.workunit.client.1.vm07.stdout:2/15: symlink d0/l9 0 2026-03-10T12:37:26.372 INFO:tasks.workunit.client.1.vm07.stdout:5/12: rename d0/f2 to d0/f4 0 2026-03-10T12:37:26.372 INFO:tasks.workunit.client.1.vm07.stdout:9/21: mkdir d5 0 2026-03-10T12:37:26.375 INFO:tasks.workunit.client.1.vm07.stdout:5/13: dread - d0/f1 zero size 2026-03-10T12:37:26.376 INFO:tasks.workunit.client.1.vm07.stdout:0/38: rename d0/c6 to d0/c9 0 2026-03-10T12:37:26.377 INFO:tasks.workunit.client.1.vm07.stdout:1/41: dwrite d9/df/f10 [0,4194304] 0 2026-03-10T12:37:26.377 INFO:tasks.workunit.client.1.vm07.stdout:9/22: mkdir d5/d6 0 2026-03-10T12:37:26.384 INFO:tasks.workunit.client.1.vm07.stdout:6/19: dwrite d1/f2 [0,4194304] 0 2026-03-10T12:37:26.403 INFO:tasks.workunit.client.1.vm07.stdout:6/20: mkdir d1/d4 0 2026-03-10T12:37:26.403 INFO:tasks.workunit.client.1.vm07.stdout:6/21: dwrite d1/f2 [0,4194304] 0 2026-03-10T12:37:26.403 INFO:tasks.workunit.client.1.vm07.stdout:6/22: write d1/f2 [1865187,101831] 0 2026-03-10T12:37:26.403 INFO:tasks.workunit.client.1.vm07.stdout:6/23: chown d1/f3 9361 1 2026-03-10T12:37:26.403 INFO:tasks.workunit.client.1.vm07.stdout:6/24: readlink - no filename 2026-03-10T12:37:26.403 INFO:tasks.workunit.client.1.vm07.stdout:5/14: dwrite d0/f1 [0,4194304] 0 2026-03-10T12:37:26.404 INFO:tasks.workunit.client.1.vm07.stdout:5/15: readlink d0/l3 0 2026-03-10T12:37:26.407 INFO:tasks.workunit.client.1.vm07.stdout:6/25: dwrite d1/f3 [0,4194304] 0 2026-03-10T12:37:26.410 INFO:tasks.workunit.client.1.vm07.stdout:0/39: mknod d0/ca 0 2026-03-10T12:37:26.410 INFO:tasks.workunit.client.1.vm07.stdout:0/40: truncate - no filename 2026-03-10T12:37:26.418 INFO:tasks.workunit.client.1.vm07.stdout:6/26: dwrite d1/f2 [0,4194304] 0 2026-03-10T12:37:26.418 INFO:tasks.workunit.client.1.vm07.stdout:9/23: rmdir d5/d6 0 2026-03-10T12:37:26.418 INFO:tasks.workunit.client.1.vm07.stdout:9/24: dwrite - no filename 2026-03-10T12:37:26.418 INFO:tasks.workunit.client.1.vm07.stdout:5/16: dwrite d0/f1 [0,4194304] 0 2026-03-10T12:37:26.436 INFO:tasks.workunit.client.1.vm07.stdout:5/17: dread d0/f1 [0,4194304] 0 2026-03-10T12:37:26.450 INFO:tasks.workunit.client.1.vm07.stdout:9/25: link l4 d5/l7 0 2026-03-10T12:37:26.450 INFO:tasks.workunit.client.1.vm07.stdout:9/26: read - no filename 2026-03-10T12:37:26.450 INFO:tasks.workunit.client.1.vm07.stdout:5/18: creat d0/f5 x:0 0 0 2026-03-10T12:37:26.450 INFO:tasks.workunit.client.1.vm07.stdout:5/19: dread - d0/f5 zero size 2026-03-10T12:37:26.450 INFO:tasks.workunit.client.1.vm07.stdout:9/27: creat d5/f8 x:0 0 0 2026-03-10T12:37:26.450 INFO:tasks.workunit.client.1.vm07.stdout:5/20: mknod d0/c6 0 2026-03-10T12:37:26.451 INFO:tasks.workunit.client.1.vm07.stdout:9/28: rename l1 to d5/l9 0 2026-03-10T12:37:26.451 INFO:tasks.workunit.client.1.vm07.stdout:9/29: write d5/f8 [706122,119016] 0 2026-03-10T12:37:26.455 INFO:tasks.workunit.client.1.vm07.stdout:9/30: dwrite d5/f8 [0,4194304] 0 2026-03-10T12:37:26.624 INFO:tasks.workunit.client.1.vm07.stdout:1/42: fsync d9/fb 0 2026-03-10T12:37:26.637 INFO:tasks.workunit.client.1.vm07.stdout:8/14: truncate d1/f2 287060 0 2026-03-10T12:37:26.637 INFO:tasks.workunit.client.1.vm07.stdout:2/16: rmdir d0 39 2026-03-10T12:37:26.637 INFO:tasks.workunit.client.1.vm07.stdout:5/21: fsync d0/f4 0 2026-03-10T12:37:26.637 INFO:tasks.workunit.client.1.vm07.stdout:1/43: rmdir d9/df 39 2026-03-10T12:37:26.646 INFO:tasks.workunit.client.1.vm07.stdout:8/15: creat d1/d3/f5 x:0 0 0 2026-03-10T12:37:26.646 INFO:tasks.workunit.client.1.vm07.stdout:1/44: chown d9/df/f11 0 1 2026-03-10T12:37:26.647 INFO:tasks.workunit.client.1.vm07.stdout:1/45: write f8 [180495,37878] 0 2026-03-10T12:37:26.648 INFO:tasks.workunit.client.1.vm07.stdout:5/22: symlink d0/l7 0 2026-03-10T12:37:26.650 INFO:tasks.workunit.client.1.vm07.stdout:8/16: mkdir d1/d3/d6 0 2026-03-10T12:37:26.650 INFO:tasks.workunit.client.1.vm07.stdout:8/17: dread - d1/d3/f5 zero size 2026-03-10T12:37:26.652 INFO:tasks.workunit.client.1.vm07.stdout:8/18: rename d1/d3/f5 to d1/f7 0 2026-03-10T12:37:26.653 INFO:tasks.workunit.client.1.vm07.stdout:8/19: dread - d1/f7 zero size 2026-03-10T12:37:26.655 INFO:tasks.workunit.client.1.vm07.stdout:1/46: creat d9/f12 x:0 0 0 2026-03-10T12:37:26.655 INFO:tasks.workunit.client.1.vm07.stdout:1/47: chown l7 203 1 2026-03-10T12:37:26.656 INFO:tasks.workunit.client.1.vm07.stdout:5/23: rename d0/c6 to d0/c8 0 2026-03-10T12:37:26.657 INFO:tasks.workunit.client.1.vm07.stdout:2/17: getdents d0 0 2026-03-10T12:37:26.661 INFO:tasks.workunit.client.1.vm07.stdout:2/18: dwrite d0/f6 [0,4194304] 0 2026-03-10T12:37:26.664 INFO:tasks.workunit.client.1.vm07.stdout:8/20: link d1/f7 d1/d3/f8 0 2026-03-10T12:37:26.665 INFO:tasks.workunit.client.1.vm07.stdout:1/48: creat d9/df/f13 x:0 0 0 2026-03-10T12:37:26.665 INFO:tasks.workunit.client.1.vm07.stdout:1/49: write d9/fd [504357,100998] 0 2026-03-10T12:37:26.666 INFO:tasks.workunit.client.1.vm07.stdout:1/50: fdatasync d9/df/f11 0 2026-03-10T12:37:26.668 INFO:tasks.workunit.client.1.vm07.stdout:2/19: rename d0/c8 to d0/ca 0 2026-03-10T12:37:26.668 INFO:tasks.workunit.client.1.vm07.stdout:2/20: chown d0 1441388 1 2026-03-10T12:37:26.669 INFO:tasks.workunit.client.1.vm07.stdout:2/21: dread - d0/f1 zero size 2026-03-10T12:37:26.670 INFO:tasks.workunit.client.1.vm07.stdout:8/21: creat d1/d3/d6/f9 x:0 0 0 2026-03-10T12:37:26.670 INFO:tasks.workunit.client.1.vm07.stdout:8/22: readlink - no filename 2026-03-10T12:37:26.670 INFO:tasks.workunit.client.1.vm07.stdout:1/51: mknod d9/c14 0 2026-03-10T12:37:26.672 INFO:tasks.workunit.client.1.vm07.stdout:5/24: rename d0/f4 to d0/f9 0 2026-03-10T12:37:26.675 INFO:tasks.workunit.client.1.vm07.stdout:8/23: symlink d1/d3/la 0 2026-03-10T12:37:26.678 INFO:tasks.workunit.client.1.vm07.stdout:5/25: creat d0/fa x:0 0 0 2026-03-10T12:37:27.015 INFO:tasks.workunit.client.1.vm07.stdout:6/27: rmdir d1 39 2026-03-10T12:37:27.015 INFO:tasks.workunit.client.1.vm07.stdout:0/41: rmdir d0 39 2026-03-10T12:37:27.018 INFO:tasks.workunit.client.1.vm07.stdout:6/28: dread d1/f2 [0,4194304] 0 2026-03-10T12:37:27.019 INFO:tasks.workunit.client.1.vm07.stdout:6/29: dread d1/f3 [0,4194304] 0 2026-03-10T12:37:27.021 INFO:tasks.workunit.client.1.vm07.stdout:6/30: stat d1/f3 0 2026-03-10T12:37:27.021 INFO:tasks.workunit.client.1.vm07.stdout:6/31: fdatasync d1/f3 0 2026-03-10T12:37:27.023 INFO:tasks.workunit.client.1.vm07.stdout:0/42: mknod d0/cb 0 2026-03-10T12:37:27.028 INFO:tasks.workunit.client.1.vm07.stdout:0/43: unlink d0/ca 0 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/44: rename d0/l7 to d0/lc 0 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/45: fsync - no filename 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/46: dread - no filename 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/47: dread - no filename 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/48: dread - no filename 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/49: creat d0/fd x:0 0 0 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/50: link d0/c2 d0/ce 0 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/51: chown d0/c9 3880992 1 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/52: mknod d0/cf 0 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/53: dwrite d0/fd [0,4194304] 0 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/54: dwrite d0/fd [0,4194304] 0 2026-03-10T12:37:27.051 INFO:tasks.workunit.client.1.vm07.stdout:0/55: stat d0/fd 0 2026-03-10T12:37:27.060 INFO:tasks.workunit.client.1.vm07.stdout:4/4: sync 2026-03-10T12:37:27.060 INFO:tasks.workunit.client.1.vm07.stdout:3/13: sync 2026-03-10T12:37:27.060 INFO:tasks.workunit.client.1.vm07.stdout:7/6: sync 2026-03-10T12:37:27.061 INFO:tasks.workunit.client.1.vm07.stdout:3/14: chown f1 27629135 1 2026-03-10T12:37:27.061 INFO:tasks.workunit.client.1.vm07.stdout:3/15: truncate f4 407752 0 2026-03-10T12:37:27.061 INFO:tasks.workunit.client.1.vm07.stdout:3/16: truncate f0 2951 0 2026-03-10T12:37:27.068 INFO:tasks.workunit.client.1.vm07.stdout:3/17: creat f6 x:0 0 0 2026-03-10T12:37:27.068 INFO:tasks.workunit.client.1.vm07.stdout:3/18: write f4 [1262362,809] 0 2026-03-10T12:37:27.071 INFO:tasks.workunit.client.1.vm07.stdout:4/5: mknod d0/c1 0 2026-03-10T12:37:27.073 INFO:tasks.workunit.client.1.vm07.stdout:4/6: dread - no filename 2026-03-10T12:37:27.074 INFO:tasks.workunit.client.1.vm07.stdout:3/19: dread f1 [0,4194304] 0 2026-03-10T12:37:27.075 INFO:tasks.workunit.client.1.vm07.stdout:7/7: sync 2026-03-10T12:37:27.075 INFO:tasks.workunit.client.1.vm07.stdout:7/8: chown d0 18616307 1 2026-03-10T12:37:27.075 INFO:tasks.workunit.client.1.vm07.stdout:7/9: stat d0 0 2026-03-10T12:37:27.076 INFO:tasks.workunit.client.1.vm07.stdout:4/7: mknod d0/c2 0 2026-03-10T12:37:27.076 INFO:tasks.workunit.client.1.vm07.stdout:4/8: truncate - no filename 2026-03-10T12:37:27.077 INFO:tasks.workunit.client.1.vm07.stdout:3/20: symlink l7 0 2026-03-10T12:37:27.088 INFO:tasks.workunit.client.1.vm07.stdout:4/9: unlink d0/c1 0 2026-03-10T12:37:27.089 INFO:tasks.workunit.client.1.vm07.stdout:3/21: unlink f4 0 2026-03-10T12:37:27.089 INFO:tasks.workunit.client.1.vm07.stdout:3/22: write f0 [441786,113520] 0 2026-03-10T12:37:27.091 INFO:tasks.workunit.client.1.vm07.stdout:7/10: creat d0/f1 x:0 0 0 2026-03-10T12:37:27.098 INFO:tasks.workunit.client.1.vm07.stdout:7/11: dwrite d0/f1 [0,4194304] 0 2026-03-10T12:37:27.099 INFO:tasks.workunit.client.1.vm07.stdout:4/10: mkdir d0/d3 0 2026-03-10T12:37:27.102 INFO:tasks.workunit.client.1.vm07.stdout:4/11: mkdir d0/d4 0 2026-03-10T12:37:27.102 INFO:tasks.workunit.client.1.vm07.stdout:4/12: truncate - no filename 2026-03-10T12:37:27.102 INFO:tasks.workunit.client.1.vm07.stdout:4/13: write - no filename 2026-03-10T12:37:27.103 INFO:tasks.workunit.client.1.vm07.stdout:3/23: link f0 f8 0 2026-03-10T12:37:27.104 INFO:tasks.workunit.client.1.vm07.stdout:3/24: mknod c9 0 2026-03-10T12:37:27.104 INFO:tasks.workunit.client.1.vm07.stdout:3/25: readlink l5 0 2026-03-10T12:37:27.104 INFO:tasks.workunit.client.1.vm07.stdout:3/26: rmdir - no directory 2026-03-10T12:37:27.105 INFO:tasks.workunit.client.1.vm07.stdout:4/14: mkdir d0/d4/d5 0 2026-03-10T12:37:27.106 INFO:tasks.workunit.client.1.vm07.stdout:3/27: symlink la 0 2026-03-10T12:37:27.107 INFO:tasks.workunit.client.1.vm07.stdout:3/28: mknod cb 0 2026-03-10T12:37:27.107 INFO:tasks.workunit.client.1.vm07.stdout:3/29: stat l5 0 2026-03-10T12:37:27.109 INFO:tasks.workunit.client.1.vm07.stdout:4/15: creat d0/d4/d5/f6 x:0 0 0 2026-03-10T12:37:27.112 INFO:tasks.workunit.client.1.vm07.stdout:4/16: unlink d0/d4/d5/f6 0 2026-03-10T12:37:27.112 INFO:tasks.workunit.client.1.vm07.stdout:4/17: fdatasync - no filename 2026-03-10T12:37:27.119 INFO:tasks.workunit.client.1.vm07.stdout:4/18: sync 2026-03-10T12:37:27.121 INFO:tasks.workunit.client.1.vm07.stdout:4/19: chown d0/c2 404 1 2026-03-10T12:37:27.122 INFO:tasks.workunit.client.1.vm07.stdout:4/20: creat d0/f7 x:0 0 0 2026-03-10T12:37:27.123 INFO:tasks.workunit.client.1.vm07.stdout:4/21: dread - d0/f7 zero size 2026-03-10T12:37:27.123 INFO:tasks.workunit.client.1.vm07.stdout:4/22: write d0/f7 [11803,63133] 0 2026-03-10T12:37:27.124 INFO:tasks.workunit.client.1.vm07.stdout:4/23: chown d0/f7 87324143 1 2026-03-10T12:37:27.125 INFO:tasks.workunit.client.1.vm07.stdout:4/24: rename d0/c2 to d0/d3/c8 0 2026-03-10T12:37:27.154 INFO:tasks.workunit.client.1.vm07.stdout:9/31: truncate d5/f8 238695 0 2026-03-10T12:37:27.219 INFO:tasks.workunit.client.1.vm07.stdout:1/52: rmdir d9/df 39 2026-03-10T12:37:27.220 INFO:tasks.workunit.client.1.vm07.stdout:1/53: rmdir d9/df 39 2026-03-10T12:37:27.224 INFO:tasks.workunit.client.1.vm07.stdout:1/54: dwrite d9/fb [0,4194304] 0 2026-03-10T12:37:27.227 INFO:tasks.workunit.client.1.vm07.stdout:1/55: creat d9/df/f15 x:0 0 0 2026-03-10T12:37:27.233 INFO:tasks.workunit.client.1.vm07.stdout:1/56: dwrite f8 [0,4194304] 0 2026-03-10T12:37:27.240 INFO:tasks.workunit.client.1.vm07.stdout:8/24: rmdir d1/d3 39 2026-03-10T12:37:27.246 INFO:tasks.workunit.client.1.vm07.stdout:5/26: rmdir d0 39 2026-03-10T12:37:27.248 INFO:tasks.workunit.client.1.vm07.stdout:2/22: dwrite d0/f1 [0,4194304] 0 2026-03-10T12:37:27.252 INFO:tasks.workunit.client.1.vm07.stdout:8/25: write d1/d3/d6/f9 [375196,114516] 0 2026-03-10T12:37:27.261 INFO:tasks.workunit.client.1.vm07.stdout:2/23: rmdir d0 39 2026-03-10T12:37:27.272 INFO:tasks.workunit.client.1.vm07.stdout:5/27: mknod d0/cb 0 2026-03-10T12:37:27.274 INFO:tasks.workunit.client.1.vm07.stdout:2/24: chown d0/c7 45396 1 2026-03-10T12:37:27.275 INFO:tasks.workunit.client.1.vm07.stdout:8/26: creat d1/fb x:0 0 0 2026-03-10T12:37:27.280 INFO:tasks.workunit.client.1.vm07.stdout:8/27: dwrite d1/fb [0,4194304] 0 2026-03-10T12:37:27.294 INFO:tasks.workunit.client.1.vm07.stdout:5/28: creat d0/fc x:0 0 0 2026-03-10T12:37:27.295 INFO:tasks.workunit.client.1.vm07.stdout:5/29: chown d0/l7 2 1 2026-03-10T12:37:27.295 INFO:tasks.workunit.client.1.vm07.stdout:5/30: fdatasync d0/fa 0 2026-03-10T12:37:27.298 INFO:tasks.workunit.client.1.vm07.stdout:8/28: creat d1/fc x:0 0 0 2026-03-10T12:37:27.309 INFO:tasks.workunit.client.1.vm07.stdout:8/29: dwrite d1/f2 [0,4194304] 0 2026-03-10T12:37:27.310 INFO:tasks.workunit.client.1.vm07.stdout:5/31: rename d0/f1 to d0/fd 0 2026-03-10T12:37:27.320 INFO:tasks.workunit.client.1.vm07.stdout:5/32: dwrite d0/fc [0,4194304] 0 2026-03-10T12:37:27.322 INFO:tasks.workunit.client.1.vm07.stdout:8/30: mknod d1/d3/d6/cd 0 2026-03-10T12:37:27.322 INFO:tasks.workunit.client.1.vm07.stdout:5/33: chown d0/l3 14754 1 2026-03-10T12:37:27.322 INFO:tasks.workunit.client.1.vm07.stdout:8/31: stat d1/d3/d6 0 2026-03-10T12:37:27.326 INFO:tasks.workunit.client.1.vm07.stdout:8/32: dread d1/fb [0,4194304] 0 2026-03-10T12:37:27.328 INFO:tasks.workunit.client.1.vm07.stdout:5/34: mknod d0/ce 0 2026-03-10T12:37:27.336 INFO:tasks.workunit.client.1.vm07.stdout:8/33: symlink d1/le 0 2026-03-10T12:37:27.338 INFO:tasks.workunit.client.1.vm07.stdout:5/35: creat d0/ff x:0 0 0 2026-03-10T12:37:27.344 INFO:tasks.workunit.client.1.vm07.stdout:5/36: write d0/fa [212751,98233] 0 2026-03-10T12:37:27.344 INFO:tasks.workunit.client.1.vm07.stdout:5/37: link d0/f5 d0/f10 0 2026-03-10T12:37:27.344 INFO:tasks.workunit.client.1.vm07.stdout:5/38: write d0/f10 [949574,36770] 0 2026-03-10T12:37:27.353 INFO:tasks.workunit.client.1.vm07.stdout:5/39: creat d0/f11 x:0 0 0 2026-03-10T12:37:27.358 INFO:tasks.workunit.client.1.vm07.stdout:5/40: dwrite d0/ff [0,4194304] 0 2026-03-10T12:37:27.360 INFO:tasks.workunit.client.1.vm07.stdout:6/32: truncate d1/f3 3950861 0 2026-03-10T12:37:27.361 INFO:tasks.workunit.client.1.vm07.stdout:0/56: getdents d0 0 2026-03-10T12:37:27.361 INFO:tasks.workunit.client.1.vm07.stdout:6/33: chown d1/f2 3838866 1 2026-03-10T12:37:27.362 INFO:tasks.workunit.client.1.vm07.stdout:0/57: write d0/fd [3199276,70832] 0 2026-03-10T12:37:27.363 INFO:tasks.workunit.client.1.vm07.stdout:5/41: write d0/f9 [1299100,68485] 0 2026-03-10T12:37:27.377 INFO:tasks.workunit.client.1.vm07.stdout:0/58: unlink d0/ce 0 2026-03-10T12:37:27.384 INFO:tasks.workunit.client.1.vm07.stdout:5/42: creat d0/f12 x:0 0 0 2026-03-10T12:37:27.386 INFO:tasks.workunit.client.1.vm07.stdout:6/34: getdents d1 0 2026-03-10T12:37:27.388 INFO:tasks.workunit.client.1.vm07.stdout:5/43: dwrite d0/f9 [0,4194304] 0 2026-03-10T12:37:27.389 INFO:tasks.workunit.client.1.vm07.stdout:5/44: dread - d0/f12 zero size 2026-03-10T12:37:27.394 INFO:tasks.workunit.client.1.vm07.stdout:7/12: write d0/f1 [5104137,19402] 0 2026-03-10T12:37:27.403 INFO:tasks.workunit.client.1.vm07.stdout:7/13: mknod d0/c2 0 2026-03-10T12:37:27.408 INFO:tasks.workunit.client.1.vm07.stdout:7/14: dwrite d0/f1 [0,4194304] 0 2026-03-10T12:37:27.411 INFO:tasks.workunit.client.1.vm07.stdout:6/35: getdents d1 0 2026-03-10T12:37:27.412 INFO:tasks.workunit.client.1.vm07.stdout:7/15: creat d0/f3 x:0 0 0 2026-03-10T12:37:27.417 INFO:tasks.workunit.client.1.vm07.stdout:3/30: truncate f2 47663 0 2026-03-10T12:37:27.422 INFO:tasks.workunit.client.1.vm07.stdout:7/16: rename d0/f1 to d0/f4 0 2026-03-10T12:37:27.422 INFO:tasks.workunit.client.1.vm07.stdout:7/17: dread - d0/f3 zero size 2026-03-10T12:37:27.422 INFO:tasks.workunit.client.1.vm07.stdout:3/31: write f0 [1155905,53933] 0 2026-03-10T12:37:27.427 INFO:tasks.workunit.client.1.vm07.stdout:7/18: symlink d0/l5 0 2026-03-10T12:37:27.427 INFO:tasks.workunit.client.1.vm07.stdout:7/19: dread - d0/f3 zero size 2026-03-10T12:37:27.437 INFO:tasks.workunit.client.1.vm07.stdout:4/25: rename d0/d3/c8 to d0/d4/c9 0 2026-03-10T12:37:27.438 INFO:tasks.workunit.client.1.vm07.stdout:4/26: mkdir d0/d4/d5/da 0 2026-03-10T12:37:27.440 INFO:tasks.workunit.client.1.vm07.stdout:4/27: symlink d0/d4/d5/lb 0 2026-03-10T12:37:27.442 INFO:tasks.workunit.client.1.vm07.stdout:3/32: getdents . 0 2026-03-10T12:37:27.448 INFO:tasks.workunit.client.1.vm07.stdout:4/28: mknod d0/d4/cc 0 2026-03-10T12:37:27.448 INFO:tasks.workunit.client.1.vm07.stdout:4/29: symlink d0/d3/ld 0 2026-03-10T12:37:27.449 INFO:tasks.workunit.client.1.vm07.stdout:4/30: dwrite d0/f7 [0,4194304] 0 2026-03-10T12:37:27.453 INFO:tasks.workunit.client.1.vm07.stdout:4/31: read d0/f7 [540077,101585] 0 2026-03-10T12:37:27.459 INFO:tasks.workunit.client.1.vm07.stdout:4/32: readlink d0/d4/d5/lb 0 2026-03-10T12:37:27.460 INFO:tasks.workunit.client.1.vm07.stdout:4/33: write d0/f7 [1004311,40892] 0 2026-03-10T12:37:27.460 INFO:tasks.workunit.client.1.vm07.stdout:8/34: fdatasync d1/d3/d6/f9 0 2026-03-10T12:37:27.460 INFO:tasks.workunit.client.1.vm07.stdout:1/57: getdents d9/df 0 2026-03-10T12:37:27.464 INFO:tasks.workunit.client.1.vm07.stdout:1/58: dwrite f8 [0,4194304] 0 2026-03-10T12:37:27.470 INFO:tasks.workunit.client.1.vm07.stdout:1/59: dread d9/fd [0,4194304] 0 2026-03-10T12:37:27.472 INFO:tasks.workunit.client.1.vm07.stdout:1/60: dwrite d9/f12 [0,4194304] 0 2026-03-10T12:37:27.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:27 vm00.local ceph-mon[50686]: pgmap v148: 65 pgs: 65 active+clean; 187 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 5.5 MiB/s wr, 538 op/s 2026-03-10T12:37:27.484 INFO:tasks.workunit.client.1.vm07.stdout:8/35: unlink d1/d3/d6/cd 0 2026-03-10T12:37:27.484 INFO:tasks.workunit.client.1.vm07.stdout:2/25: truncate d0/f1 3291717 0 2026-03-10T12:37:27.489 INFO:tasks.workunit.client.1.vm07.stdout:7/20: sync 2026-03-10T12:37:27.489 INFO:tasks.workunit.client.1.vm07.stdout:4/34: mknod d0/d4/d5/da/ce 0 2026-03-10T12:37:27.490 INFO:tasks.workunit.client.1.vm07.stdout:7/21: write d0/f3 [1013321,130388] 0 2026-03-10T12:37:27.503 INFO:tasks.workunit.client.1.vm07.stdout:7/22: creat d0/f6 x:0 0 0 2026-03-10T12:37:27.506 INFO:tasks.workunit.client.1.vm07.stdout:4/35: write d0/f7 [42793,102745] 0 2026-03-10T12:37:27.508 INFO:tasks.workunit.client.1.vm07.stdout:4/36: symlink d0/d4/d5/da/lf 0 2026-03-10T12:37:27.508 INFO:tasks.workunit.client.1.vm07.stdout:1/61: creat d9/f16 x:0 0 0 2026-03-10T12:37:27.510 INFO:tasks.workunit.client.1.vm07.stdout:1/62: dread d9/fd [0,4194304] 0 2026-03-10T12:37:27.512 INFO:tasks.workunit.client.1.vm07.stdout:1/63: symlink d9/df/l17 0 2026-03-10T12:37:27.515 INFO:tasks.workunit.client.1.vm07.stdout:1/64: dread d9/df/f10 [0,4194304] 0 2026-03-10T12:37:27.520 INFO:tasks.workunit.client.1.vm07.stdout:1/65: creat d9/df/f18 x:0 0 0 2026-03-10T12:37:27.521 INFO:tasks.workunit.client.1.vm07.stdout:7/23: sync 2026-03-10T12:37:27.521 INFO:tasks.workunit.client.1.vm07.stdout:7/24: write d0/f6 [397959,70202] 0 2026-03-10T12:37:27.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:27 vm07.local ceph-mon[58582]: pgmap v148: 65 pgs: 65 active+clean; 187 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 5.5 MiB/s wr, 538 op/s 2026-03-10T12:37:27.580 INFO:tasks.workunit.client.1.vm07.stdout:6/36: dread d1/f3 [0,4194304] 0 2026-03-10T12:37:27.581 INFO:tasks.workunit.client.1.vm07.stdout:6/37: dread d1/f2 [0,4194304] 0 2026-03-10T12:37:27.586 INFO:tasks.workunit.client.1.vm07.stdout:5/45: truncate d0/f10 2741 0 2026-03-10T12:37:27.586 INFO:tasks.workunit.client.1.vm07.stdout:3/33: write f2 [717482,33111] 0 2026-03-10T12:37:27.586 INFO:tasks.workunit.client.1.vm07.stdout:7/25: fsync d0/f3 0 2026-03-10T12:37:27.591 INFO:tasks.workunit.client.1.vm07.stdout:5/46: dwrite d0/ff [4194304,4194304] 0 2026-03-10T12:37:27.599 INFO:tasks.workunit.client.1.vm07.stdout:6/38: rmdir d1 39 2026-03-10T12:37:27.601 INFO:tasks.workunit.client.1.vm07.stdout:3/34: mkdir dc 0 2026-03-10T12:37:27.602 INFO:tasks.workunit.client.1.vm07.stdout:3/35: truncate f1 337579 0 2026-03-10T12:37:27.604 INFO:tasks.workunit.client.1.vm07.stdout:7/26: rename d0/f6 to d0/f7 0 2026-03-10T12:37:27.613 INFO:tasks.workunit.client.1.vm07.stdout:2/26: truncate d0/f4 3780467 0 2026-03-10T12:37:27.614 INFO:tasks.workunit.client.1.vm07.stdout:2/27: chown d0 0 1 2026-03-10T12:37:27.615 INFO:tasks.workunit.client.1.vm07.stdout:3/36: mkdir dc/dd 0 2026-03-10T12:37:27.615 INFO:tasks.workunit.client.1.vm07.stdout:5/47: creat d0/f13 x:0 0 0 2026-03-10T12:37:27.616 INFO:tasks.workunit.client.1.vm07.stdout:3/37: rename dc/dd to dc/dd/de 22 2026-03-10T12:37:27.618 INFO:tasks.workunit.client.1.vm07.stdout:3/38: dread f1 [0,4194304] 0 2026-03-10T12:37:27.619 INFO:tasks.workunit.client.1.vm07.stdout:7/27: dwrite d0/f4 [4194304,4194304] 0 2026-03-10T12:37:27.635 INFO:tasks.workunit.client.1.vm07.stdout:6/39: mkdir d1/d4/d5 0 2026-03-10T12:37:27.636 INFO:tasks.workunit.client.1.vm07.stdout:3/39: dwrite f8 [0,4194304] 0 2026-03-10T12:37:27.644 INFO:tasks.workunit.client.1.vm07.stdout:8/36: dwrite d1/d3/f8 [0,4194304] 0 2026-03-10T12:37:27.648 INFO:tasks.workunit.client.1.vm07.stdout:6/40: mkdir d1/d4/d6 0 2026-03-10T12:37:27.648 INFO:tasks.workunit.client.1.vm07.stdout:8/37: write d1/d3/d6/f9 [169359,27926] 0 2026-03-10T12:37:27.651 INFO:tasks.workunit.client.1.vm07.stdout:6/41: rename d1 to d1/d4/d6/d7 22 2026-03-10T12:37:27.652 INFO:tasks.workunit.client.1.vm07.stdout:2/28: symlink d0/lb 0 2026-03-10T12:37:27.659 INFO:tasks.workunit.client.1.vm07.stdout:6/42: dread d1/f3 [0,4194304] 0 2026-03-10T12:37:27.660 INFO:tasks.workunit.client.1.vm07.stdout:8/38: dwrite d1/fc [0,4194304] 0 2026-03-10T12:37:27.662 INFO:tasks.workunit.client.1.vm07.stdout:5/48: mkdir d0/d14 0 2026-03-10T12:37:27.662 INFO:tasks.workunit.client.1.vm07.stdout:5/49: readlink d0/l7 0 2026-03-10T12:37:27.666 INFO:tasks.workunit.client.1.vm07.stdout:7/28: link d0/f3 d0/f8 0 2026-03-10T12:37:27.673 INFO:tasks.workunit.client.1.vm07.stdout:4/37: truncate d0/f7 3312966 0 2026-03-10T12:37:27.677 INFO:tasks.workunit.client.1.vm07.stdout:8/39: creat d1/d3/ff x:0 0 0 2026-03-10T12:37:27.677 INFO:tasks.workunit.client.1.vm07.stdout:8/40: stat d1/f2 0 2026-03-10T12:37:27.681 INFO:tasks.workunit.client.1.vm07.stdout:5/50: sync 2026-03-10T12:37:27.681 INFO:tasks.workunit.client.1.vm07.stdout:0/59: fsync d0/fd 0 2026-03-10T12:37:27.681 INFO:tasks.workunit.client.1.vm07.stdout:0/60: fsync d0/fd 0 2026-03-10T12:37:27.688 INFO:tasks.workunit.client.1.vm07.stdout:2/29: fdatasync d0/f1 0 2026-03-10T12:37:27.689 INFO:tasks.workunit.client.1.vm07.stdout:1/66: rmdir d9/df 39 2026-03-10T12:37:27.693 INFO:tasks.workunit.client.1.vm07.stdout:4/38: chown d0/d4/c9 6634 1 2026-03-10T12:37:27.700 INFO:tasks.workunit.client.1.vm07.stdout:8/41: creat d1/d3/f10 x:0 0 0 2026-03-10T12:37:27.706 INFO:tasks.workunit.client.1.vm07.stdout:8/42: dwrite d1/d3/ff [0,4194304] 0 2026-03-10T12:37:27.716 INFO:tasks.workunit.client.1.vm07.stdout:4/39: mkdir d0/d4/d10 0 2026-03-10T12:37:27.732 INFO:tasks.workunit.client.1.vm07.stdout:5/51: mknod d0/c15 0 2026-03-10T12:37:27.738 INFO:tasks.workunit.client.1.vm07.stdout:8/43: mkdir d1/d3/d11 0 2026-03-10T12:37:27.745 INFO:tasks.workunit.client.1.vm07.stdout:8/44: dwrite d1/f7 [0,4194304] 0 2026-03-10T12:37:27.755 INFO:tasks.workunit.client.1.vm07.stdout:6/43: dwrite d1/f3 [0,4194304] 0 2026-03-10T12:37:27.778 INFO:tasks.workunit.client.1.vm07.stdout:0/61: truncate d0/fd 2886764 0 2026-03-10T12:37:27.784 INFO:tasks.workunit.client.1.vm07.stdout:8/45: symlink d1/d3/l12 0 2026-03-10T12:37:27.789 INFO:tasks.workunit.client.1.vm07.stdout:6/44: rmdir d1/d4 39 2026-03-10T12:37:27.814 INFO:tasks.workunit.client.1.vm07.stdout:3/40: write f8 [4163777,125115] 0 2026-03-10T12:37:27.836 INFO:tasks.workunit.client.1.vm07.stdout:7/29: fsync d0/f3 0 2026-03-10T12:37:27.843 INFO:tasks.workunit.client.1.vm07.stdout:2/30: dread d0/f4 [0,4194304] 0 2026-03-10T12:37:27.844 INFO:tasks.workunit.client.1.vm07.stdout:9/32: truncate d5/f8 118439 0 2026-03-10T12:37:27.894 INFO:tasks.workunit.client.1.vm07.stdout:1/67: write d9/df/f10 [3230295,85315] 0 2026-03-10T12:37:27.897 INFO:tasks.workunit.client.1.vm07.stdout:5/52: unlink d0/f10 0 2026-03-10T12:37:27.902 INFO:tasks.workunit.client.1.vm07.stdout:3/41: creat dc/ff x:0 0 0 2026-03-10T12:37:27.902 INFO:tasks.workunit.client.1.vm07.stdout:7/30: rename d0/f7 to d0/f9 0 2026-03-10T12:37:27.903 INFO:tasks.workunit.client.1.vm07.stdout:3/42: read f0 [2202879,41848] 0 2026-03-10T12:37:27.903 INFO:tasks.workunit.client.1.vm07.stdout:2/31: fsync d0/f6 0 2026-03-10T12:37:27.906 INFO:tasks.workunit.client.1.vm07.stdout:3/43: dwrite f0 [0,4194304] 0 2026-03-10T12:37:27.916 INFO:tasks.workunit.client.1.vm07.stdout:4/40: getdents d0/d4/d5 0 2026-03-10T12:37:27.916 INFO:tasks.workunit.client.1.vm07.stdout:4/41: truncate d0/f7 3725185 0 2026-03-10T12:37:27.931 INFO:tasks.workunit.client.1.vm07.stdout:0/62: link d0/fd d0/f10 0 2026-03-10T12:37:27.932 INFO:tasks.workunit.client.1.vm07.stdout:4/42: mknod d0/d3/c11 0 2026-03-10T12:37:27.932 INFO:tasks.workunit.client.1.vm07.stdout:4/43: readlink d0/d4/d5/da/lf 0 2026-03-10T12:37:27.938 INFO:tasks.workunit.client.1.vm07.stdout:8/46: getdents d1 0 2026-03-10T12:37:27.940 INFO:tasks.workunit.client.1.vm07.stdout:4/44: unlink d0/d4/d5/da/lf 0 2026-03-10T12:37:27.942 INFO:tasks.workunit.client.1.vm07.stdout:8/47: dwrite d1/fc [0,4194304] 0 2026-03-10T12:37:27.946 INFO:tasks.workunit.client.1.vm07.stdout:8/48: dread d1/d3/f8 [0,4194304] 0 2026-03-10T12:37:27.950 INFO:tasks.workunit.client.1.vm07.stdout:6/45: rmdir d1/d4/d5 0 2026-03-10T12:37:27.962 INFO:tasks.workunit.client.1.vm07.stdout:0/63: read d0/f10 [1216555,64047] 0 2026-03-10T12:37:27.962 INFO:tasks.workunit.client.1.vm07.stdout:0/64: readlink d0/lc 0 2026-03-10T12:37:27.970 INFO:tasks.workunit.client.1.vm07.stdout:4/45: dwrite d0/f7 [0,4194304] 0 2026-03-10T12:37:27.986 INFO:tasks.workunit.client.1.vm07.stdout:8/49: mknod d1/d3/c13 0 2026-03-10T12:37:27.993 INFO:tasks.workunit.client.1.vm07.stdout:8/50: write d1/d3/f10 [1033102,65751] 0 2026-03-10T12:37:27.993 INFO:tasks.workunit.client.1.vm07.stdout:8/51: write d1/f2 [4890131,108226] 0 2026-03-10T12:37:27.993 INFO:tasks.workunit.client.1.vm07.stdout:8/52: dread d1/f2 [0,4194304] 0 2026-03-10T12:37:28.002 INFO:tasks.workunit.client.1.vm07.stdout:2/32: rename d0/ca to d0/cc 0 2026-03-10T12:37:28.002 INFO:tasks.workunit.client.1.vm07.stdout:2/33: stat d0/f6 0 2026-03-10T12:37:28.007 INFO:tasks.workunit.client.1.vm07.stdout:6/46: sync 2026-03-10T12:37:28.013 INFO:tasks.workunit.client.1.vm07.stdout:8/53: symlink d1/d3/l14 0 2026-03-10T12:37:28.018 INFO:tasks.workunit.client.1.vm07.stdout:6/47: unlink d1/f3 0 2026-03-10T12:37:28.021 INFO:tasks.workunit.client.1.vm07.stdout:4/46: sync 2026-03-10T12:37:28.023 INFO:tasks.workunit.client.1.vm07.stdout:2/34: mknod d0/cd 0 2026-03-10T12:37:28.023 INFO:tasks.workunit.client.1.vm07.stdout:2/35: readlink d0/lb 0 2026-03-10T12:37:28.027 INFO:tasks.workunit.client.1.vm07.stdout:2/36: dread d0/f4 [0,4194304] 0 2026-03-10T12:37:28.028 INFO:tasks.workunit.client.1.vm07.stdout:2/37: write d0/f4 [1683106,95991] 0 2026-03-10T12:37:28.031 INFO:tasks.workunit.client.1.vm07.stdout:2/38: dwrite d0/f1 [0,4194304] 0 2026-03-10T12:37:28.035 INFO:tasks.workunit.client.1.vm07.stdout:6/48: creat d1/f8 x:0 0 0 2026-03-10T12:37:28.036 INFO:tasks.workunit.client.1.vm07.stdout:6/49: write d1/f8 [218515,26101] 0 2026-03-10T12:37:28.039 INFO:tasks.workunit.client.1.vm07.stdout:6/50: dwrite d1/f8 [0,4194304] 0 2026-03-10T12:37:28.044 INFO:tasks.workunit.client.1.vm07.stdout:8/54: getdents d1/d3/d6 0 2026-03-10T12:37:28.045 INFO:tasks.workunit.client.1.vm07.stdout:6/51: write d1/f2 [925998,57746] 0 2026-03-10T12:37:28.046 INFO:tasks.workunit.client.1.vm07.stdout:8/55: dread d1/f2 [0,4194304] 0 2026-03-10T12:37:28.055 INFO:tasks.workunit.client.1.vm07.stdout:6/52: unlink d1/f2 0 2026-03-10T12:37:28.056 INFO:tasks.workunit.client.1.vm07.stdout:8/56: read d1/f7 [1308426,104779] 0 2026-03-10T12:37:28.058 INFO:tasks.workunit.client.1.vm07.stdout:8/57: dread d1/d3/f8 [0,4194304] 0 2026-03-10T12:37:28.060 INFO:tasks.workunit.client.1.vm07.stdout:6/53: unlink d1/f8 0 2026-03-10T12:37:28.067 INFO:tasks.workunit.client.1.vm07.stdout:2/39: rename d0/l9 to d0/le 0 2026-03-10T12:37:28.068 INFO:tasks.workunit.client.1.vm07.stdout:2/40: write d0/f1 [1023922,65937] 0 2026-03-10T12:37:28.082 INFO:tasks.workunit.client.1.vm07.stdout:8/58: creat d1/d3/d11/f15 x:0 0 0 2026-03-10T12:37:28.091 INFO:tasks.workunit.client.1.vm07.stdout:6/54: mkdir d1/d9 0 2026-03-10T12:37:28.093 INFO:tasks.workunit.client.1.vm07.stdout:8/59: creat d1/d3/f16 x:0 0 0 2026-03-10T12:37:28.097 INFO:tasks.workunit.client.1.vm07.stdout:8/60: dwrite d1/fc [0,4194304] 0 2026-03-10T12:37:28.111 INFO:tasks.workunit.client.1.vm07.stdout:8/61: write d1/f7 [686720,91667] 0 2026-03-10T12:37:28.123 INFO:tasks.workunit.client.1.vm07.stdout:6/55: mknod d1/ca 0 2026-03-10T12:37:28.129 INFO:tasks.workunit.client.1.vm07.stdout:6/56: creat d1/d9/fb x:0 0 0 2026-03-10T12:37:28.133 INFO:tasks.workunit.client.1.vm07.stdout:6/57: dwrite d1/d9/fb [0,4194304] 0 2026-03-10T12:37:28.147 INFO:tasks.workunit.client.1.vm07.stdout:8/62: sync 2026-03-10T12:37:28.147 INFO:tasks.workunit.client.1.vm07.stdout:8/63: chown d1/d3/f16 0 1 2026-03-10T12:37:28.148 INFO:tasks.workunit.client.1.vm07.stdout:8/64: fsync d1/d3/f10 0 2026-03-10T12:37:28.148 INFO:tasks.workunit.client.1.vm07.stdout:8/65: write d1/fc [4319974,11593] 0 2026-03-10T12:37:28.159 INFO:tasks.workunit.client.1.vm07.stdout:9/33: truncate d5/f8 1130000 0 2026-03-10T12:37:28.165 INFO:tasks.workunit.client.1.vm07.stdout:9/34: chown d5/l9 1104694161 1 2026-03-10T12:37:28.166 INFO:tasks.workunit.client.1.vm07.stdout:9/35: creat d5/fa x:0 0 0 2026-03-10T12:37:28.167 INFO:tasks.workunit.client.1.vm07.stdout:9/36: creat d5/fb x:0 0 0 2026-03-10T12:37:28.168 INFO:tasks.workunit.client.1.vm07.stdout:9/37: chown d5/l7 787756044 1 2026-03-10T12:37:28.171 INFO:tasks.workunit.client.1.vm07.stdout:9/38: dwrite d5/fa [0,4194304] 0 2026-03-10T12:37:28.173 INFO:tasks.workunit.client.1.vm07.stdout:9/39: write d5/fb [355949,42136] 0 2026-03-10T12:37:28.178 INFO:tasks.workunit.client.1.vm07.stdout:9/40: dwrite d5/fa [0,4194304] 0 2026-03-10T12:37:28.206 INFO:tasks.workunit.client.1.vm07.stdout:1/68: write d9/fd [1384084,82828] 0 2026-03-10T12:37:28.211 INFO:tasks.workunit.client.1.vm07.stdout:7/31: getdents d0 0 2026-03-10T12:37:28.212 INFO:tasks.workunit.client.1.vm07.stdout:3/44: dwrite f1 [0,4194304] 0 2026-03-10T12:37:28.212 INFO:tasks.workunit.client.1.vm07.stdout:5/53: rename d0/f5 to d0/d14/f16 0 2026-03-10T12:37:28.228 INFO:tasks.workunit.client.1.vm07.stdout:5/54: rmdir d0/d14 39 2026-03-10T12:37:28.229 INFO:tasks.workunit.client.1.vm07.stdout:5/55: fsync d0/ff 0 2026-03-10T12:37:28.230 INFO:tasks.workunit.client.1.vm07.stdout:4/47: rename d0/d4/cc to d0/d4/d5/c12 0 2026-03-10T12:37:28.231 INFO:tasks.workunit.client.1.vm07.stdout:4/48: fdatasync d0/f7 0 2026-03-10T12:37:28.235 INFO:tasks.workunit.client.1.vm07.stdout:3/45: dwrite f0 [0,4194304] 0 2026-03-10T12:37:28.245 INFO:tasks.workunit.client.1.vm07.stdout:3/46: dread f0 [0,4194304] 0 2026-03-10T12:37:28.246 INFO:tasks.workunit.client.1.vm07.stdout:8/66: rename d1/d3/l12 to d1/d3/d6/l17 0 2026-03-10T12:37:28.246 INFO:tasks.workunit.client.1.vm07.stdout:1/69: creat d9/f19 x:0 0 0 2026-03-10T12:37:28.246 INFO:tasks.workunit.client.1.vm07.stdout:4/49: mknod d0/d4/d5/c13 0 2026-03-10T12:37:28.246 INFO:tasks.workunit.client.1.vm07.stdout:4/50: write d0/f7 [1755741,38959] 0 2026-03-10T12:37:28.256 INFO:tasks.workunit.client.1.vm07.stdout:3/47: unlink l5 0 2026-03-10T12:37:28.258 INFO:tasks.workunit.client.1.vm07.stdout:0/65: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:28.260 INFO:tasks.workunit.client.1.vm07.stdout:8/67: mkdir d1/d3/d18 0 2026-03-10T12:37:28.260 INFO:tasks.workunit.client.1.vm07.stdout:0/66: write d0/f10 [2159383,12718] 0 2026-03-10T12:37:28.263 INFO:tasks.workunit.client.1.vm07.stdout:5/56: creat d0/f17 x:0 0 0 2026-03-10T12:37:28.263 INFO:tasks.workunit.client.1.vm07.stdout:2/41: chown d0/cc 214 1 2026-03-10T12:37:28.264 INFO:tasks.workunit.client.1.vm07.stdout:5/57: write d0/fa [835194,74477] 0 2026-03-10T12:37:28.268 INFO:tasks.workunit.client.1.vm07.stdout:6/58: truncate d1/d9/fb 1593073 0 2026-03-10T12:37:28.272 INFO:tasks.workunit.client.1.vm07.stdout:3/48: creat dc/f10 x:0 0 0 2026-03-10T12:37:28.273 INFO:tasks.workunit.client.1.vm07.stdout:3/49: dread f8 [0,4194304] 0 2026-03-10T12:37:28.277 INFO:tasks.workunit.client.1.vm07.stdout:8/68: creat d1/f19 x:0 0 0 2026-03-10T12:37:28.277 INFO:tasks.workunit.client.1.vm07.stdout:0/67: creat d0/f11 x:0 0 0 2026-03-10T12:37:28.278 INFO:tasks.workunit.client.1.vm07.stdout:0/68: chown d0/lc 132333533 1 2026-03-10T12:37:28.279 INFO:tasks.workunit.client.1.vm07.stdout:9/41: dwrite d5/f8 [0,4194304] 0 2026-03-10T12:37:28.289 INFO:tasks.workunit.client.1.vm07.stdout:1/70: creat d9/f1a x:0 0 0 2026-03-10T12:37:28.289 INFO:tasks.workunit.client.1.vm07.stdout:2/42: symlink d0/lf 0 2026-03-10T12:37:28.289 INFO:tasks.workunit.client.1.vm07.stdout:3/50: write f8 [4917747,52830] 0 2026-03-10T12:37:28.289 INFO:tasks.workunit.client.1.vm07.stdout:3/51: chown f8 39 1 2026-03-10T12:37:28.290 INFO:tasks.workunit.client.1.vm07.stdout:1/71: write d9/df/f10 [1900340,69370] 0 2026-03-10T12:37:28.291 INFO:tasks.workunit.client.1.vm07.stdout:1/72: read d9/f12 [540718,53417] 0 2026-03-10T12:37:28.292 INFO:tasks.workunit.client.1.vm07.stdout:1/73: chown c4 25681 1 2026-03-10T12:37:28.294 INFO:tasks.workunit.client.1.vm07.stdout:1/74: truncate d9/f19 438738 0 2026-03-10T12:37:28.295 INFO:tasks.workunit.client.1.vm07.stdout:0/69: mkdir d0/d12 0 2026-03-10T12:37:28.296 INFO:tasks.workunit.client.1.vm07.stdout:0/70: chown d0/f10 15772 1 2026-03-10T12:37:28.297 INFO:tasks.workunit.client.1.vm07.stdout:0/71: write d0/f11 [990063,33555] 0 2026-03-10T12:37:28.299 INFO:tasks.workunit.client.1.vm07.stdout:5/58: mkdir d0/d14/d18 0 2026-03-10T12:37:28.307 INFO:tasks.workunit.client.1.vm07.stdout:5/59: dwrite d0/ff [4194304,4194304] 0 2026-03-10T12:37:28.308 INFO:tasks.workunit.client.1.vm07.stdout:4/51: getdents d0/d4/d5/da 0 2026-03-10T12:37:28.309 INFO:tasks.workunit.client.1.vm07.stdout:5/60: write d0/f12 [59234,15084] 0 2026-03-10T12:37:28.309 INFO:tasks.workunit.client.1.vm07.stdout:5/61: fsync d0/f9 0 2026-03-10T12:37:28.310 INFO:tasks.workunit.client.1.vm07.stdout:5/62: stat d0/fd 0 2026-03-10T12:37:28.314 INFO:tasks.workunit.client.1.vm07.stdout:3/52: symlink dc/dd/l11 0 2026-03-10T12:37:28.325 INFO:tasks.workunit.client.1.vm07.stdout:0/72: symlink d0/d12/l13 0 2026-03-10T12:37:28.326 INFO:tasks.workunit.client.1.vm07.stdout:3/53: truncate f6 57175 0 2026-03-10T12:37:28.326 INFO:tasks.workunit.client.1.vm07.stdout:2/43: symlink d0/l10 0 2026-03-10T12:37:28.326 INFO:tasks.workunit.client.1.vm07.stdout:2/44: chown d0/f4 64207255 1 2026-03-10T12:37:28.326 INFO:tasks.workunit.client.1.vm07.stdout:4/52: dwrite d0/f7 [0,4194304] 0 2026-03-10T12:37:28.326 INFO:tasks.workunit.client.1.vm07.stdout:9/42: dread d5/fb [0,4194304] 0 2026-03-10T12:37:28.326 INFO:tasks.workunit.client.1.vm07.stdout:4/53: write d0/f7 [2535645,44627] 0 2026-03-10T12:37:28.328 INFO:tasks.workunit.client.1.vm07.stdout:0/73: dwrite d0/f11 [0,4194304] 0 2026-03-10T12:37:28.329 INFO:tasks.workunit.client.1.vm07.stdout:1/75: sync 2026-03-10T12:37:28.333 INFO:tasks.workunit.client.1.vm07.stdout:9/43: mknod d5/cc 0 2026-03-10T12:37:28.335 INFO:tasks.workunit.client.1.vm07.stdout:4/54: dread d0/f7 [0,4194304] 0 2026-03-10T12:37:28.346 INFO:tasks.workunit.client.1.vm07.stdout:7/32: write d0/f8 [668626,88203] 0 2026-03-10T12:37:28.348 INFO:tasks.workunit.client.1.vm07.stdout:1/76: creat d9/f1b x:0 0 0 2026-03-10T12:37:28.348 INFO:tasks.workunit.client.1.vm07.stdout:1/77: fdatasync d9/df/f15 0 2026-03-10T12:37:28.350 INFO:tasks.workunit.client.1.vm07.stdout:1/78: rmdir d9/df 39 2026-03-10T12:37:28.353 INFO:tasks.workunit.client.1.vm07.stdout:1/79: read - d9/df/f18 zero size 2026-03-10T12:37:28.353 INFO:tasks.workunit.client.1.vm07.stdout:1/80: chown d9/f1b 234 1 2026-03-10T12:37:28.354 INFO:tasks.workunit.client.1.vm07.stdout:5/63: sync 2026-03-10T12:37:28.367 INFO:tasks.workunit.client.1.vm07.stdout:5/64: dwrite d0/f9 [0,4194304] 0 2026-03-10T12:37:28.371 INFO:tasks.workunit.client.1.vm07.stdout:1/81: mknod d9/df/c1c 0 2026-03-10T12:37:28.372 INFO:tasks.workunit.client.1.vm07.stdout:5/65: mkdir d0/d14/d18/d19 0 2026-03-10T12:37:28.372 INFO:tasks.workunit.client.1.vm07.stdout:5/66: chown d0/f17 15082248 1 2026-03-10T12:37:28.378 INFO:tasks.workunit.client.1.vm07.stdout:7/33: dread d0/f9 [0,4194304] 0 2026-03-10T12:37:28.383 INFO:tasks.workunit.client.1.vm07.stdout:7/34: dwrite d0/f3 [0,4194304] 0 2026-03-10T12:37:28.386 INFO:tasks.workunit.client.1.vm07.stdout:7/35: fsync d0/f4 0 2026-03-10T12:37:28.390 INFO:tasks.workunit.client.1.vm07.stdout:7/36: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:28.396 INFO:tasks.workunit.client.1.vm07.stdout:7/37: creat d0/fa x:0 0 0 2026-03-10T12:37:28.399 INFO:tasks.workunit.client.1.vm07.stdout:7/38: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:28.399 INFO:tasks.workunit.client.1.vm07.stdout:7/39: stat d0/f8 0 2026-03-10T12:37:28.401 INFO:tasks.workunit.client.1.vm07.stdout:7/40: read d0/f9 [208744,66269] 0 2026-03-10T12:37:28.406 INFO:tasks.workunit.client.1.vm07.stdout:6/59: dwrite d1/d9/fb [0,4194304] 0 2026-03-10T12:37:28.421 INFO:tasks.workunit.client.1.vm07.stdout:8/69: symlink d1/d3/d11/l1a 0 2026-03-10T12:37:28.422 INFO:tasks.workunit.client.1.vm07.stdout:6/60: rename d1/ca to d1/cc 0 2026-03-10T12:37:28.424 INFO:tasks.workunit.client.1.vm07.stdout:8/70: rename d1/d3/f10 to d1/d3/d18/f1b 0 2026-03-10T12:37:28.425 INFO:tasks.workunit.client.1.vm07.stdout:8/71: read d1/d3/ff [3251108,65802] 0 2026-03-10T12:37:28.425 INFO:tasks.workunit.client.1.vm07.stdout:6/61: mknod d1/d4/d6/cd 0 2026-03-10T12:37:28.426 INFO:tasks.workunit.client.1.vm07.stdout:8/72: write d1/d3/d6/f9 [255645,27200] 0 2026-03-10T12:37:28.428 INFO:tasks.workunit.client.1.vm07.stdout:3/54: read f0 [4584775,56848] 0 2026-03-10T12:37:28.429 INFO:tasks.workunit.client.1.vm07.stdout:3/55: rename dc to dc/dd/d12 22 2026-03-10T12:37:28.431 INFO:tasks.workunit.client.1.vm07.stdout:8/73: creat d1/d3/f1c x:0 0 0 2026-03-10T12:37:28.433 INFO:tasks.workunit.client.1.vm07.stdout:3/56: symlink dc/l13 0 2026-03-10T12:37:28.437 INFO:tasks.workunit.client.1.vm07.stdout:6/62: dwrite d1/d9/fb [4194304,4194304] 0 2026-03-10T12:37:28.451 INFO:tasks.workunit.client.1.vm07.stdout:8/74: write d1/d3/d18/f1b [2002404,44290] 0 2026-03-10T12:37:28.451 INFO:tasks.workunit.client.1.vm07.stdout:3/57: dread f1 [0,4194304] 0 2026-03-10T12:37:28.451 INFO:tasks.workunit.client.1.vm07.stdout:3/58: stat dc 0 2026-03-10T12:37:28.451 INFO:tasks.workunit.client.1.vm07.stdout:3/59: fsync f1 0 2026-03-10T12:37:28.451 INFO:tasks.workunit.client.1.vm07.stdout:3/60: symlink dc/dd/l14 0 2026-03-10T12:37:28.451 INFO:tasks.workunit.client.1.vm07.stdout:3/61: link c9 dc/c15 0 2026-03-10T12:37:28.452 INFO:tasks.workunit.client.1.vm07.stdout:4/55: fdatasync d0/f7 0 2026-03-10T12:37:28.455 INFO:tasks.workunit.client.1.vm07.stdout:8/75: dwrite d1/fb [0,4194304] 0 2026-03-10T12:37:28.456 INFO:tasks.workunit.client.1.vm07.stdout:3/62: dwrite f0 [0,4194304] 0 2026-03-10T12:37:28.458 INFO:tasks.workunit.client.1.vm07.stdout:8/76: fsync d1/d3/d11/f15 0 2026-03-10T12:37:28.461 INFO:tasks.workunit.client.1.vm07.stdout:5/67: fsync d0/f9 0 2026-03-10T12:37:28.469 INFO:tasks.workunit.client.1.vm07.stdout:4/56: dwrite d0/f7 [4194304,4194304] 0 2026-03-10T12:37:28.474 INFO:tasks.workunit.client.1.vm07.stdout:4/57: write d0/f7 [3458340,69249] 0 2026-03-10T12:37:28.474 INFO:tasks.workunit.client.1.vm07.stdout:8/77: rename d1/d3/f1c to d1/d3/f1d 0 2026-03-10T12:37:28.474 INFO:tasks.workunit.client.1.vm07.stdout:5/68: symlink d0/d14/d18/l1a 0 2026-03-10T12:37:28.474 INFO:tasks.workunit.client.1.vm07.stdout:3/63: creat dc/dd/f16 x:0 0 0 2026-03-10T12:37:28.476 INFO:tasks.workunit.client.1.vm07.stdout:5/69: chown d0/cb 57418 1 2026-03-10T12:37:28.477 INFO:tasks.workunit.client.1.vm07.stdout:3/64: write f8 [962300,115872] 0 2026-03-10T12:37:28.477 INFO:tasks.workunit.client.1.vm07.stdout:5/70: chown d0/d14/d18 61831 1 2026-03-10T12:37:28.477 INFO:tasks.workunit.client.1.vm07.stdout:3/65: fsync f1 0 2026-03-10T12:37:28.483 INFO:tasks.workunit.client.1.vm07.stdout:8/78: dwrite d1/d3/f16 [0,4194304] 0 2026-03-10T12:37:28.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:28 vm00.local ceph-mon[50686]: pgmap v149: 65 pgs: 65 active+clean; 194 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 4.5 MiB/s wr, 461 op/s 2026-03-10T12:37:28.488 INFO:tasks.workunit.client.1.vm07.stdout:1/82: truncate d9/f12 1874203 0 2026-03-10T12:37:28.488 INFO:tasks.workunit.client.1.vm07.stdout:0/74: rename d0/d12 to d0/d14 0 2026-03-10T12:37:28.489 INFO:tasks.workunit.client.1.vm07.stdout:2/45: getdents d0 0 2026-03-10T12:37:28.489 INFO:tasks.workunit.client.1.vm07.stdout:5/71: dread d0/fd [0,4194304] 0 2026-03-10T12:37:28.490 INFO:tasks.workunit.client.1.vm07.stdout:9/44: truncate d5/f8 3994070 0 2026-03-10T12:37:28.491 INFO:tasks.workunit.client.1.vm07.stdout:5/72: write d0/f12 [515269,108190] 0 2026-03-10T12:37:28.499 INFO:tasks.workunit.client.1.vm07.stdout:3/66: creat dc/f17 x:0 0 0 2026-03-10T12:37:28.500 INFO:tasks.workunit.client.1.vm07.stdout:1/83: unlink d9/df/c1c 0 2026-03-10T12:37:28.507 INFO:tasks.workunit.client.1.vm07.stdout:5/73: creat d0/d14/f1b x:0 0 0 2026-03-10T12:37:28.512 INFO:tasks.workunit.client.1.vm07.stdout:5/74: fsync d0/fc 0 2026-03-10T12:37:28.512 INFO:tasks.workunit.client.1.vm07.stdout:1/84: dwrite d9/df/f13 [0,4194304] 0 2026-03-10T12:37:28.513 INFO:tasks.workunit.client.1.vm07.stdout:3/67: dread f0 [0,4194304] 0 2026-03-10T12:37:28.519 INFO:tasks.workunit.client.1.vm07.stdout:4/58: sync 2026-03-10T12:37:28.519 INFO:tasks.workunit.client.1.vm07.stdout:8/79: mknod d1/c1e 0 2026-03-10T12:37:28.520 INFO:tasks.workunit.client.1.vm07.stdout:4/59: read d0/f7 [1885473,35643] 0 2026-03-10T12:37:28.524 INFO:tasks.workunit.client.1.vm07.stdout:5/75: write d0/fd [3637597,37165] 0 2026-03-10T12:37:28.524 INFO:tasks.workunit.client.1.vm07.stdout:9/45: dwrite d5/fb [0,4194304] 0 2026-03-10T12:37:28.528 INFO:tasks.workunit.client.1.vm07.stdout:1/85: mknod d9/df/c1d 0 2026-03-10T12:37:28.539 INFO:tasks.workunit.client.1.vm07.stdout:4/60: sync 2026-03-10T12:37:28.542 INFO:tasks.workunit.client.1.vm07.stdout:3/68: unlink f0 0 2026-03-10T12:37:28.546 INFO:tasks.workunit.client.1.vm07.stdout:1/86: creat d9/df/f1e x:0 0 0 2026-03-10T12:37:28.546 INFO:tasks.workunit.client.1.vm07.stdout:8/80: creat d1/d3/f1f x:0 0 0 2026-03-10T12:37:28.546 INFO:tasks.workunit.client.1.vm07.stdout:0/75: getdents d0/d14 0 2026-03-10T12:37:28.549 INFO:tasks.workunit.client.1.vm07.stdout:3/69: mkdir dc/d18 0 2026-03-10T12:37:28.549 INFO:tasks.workunit.client.1.vm07.stdout:3/70: write dc/f10 [51126,1458] 0 2026-03-10T12:37:28.551 INFO:tasks.workunit.client.1.vm07.stdout:5/76: creat d0/d14/d18/d19/f1c x:0 0 0 2026-03-10T12:37:28.552 INFO:tasks.workunit.client.1.vm07.stdout:5/77: write d0/d14/d18/d19/f1c [13339,118551] 0 2026-03-10T12:37:28.556 INFO:tasks.workunit.client.1.vm07.stdout:3/71: unlink cb 0 2026-03-10T12:37:28.556 INFO:tasks.workunit.client.1.vm07.stdout:0/76: creat d0/f15 x:0 0 0 2026-03-10T12:37:28.560 INFO:tasks.workunit.client.1.vm07.stdout:8/81: dwrite d1/d3/ff [0,4194304] 0 2026-03-10T12:37:28.564 INFO:tasks.workunit.client.1.vm07.stdout:5/78: mknod d0/d14/d18/c1d 0 2026-03-10T12:37:28.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:28 vm07.local ceph-mon[58582]: pgmap v149: 65 pgs: 65 active+clean; 194 MiB data, 1.6 GiB used, 118 GiB / 120 GiB avail; 4.5 MiB/s wr, 461 op/s 2026-03-10T12:37:28.569 INFO:tasks.workunit.client.1.vm07.stdout:0/77: creat d0/f16 x:0 0 0 2026-03-10T12:37:28.572 INFO:tasks.workunit.client.1.vm07.stdout:8/82: dwrite d1/d3/f1f [0,4194304] 0 2026-03-10T12:37:28.576 INFO:tasks.workunit.client.1.vm07.stdout:5/79: unlink d0/f11 0 2026-03-10T12:37:28.577 INFO:tasks.workunit.client.1.vm07.stdout:5/80: fdatasync d0/fa 0 2026-03-10T12:37:28.578 INFO:tasks.workunit.client.1.vm07.stdout:0/78: symlink d0/d14/l17 0 2026-03-10T12:37:28.578 INFO:tasks.workunit.client.1.vm07.stdout:5/81: dread - d0/f13 zero size 2026-03-10T12:37:28.580 INFO:tasks.workunit.client.1.vm07.stdout:5/82: chown d0/f13 1071395564 1 2026-03-10T12:37:28.580 INFO:tasks.workunit.client.1.vm07.stdout:0/79: write d0/f11 [729126,59734] 0 2026-03-10T12:37:28.580 INFO:tasks.workunit.client.1.vm07.stdout:5/83: readlink d0/l3 0 2026-03-10T12:37:28.580 INFO:tasks.workunit.client.1.vm07.stdout:5/84: stat d0/fa 0 2026-03-10T12:37:28.587 INFO:tasks.workunit.client.1.vm07.stdout:0/80: creat d0/d14/f18 x:0 0 0 2026-03-10T12:37:28.587 INFO:tasks.workunit.client.1.vm07.stdout:8/83: dwrite d1/f19 [0,4194304] 0 2026-03-10T12:37:28.588 INFO:tasks.workunit.client.1.vm07.stdout:0/81: dread - d0/f16 zero size 2026-03-10T12:37:28.590 INFO:tasks.workunit.client.1.vm07.stdout:8/84: read d1/f19 [1178391,82231] 0 2026-03-10T12:37:28.607 INFO:tasks.workunit.client.1.vm07.stdout:0/82: creat d0/d14/f19 x:0 0 0 2026-03-10T12:37:28.607 INFO:tasks.workunit.client.1.vm07.stdout:0/83: stat d0 0 2026-03-10T12:37:28.607 INFO:tasks.workunit.client.1.vm07.stdout:0/84: truncate d0/d14/f19 707016 0 2026-03-10T12:37:28.612 INFO:tasks.workunit.client.1.vm07.stdout:5/85: rename d0/d14/d18/d19/f1c to d0/f1e 0 2026-03-10T12:37:28.613 INFO:tasks.workunit.client.1.vm07.stdout:5/86: readlink d0/d14/d18/l1a 0 2026-03-10T12:37:28.615 INFO:tasks.workunit.client.1.vm07.stdout:5/87: chown d0/ce 74 1 2026-03-10T12:37:28.617 INFO:tasks.workunit.client.1.vm07.stdout:0/85: dwrite d0/f16 [0,4194304] 0 2026-03-10T12:37:28.617 INFO:tasks.workunit.client.1.vm07.stdout:5/88: read - d0/f13 zero size 2026-03-10T12:37:28.620 INFO:tasks.workunit.client.1.vm07.stdout:0/86: mkdir d0/d14/d1a 0 2026-03-10T12:37:28.622 INFO:tasks.workunit.client.1.vm07.stdout:0/87: mkdir d0/d14/d1a/d1b 0 2026-03-10T12:37:28.626 INFO:tasks.workunit.client.1.vm07.stdout:0/88: dread d0/fd [0,4194304] 0 2026-03-10T12:37:28.630 INFO:tasks.workunit.client.1.vm07.stdout:0/89: dread d0/f11 [0,4194304] 0 2026-03-10T12:37:28.632 INFO:tasks.workunit.client.1.vm07.stdout:8/85: sync 2026-03-10T12:37:28.639 INFO:tasks.workunit.client.1.vm07.stdout:0/90: rmdir d0/d14/d1a 39 2026-03-10T12:37:28.639 INFO:tasks.workunit.client.1.vm07.stdout:0/91: chown d0/f11 36434 1 2026-03-10T12:37:28.644 INFO:tasks.workunit.client.1.vm07.stdout:0/92: creat d0/f1c x:0 0 0 2026-03-10T12:37:28.645 INFO:tasks.workunit.client.1.vm07.stdout:8/86: dread d1/f7 [0,4194304] 0 2026-03-10T12:37:28.645 INFO:tasks.workunit.client.1.vm07.stdout:0/93: write d0/f11 [825686,868] 0 2026-03-10T12:37:28.645 INFO:tasks.workunit.client.1.vm07.stdout:8/87: rename d1 to d1/d3/d11/d20 22 2026-03-10T12:37:28.646 INFO:tasks.workunit.client.1.vm07.stdout:0/94: truncate d0/d14/f18 342213 0 2026-03-10T12:37:28.647 INFO:tasks.workunit.client.1.vm07.stdout:8/88: creat d1/d3/f21 x:0 0 0 2026-03-10T12:37:28.648 INFO:tasks.workunit.client.1.vm07.stdout:0/95: creat d0/f1d x:0 0 0 2026-03-10T12:37:28.648 INFO:tasks.workunit.client.1.vm07.stdout:0/96: stat d0/d14 0 2026-03-10T12:37:28.649 INFO:tasks.workunit.client.1.vm07.stdout:0/97: read - d0/f1c zero size 2026-03-10T12:37:28.701 INFO:tasks.workunit.client.1.vm07.stdout:0/98: sync 2026-03-10T12:37:28.702 INFO:tasks.workunit.client.1.vm07.stdout:0/99: truncate d0/fd 4726131 0 2026-03-10T12:37:28.703 INFO:tasks.workunit.client.1.vm07.stdout:0/100: chown d0/l8 16136 1 2026-03-10T12:37:28.704 INFO:tasks.workunit.client.1.vm07.stdout:7/41: getdents d0 0 2026-03-10T12:37:28.704 INFO:tasks.workunit.client.1.vm07.stdout:0/101: truncate d0/f1d 711433 0 2026-03-10T12:37:28.704 INFO:tasks.workunit.client.1.vm07.stdout:7/42: truncate d0/f9 1092040 0 2026-03-10T12:37:28.715 INFO:tasks.workunit.client.1.vm07.stdout:7/43: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:28.715 INFO:tasks.workunit.client.1.vm07.stdout:0/102: dwrite d0/fd [0,4194304] 0 2026-03-10T12:37:28.716 INFO:tasks.workunit.client.1.vm07.stdout:0/103: dread - d0/f1c zero size 2026-03-10T12:37:28.718 INFO:tasks.workunit.client.1.vm07.stdout:7/44: mknod d0/cb 0 2026-03-10T12:37:28.727 INFO:tasks.workunit.client.1.vm07.stdout:7/45: dwrite d0/fa [0,4194304] 0 2026-03-10T12:37:28.739 INFO:tasks.workunit.client.1.vm07.stdout:7/46: dread d0/f4 [4194304,4194304] 0 2026-03-10T12:37:28.745 INFO:tasks.workunit.client.1.vm07.stdout:6/63: truncate d1/d9/fb 2768902 0 2026-03-10T12:37:28.759 INFO:tasks.workunit.client.1.vm07.stdout:1/87: fsync d9/f12 0 2026-03-10T12:37:28.761 INFO:tasks.workunit.client.1.vm07.stdout:1/88: write d9/df/f15 [865521,52663] 0 2026-03-10T12:37:28.761 INFO:tasks.workunit.client.1.vm07.stdout:1/89: write d9/fb [3473191,108191] 0 2026-03-10T12:37:28.762 INFO:tasks.workunit.client.1.vm07.stdout:1/90: read - d9/f1b zero size 2026-03-10T12:37:28.812 INFO:tasks.workunit.client.1.vm07.stdout:2/46: truncate d0/f4 1267713 0 2026-03-10T12:37:28.826 INFO:tasks.workunit.client.1.vm07.stdout:3/72: dread f2 [0,4194304] 0 2026-03-10T12:37:28.828 INFO:tasks.workunit.client.1.vm07.stdout:4/61: truncate d0/f7 1114590 0 2026-03-10T12:37:28.848 INFO:tasks.workunit.client.1.vm07.stdout:9/46: truncate d5/fa 2333799 0 2026-03-10T12:37:28.848 INFO:tasks.workunit.client.1.vm07.stdout:4/62: sync 2026-03-10T12:37:28.894 INFO:tasks.workunit.client.1.vm07.stdout:5/89: rmdir d0/d14 39 2026-03-10T12:37:28.908 INFO:tasks.workunit.client.1.vm07.stdout:8/89: truncate d1/fc 3651023 0 2026-03-10T12:37:28.909 INFO:tasks.workunit.client.1.vm07.stdout:8/90: chown d1/d3/d11 296 1 2026-03-10T12:37:28.911 INFO:tasks.workunit.client.1.vm07.stdout:7/47: truncate d0/f9 216554 0 2026-03-10T12:37:28.917 INFO:tasks.workunit.client.1.vm07.stdout:1/91: truncate d9/fd 548246 0 2026-03-10T12:37:28.964 INFO:tasks.workunit.client.1.vm07.stdout:2/47: rename d0/l10 to d0/l11 0 2026-03-10T12:37:28.971 INFO:tasks.workunit.client.1.vm07.stdout:3/73: read f8 [1125322,24421] 0 2026-03-10T12:37:28.975 INFO:tasks.workunit.client.1.vm07.stdout:9/47: unlink d5/l9 0 2026-03-10T12:37:28.979 INFO:tasks.workunit.client.1.vm07.stdout:4/63: fdatasync d0/f7 0 2026-03-10T12:37:28.979 INFO:tasks.workunit.client.1.vm07.stdout:4/64: readlink d0/d3/ld 0 2026-03-10T12:37:28.981 INFO:tasks.workunit.client.1.vm07.stdout:5/90: fsync d0/f1e 0 2026-03-10T12:37:28.982 INFO:tasks.workunit.client.1.vm07.stdout:5/91: readlink d0/l3 0 2026-03-10T12:37:28.985 INFO:tasks.workunit.client.1.vm07.stdout:5/92: dwrite d0/f1e [0,4194304] 0 2026-03-10T12:37:28.993 INFO:tasks.workunit.client.1.vm07.stdout:8/91: dwrite d1/fb [0,4194304] 0 2026-03-10T12:37:28.995 INFO:tasks.workunit.client.1.vm07.stdout:7/48: unlink d0/cb 0 2026-03-10T12:37:28.999 INFO:tasks.workunit.client.1.vm07.stdout:7/49: dread d0/f4 [4194304,4194304] 0 2026-03-10T12:37:29.000 INFO:tasks.workunit.client.1.vm07.stdout:7/50: truncate d0/fa 4240741 0 2026-03-10T12:37:29.000 INFO:tasks.workunit.client.1.vm07.stdout:7/51: chown d0 1 1 2026-03-10T12:37:29.001 INFO:tasks.workunit.client.1.vm07.stdout:7/52: write d0/fa [4417978,57997] 0 2026-03-10T12:37:29.009 INFO:tasks.workunit.client.1.vm07.stdout:6/64: truncate d1/d9/fb 408672 0 2026-03-10T12:37:29.031 INFO:tasks.workunit.client.1.vm07.stdout:0/104: symlink d0/d14/d1a/d1b/l1e 0 2026-03-10T12:37:29.031 INFO:tasks.workunit.client.1.vm07.stdout:0/105: write d0/f16 [5055898,45881] 0 2026-03-10T12:37:29.035 INFO:tasks.workunit.client.1.vm07.stdout:0/106: dwrite d0/f1c [0,4194304] 0 2026-03-10T12:37:29.046 INFO:tasks.workunit.client.1.vm07.stdout:7/53: creat d0/fc x:0 0 0 2026-03-10T12:37:29.049 INFO:tasks.workunit.client.1.vm07.stdout:7/54: dwrite d0/fc [0,4194304] 0 2026-03-10T12:37:29.054 INFO:tasks.workunit.client.1.vm07.stdout:7/55: dwrite d0/fa [0,4194304] 0 2026-03-10T12:37:29.064 INFO:tasks.workunit.client.1.vm07.stdout:6/65: symlink d1/d4/le 0 2026-03-10T12:37:29.092 INFO:tasks.workunit.client.1.vm07.stdout:0/107: symlink d0/l1f 0 2026-03-10T12:37:29.093 INFO:tasks.workunit.client.1.vm07.stdout:0/108: truncate d0/f15 512 0 2026-03-10T12:37:29.104 INFO:tasks.workunit.client.1.vm07.stdout:8/92: rename d1/le to d1/d3/d18/l22 0 2026-03-10T12:37:29.107 INFO:tasks.workunit.client.1.vm07.stdout:7/56: symlink d0/ld 0 2026-03-10T12:37:29.107 INFO:tasks.workunit.client.1.vm07.stdout:7/57: chown d0/l5 14964850 1 2026-03-10T12:37:29.110 INFO:tasks.workunit.client.1.vm07.stdout:7/58: dwrite d0/fc [0,4194304] 0 2026-03-10T12:37:29.125 INFO:tasks.workunit.client.1.vm07.stdout:6/66: symlink d1/d4/d6/lf 0 2026-03-10T12:37:29.132 INFO:tasks.workunit.client.1.vm07.stdout:1/92: creat d9/f1f x:0 0 0 2026-03-10T12:37:29.133 INFO:tasks.workunit.client.1.vm07.stdout:9/48: link l3 d5/ld 0 2026-03-10T12:37:29.134 INFO:tasks.workunit.client.1.vm07.stdout:4/65: link d0/d4/d5/da/ce d0/d4/d5/c14 0 2026-03-10T12:37:29.135 INFO:tasks.workunit.client.1.vm07.stdout:4/66: truncate d0/f7 1314735 0 2026-03-10T12:37:29.137 INFO:tasks.workunit.client.1.vm07.stdout:5/93: creat d0/f1f x:0 0 0 2026-03-10T12:37:29.137 INFO:tasks.workunit.client.1.vm07.stdout:5/94: write d0/f1f [138471,114503] 0 2026-03-10T12:37:29.140 INFO:tasks.workunit.client.1.vm07.stdout:0/109: rmdir d0/d14/d1a/d1b 39 2026-03-10T12:37:29.143 INFO:tasks.workunit.client.1.vm07.stdout:8/93: mknod d1/d3/c23 0 2026-03-10T12:37:29.146 INFO:tasks.workunit.client.1.vm07.stdout:7/59: creat d0/fe x:0 0 0 2026-03-10T12:37:29.147 INFO:tasks.workunit.client.1.vm07.stdout:7/60: truncate d0/fe 535060 0 2026-03-10T12:37:29.147 INFO:tasks.workunit.client.1.vm07.stdout:7/61: chown d0/l5 426400 1 2026-03-10T12:37:29.148 INFO:tasks.workunit.client.1.vm07.stdout:7/62: read d0/fc [1865408,120367] 0 2026-03-10T12:37:29.150 INFO:tasks.workunit.client.1.vm07.stdout:6/67: creat d1/d9/f10 x:0 0 0 2026-03-10T12:37:29.155 INFO:tasks.workunit.client.1.vm07.stdout:1/93: dwrite f6 [0,4194304] 0 2026-03-10T12:37:29.160 INFO:tasks.workunit.client.1.vm07.stdout:1/94: dwrite d9/df/f11 [0,4194304] 0 2026-03-10T12:37:29.160 INFO:tasks.workunit.client.1.vm07.stdout:1/95: read f6 [3203532,124789] 0 2026-03-10T12:37:29.172 INFO:tasks.workunit.client.1.vm07.stdout:4/67: creat d0/d4/d5/da/f15 x:0 0 0 2026-03-10T12:37:29.188 INFO:tasks.workunit.client.1.vm07.stdout:8/94: creat d1/d3/d6/f24 x:0 0 0 2026-03-10T12:37:29.221 INFO:tasks.workunit.client.1.vm07.stdout:8/95: stat d1/d3/f1d 0 2026-03-10T12:37:29.221 INFO:tasks.workunit.client.1.vm07.stdout:7/63: mknod d0/cf 0 2026-03-10T12:37:29.221 INFO:tasks.workunit.client.1.vm07.stdout:7/64: write d0/f4 [5055829,39420] 0 2026-03-10T12:37:29.221 INFO:tasks.workunit.client.1.vm07.stdout:1/96: mknod d9/df/c20 0 2026-03-10T12:37:29.221 INFO:tasks.workunit.client.1.vm07.stdout:5/95: rmdir d0/d14 39 2026-03-10T12:37:29.240 INFO:tasks.workunit.client.1.vm07.stdout:5/96: rename d0/f17 to d0/d14/d18/f20 0 2026-03-10T12:37:29.262 INFO:tasks.workunit.client.1.vm07.stdout:5/97: write d0/f13 [420383,5629] 0 2026-03-10T12:37:29.262 INFO:tasks.workunit.client.1.vm07.stdout:5/98: dwrite d0/f1f [0,4194304] 0 2026-03-10T12:37:29.262 INFO:tasks.workunit.client.1.vm07.stdout:0/110: link d0/c9 d0/d14/d1a/c20 0 2026-03-10T12:37:29.262 INFO:tasks.workunit.client.1.vm07.stdout:0/111: fsync d0/d14/f18 0 2026-03-10T12:37:29.262 INFO:tasks.workunit.client.1.vm07.stdout:0/112: chown d0/l1f 0 1 2026-03-10T12:37:29.262 INFO:tasks.workunit.client.1.vm07.stdout:0/113: chown d0/cb 2 1 2026-03-10T12:37:29.262 INFO:tasks.workunit.client.1.vm07.stdout:5/99: dwrite d0/fc [4194304,4194304] 0 2026-03-10T12:37:29.263 INFO:tasks.workunit.client.1.vm07.stdout:0/114: unlink d0/d14/l13 0 2026-03-10T12:37:29.264 INFO:tasks.workunit.client.1.vm07.stdout:5/100: dread d0/fa [0,4194304] 0 2026-03-10T12:37:29.269 INFO:tasks.workunit.client.1.vm07.stdout:0/115: creat d0/f21 x:0 0 0 2026-03-10T12:37:29.277 INFO:tasks.workunit.client.1.vm07.stdout:0/116: write d0/fd [4328660,1784] 0 2026-03-10T12:37:29.277 INFO:tasks.workunit.client.1.vm07.stdout:0/117: rmdir d0/d14/d1a 39 2026-03-10T12:37:29.278 INFO:tasks.workunit.client.1.vm07.stdout:0/118: unlink d0/d14/f18 0 2026-03-10T12:37:29.282 INFO:tasks.workunit.client.1.vm07.stdout:0/119: dwrite d0/f11 [4194304,4194304] 0 2026-03-10T12:37:29.284 INFO:tasks.workunit.client.1.vm07.stdout:0/120: write d0/f10 [1935649,26224] 0 2026-03-10T12:37:29.305 INFO:tasks.workunit.client.1.vm07.stdout:2/48: chown d0/f4 1940089 1 2026-03-10T12:37:29.311 INFO:tasks.workunit.client.1.vm07.stdout:3/74: unlink f8 0 2026-03-10T12:37:29.314 INFO:tasks.workunit.client.1.vm07.stdout:3/75: write dc/f17 [410956,56802] 0 2026-03-10T12:37:29.314 INFO:tasks.workunit.client.1.vm07.stdout:2/49: creat d0/f12 x:0 0 0 2026-03-10T12:37:29.317 INFO:tasks.workunit.client.1.vm07.stdout:2/50: readlink d0/l5 0 2026-03-10T12:37:29.318 INFO:tasks.workunit.client.1.vm07.stdout:3/76: dwrite f1 [0,4194304] 0 2026-03-10T12:37:29.321 INFO:tasks.workunit.client.1.vm07.stdout:2/51: creat d0/f13 x:0 0 0 2026-03-10T12:37:29.324 INFO:tasks.workunit.client.1.vm07.stdout:3/77: dread dc/f10 [0,4194304] 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:2/52: dread d0/f6 [0,4194304] 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/78: chown dc/c15 94 1 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/79: chown dc/ff 36 1 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/80: stat dc/dd/l14 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:2/53: creat d0/f14 x:0 0 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/81: creat dc/dd/f19 x:0 0 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/82: readlink dc/l13 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/83: rename dc/c15 to dc/dd/c1a 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:2/54: creat d0/f15 x:0 0 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:2/55: mknod d0/c16 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/84: dwrite dc/f17 [0,4194304] 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:2/56: truncate d0/f6 1897871 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:2/57: write d0/f14 [426557,32436] 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:2/58: dread - d0/f15 zero size 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/85: dwrite dc/dd/f19 [0,4194304] 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:4/68: dread d0/f7 [0,4194304] 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:3/86: unlink f6 0 2026-03-10T12:37:29.361 INFO:tasks.workunit.client.1.vm07.stdout:4/69: dread d0/f7 [0,4194304] 0 2026-03-10T12:37:29.362 INFO:tasks.workunit.client.1.vm07.stdout:4/70: creat d0/d4/d10/f16 x:0 0 0 2026-03-10T12:37:29.365 INFO:tasks.workunit.client.1.vm07.stdout:4/71: symlink d0/d4/d5/da/l17 0 2026-03-10T12:37:29.366 INFO:tasks.workunit.client.1.vm07.stdout:4/72: truncate d0/d4/d10/f16 71422 0 2026-03-10T12:37:29.368 INFO:tasks.workunit.client.1.vm07.stdout:4/73: mkdir d0/d4/d10/d18 0 2026-03-10T12:37:29.375 INFO:tasks.workunit.client.1.vm07.stdout:4/74: dwrite d0/d4/d5/da/f15 [0,4194304] 0 2026-03-10T12:37:29.376 INFO:tasks.workunit.client.1.vm07.stdout:4/75: chown d0 220800076 1 2026-03-10T12:37:29.398 INFO:tasks.workunit.client.1.vm07.stdout:7/65: sync 2026-03-10T12:37:29.405 INFO:tasks.workunit.client.1.vm07.stdout:0/121: sync 2026-03-10T12:37:29.405 INFO:tasks.workunit.client.1.vm07.stdout:7/66: unlink d0/l5 0 2026-03-10T12:37:29.409 INFO:tasks.workunit.client.1.vm07.stdout:7/67: dread d0/fc [0,4194304] 0 2026-03-10T12:37:29.423 INFO:tasks.workunit.client.1.vm07.stdout:7/68: unlink d0/f8 0 2026-03-10T12:37:29.423 INFO:tasks.workunit.client.1.vm07.stdout:7/69: stat d0/f4 0 2026-03-10T12:37:29.424 INFO:tasks.workunit.client.1.vm07.stdout:7/70: unlink d0/cf 0 2026-03-10T12:37:29.426 INFO:tasks.workunit.client.1.vm07.stdout:7/71: dread d0/fe [0,4194304] 0 2026-03-10T12:37:29.426 INFO:tasks.workunit.client.1.vm07.stdout:0/122: symlink d0/d14/d1a/d1b/l22 0 2026-03-10T12:37:29.427 INFO:tasks.workunit.client.1.vm07.stdout:7/72: creat d0/f10 x:0 0 0 2026-03-10T12:37:29.428 INFO:tasks.workunit.client.1.vm07.stdout:0/123: rename d0/cb to d0/d14/d1a/d1b/c23 0 2026-03-10T12:37:29.430 INFO:tasks.workunit.client.1.vm07.stdout:7/73: mknod d0/c11 0 2026-03-10T12:37:29.437 INFO:tasks.workunit.client.1.vm07.stdout:0/124: dwrite d0/d14/f19 [0,4194304] 0 2026-03-10T12:37:29.441 INFO:tasks.workunit.client.1.vm07.stdout:7/74: dwrite d0/fe [0,4194304] 0 2026-03-10T12:37:29.465 INFO:tasks.workunit.client.1.vm07.stdout:0/125: creat d0/d14/d1a/f24 x:0 0 0 2026-03-10T12:37:29.469 INFO:tasks.workunit.client.1.vm07.stdout:0/126: symlink d0/d14/l25 0 2026-03-10T12:37:29.543 INFO:tasks.workunit.client.1.vm07.stdout:3/87: fdatasync f1 0 2026-03-10T12:37:29.549 INFO:tasks.workunit.client.1.vm07.stdout:0/127: dread d0/f15 [0,4194304] 0 2026-03-10T12:37:29.553 INFO:tasks.workunit.client.1.vm07.stdout:3/88: dread f1 [0,4194304] 0 2026-03-10T12:37:29.554 INFO:tasks.workunit.client.1.vm07.stdout:3/89: truncate dc/f10 609919 0 2026-03-10T12:37:29.555 INFO:tasks.workunit.client.1.vm07.stdout:3/90: chown dc/dd/f16 3008449 1 2026-03-10T12:37:29.557 INFO:tasks.workunit.client.1.vm07.stdout:3/91: mknod dc/c1b 0 2026-03-10T12:37:29.559 INFO:tasks.workunit.client.1.vm07.stdout:3/92: truncate f2 1643211 0 2026-03-10T12:37:29.563 INFO:tasks.workunit.client.1.vm07.stdout:3/93: unlink dc/ff 0 2026-03-10T12:37:29.569 INFO:tasks.workunit.client.1.vm07.stdout:3/94: mknod dc/dd/c1c 0 2026-03-10T12:37:29.574 INFO:tasks.workunit.client.1.vm07.stdout:3/95: dread f1 [0,4194304] 0 2026-03-10T12:37:29.582 INFO:tasks.workunit.client.1.vm07.stdout:3/96: dwrite dc/dd/f16 [0,4194304] 0 2026-03-10T12:37:29.586 INFO:tasks.workunit.client.1.vm07.stdout:3/97: getdents dc/d18 0 2026-03-10T12:37:29.589 INFO:tasks.workunit.client.1.vm07.stdout:3/98: write f1 [3594956,54457] 0 2026-03-10T12:37:29.593 INFO:tasks.workunit.client.1.vm07.stdout:3/99: creat dc/dd/f1d x:0 0 0 2026-03-10T12:37:29.627 INFO:tasks.workunit.client.1.vm07.stdout:8/96: getdents d1/d3 0 2026-03-10T12:37:29.629 INFO:tasks.workunit.client.1.vm07.stdout:8/97: creat d1/d3/f25 x:0 0 0 2026-03-10T12:37:29.632 INFO:tasks.workunit.client.1.vm07.stdout:8/98: symlink d1/d3/d6/l26 0 2026-03-10T12:37:29.635 INFO:tasks.workunit.client.1.vm07.stdout:8/99: write d1/d3/d6/f9 [1331786,129553] 0 2026-03-10T12:37:29.635 INFO:tasks.workunit.client.1.vm07.stdout:8/100: chown d1/d3/la 0 1 2026-03-10T12:37:29.657 INFO:tasks.workunit.client.1.vm07.stdout:8/101: dwrite d1/f2 [0,4194304] 0 2026-03-10T12:37:29.670 INFO:tasks.workunit.client.1.vm07.stdout:8/102: unlink d1/d3/la 0 2026-03-10T12:37:29.679 INFO:tasks.workunit.client.1.vm07.stdout:8/103: dwrite d1/d3/f21 [0,4194304] 0 2026-03-10T12:37:29.686 INFO:tasks.workunit.client.1.vm07.stdout:5/101: getdents d0/d14/d18 0 2026-03-10T12:37:29.687 INFO:tasks.workunit.client.1.vm07.stdout:5/102: write d0/d14/f1b [342700,5493] 0 2026-03-10T12:37:29.690 INFO:tasks.workunit.client.1.vm07.stdout:1/97: dwrite d9/f12 [0,4194304] 0 2026-03-10T12:37:29.694 INFO:tasks.workunit.client.1.vm07.stdout:1/98: write d9/df/f15 [1094214,96962] 0 2026-03-10T12:37:29.695 INFO:tasks.workunit.client.1.vm07.stdout:1/99: chown c3 1785339 1 2026-03-10T12:37:29.695 INFO:tasks.workunit.client.1.vm07.stdout:1/100: chown f8 336386 1 2026-03-10T12:37:29.703 INFO:tasks.workunit.client.1.vm07.stdout:1/101: dwrite d9/df/f10 [0,4194304] 0 2026-03-10T12:37:29.705 INFO:tasks.workunit.client.1.vm07.stdout:1/102: chown d9/df/c20 7 1 2026-03-10T12:37:29.705 INFO:tasks.workunit.client.1.vm07.stdout:1/103: chown f6 120 1 2026-03-10T12:37:29.710 INFO:tasks.workunit.client.1.vm07.stdout:5/103: mkdir d0/d14/d18/d19/d21 0 2026-03-10T12:37:29.711 INFO:tasks.workunit.client.1.vm07.stdout:5/104: fsync d0/ff 0 2026-03-10T12:37:29.711 INFO:tasks.workunit.client.1.vm07.stdout:5/105: chown d0/f9 13161 1 2026-03-10T12:37:29.723 INFO:tasks.workunit.client.1.vm07.stdout:1/104: write f6 [3479219,112436] 0 2026-03-10T12:37:29.735 INFO:tasks.workunit.client.1.vm07.stdout:6/68: link d1/d9/fb d1/d4/f11 0 2026-03-10T12:37:29.736 INFO:tasks.workunit.client.1.vm07.stdout:3/100: fsync dc/dd/f19 0 2026-03-10T12:37:29.741 INFO:tasks.workunit.client.1.vm07.stdout:5/106: write d0/d14/d18/f20 [624081,18066] 0 2026-03-10T12:37:29.749 INFO:tasks.workunit.client.1.vm07.stdout:8/104: truncate d1/fc 454364 0 2026-03-10T12:37:29.749 INFO:tasks.workunit.client.1.vm07.stdout:1/105: creat d9/df/f21 x:0 0 0 2026-03-10T12:37:29.758 INFO:tasks.workunit.client.1.vm07.stdout:5/107: rmdir d0/d14 39 2026-03-10T12:37:29.762 INFO:tasks.workunit.client.1.vm07.stdout:2/59: dwrite d0/f4 [0,4194304] 0 2026-03-10T12:37:29.771 INFO:tasks.workunit.client.1.vm07.stdout:4/76: write d0/d4/d5/da/f15 [5041485,103706] 0 2026-03-10T12:37:29.779 INFO:tasks.workunit.client.1.vm07.stdout:7/75: write d0/fe [4356214,90273] 0 2026-03-10T12:37:29.789 INFO:tasks.workunit.client.1.vm07.stdout:8/105: symlink d1/d3/d11/l27 0 2026-03-10T12:37:29.792 INFO:tasks.workunit.client.1.vm07.stdout:1/106: readlink l5 0 2026-03-10T12:37:29.795 INFO:tasks.workunit.client.1.vm07.stdout:6/69: rename d1/cc to d1/d9/c12 0 2026-03-10T12:37:29.795 INFO:tasks.workunit.client.1.vm07.stdout:6/70: chown d1/d9 183 1 2026-03-10T12:37:29.796 INFO:tasks.workunit.client.1.vm07.stdout:6/71: readlink d1/d4/d6/lf 0 2026-03-10T12:37:29.798 INFO:tasks.workunit.client.1.vm07.stdout:8/106: dread d1/d3/d18/f1b [0,4194304] 0 2026-03-10T12:37:29.800 INFO:tasks.workunit.client.1.vm07.stdout:8/107: truncate d1/d3/d6/f9 1704992 0 2026-03-10T12:37:29.811 INFO:tasks.workunit.client.1.vm07.stdout:4/77: mkdir d0/d19 0 2026-03-10T12:37:29.817 INFO:tasks.workunit.client.1.vm07.stdout:0/128: truncate d0/f11 7194569 0 2026-03-10T12:37:29.829 INFO:tasks.workunit.client.1.vm07.stdout:1/107: dread d9/df/f15 [0,4194304] 0 2026-03-10T12:37:29.829 INFO:tasks.workunit.client.1.vm07.stdout:1/108: chown d9/df/f13 92 1 2026-03-10T12:37:29.847 INFO:tasks.workunit.client.1.vm07.stdout:9/49: dwrite d5/fa [0,4194304] 0 2026-03-10T12:37:29.996 INFO:tasks.workunit.client.1.vm07.stdout:7/76: mkdir d0/d12 0 2026-03-10T12:37:30.005 INFO:tasks.workunit.client.1.vm07.stdout:8/108: rename d1/d3/d11/l27 to d1/d3/d6/l28 0 2026-03-10T12:37:30.006 INFO:tasks.workunit.client.1.vm07.stdout:8/109: chown d1 217607156 1 2026-03-10T12:37:30.008 INFO:tasks.workunit.client.1.vm07.stdout:0/129: symlink d0/l26 0 2026-03-10T12:37:30.010 INFO:tasks.workunit.client.1.vm07.stdout:9/50: creat d5/fe x:0 0 0 2026-03-10T12:37:30.016 INFO:tasks.workunit.client.1.vm07.stdout:6/72: truncate d1/d4/f11 5170 0 2026-03-10T12:37:30.016 INFO:tasks.workunit.client.1.vm07.stdout:6/73: chown d1/d4/d6/lf 1063109560 1 2026-03-10T12:37:30.017 INFO:tasks.workunit.client.1.vm07.stdout:3/101: getdents dc/dd 0 2026-03-10T12:37:30.018 INFO:tasks.workunit.client.1.vm07.stdout:8/110: creat d1/d3/f29 x:0 0 0 2026-03-10T12:37:30.018 INFO:tasks.workunit.client.1.vm07.stdout:8/111: fsync d1/d3/d11/f15 0 2026-03-10T12:37:30.020 INFO:tasks.workunit.client.1.vm07.stdout:2/60: creat d0/f17 x:0 0 0 2026-03-10T12:37:30.020 INFO:tasks.workunit.client.1.vm07.stdout:4/78: creat d0/d4/d10/d18/f1a x:0 0 0 2026-03-10T12:37:30.021 INFO:tasks.workunit.client.1.vm07.stdout:9/51: symlink d5/lf 0 2026-03-10T12:37:30.027 INFO:tasks.workunit.client.1.vm07.stdout:6/74: creat d1/d4/d6/f13 x:0 0 0 2026-03-10T12:37:30.030 INFO:tasks.workunit.client.1.vm07.stdout:5/108: rename d0/d14 to d0/d22 0 2026-03-10T12:37:30.035 INFO:tasks.workunit.client.1.vm07.stdout:4/79: symlink d0/d4/d5/da/l1b 0 2026-03-10T12:37:30.037 INFO:tasks.workunit.client.1.vm07.stdout:9/52: mknod d5/c10 0 2026-03-10T12:37:30.038 INFO:tasks.workunit.client.1.vm07.stdout:7/77: rmdir d0/d12 0 2026-03-10T12:37:30.039 INFO:tasks.workunit.client.1.vm07.stdout:3/102: symlink dc/d18/l1e 0 2026-03-10T12:37:30.040 INFO:tasks.workunit.client.1.vm07.stdout:3/103: write dc/dd/f19 [4514365,14685] 0 2026-03-10T12:37:30.045 INFO:tasks.workunit.client.1.vm07.stdout:1/109: rename d9/df/f18 to d9/f22 0 2026-03-10T12:37:30.045 INFO:tasks.workunit.client.1.vm07.stdout:8/112: rename d1/d3 to d1/d3/d6/d2a 22 2026-03-10T12:37:30.046 INFO:tasks.workunit.client.1.vm07.stdout:8/113: truncate d1/d3/d6/f9 2005338 0 2026-03-10T12:37:30.051 INFO:tasks.workunit.client.1.vm07.stdout:5/109: creat d0/d22/d18/d19/f23 x:0 0 0 2026-03-10T12:37:30.067 INFO:tasks.workunit.client.1.vm07.stdout:0/130: creat d0/d14/d1a/f27 x:0 0 0 2026-03-10T12:37:30.067 INFO:tasks.workunit.client.1.vm07.stdout:0/131: fdatasync d0/f16 0 2026-03-10T12:37:30.067 INFO:tasks.workunit.client.1.vm07.stdout:0/132: write d0/d14/d1a/f24 [550146,69714] 0 2026-03-10T12:37:30.075 INFO:tasks.workunit.client.1.vm07.stdout:9/53: mknod d5/c11 0 2026-03-10T12:37:30.077 INFO:tasks.workunit.client.1.vm07.stdout:9/54: dread d5/f8 [0,4194304] 0 2026-03-10T12:37:30.081 INFO:tasks.workunit.client.1.vm07.stdout:6/75: symlink d1/l14 0 2026-03-10T12:37:30.082 INFO:tasks.workunit.client.1.vm07.stdout:3/104: write f2 [773507,30856] 0 2026-03-10T12:37:30.082 INFO:tasks.workunit.client.1.vm07.stdout:3/105: write f2 [512362,88680] 0 2026-03-10T12:37:30.083 INFO:tasks.workunit.client.1.vm07.stdout:4/80: rename d0/d4/d5/lb to d0/l1c 0 2026-03-10T12:37:30.089 INFO:tasks.workunit.client.1.vm07.stdout:1/110: mknod d9/df/c23 0 2026-03-10T12:37:30.091 INFO:tasks.workunit.client.1.vm07.stdout:8/114: symlink d1/d3/d6/l2b 0 2026-03-10T12:37:30.095 INFO:tasks.workunit.client.1.vm07.stdout:0/133: mknod d0/d14/d1a/d1b/c28 0 2026-03-10T12:37:30.097 INFO:tasks.workunit.client.1.vm07.stdout:2/61: creat d0/f18 x:0 0 0 2026-03-10T12:37:30.097 INFO:tasks.workunit.client.1.vm07.stdout:7/78: fsync d0/f3 0 2026-03-10T12:37:30.098 INFO:tasks.workunit.client.1.vm07.stdout:7/79: truncate d0/f10 388625 0 2026-03-10T12:37:30.098 INFO:tasks.workunit.client.1.vm07.stdout:7/80: readlink d0/ld 0 2026-03-10T12:37:30.099 INFO:tasks.workunit.client.1.vm07.stdout:6/76: creat d1/d4/d6/f15 x:0 0 0 2026-03-10T12:37:30.103 INFO:tasks.workunit.client.1.vm07.stdout:3/106: mkdir dc/dd/d1f 0 2026-03-10T12:37:30.108 INFO:tasks.workunit.client.1.vm07.stdout:4/81: symlink d0/d4/l1d 0 2026-03-10T12:37:30.113 INFO:tasks.workunit.client.1.vm07.stdout:4/82: dwrite d0/d4/d5/da/f15 [4194304,4194304] 0 2026-03-10T12:37:30.120 INFO:tasks.workunit.client.1.vm07.stdout:1/111: creat d9/df/f24 x:0 0 0 2026-03-10T12:37:30.121 INFO:tasks.workunit.client.1.vm07.stdout:4/83: dwrite d0/d4/d5/da/f15 [4194304,4194304] 0 2026-03-10T12:37:30.126 INFO:tasks.workunit.client.1.vm07.stdout:4/84: chown d0/d4/d5/da/l1b 737379 1 2026-03-10T12:37:30.129 INFO:tasks.workunit.client.1.vm07.stdout:1/112: dread d9/fb [0,4194304] 0 2026-03-10T12:37:30.129 INFO:tasks.workunit.client.1.vm07.stdout:1/113: dread - d9/fc zero size 2026-03-10T12:37:30.131 INFO:tasks.workunit.client.1.vm07.stdout:4/85: dread d0/d4/d5/da/f15 [0,4194304] 0 2026-03-10T12:37:30.133 INFO:tasks.workunit.client.1.vm07.stdout:0/134: rename d0/d14/d1a/d1b/l22 to d0/d14/d1a/d1b/l29 0 2026-03-10T12:37:30.137 INFO:tasks.workunit.client.1.vm07.stdout:6/77: mkdir d1/d4/d6/d16 0 2026-03-10T12:37:30.140 INFO:tasks.workunit.client.1.vm07.stdout:3/107: creat dc/dd/f20 x:0 0 0 2026-03-10T12:37:30.142 INFO:tasks.workunit.client.1.vm07.stdout:3/108: dread dc/dd/f19 [0,4194304] 0 2026-03-10T12:37:30.143 INFO:tasks.workunit.client.1.vm07.stdout:9/55: rename l4 to d5/l12 0 2026-03-10T12:37:30.143 INFO:tasks.workunit.client.1.vm07.stdout:3/109: write dc/f17 [2539555,73195] 0 2026-03-10T12:37:30.156 INFO:tasks.workunit.client.1.vm07.stdout:4/86: symlink d0/d4/d5/l1e 0 2026-03-10T12:37:30.159 INFO:tasks.workunit.client.1.vm07.stdout:0/135: mknod d0/d14/d1a/d1b/c2a 0 2026-03-10T12:37:30.164 INFO:tasks.workunit.client.1.vm07.stdout:2/62: mkdir d0/d19 0 2026-03-10T12:37:30.170 INFO:tasks.workunit.client.1.vm07.stdout:7/81: rename d0/f9 to d0/f13 0 2026-03-10T12:37:30.171 INFO:tasks.workunit.client.1.vm07.stdout:7/82: write d0/f10 [112687,37218] 0 2026-03-10T12:37:30.179 INFO:tasks.workunit.client.1.vm07.stdout:9/56: mkdir d5/d13 0 2026-03-10T12:37:30.179 INFO:tasks.workunit.client.1.vm07.stdout:9/57: dread - d5/fe zero size 2026-03-10T12:37:30.189 INFO:tasks.workunit.client.1.vm07.stdout:2/63: chown d0/f6 461248 1 2026-03-10T12:37:30.190 INFO:tasks.workunit.client.1.vm07.stdout:2/64: truncate d0/f18 591841 0 2026-03-10T12:37:30.205 INFO:tasks.workunit.client.1.vm07.stdout:7/83: dwrite d0/fc [4194304,4194304] 0 2026-03-10T12:37:30.206 INFO:tasks.workunit.client.1.vm07.stdout:7/84: fdatasync d0/f3 0 2026-03-10T12:37:30.206 INFO:tasks.workunit.client.1.vm07.stdout:7/85: stat d0/ld 0 2026-03-10T12:37:30.207 INFO:tasks.workunit.client.1.vm07.stdout:7/86: fdatasync d0/f4 0 2026-03-10T12:37:30.219 INFO:tasks.workunit.client.1.vm07.stdout:9/58: readlink d5/ld 0 2026-03-10T12:37:30.222 INFO:tasks.workunit.client.1.vm07.stdout:9/59: dwrite d5/fb [4194304,4194304] 0 2026-03-10T12:37:30.225 INFO:tasks.workunit.client.1.vm07.stdout:3/110: link dc/f10 dc/dd/f21 0 2026-03-10T12:37:30.239 INFO:tasks.workunit.client.1.vm07.stdout:4/87: mkdir d0/d19/d1f 0 2026-03-10T12:37:30.244 INFO:tasks.workunit.client.1.vm07.stdout:5/110: getdents d0 0 2026-03-10T12:37:30.245 INFO:tasks.workunit.client.1.vm07.stdout:5/111: readlink d0/d22/d18/l1a 0 2026-03-10T12:37:30.246 INFO:tasks.workunit.client.1.vm07.stdout:5/112: write d0/ff [8827754,123306] 0 2026-03-10T12:37:30.252 INFO:tasks.workunit.client.1.vm07.stdout:5/113: dread d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:37:30.253 INFO:tasks.workunit.client.1.vm07.stdout:5/114: rename d0/d22/d18 to d0/d22/d18/d19/d21/d24 22 2026-03-10T12:37:30.262 INFO:tasks.workunit.client.1.vm07.stdout:8/115: getdents d1/d3/d6 0 2026-03-10T12:37:30.267 INFO:tasks.workunit.client.1.vm07.stdout:0/136: symlink d0/d14/d1a/l2b 0 2026-03-10T12:37:30.268 INFO:tasks.workunit.client.1.vm07.stdout:0/137: chown d0/d14/d1a/f27 0 1 2026-03-10T12:37:30.281 INFO:tasks.workunit.client.1.vm07.stdout:7/87: creat d0/f14 x:0 0 0 2026-03-10T12:37:30.286 INFO:tasks.workunit.client.1.vm07.stdout:6/78: creat d1/f17 x:0 0 0 2026-03-10T12:37:30.287 INFO:tasks.workunit.client.1.vm07.stdout:3/111: creat dc/dd/f22 x:0 0 0 2026-03-10T12:37:30.287 INFO:tasks.workunit.client.1.vm07.stdout:1/114: link l2 d9/l25 0 2026-03-10T12:37:30.288 INFO:tasks.workunit.client.1.vm07.stdout:4/88: symlink d0/d4/d5/l20 0 2026-03-10T12:37:30.288 INFO:tasks.workunit.client.1.vm07.stdout:5/115: symlink d0/l25 0 2026-03-10T12:37:30.300 INFO:tasks.workunit.client.1.vm07.stdout:8/116: readlink d1/d3/d18/l22 0 2026-03-10T12:37:30.301 INFO:tasks.workunit.client.1.vm07.stdout:8/117: chown d1/d3/d6/l26 0 1 2026-03-10T12:37:30.308 INFO:tasks.workunit.client.1.vm07.stdout:8/118: dread d1/f19 [0,4194304] 0 2026-03-10T12:37:30.311 INFO:tasks.workunit.client.1.vm07.stdout:2/65: creat d0/d19/f1a x:0 0 0 2026-03-10T12:37:30.311 INFO:tasks.workunit.client.1.vm07.stdout:9/60: creat d5/d13/f14 x:0 0 0 2026-03-10T12:37:30.312 INFO:tasks.workunit.client.1.vm07.stdout:9/61: chown d5/cc 17 1 2026-03-10T12:37:30.313 INFO:tasks.workunit.client.1.vm07.stdout:1/115: creat d9/df/f26 x:0 0 0 2026-03-10T12:37:30.316 INFO:tasks.workunit.client.1.vm07.stdout:5/116: write d0/d22/f1b [1087318,63083] 0 2026-03-10T12:37:30.321 INFO:tasks.workunit.client.1.vm07.stdout:8/119: fsync d1/f7 0 2026-03-10T12:37:30.338 INFO:tasks.workunit.client.1.vm07.stdout:2/66: creat d0/d19/f1b x:0 0 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:9/62: symlink d5/d13/l15 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:9/63: write d5/fa [5102141,82006] 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:3/112: symlink dc/dd/d1f/l23 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:3/113: read dc/dd/f16 [1646258,127447] 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:9/64: dwrite d5/fe [0,4194304] 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:5/117: mknod d0/d22/c26 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:0/138: creat d0/d14/d1a/f2c x:0 0 0 2026-03-10T12:37:30.339 INFO:tasks.workunit.client.1.vm07.stdout:8/120: creat d1/d3/d6/f2c x:0 0 0 2026-03-10T12:37:30.346 INFO:tasks.workunit.client.1.vm07.stdout:3/114: mkdir dc/d18/d24 0 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:3/115: dread - dc/dd/f20 zero size 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:3/116: dread dc/f10 [0,4194304] 0 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:5/118: dread d0/fd [0,4194304] 0 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:8/121: rename d1/d3/f21 to d1/d3/f2d 0 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:6/79: link d1/d4/d6/cd d1/c18 0 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:4/89: getdents d0/d4 0 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:5/119: creat d0/d22/f27 x:0 0 0 2026-03-10T12:37:30.388 INFO:tasks.workunit.client.1.vm07.stdout:8/122: unlink d1/d3/c13 0 2026-03-10T12:37:30.389 INFO:tasks.workunit.client.1.vm07.stdout:6/80: creat d1/d4/f19 x:0 0 0 2026-03-10T12:37:30.392 INFO:tasks.workunit.client.1.vm07.stdout:4/90: mknod d0/d4/d10/d18/c21 0 2026-03-10T12:37:30.392 INFO:tasks.workunit.client.1.vm07.stdout:4/91: read - d0/d4/d10/d18/f1a zero size 2026-03-10T12:37:30.397 INFO:tasks.workunit.client.1.vm07.stdout:5/120: symlink d0/d22/d18/d19/l28 0 2026-03-10T12:37:30.399 INFO:tasks.workunit.client.1.vm07.stdout:8/123: write d1/d3/f2d [4177028,65890] 0 2026-03-10T12:37:30.399 INFO:tasks.workunit.client.1.vm07.stdout:8/124: fdatasync d1/f2 0 2026-03-10T12:37:30.404 INFO:tasks.workunit.client.1.vm07.stdout:5/121: read d0/fc [5560339,57225] 0 2026-03-10T12:37:30.406 INFO:tasks.workunit.client.1.vm07.stdout:7/88: sync 2026-03-10T12:37:30.407 INFO:tasks.workunit.client.1.vm07.stdout:1/116: sync 2026-03-10T12:37:30.407 INFO:tasks.workunit.client.1.vm07.stdout:7/89: readlink d0/ld 0 2026-03-10T12:37:30.409 INFO:tasks.workunit.client.1.vm07.stdout:8/125: write d1/f19 [4103683,76908] 0 2026-03-10T12:37:30.412 INFO:tasks.workunit.client.1.vm07.stdout:6/81: mkdir d1/d4/d6/d16/d1a 0 2026-03-10T12:37:30.412 INFO:tasks.workunit.client.1.vm07.stdout:6/82: write d1/d9/f10 [441863,63499] 0 2026-03-10T12:37:30.418 INFO:tasks.workunit.client.1.vm07.stdout:6/83: dwrite d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:37:30.424 INFO:tasks.workunit.client.1.vm07.stdout:4/92: creat d0/d19/d1f/f22 x:0 0 0 2026-03-10T12:37:30.425 INFO:tasks.workunit.client.1.vm07.stdout:7/90: unlink d0/f4 0 2026-03-10T12:37:30.426 INFO:tasks.workunit.client.1.vm07.stdout:7/91: stat d0/fc 0 2026-03-10T12:37:30.426 INFO:tasks.workunit.client.1.vm07.stdout:7/92: stat d0/c2 0 2026-03-10T12:37:30.427 INFO:tasks.workunit.client.1.vm07.stdout:7/93: readlink d0/ld 0 2026-03-10T12:37:30.433 INFO:tasks.workunit.client.1.vm07.stdout:7/94: chown d0/c2 596 1 2026-03-10T12:37:30.434 INFO:tasks.workunit.client.1.vm07.stdout:6/84: dread d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:37:30.434 INFO:tasks.workunit.client.1.vm07.stdout:7/95: readlink d0/ld 0 2026-03-10T12:37:30.435 INFO:tasks.workunit.client.1.vm07.stdout:4/93: dwrite d0/d4/d10/d18/f1a [0,4194304] 0 2026-03-10T12:37:30.440 INFO:tasks.workunit.client.1.vm07.stdout:5/122: mknod d0/d22/d18/d19/d21/c29 0 2026-03-10T12:37:30.441 INFO:tasks.workunit.client.1.vm07.stdout:6/85: write d1/d4/d6/f15 [753010,85551] 0 2026-03-10T12:37:30.444 INFO:tasks.workunit.client.1.vm07.stdout:7/96: unlink d0/c2 0 2026-03-10T12:37:30.450 INFO:tasks.workunit.client.1.vm07.stdout:4/94: dread d0/d4/d10/f16 [0,4194304] 0 2026-03-10T12:37:30.451 INFO:tasks.workunit.client.1.vm07.stdout:4/95: readlink d0/d4/d5/da/l17 0 2026-03-10T12:37:30.454 INFO:tasks.workunit.client.1.vm07.stdout:1/117: rename d9/f12 to d9/f27 0 2026-03-10T12:37:30.454 INFO:tasks.workunit.client.1.vm07.stdout:1/118: stat d9/df/l17 0 2026-03-10T12:37:30.458 INFO:tasks.workunit.client.1.vm07.stdout:5/123: dread d0/d22/f16 [0,4194304] 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:6/86: symlink d1/d4/d6/l1b 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:6/87: stat d1/d9/f10 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:7/97: mknod d0/c15 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:4/96: readlink d0/l1c 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:5/124: chown d0/f1f 33327355 1 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:6/88: mknod d1/d4/c1c 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:8/126: getdents d1 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:1/119: symlink d9/l28 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:4/97: fdatasync d0/d4/d10/f16 0 2026-03-10T12:37:30.475 INFO:tasks.workunit.client.1.vm07.stdout:1/120: dwrite d9/f19 [0,4194304] 0 2026-03-10T12:37:30.476 INFO:tasks.workunit.client.1.vm07.stdout:8/127: creat d1/d3/d18/f2e x:0 0 0 2026-03-10T12:37:30.478 INFO:tasks.workunit.client.1.vm07.stdout:5/125: symlink d0/l2a 0 2026-03-10T12:37:30.481 INFO:tasks.workunit.client.1.vm07.stdout:4/98: rename d0/d3 to d0/d4/d10/d23 0 2026-03-10T12:37:30.485 INFO:tasks.workunit.client.1.vm07.stdout:1/121: mkdir d9/df/d29 0 2026-03-10T12:37:30.500 INFO:tasks.workunit.client.1.vm07.stdout:1/122: dread - d9/f16 zero size 2026-03-10T12:37:30.500 INFO:tasks.workunit.client.1.vm07.stdout:8/128: unlink d1/d3/d6/f9 0 2026-03-10T12:37:30.500 INFO:tasks.workunit.client.1.vm07.stdout:5/126: chown d0/ff 395688 1 2026-03-10T12:37:30.500 INFO:tasks.workunit.client.1.vm07.stdout:8/129: dread d1/fc [0,4194304] 0 2026-03-10T12:37:30.500 INFO:tasks.workunit.client.1.vm07.stdout:4/99: symlink d0/d4/l24 0 2026-03-10T12:37:30.500 INFO:tasks.workunit.client.1.vm07.stdout:8/130: fdatasync d1/d3/f1d 0 2026-03-10T12:37:30.500 INFO:tasks.workunit.client.1.vm07.stdout:1/123: dwrite d9/fe [0,4194304] 0 2026-03-10T12:37:30.510 INFO:tasks.workunit.client.1.vm07.stdout:4/100: creat d0/d19/f25 x:0 0 0 2026-03-10T12:37:30.518 INFO:tasks.workunit.client.1.vm07.stdout:5/127: creat d0/f2b x:0 0 0 2026-03-10T12:37:30.525 INFO:tasks.workunit.client.1.vm07.stdout:1/124: mknod d9/c2a 0 2026-03-10T12:37:30.531 INFO:tasks.workunit.client.1.vm07.stdout:1/125: unlink d9/df/f1e 0 2026-03-10T12:37:30.533 INFO:tasks.workunit.client.1.vm07.stdout:4/101: link d0/l1c d0/d4/d5/da/l26 0 2026-03-10T12:37:30.533 INFO:tasks.workunit.client.1.vm07.stdout:4/102: chown d0/d4/d5 161100392 1 2026-03-10T12:37:30.533 INFO:tasks.workunit.client.1.vm07.stdout:4/103: fdatasync d0/d19/d1f/f22 0 2026-03-10T12:37:30.536 INFO:tasks.workunit.client.1.vm07.stdout:4/104: unlink d0/f7 0 2026-03-10T12:37:30.537 INFO:tasks.workunit.client.1.vm07.stdout:1/126: mkdir d9/df/d29/d2b 0 2026-03-10T12:37:30.538 INFO:tasks.workunit.client.1.vm07.stdout:1/127: chown d9/c2a 0 1 2026-03-10T12:37:30.540 INFO:tasks.workunit.client.1.vm07.stdout:1/128: truncate d9/fb 4506498 0 2026-03-10T12:37:30.540 INFO:tasks.workunit.client.1.vm07.stdout:1/129: chown d9/df/l17 526832 1 2026-03-10T12:37:30.543 INFO:tasks.workunit.client.1.vm07.stdout:9/65: rmdir d5/d13 39 2026-03-10T12:37:30.543 INFO:tasks.workunit.client.1.vm07.stdout:9/66: fdatasync d5/fa 0 2026-03-10T12:37:30.545 INFO:tasks.workunit.client.1.vm07.stdout:1/130: mkdir d9/df/d29/d2c 0 2026-03-10T12:37:30.549 INFO:tasks.workunit.client.1.vm07.stdout:1/131: dwrite d9/df/f13 [4194304,4194304] 0 2026-03-10T12:37:30.550 INFO:tasks.workunit.client.1.vm07.stdout:1/132: chown f6 638397702 1 2026-03-10T12:37:30.554 INFO:tasks.workunit.client.1.vm07.stdout:0/139: rmdir d0/d14/d1a 39 2026-03-10T12:37:30.557 INFO:tasks.workunit.client.1.vm07.stdout:0/140: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:30.558 INFO:tasks.workunit.client.1.vm07.stdout:0/141: stat d0/l26 0 2026-03-10T12:37:30.563 INFO:tasks.workunit.client.1.vm07.stdout:9/67: chown d5/ld 5545848 1 2026-03-10T12:37:30.564 INFO:tasks.workunit.client.1.vm07.stdout:4/105: getdents d0/d4/d10/d23 0 2026-03-10T12:37:30.564 INFO:tasks.workunit.client.1.vm07.stdout:1/133: write d9/f22 [492079,13099] 0 2026-03-10T12:37:30.565 INFO:tasks.workunit.client.1.vm07.stdout:0/142: unlink d0/f11 0 2026-03-10T12:37:30.565 INFO:tasks.workunit.client.1.vm07.stdout:9/68: write d5/fa [2964573,112392] 0 2026-03-10T12:37:30.566 INFO:tasks.workunit.client.1.vm07.stdout:4/106: readlink d0/d4/l1d 0 2026-03-10T12:37:30.577 INFO:tasks.workunit.client.1.vm07.stdout:9/69: read d5/fa [3292536,51560] 0 2026-03-10T12:37:30.577 INFO:tasks.workunit.client.1.vm07.stdout:1/134: chown d9/df/f13 710611 1 2026-03-10T12:37:30.578 INFO:tasks.workunit.client.1.vm07.stdout:1/135: chown d9/f1f 1174 1 2026-03-10T12:37:30.578 INFO:tasks.workunit.client.1.vm07.stdout:0/143: chown d0/d14/d1a/d1b/l1e 3529 1 2026-03-10T12:37:30.578 INFO:tasks.workunit.client.1.vm07.stdout:0/144: fdatasync d0/f1d 0 2026-03-10T12:37:30.578 INFO:tasks.workunit.client.1.vm07.stdout:4/107: creat d0/d4/d10/d23/f27 x:0 0 0 2026-03-10T12:37:30.578 INFO:tasks.workunit.client.1.vm07.stdout:0/145: dwrite d0/f1c [0,4194304] 0 2026-03-10T12:37:30.579 INFO:tasks.workunit.client.1.vm07.stdout:0/146: truncate d0/d14/d1a/f2c 266207 0 2026-03-10T12:37:30.580 INFO:tasks.workunit.client.1.vm07.stdout:0/147: write d0/d14/d1a/f27 [798368,104220] 0 2026-03-10T12:37:30.582 INFO:tasks.workunit.client.1.vm07.stdout:0/148: read d0/f1c [3701724,94350] 0 2026-03-10T12:37:30.586 INFO:tasks.workunit.client.1.vm07.stdout:1/136: mkdir d9/d2d 0 2026-03-10T12:37:30.586 INFO:tasks.workunit.client.1.vm07.stdout:1/137: stat d9/f27 0 2026-03-10T12:37:30.587 INFO:tasks.workunit.client.1.vm07.stdout:1/138: read d9/df/f13 [301189,19496] 0 2026-03-10T12:37:30.589 INFO:tasks.workunit.client.1.vm07.stdout:0/149: creat d0/f2d x:0 0 0 2026-03-10T12:37:30.589 INFO:tasks.workunit.client.1.vm07.stdout:0/150: dread - d0/f21 zero size 2026-03-10T12:37:30.592 INFO:tasks.workunit.client.1.vm07.stdout:0/151: rename d0/f10 to d0/f2e 0 2026-03-10T12:37:30.594 INFO:tasks.workunit.client.1.vm07.stdout:1/139: link d9/df/c1d d9/df/d29/c2e 0 2026-03-10T12:37:30.594 INFO:tasks.workunit.client.1.vm07.stdout:1/140: read - d9/df/f24 zero size 2026-03-10T12:37:30.594 INFO:tasks.workunit.client.1.vm07.stdout:1/141: stat d9/fc 0 2026-03-10T12:37:30.595 INFO:tasks.workunit.client.1.vm07.stdout:0/152: mkdir d0/d14/d1a/d2f 0 2026-03-10T12:37:30.596 INFO:tasks.workunit.client.1.vm07.stdout:0/153: write d0/f1d [412702,52369] 0 2026-03-10T12:37:30.598 INFO:tasks.workunit.client.1.vm07.stdout:0/154: rename d0/f2d to d0/d14/d1a/f30 0 2026-03-10T12:37:30.599 INFO:tasks.workunit.client.1.vm07.stdout:0/155: mkdir d0/d14/d1a/d2f/d31 0 2026-03-10T12:37:30.678 INFO:tasks.workunit.client.1.vm07.stdout:5/128: sync 2026-03-10T12:37:30.682 INFO:tasks.workunit.client.1.vm07.stdout:5/129: rename d0/d22/f1b to d0/d22/d18/d19/f2c 0 2026-03-10T12:37:30.682 INFO:tasks.workunit.client.1.vm07.stdout:5/130: readlink d0/l25 0 2026-03-10T12:37:30.771 INFO:tasks.workunit.client.1.vm07.stdout:3/117: write dc/dd/f21 [180374,42361] 0 2026-03-10T12:37:30.772 INFO:tasks.workunit.client.1.vm07.stdout:3/118: stat l7 0 2026-03-10T12:37:30.774 INFO:tasks.workunit.client.1.vm07.stdout:3/119: symlink dc/l25 0 2026-03-10T12:37:30.777 INFO:tasks.workunit.client.1.vm07.stdout:3/120: link c9 dc/c26 0 2026-03-10T12:37:30.795 INFO:tasks.workunit.client.1.vm07.stdout:3/121: creat dc/dd/d1f/f27 x:0 0 0 2026-03-10T12:37:30.797 INFO:tasks.workunit.client.1.vm07.stdout:3/122: stat f2 0 2026-03-10T12:37:30.797 INFO:tasks.workunit.client.1.vm07.stdout:3/123: chown dc/c26 8949 1 2026-03-10T12:37:30.797 INFO:tasks.workunit.client.1.vm07.stdout:3/124: fsync dc/f17 0 2026-03-10T12:37:30.797 INFO:tasks.workunit.client.1.vm07.stdout:3/125: fdatasync dc/f17 0 2026-03-10T12:37:30.801 INFO:tasks.workunit.client.1.vm07.stdout:3/126: mkdir dc/dd/d28 0 2026-03-10T12:37:30.801 INFO:tasks.workunit.client.1.vm07.stdout:3/127: chown dc 122624 1 2026-03-10T12:37:30.807 INFO:tasks.workunit.client.1.vm07.stdout:2/67: truncate d0/f1 2806662 0 2026-03-10T12:37:30.811 INFO:tasks.workunit.client.1.vm07.stdout:4/108: fsync d0/d4/d10/d18/f1a 0 2026-03-10T12:37:30.811 INFO:tasks.workunit.client.1.vm07.stdout:4/109: readlink d0/d4/l1d 0 2026-03-10T12:37:30.811 INFO:tasks.workunit.client.1.vm07.stdout:4/110: stat d0/d4/d10/f16 0 2026-03-10T12:37:30.823 INFO:tasks.workunit.client.1.vm07.stdout:2/68: creat d0/f1c x:0 0 0 2026-03-10T12:37:30.827 INFO:tasks.workunit.client.1.vm07.stdout:2/69: dwrite d0/f1c [0,4194304] 0 2026-03-10T12:37:30.866 INFO:tasks.workunit.client.1.vm07.stdout:6/89: getdents d1/d4/d6 0 2026-03-10T12:37:30.869 INFO:tasks.workunit.client.1.vm07.stdout:6/90: dwrite d1/d4/f19 [0,4194304] 0 2026-03-10T12:37:30.871 INFO:tasks.workunit.client.1.vm07.stdout:6/91: chown d1/d4/le 8067371 1 2026-03-10T12:37:30.874 INFO:tasks.workunit.client.1.vm07.stdout:6/92: dread d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:37:30.876 INFO:tasks.workunit.client.1.vm07.stdout:7/98: write d0/f13 [271354,97824] 0 2026-03-10T12:37:30.881 INFO:tasks.workunit.client.1.vm07.stdout:7/99: rename d0/fe to d0/f16 0 2026-03-10T12:37:30.882 INFO:tasks.workunit.client.1.vm07.stdout:6/93: unlink d1/d9/c12 0 2026-03-10T12:37:30.891 INFO:tasks.workunit.client.1.vm07.stdout:8/131: dwrite d1/fc [0,4194304] 0 2026-03-10T12:37:30.893 INFO:tasks.workunit.client.1.vm07.stdout:6/94: mknod d1/d9/c1d 0 2026-03-10T12:37:30.894 INFO:tasks.workunit.client.1.vm07.stdout:6/95: truncate d1/f17 963087 0 2026-03-10T12:37:30.894 INFO:tasks.workunit.client.1.vm07.stdout:6/96: readlink d1/d4/d6/lf 0 2026-03-10T12:37:30.898 INFO:tasks.workunit.client.1.vm07.stdout:8/132: rename d1/fb to d1/d3/f2f 0 2026-03-10T12:37:30.900 INFO:tasks.workunit.client.1.vm07.stdout:6/97: rmdir d1/d4/d6 39 2026-03-10T12:37:30.906 INFO:tasks.workunit.client.1.vm07.stdout:6/98: creat d1/f1e x:0 0 0 2026-03-10T12:37:30.918 INFO:tasks.workunit.client.1.vm07.stdout:6/99: dread d1/d9/f10 [0,4194304] 0 2026-03-10T12:37:30.920 INFO:tasks.workunit.client.1.vm07.stdout:6/100: rename d1/d9/f10 to d1/d9/f1f 0 2026-03-10T12:37:30.920 INFO:tasks.workunit.client.1.vm07.stdout:6/101: rename d1 to d1/d9/d20 22 2026-03-10T12:37:30.923 INFO:tasks.workunit.client.1.vm07.stdout:6/102: dread d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:37:30.923 INFO:tasks.workunit.client.1.vm07.stdout:6/103: chown d1/d9/f1f 153 1 2026-03-10T12:37:30.925 INFO:tasks.workunit.client.1.vm07.stdout:6/104: symlink d1/l21 0 2026-03-10T12:37:30.926 INFO:tasks.workunit.client.1.vm07.stdout:6/105: creat d1/d9/f22 x:0 0 0 2026-03-10T12:37:30.928 INFO:tasks.workunit.client.1.vm07.stdout:6/106: symlink d1/d4/d6/l23 0 2026-03-10T12:37:30.928 INFO:tasks.workunit.client.1.vm07.stdout:6/107: truncate d1/f1e 544916 0 2026-03-10T12:37:30.929 INFO:tasks.workunit.client.1.vm07.stdout:6/108: symlink d1/d4/d6/d16/l24 0 2026-03-10T12:37:30.955 INFO:tasks.workunit.client.1.vm07.stdout:9/70: truncate d5/fe 1233243 0 2026-03-10T12:37:30.957 INFO:tasks.workunit.client.1.vm07.stdout:4/111: truncate d0/d4/d10/d18/f1a 1377413 0 2026-03-10T12:37:30.958 INFO:tasks.workunit.client.1.vm07.stdout:9/71: dwrite d5/fb [4194304,4194304] 0 2026-03-10T12:37:30.962 INFO:tasks.workunit.client.1.vm07.stdout:6/109: fdatasync d1/d4/f11 0 2026-03-10T12:37:30.969 INFO:tasks.workunit.client.1.vm07.stdout:1/142: rmdir d9/df 39 2026-03-10T12:37:30.970 INFO:tasks.workunit.client.1.vm07.stdout:1/143: fdatasync d9/f16 0 2026-03-10T12:37:30.974 INFO:tasks.workunit.client.1.vm07.stdout:0/156: truncate d0/fd 725160 0 2026-03-10T12:37:30.974 INFO:tasks.workunit.client.1.vm07.stdout:0/157: write d0/d14/d1a/f27 [1216249,98222] 0 2026-03-10T12:37:30.980 INFO:tasks.workunit.client.1.vm07.stdout:0/158: dread d0/d14/d1a/f24 [0,4194304] 0 2026-03-10T12:37:30.983 INFO:tasks.workunit.client.1.vm07.stdout:5/131: dwrite d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:37:30.986 INFO:tasks.workunit.client.1.vm07.stdout:5/132: dwrite d0/f1f [0,4194304] 0 2026-03-10T12:37:30.990 INFO:tasks.workunit.client.1.vm07.stdout:9/72: chown d5/d13/f14 65 1 2026-03-10T12:37:30.995 INFO:tasks.workunit.client.1.vm07.stdout:6/110: fdatasync d1/d9/f1f 0 2026-03-10T12:37:30.997 INFO:tasks.workunit.client.1.vm07.stdout:0/159: rename d0 to d0/d14/d32 22 2026-03-10T12:37:31.000 INFO:tasks.workunit.client.1.vm07.stdout:0/160: dread d0/f16 [0,4194304] 0 2026-03-10T12:37:31.014 INFO:tasks.workunit.client.1.vm07.stdout:0/161: fdatasync d0/d14/f19 0 2026-03-10T12:37:31.014 INFO:tasks.workunit.client.1.vm07.stdout:5/133: stat d0/l7 0 2026-03-10T12:37:31.014 INFO:tasks.workunit.client.1.vm07.stdout:5/134: chown d0/d22 362 1 2026-03-10T12:37:31.014 INFO:tasks.workunit.client.1.vm07.stdout:0/162: symlink d0/l33 0 2026-03-10T12:37:31.014 INFO:tasks.workunit.client.1.vm07.stdout:0/163: chown d0/d14/d1a/l2b 340438943 1 2026-03-10T12:37:31.015 INFO:tasks.workunit.client.1.vm07.stdout:3/128: rmdir dc 39 2026-03-10T12:37:31.016 INFO:tasks.workunit.client.1.vm07.stdout:9/73: mkdir d5/d16 0 2026-03-10T12:37:31.017 INFO:tasks.workunit.client.1.vm07.stdout:9/74: truncate d5/d13/f14 90869 0 2026-03-10T12:37:31.019 INFO:tasks.workunit.client.1.vm07.stdout:5/135: creat d0/d22/d18/d19/d21/f2d x:0 0 0 2026-03-10T12:37:31.019 INFO:tasks.workunit.client.1.vm07.stdout:2/70: dwrite d0/f1 [0,4194304] 0 2026-03-10T12:37:31.021 INFO:tasks.workunit.client.1.vm07.stdout:2/71: fsync d0/f17 0 2026-03-10T12:37:31.022 INFO:tasks.workunit.client.1.vm07.stdout:0/164: dread d0/f15 [0,4194304] 0 2026-03-10T12:37:31.023 INFO:tasks.workunit.client.1.vm07.stdout:3/129: truncate dc/dd/f21 804905 0 2026-03-10T12:37:31.028 INFO:tasks.workunit.client.1.vm07.stdout:9/75: dwrite d5/f8 [0,4194304] 0 2026-03-10T12:37:31.032 INFO:tasks.workunit.client.1.vm07.stdout:5/136: mkdir d0/d22/d18/d19/d2e 0 2026-03-10T12:37:31.038 INFO:tasks.workunit.client.1.vm07.stdout:2/72: chown d0/c7 844132 1 2026-03-10T12:37:31.038 INFO:tasks.workunit.client.1.vm07.stdout:0/165: symlink d0/l34 0 2026-03-10T12:37:31.038 INFO:tasks.workunit.client.1.vm07.stdout:2/73: dread d0/f18 [0,4194304] 0 2026-03-10T12:37:31.038 INFO:tasks.workunit.client.1.vm07.stdout:5/137: dwrite d0/f2b [0,4194304] 0 2026-03-10T12:37:31.038 INFO:tasks.workunit.client.1.vm07.stdout:1/144: sync 2026-03-10T12:37:31.040 INFO:tasks.workunit.client.1.vm07.stdout:2/74: chown d0/f15 493843 1 2026-03-10T12:37:31.044 INFO:tasks.workunit.client.1.vm07.stdout:3/130: creat dc/dd/f29 x:0 0 0 2026-03-10T12:37:31.055 INFO:tasks.workunit.client.1.vm07.stdout:2/75: chown d0/c16 6056 1 2026-03-10T12:37:31.055 INFO:tasks.workunit.client.1.vm07.stdout:5/138: creat d0/d22/d18/d19/d21/f2f x:0 0 0 2026-03-10T12:37:31.057 INFO:tasks.workunit.client.1.vm07.stdout:7/100: truncate d0/f13 347577 0 2026-03-10T12:37:31.058 INFO:tasks.workunit.client.1.vm07.stdout:3/131: symlink dc/dd/d1f/l2a 0 2026-03-10T12:37:31.058 INFO:tasks.workunit.client.1.vm07.stdout:7/101: stat d0/ld 0 2026-03-10T12:37:31.060 INFO:tasks.workunit.client.1.vm07.stdout:9/76: mknod d5/c17 0 2026-03-10T12:37:31.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:30 vm07.local ceph-mon[58582]: pgmap v150: 65 pgs: 65 active+clean; 234 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 9.2 MiB/s wr, 418 op/s 2026-03-10T12:37:31.074 INFO:tasks.workunit.client.1.vm07.stdout:7/102: dwrite d0/fc [4194304,4194304] 0 2026-03-10T12:37:31.080 INFO:tasks.workunit.client.1.vm07.stdout:6/111: rename d1/d9/f1f to d1/d4/d6/d16/d1a/f25 0 2026-03-10T12:37:31.086 INFO:tasks.workunit.client.1.vm07.stdout:5/139: mkdir d0/d22/d18/d30 0 2026-03-10T12:37:31.087 INFO:tasks.workunit.client.1.vm07.stdout:7/103: dwrite d0/f3 [0,4194304] 0 2026-03-10T12:37:31.089 INFO:tasks.workunit.client.1.vm07.stdout:5/140: dread - d0/d22/d18/d19/d21/f2f zero size 2026-03-10T12:37:31.097 INFO:tasks.workunit.client.1.vm07.stdout:1/145: creat d9/f2f x:0 0 0 2026-03-10T12:37:31.100 INFO:tasks.workunit.client.1.vm07.stdout:3/132: symlink dc/dd/d28/l2b 0 2026-03-10T12:37:31.100 INFO:tasks.workunit.client.1.vm07.stdout:8/133: dwrite d1/d3/f1d [0,4194304] 0 2026-03-10T12:37:31.109 INFO:tasks.workunit.client.1.vm07.stdout:5/141: unlink d0/d22/c26 0 2026-03-10T12:37:31.112 INFO:tasks.workunit.client.1.vm07.stdout:2/76: creat d0/f1d x:0 0 0 2026-03-10T12:37:31.116 INFO:tasks.workunit.client.1.vm07.stdout:4/112: dwrite d0/d4/d10/d18/f1a [0,4194304] 0 2026-03-10T12:37:31.118 INFO:tasks.workunit.client.1.vm07.stdout:9/77: mkdir d5/d16/d18 0 2026-03-10T12:37:31.119 INFO:tasks.workunit.client.1.vm07.stdout:9/78: truncate d5/d13/f14 832186 0 2026-03-10T12:37:31.121 INFO:tasks.workunit.client.1.vm07.stdout:4/113: write d0/d4/d10/d18/f1a [3403860,54608] 0 2026-03-10T12:37:31.122 INFO:tasks.workunit.client.1.vm07.stdout:2/77: dwrite d0/f1d [0,4194304] 0 2026-03-10T12:37:31.126 INFO:tasks.workunit.client.1.vm07.stdout:1/146: read d9/fb [2949826,56550] 0 2026-03-10T12:37:31.131 INFO:tasks.workunit.client.1.vm07.stdout:8/134: write d1/fc [1976937,102606] 0 2026-03-10T12:37:31.138 INFO:tasks.workunit.client.1.vm07.stdout:9/79: creat d5/d16/f19 x:0 0 0 2026-03-10T12:37:31.139 INFO:tasks.workunit.client.1.vm07.stdout:0/166: link d0/d14/d1a/d1b/l29 d0/d14/l35 0 2026-03-10T12:37:31.143 INFO:tasks.workunit.client.1.vm07.stdout:4/114: dwrite d0/d4/d5/da/f15 [4194304,4194304] 0 2026-03-10T12:37:31.152 INFO:tasks.workunit.client.1.vm07.stdout:2/78: rename d0/f6 to d0/d19/f1e 0 2026-03-10T12:37:31.152 INFO:tasks.workunit.client.1.vm07.stdout:2/79: dread d0/f1 [0,4194304] 0 2026-03-10T12:37:31.153 INFO:tasks.workunit.client.1.vm07.stdout:7/104: link d0/c15 d0/c17 0 2026-03-10T12:37:31.153 INFO:tasks.workunit.client.1.vm07.stdout:2/80: read - d0/d19/f1a zero size 2026-03-10T12:37:31.154 INFO:tasks.workunit.client.1.vm07.stdout:7/105: chown d0/c11 23648154 1 2026-03-10T12:37:31.154 INFO:tasks.workunit.client.1.vm07.stdout:7/106: readlink d0/ld 0 2026-03-10T12:37:31.155 INFO:tasks.workunit.client.1.vm07.stdout:3/133: link dc/dd/f29 dc/d18/d24/f2c 0 2026-03-10T12:37:31.155 INFO:tasks.workunit.client.1.vm07.stdout:8/135: mknod d1/d3/d6/c30 0 2026-03-10T12:37:31.156 INFO:tasks.workunit.client.1.vm07.stdout:0/167: write d0/d14/d1a/f24 [1633433,58710] 0 2026-03-10T12:37:31.156 INFO:tasks.workunit.client.1.vm07.stdout:7/107: chown d0/f16 1985323042 1 2026-03-10T12:37:31.158 INFO:tasks.workunit.client.1.vm07.stdout:4/115: mknod d0/d4/d10/d18/c28 0 2026-03-10T12:37:31.158 INFO:tasks.workunit.client.1.vm07.stdout:4/116: stat d0/d4/d5/c13 0 2026-03-10T12:37:31.159 INFO:tasks.workunit.client.1.vm07.stdout:4/117: write d0/d4/d10/d18/f1a [998237,83991] 0 2026-03-10T12:37:31.160 INFO:tasks.workunit.client.1.vm07.stdout:4/118: stat d0/d4/d5/da 0 2026-03-10T12:37:31.161 INFO:tasks.workunit.client.1.vm07.stdout:2/81: mkdir d0/d19/d1f 0 2026-03-10T12:37:31.162 INFO:tasks.workunit.client.1.vm07.stdout:2/82: dread - d0/f17 zero size 2026-03-10T12:37:31.167 INFO:tasks.workunit.client.1.vm07.stdout:3/134: mkdir dc/d18/d2d 0 2026-03-10T12:37:31.177 INFO:tasks.workunit.client.1.vm07.stdout:7/108: mknod d0/c18 0 2026-03-10T12:37:31.177 INFO:tasks.workunit.client.1.vm07.stdout:8/136: rename d1/d3/d6/l26 to d1/d3/d18/l31 0 2026-03-10T12:37:31.177 INFO:tasks.workunit.client.1.vm07.stdout:3/135: dwrite dc/dd/f20 [0,4194304] 0 2026-03-10T12:37:31.177 INFO:tasks.workunit.client.1.vm07.stdout:8/137: stat d1/d3/f25 0 2026-03-10T12:37:31.180 INFO:tasks.workunit.client.1.vm07.stdout:3/136: dwrite dc/f17 [0,4194304] 0 2026-03-10T12:37:31.180 INFO:tasks.workunit.client.1.vm07.stdout:9/80: sync 2026-03-10T12:37:31.181 INFO:tasks.workunit.client.1.vm07.stdout:9/81: fdatasync d5/d13/f14 0 2026-03-10T12:37:31.185 INFO:tasks.workunit.client.1.vm07.stdout:5/142: sync 2026-03-10T12:37:31.202 INFO:tasks.workunit.client.1.vm07.stdout:1/147: mkdir d9/df/d29/d2b/d30 0 2026-03-10T12:37:31.205 INFO:tasks.workunit.client.1.vm07.stdout:1/148: chown d9/df/f13 33095940 1 2026-03-10T12:37:31.205 INFO:tasks.workunit.client.1.vm07.stdout:1/149: truncate d9/f1b 551013 0 2026-03-10T12:37:31.205 INFO:tasks.workunit.client.1.vm07.stdout:1/150: fdatasync d9/fc 0 2026-03-10T12:37:31.205 INFO:tasks.workunit.client.1.vm07.stdout:1/151: chown d9/df/f24 15928 1 2026-03-10T12:37:31.206 INFO:tasks.workunit.client.1.vm07.stdout:8/138: creat d1/d3/d18/f32 x:0 0 0 2026-03-10T12:37:31.206 INFO:tasks.workunit.client.1.vm07.stdout:3/137: mknod dc/d18/c2e 0 2026-03-10T12:37:31.208 INFO:tasks.workunit.client.1.vm07.stdout:7/109: dread d0/f10 [0,4194304] 0 2026-03-10T12:37:31.211 INFO:tasks.workunit.client.1.vm07.stdout:0/168: creat d0/d14/f36 x:0 0 0 2026-03-10T12:37:31.213 INFO:tasks.workunit.client.1.vm07.stdout:2/83: mkdir d0/d19/d1f/d20 0 2026-03-10T12:37:31.214 INFO:tasks.workunit.client.1.vm07.stdout:1/152: mkdir d9/df/d29/d2b/d31 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:7/110: dread d0/f16 [4194304,4194304] 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:1/153: write d9/f1a [564474,26942] 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:5/143: mknod d0/d22/d18/d30/c31 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:1/154: dread d9/f19 [0,4194304] 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:5/144: dwrite d0/ff [0,4194304] 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:1/155: fsync d9/df/f10 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:8/139: symlink d1/d3/d18/l33 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:3/138: creat dc/dd/d1f/f2f x:0 0 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:8/140: dread d1/d3/f1d [0,4194304] 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:8/141: fsync d1/f19 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:5/145: dread - d0/d22/f27 zero size 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:3/139: dwrite dc/f10 [0,4194304] 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:7/111: rename d0/ld to d0/l19 0 2026-03-10T12:37:31.232 INFO:tasks.workunit.client.1.vm07.stdout:4/119: link d0/d4/d5/c12 d0/c29 0 2026-03-10T12:37:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:30 vm00.local ceph-mon[50686]: pgmap v150: 65 pgs: 65 active+clean; 234 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 9.2 MiB/s wr, 418 op/s 2026-03-10T12:37:31.235 INFO:tasks.workunit.client.1.vm07.stdout:4/120: dread - d0/d19/d1f/f22 zero size 2026-03-10T12:37:31.238 INFO:tasks.workunit.client.1.vm07.stdout:4/121: stat d0/d4/d10/d23/f27 0 2026-03-10T12:37:31.238 INFO:tasks.workunit.client.1.vm07.stdout:1/156: creat d9/df/d29/d2b/f32 x:0 0 0 2026-03-10T12:37:31.240 INFO:tasks.workunit.client.1.vm07.stdout:3/140: creat dc/dd/d1f/f30 x:0 0 0 2026-03-10T12:37:31.251 INFO:tasks.workunit.client.1.vm07.stdout:3/141: write dc/dd/f21 [746258,129420] 0 2026-03-10T12:37:31.251 INFO:tasks.workunit.client.1.vm07.stdout:1/157: unlink f8 0 2026-03-10T12:37:31.251 INFO:tasks.workunit.client.1.vm07.stdout:5/146: symlink d0/l32 0 2026-03-10T12:37:31.251 INFO:tasks.workunit.client.1.vm07.stdout:7/112: dwrite d0/f16 [4194304,4194304] 0 2026-03-10T12:37:31.253 INFO:tasks.workunit.client.1.vm07.stdout:5/147: truncate d0/d22/d18/d19/d21/f2f 706329 0 2026-03-10T12:37:31.256 INFO:tasks.workunit.client.1.vm07.stdout:4/122: mknod d0/d4/d5/c2a 0 2026-03-10T12:37:31.259 INFO:tasks.workunit.client.1.vm07.stdout:3/142: mknod dc/dd/c31 0 2026-03-10T12:37:31.268 INFO:tasks.workunit.client.1.vm07.stdout:3/143: dread - dc/dd/f1d zero size 2026-03-10T12:37:31.268 INFO:tasks.workunit.client.1.vm07.stdout:0/169: rename d0/f16 to d0/d14/f37 0 2026-03-10T12:37:31.269 INFO:tasks.workunit.client.1.vm07.stdout:8/142: rename d1/d3/d18 to d1/d3/d18/d34 22 2026-03-10T12:37:31.269 INFO:tasks.workunit.client.1.vm07.stdout:0/170: dread d0/f2e [0,4194304] 0 2026-03-10T12:37:31.269 INFO:tasks.workunit.client.1.vm07.stdout:7/113: unlink d0/fa 0 2026-03-10T12:37:31.279 INFO:tasks.workunit.client.1.vm07.stdout:3/144: unlink dc/f10 0 2026-03-10T12:37:31.280 INFO:tasks.workunit.client.1.vm07.stdout:8/143: unlink d1/d3/d6/l28 0 2026-03-10T12:37:31.281 INFO:tasks.workunit.client.1.vm07.stdout:8/144: write d1/d3/f25 [23866,58525] 0 2026-03-10T12:37:31.289 INFO:tasks.workunit.client.1.vm07.stdout:5/148: link d0/d22/d18/d19/f2c d0/d22/d18/d30/f33 0 2026-03-10T12:37:31.294 INFO:tasks.workunit.client.1.vm07.stdout:7/114: mknod d0/c1a 0 2026-03-10T12:37:31.299 INFO:tasks.workunit.client.1.vm07.stdout:7/115: dwrite d0/fc [0,4194304] 0 2026-03-10T12:37:31.302 INFO:tasks.workunit.client.1.vm07.stdout:7/116: write d0/fc [7552321,89925] 0 2026-03-10T12:37:31.304 INFO:tasks.workunit.client.1.vm07.stdout:4/123: sync 2026-03-10T12:37:31.310 INFO:tasks.workunit.client.1.vm07.stdout:0/171: mkdir d0/d14/d1a/d38 0 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:4/124: write d0/d4/d10/d18/f1a [4839335,5726] 0 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:4/125: chown d0/d4/d10/d18 514378 1 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:4/126: write d0/d4/d5/da/f15 [2716166,74760] 0 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:0/172: dread d0/d14/f19 [0,4194304] 0 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:5/149: stat d0/f12 0 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:5/150: chown d0/f2b 912454 1 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:7/117: fdatasync d0/f13 0 2026-03-10T12:37:31.322 INFO:tasks.workunit.client.1.vm07.stdout:8/145: link d1/d3/f8 d1/d3/d11/f35 0 2026-03-10T12:37:31.325 INFO:tasks.workunit.client.1.vm07.stdout:5/151: dwrite d0/f1e [0,4194304] 0 2026-03-10T12:37:31.326 INFO:tasks.workunit.client.1.vm07.stdout:5/152: dread - d0/d22/d18/d19/f23 zero size 2026-03-10T12:37:31.330 INFO:tasks.workunit.client.1.vm07.stdout:7/118: symlink d0/l1b 0 2026-03-10T12:37:31.330 INFO:tasks.workunit.client.1.vm07.stdout:0/173: mknod d0/d14/d1a/d2f/d31/c39 0 2026-03-10T12:37:31.331 INFO:tasks.workunit.client.1.vm07.stdout:7/119: write d0/fc [2552833,68241] 0 2026-03-10T12:37:31.331 INFO:tasks.workunit.client.1.vm07.stdout:0/174: chown d0/l26 388 1 2026-03-10T12:37:31.331 INFO:tasks.workunit.client.1.vm07.stdout:5/153: creat d0/d22/d18/d30/f34 x:0 0 0 2026-03-10T12:37:31.339 INFO:tasks.workunit.client.1.vm07.stdout:5/154: dwrite d0/f13 [0,4194304] 0 2026-03-10T12:37:31.339 INFO:tasks.workunit.client.1.vm07.stdout:5/155: readlink d0/l25 0 2026-03-10T12:37:31.340 INFO:tasks.workunit.client.1.vm07.stdout:0/175: symlink d0/d14/d1a/d2f/d31/l3a 0 2026-03-10T12:37:31.347 INFO:tasks.workunit.client.1.vm07.stdout:9/82: dread d5/fa [4194304,4194304] 0 2026-03-10T12:37:31.348 INFO:tasks.workunit.client.1.vm07.stdout:9/83: chown d5/lf 14 1 2026-03-10T12:37:31.349 INFO:tasks.workunit.client.1.vm07.stdout:9/84: dread d5/fb [4194304,4194304] 0 2026-03-10T12:37:31.357 INFO:tasks.workunit.client.1.vm07.stdout:8/146: link d1/d3/f29 d1/f36 0 2026-03-10T12:37:31.366 INFO:tasks.workunit.client.1.vm07.stdout:9/85: creat d5/f1a x:0 0 0 2026-03-10T12:37:31.368 INFO:tasks.workunit.client.1.vm07.stdout:9/86: creat d5/d13/f1b x:0 0 0 2026-03-10T12:37:31.373 INFO:tasks.workunit.client.1.vm07.stdout:9/87: readlink d5/lf 0 2026-03-10T12:37:31.373 INFO:tasks.workunit.client.1.vm07.stdout:9/88: write d5/d13/f1b [416285,23992] 0 2026-03-10T12:37:31.373 INFO:tasks.workunit.client.1.vm07.stdout:9/89: fsync d5/f8 0 2026-03-10T12:37:31.373 INFO:tasks.workunit.client.1.vm07.stdout:9/90: chown d5/d13/f1b 95 1 2026-03-10T12:37:31.373 INFO:tasks.workunit.client.1.vm07.stdout:8/147: dwrite d1/d3/f8 [0,4194304] 0 2026-03-10T12:37:31.380 INFO:tasks.workunit.client.1.vm07.stdout:9/91: dread d5/f8 [0,4194304] 0 2026-03-10T12:37:31.382 INFO:tasks.workunit.client.1.vm07.stdout:8/148: dwrite d1/d3/d11/f15 [0,4194304] 0 2026-03-10T12:37:31.391 INFO:tasks.workunit.client.1.vm07.stdout:8/149: dwrite d1/d3/d11/f35 [0,4194304] 0 2026-03-10T12:37:31.404 INFO:tasks.workunit.client.1.vm07.stdout:8/150: link d1/d3/l14 d1/l37 0 2026-03-10T12:37:31.407 INFO:tasks.workunit.client.1.vm07.stdout:8/151: creat d1/d3/d18/f38 x:0 0 0 2026-03-10T12:37:31.417 INFO:tasks.workunit.client.1.vm07.stdout:8/152: chown d1/d3/ff 2009401110 1 2026-03-10T12:37:31.455 INFO:tasks.workunit.client.1.vm07.stdout:9/92: sync 2026-03-10T12:37:31.455 INFO:tasks.workunit.client.1.vm07.stdout:8/153: sync 2026-03-10T12:37:31.458 INFO:tasks.workunit.client.1.vm07.stdout:8/154: unlink d1/d3/f2f 0 2026-03-10T12:37:31.458 INFO:tasks.workunit.client.1.vm07.stdout:9/93: write d5/fa [4276763,130388] 0 2026-03-10T12:37:31.462 INFO:tasks.workunit.client.1.vm07.stdout:8/155: symlink d1/d3/d11/l39 0 2026-03-10T12:37:31.463 INFO:tasks.workunit.client.1.vm07.stdout:5/156: getdents d0/d22/d18/d19/d21 0 2026-03-10T12:37:31.463 INFO:tasks.workunit.client.1.vm07.stdout:8/156: truncate d1/d3/d18/f38 218126 0 2026-03-10T12:37:31.464 INFO:tasks.workunit.client.1.vm07.stdout:8/157: read - d1/d3/d6/f2c zero size 2026-03-10T12:37:31.465 INFO:tasks.workunit.client.1.vm07.stdout:8/158: readlink d1/d3/d11/l39 0 2026-03-10T12:37:31.467 INFO:tasks.workunit.client.1.vm07.stdout:6/112: unlink d1/d4/d6/d16/d1a/f25 0 2026-03-10T12:37:31.470 INFO:tasks.workunit.client.1.vm07.stdout:9/94: getdents d5/d16/d18 0 2026-03-10T12:37:31.473 INFO:tasks.workunit.client.1.vm07.stdout:5/157: rename d0/f12 to d0/d22/d18/d30/f35 0 2026-03-10T12:37:31.477 INFO:tasks.workunit.client.1.vm07.stdout:8/159: unlink d1/d3/d6/f2c 0 2026-03-10T12:37:31.478 INFO:tasks.workunit.client.1.vm07.stdout:8/160: read d1/f19 [502150,49327] 0 2026-03-10T12:37:31.478 INFO:tasks.workunit.client.1.vm07.stdout:8/161: truncate d1/d3/d6/f24 348350 0 2026-03-10T12:37:31.479 INFO:tasks.workunit.client.1.vm07.stdout:8/162: chown d1/d3/f25 127216 1 2026-03-10T12:37:31.484 INFO:tasks.workunit.client.1.vm07.stdout:5/158: mkdir d0/d22/d18/d19/d36 0 2026-03-10T12:37:31.488 INFO:tasks.workunit.client.1.vm07.stdout:4/127: dread d0/d4/d10/d18/f1a [4194304,4194304] 0 2026-03-10T12:37:31.488 INFO:tasks.workunit.client.1.vm07.stdout:8/163: mkdir d1/d3/d18/d3a 0 2026-03-10T12:37:31.492 INFO:tasks.workunit.client.1.vm07.stdout:4/128: dwrite d0/d4/d10/d23/f27 [0,4194304] 0 2026-03-10T12:37:31.498 INFO:tasks.workunit.client.1.vm07.stdout:5/159: creat d0/d22/d18/d19/d21/f37 x:0 0 0 2026-03-10T12:37:31.504 INFO:tasks.workunit.client.1.vm07.stdout:4/129: dwrite d0/d4/d5/da/f15 [0,4194304] 0 2026-03-10T12:37:31.515 INFO:tasks.workunit.client.1.vm07.stdout:4/130: write d0/d4/d10/d18/f1a [4475651,107707] 0 2026-03-10T12:37:31.516 INFO:tasks.workunit.client.1.vm07.stdout:4/131: dread d0/d4/d10/f16 [0,4194304] 0 2026-03-10T12:37:31.521 INFO:tasks.workunit.client.1.vm07.stdout:5/160: link d0/f2b d0/d22/d18/d19/d21/f38 0 2026-03-10T12:37:31.524 INFO:tasks.workunit.client.1.vm07.stdout:2/84: dread d0/d19/f1e [0,4194304] 0 2026-03-10T12:37:31.525 INFO:tasks.workunit.client.1.vm07.stdout:2/85: stat d0/l5 0 2026-03-10T12:37:31.531 INFO:tasks.workunit.client.1.vm07.stdout:4/132: mkdir d0/d19/d1f/d2b 0 2026-03-10T12:37:31.532 INFO:tasks.workunit.client.1.vm07.stdout:4/133: dread d0/d4/d10/f16 [0,4194304] 0 2026-03-10T12:37:31.534 INFO:tasks.workunit.client.1.vm07.stdout:8/164: fsync d1/d3/d18/f32 0 2026-03-10T12:37:31.535 INFO:tasks.workunit.client.1.vm07.stdout:8/165: chown d1/d3/f29 3 1 2026-03-10T12:37:31.545 INFO:tasks.workunit.client.1.vm07.stdout:3/145: rmdir dc/dd/d1f 39 2026-03-10T12:37:31.548 INFO:tasks.workunit.client.1.vm07.stdout:4/134: mknod d0/d19/d1f/c2c 0 2026-03-10T12:37:31.549 INFO:tasks.workunit.client.1.vm07.stdout:8/166: stat d1/fc 0 2026-03-10T12:37:31.554 INFO:tasks.workunit.client.1.vm07.stdout:7/120: rename d0/l19 to d0/l1c 0 2026-03-10T12:37:31.557 INFO:tasks.workunit.client.1.vm07.stdout:5/161: mknod d0/d22/d18/d19/d2e/c39 0 2026-03-10T12:37:31.560 INFO:tasks.workunit.client.1.vm07.stdout:5/162: dwrite d0/d22/d18/d19/f23 [0,4194304] 0 2026-03-10T12:37:31.564 INFO:tasks.workunit.client.1.vm07.stdout:1/158: truncate f6 3261348 0 2026-03-10T12:37:31.568 INFO:tasks.workunit.client.1.vm07.stdout:8/167: readlink d1/d3/l14 0 2026-03-10T12:37:31.584 INFO:tasks.workunit.client.1.vm07.stdout:5/163: unlink d0/d22/d18/d30/f34 0 2026-03-10T12:37:31.586 INFO:tasks.workunit.client.1.vm07.stdout:9/95: dread d5/d13/f1b [0,4194304] 0 2026-03-10T12:37:31.586 INFO:tasks.workunit.client.1.vm07.stdout:9/96: chown d5/f1a 11 1 2026-03-10T12:37:31.597 INFO:tasks.workunit.client.1.vm07.stdout:3/146: mknod dc/dd/d1f/c32 0 2026-03-10T12:37:31.597 INFO:tasks.workunit.client.1.vm07.stdout:3/147: dread - dc/d18/d24/f2c zero size 2026-03-10T12:37:31.613 INFO:tasks.workunit.client.1.vm07.stdout:4/135: mkdir d0/d19/d1f/d2b/d2d 0 2026-03-10T12:37:31.629 INFO:tasks.workunit.client.1.vm07.stdout:0/176: getdents d0/d14/d1a/d2f/d31 0 2026-03-10T12:37:31.645 INFO:tasks.workunit.client.1.vm07.stdout:1/159: mknod d9/df/d29/d2c/c33 0 2026-03-10T12:37:31.645 INFO:tasks.workunit.client.1.vm07.stdout:1/160: chown c4 134 1 2026-03-10T12:37:31.649 INFO:tasks.workunit.client.1.vm07.stdout:4/136: rmdir d0/d4/d10/d23 39 2026-03-10T12:37:31.650 INFO:tasks.workunit.client.1.vm07.stdout:4/137: dread d0/d4/d10/d18/f1a [4194304,4194304] 0 2026-03-10T12:37:31.668 INFO:tasks.workunit.client.1.vm07.stdout:0/177: mkdir d0/d14/d1a/d1b/d3b 0 2026-03-10T12:37:31.668 INFO:tasks.workunit.client.1.vm07.stdout:0/178: dread - d0/d14/d1a/f30 zero size 2026-03-10T12:37:31.674 INFO:tasks.workunit.client.1.vm07.stdout:3/148: mknod dc/d18/d24/c33 0 2026-03-10T12:37:31.699 INFO:tasks.workunit.client.1.vm07.stdout:9/97: rename d5/fa to d5/f1c 0 2026-03-10T12:37:31.699 INFO:tasks.workunit.client.1.vm07.stdout:9/98: fsync d5/f1a 0 2026-03-10T12:37:31.701 INFO:tasks.workunit.client.1.vm07.stdout:6/113: truncate d1/f17 9429 0 2026-03-10T12:37:31.701 INFO:tasks.workunit.client.1.vm07.stdout:6/114: fdatasync d1/f1e 0 2026-03-10T12:37:31.702 INFO:tasks.workunit.client.1.vm07.stdout:6/115: truncate d1/d9/f22 1038480 0 2026-03-10T12:37:31.702 INFO:tasks.workunit.client.1.vm07.stdout:6/116: chown d1/d4/f19 850121525 1 2026-03-10T12:37:31.703 INFO:tasks.workunit.client.1.vm07.stdout:1/161: symlink d9/df/d29/d2b/d30/l34 0 2026-03-10T12:37:31.703 INFO:tasks.workunit.client.1.vm07.stdout:1/162: write d9/f1b [1482034,23550] 0 2026-03-10T12:37:31.707 INFO:tasks.workunit.client.1.vm07.stdout:8/168: getdents d1/d3 0 2026-03-10T12:37:31.707 INFO:tasks.workunit.client.1.vm07.stdout:8/169: truncate d1/d3/f25 1015547 0 2026-03-10T12:37:31.708 INFO:tasks.workunit.client.1.vm07.stdout:8/170: write d1/fc [2253651,66692] 0 2026-03-10T12:37:31.713 INFO:tasks.workunit.client.1.vm07.stdout:4/138: creat d0/d4/d10/d23/f2e x:0 0 0 2026-03-10T12:37:31.717 INFO:tasks.workunit.client.1.vm07.stdout:4/139: dwrite d0/d4/d5/da/f15 [8388608,4194304] 0 2026-03-10T12:37:31.723 INFO:tasks.workunit.client.1.vm07.stdout:4/140: dwrite d0/d19/f25 [0,4194304] 0 2026-03-10T12:37:31.741 INFO:tasks.workunit.client.1.vm07.stdout:2/86: write d0/f18 [986667,25142] 0 2026-03-10T12:37:31.744 INFO:tasks.workunit.client.1.vm07.stdout:2/87: dwrite d0/f1c [4194304,4194304] 0 2026-03-10T12:37:31.745 INFO:tasks.workunit.client.1.vm07.stdout:2/88: write d0/d19/f1b [97712,37663] 0 2026-03-10T12:37:31.758 INFO:tasks.workunit.client.1.vm07.stdout:7/121: getdents d0 0 2026-03-10T12:37:31.759 INFO:tasks.workunit.client.1.vm07.stdout:8/171: fdatasync d1/f7 0 2026-03-10T12:37:31.771 INFO:tasks.workunit.client.1.vm07.stdout:4/141: fsync d0/d4/d10/d18/f1a 0 2026-03-10T12:37:31.772 INFO:tasks.workunit.client.1.vm07.stdout:4/142: write d0/d19/d1f/f22 [388443,128212] 0 2026-03-10T12:37:31.773 INFO:tasks.workunit.client.1.vm07.stdout:6/117: chown d1/f17 15196 1 2026-03-10T12:37:31.781 INFO:tasks.workunit.client.1.vm07.stdout:7/122: readlink d0/l1c 0 2026-03-10T12:37:31.781 INFO:tasks.workunit.client.1.vm07.stdout:7/123: stat d0/c11 0 2026-03-10T12:37:31.784 INFO:tasks.workunit.client.1.vm07.stdout:7/124: dwrite d0/f16 [0,4194304] 0 2026-03-10T12:37:31.785 INFO:tasks.workunit.client.1.vm07.stdout:7/125: fsync d0/f3 0 2026-03-10T12:37:31.787 INFO:tasks.workunit.client.1.vm07.stdout:7/126: dread d0/f16 [0,4194304] 0 2026-03-10T12:37:31.798 INFO:tasks.workunit.client.1.vm07.stdout:8/172: mknod d1/d3/d11/c3b 0 2026-03-10T12:37:31.802 INFO:tasks.workunit.client.1.vm07.stdout:0/179: link d0/d14/d1a/d1b/l29 d0/d14/d1a/d2f/d31/l3c 0 2026-03-10T12:37:31.803 INFO:tasks.workunit.client.1.vm07.stdout:0/180: write d0/d14/f36 [176530,61101] 0 2026-03-10T12:37:31.805 INFO:tasks.workunit.client.1.vm07.stdout:6/118: chown d1/d4/d6/cd 9 1 2026-03-10T12:37:31.807 INFO:tasks.workunit.client.1.vm07.stdout:2/89: unlink d0/le 0 2026-03-10T12:37:31.809 INFO:tasks.workunit.client.1.vm07.stdout:2/90: write d0/f12 [84528,103954] 0 2026-03-10T12:37:31.817 INFO:tasks.workunit.client.1.vm07.stdout:7/127: creat d0/f1d x:0 0 0 2026-03-10T12:37:31.817 INFO:tasks.workunit.client.1.vm07.stdout:7/128: write d0/f14 [424696,9442] 0 2026-03-10T12:37:31.818 INFO:tasks.workunit.client.1.vm07.stdout:7/129: write d0/fc [5143565,66732] 0 2026-03-10T12:37:31.820 INFO:tasks.workunit.client.1.vm07.stdout:7/130: dread d0/fc [0,4194304] 0 2026-03-10T12:37:31.838 INFO:tasks.workunit.client.1.vm07.stdout:2/91: symlink d0/d19/l21 0 2026-03-10T12:37:31.854 INFO:tasks.workunit.client.1.vm07.stdout:8/173: unlink d1/d3/l14 0 2026-03-10T12:37:31.855 INFO:tasks.workunit.client.1.vm07.stdout:8/174: stat d1/d3/ff 0 2026-03-10T12:37:31.857 INFO:tasks.workunit.client.1.vm07.stdout:4/143: link d0/d4/d10/d18/f1a d0/d4/d10/f2f 0 2026-03-10T12:37:31.859 INFO:tasks.workunit.client.1.vm07.stdout:4/144: write d0/d19/f25 [3540758,130706] 0 2026-03-10T12:37:31.862 INFO:tasks.workunit.client.1.vm07.stdout:7/131: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:31.866 INFO:tasks.workunit.client.1.vm07.stdout:7/132: dread d0/f13 [0,4194304] 0 2026-03-10T12:37:31.866 INFO:tasks.workunit.client.1.vm07.stdout:7/133: chown d0/c11 166 1 2026-03-10T12:37:31.868 INFO:tasks.workunit.client.1.vm07.stdout:8/175: creat d1/d3/d11/f3c x:0 0 0 2026-03-10T12:37:31.869 INFO:tasks.workunit.client.1.vm07.stdout:0/181: creat d0/d14/d1a/f3d x:0 0 0 2026-03-10T12:37:31.869 INFO:tasks.workunit.client.1.vm07.stdout:0/182: write d0/f1d [561219,66237] 0 2026-03-10T12:37:31.874 INFO:tasks.workunit.client.1.vm07.stdout:4/145: rename d0/d19/d1f/c2c to d0/d19/d1f/d2b/d2d/c30 0 2026-03-10T12:37:31.875 INFO:tasks.workunit.client.1.vm07.stdout:4/146: read d0/d4/d5/da/f15 [1136329,86514] 0 2026-03-10T12:37:31.877 INFO:tasks.workunit.client.1.vm07.stdout:0/183: dwrite d0/d14/f19 [0,4194304] 0 2026-03-10T12:37:31.878 INFO:tasks.workunit.client.1.vm07.stdout:0/184: dread - d0/d14/d1a/f30 zero size 2026-03-10T12:37:31.878 INFO:tasks.workunit.client.1.vm07.stdout:0/185: chown d0/f1d 7 1 2026-03-10T12:37:31.881 INFO:tasks.workunit.client.1.vm07.stdout:4/147: dread d0/d4/d10/d23/f27 [0,4194304] 0 2026-03-10T12:37:31.881 INFO:tasks.workunit.client.1.vm07.stdout:4/148: fsync d0/d4/d10/d23/f27 0 2026-03-10T12:37:31.885 INFO:tasks.workunit.client.1.vm07.stdout:4/149: dwrite d0/d4/d10/f16 [0,4194304] 0 2026-03-10T12:37:31.900 INFO:tasks.workunit.client.1.vm07.stdout:0/186: unlink d0/l8 0 2026-03-10T12:37:31.901 INFO:tasks.workunit.client.1.vm07.stdout:4/150: write d0/d4/d10/d18/f1a [4974120,61237] 0 2026-03-10T12:37:31.902 INFO:tasks.workunit.client.1.vm07.stdout:4/151: readlink d0/d4/d10/d23/ld 0 2026-03-10T12:37:31.902 INFO:tasks.workunit.client.1.vm07.stdout:4/152: stat d0/d4/d5/da/l1b 0 2026-03-10T12:37:31.911 INFO:tasks.workunit.client.1.vm07.stdout:8/176: truncate d1/f2 1826930 0 2026-03-10T12:37:31.911 INFO:tasks.workunit.client.1.vm07.stdout:8/177: fsync d1/d3/f8 0 2026-03-10T12:37:31.912 INFO:tasks.workunit.client.1.vm07.stdout:4/153: symlink d0/d4/d10/l31 0 2026-03-10T12:37:31.921 INFO:tasks.workunit.client.1.vm07.stdout:5/164: truncate d0/f1e 2489997 0 2026-03-10T12:37:31.926 INFO:tasks.workunit.client.1.vm07.stdout:5/165: dwrite d0/d22/d18/d19/f23 [0,4194304] 0 2026-03-10T12:37:31.936 INFO:tasks.workunit.client.0.vm00.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T12:37:31.942 INFO:tasks.workunit.client.1.vm07.stdout:4/154: link d0/d4/d5/l20 d0/d4/d5/da/l32 0 2026-03-10T12:37:31.945 INFO:tasks.workunit.client.1.vm07.stdout:4/155: dwrite d0/d19/f25 [0,4194304] 0 2026-03-10T12:37:31.963 INFO:tasks.workunit.client.1.vm07.stdout:1/163: rmdir d9/df/d29/d2b 39 2026-03-10T12:37:31.966 INFO:tasks.workunit.client.1.vm07.stdout:4/156: creat d0/f33 x:0 0 0 2026-03-10T12:37:31.980 INFO:tasks.workunit.client.0.vm00.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T12:37:31.980 INFO:tasks.workunit.client.0.vm00.stderr:+ make 2026-03-10T12:37:32.043 INFO:tasks.workunit.client.1.vm07.stdout:4/157: sync 2026-03-10T12:37:32.055 INFO:tasks.workunit.client.1.vm07.stdout:7/134: fsync d0/f1d 0 2026-03-10T12:37:32.056 INFO:tasks.workunit.client.1.vm07.stdout:2/92: getdents d0/d19 0 2026-03-10T12:37:32.057 INFO:tasks.workunit.client.1.vm07.stdout:7/135: creat d0/f1e x:0 0 0 2026-03-10T12:37:32.057 INFO:tasks.workunit.client.1.vm07.stdout:4/158: sync 2026-03-10T12:37:32.059 INFO:tasks.workunit.client.1.vm07.stdout:2/93: rename d0/d19/f1a to d0/d19/f22 0 2026-03-10T12:37:32.062 INFO:tasks.workunit.client.1.vm07.stdout:2/94: dwrite d0/f15 [0,4194304] 0 2026-03-10T12:37:32.063 INFO:tasks.workunit.client.1.vm07.stdout:2/95: dread - d0/d19/f22 zero size 2026-03-10T12:37:32.063 INFO:tasks.workunit.client.1.vm07.stdout:2/96: fsync d0/f12 0 2026-03-10T12:37:32.068 INFO:tasks.workunit.client.1.vm07.stdout:7/136: dwrite d0/f16 [4194304,4194304] 0 2026-03-10T12:37:32.072 INFO:tasks.workunit.client.1.vm07.stdout:9/99: rmdir d5 39 2026-03-10T12:37:32.081 INFO:tasks.workunit.client.1.vm07.stdout:4/159: unlink d0/d4/d5/c2a 0 2026-03-10T12:37:32.084 INFO:tasks.workunit.client.1.vm07.stdout:4/160: dread d0/d19/f25 [0,4194304] 0 2026-03-10T12:37:32.085 INFO:tasks.workunit.client.1.vm07.stdout:2/97: chown d0/lf 1898 1 2026-03-10T12:37:32.088 INFO:tasks.workunit.client.1.vm07.stdout:2/98: dwrite d0/f13 [0,4194304] 0 2026-03-10T12:37:32.091 INFO:tasks.workunit.client.1.vm07.stdout:7/137: creat d0/f1f x:0 0 0 2026-03-10T12:37:32.094 INFO:tasks.workunit.client.1.vm07.stdout:3/149: dwrite dc/dd/f29 [0,4194304] 0 2026-03-10T12:37:32.099 INFO:tasks.workunit.client.1.vm07.stdout:8/178: link d1/f19 d1/f3d 0 2026-03-10T12:37:32.107 INFO:tasks.workunit.client.1.vm07.stdout:4/161: mkdir d0/d4/d5/d34 0 2026-03-10T12:37:32.107 INFO:tasks.workunit.client.1.vm07.stdout:4/162: dread - d0/f33 zero size 2026-03-10T12:37:32.111 INFO:tasks.workunit.client.1.vm07.stdout:2/99: write d0/f4 [1780268,17083] 0 2026-03-10T12:37:32.111 INFO:tasks.workunit.client.1.vm07.stdout:2/100: truncate d0/f14 763257 0 2026-03-10T12:37:32.114 INFO:tasks.workunit.client.1.vm07.stdout:7/138: creat d0/f20 x:0 0 0 2026-03-10T12:37:32.117 INFO:tasks.workunit.client.0.vm00.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T12:37:32.129 INFO:tasks.workunit.client.1.vm07.stdout:3/150: truncate dc/dd/d1f/f2f 1024424 0 2026-03-10T12:37:32.129 INFO:tasks.workunit.client.1.vm07.stdout:3/151: dread dc/dd/d1f/f2f [0,4194304] 0 2026-03-10T12:37:32.129 INFO:tasks.workunit.client.1.vm07.stdout:0/187: dwrite d0/f15 [0,4194304] 0 2026-03-10T12:37:32.129 INFO:tasks.workunit.client.1.vm07.stdout:0/188: dwrite d0/d14/f36 [0,4194304] 0 2026-03-10T12:37:32.129 INFO:tasks.workunit.client.1.vm07.stdout:0/189: readlink d0/d14/d1a/d1b/l1e 0 2026-03-10T12:37:32.138 INFO:tasks.workunit.client.1.vm07.stdout:0/190: dwrite d0/d14/d1a/f3d [0,4194304] 0 2026-03-10T12:37:32.147 INFO:tasks.workunit.client.1.vm07.stdout:2/101: sync 2026-03-10T12:37:32.148 INFO:tasks.workunit.client.1.vm07.stdout:2/102: write d0/f14 [320131,38026] 0 2026-03-10T12:37:32.152 INFO:tasks.workunit.client.1.vm07.stdout:7/139: creat d0/f21 x:0 0 0 2026-03-10T12:37:32.159 INFO:tasks.workunit.client.1.vm07.stdout:5/166: dwrite d0/fd [4194304,4194304] 0 2026-03-10T12:37:32.164 INFO:tasks.workunit.client.1.vm07.stdout:6/119: link d1/f17 d1/f26 0 2026-03-10T12:37:32.167 INFO:tasks.workunit.client.1.vm07.stdout:0/191: fsync d0/f1c 0 2026-03-10T12:37:32.167 INFO:tasks.workunit.client.1.vm07.stdout:0/192: fsync d0/d14/f19 0 2026-03-10T12:37:32.169 INFO:tasks.workunit.client.1.vm07.stdout:1/164: dwrite d9/f19 [0,4194304] 0 2026-03-10T12:37:32.174 INFO:tasks.workunit.client.1.vm07.stdout:7/140: rename d0/c11 to d0/c22 0 2026-03-10T12:37:32.182 INFO:tasks.workunit.client.1.vm07.stdout:3/152: dwrite dc/dd/f21 [4194304,4194304] 0 2026-03-10T12:37:32.191 INFO:tasks.workunit.client.1.vm07.stdout:5/167: truncate d0/d22/f16 992518 0 2026-03-10T12:37:32.192 INFO:tasks.workunit.client.1.vm07.stdout:6/120: symlink d1/d4/d6/l27 0 2026-03-10T12:37:32.192 INFO:tasks.workunit.client.1.vm07.stdout:6/121: chown d1/f1e 12829711 1 2026-03-10T12:37:32.193 INFO:tasks.workunit.client.1.vm07.stdout:6/122: readlink d1/d4/d6/l27 0 2026-03-10T12:37:32.198 INFO:tasks.workunit.client.1.vm07.stdout:7/141: creat d0/f23 x:0 0 0 2026-03-10T12:37:32.198 INFO:tasks.workunit.client.1.vm07.stdout:7/142: write d0/f3 [4288950,28600] 0 2026-03-10T12:37:32.199 INFO:tasks.workunit.client.1.vm07.stdout:3/153: creat dc/d18/f34 x:0 0 0 2026-03-10T12:37:32.200 INFO:tasks.workunit.client.1.vm07.stdout:4/163: rename d0/d4/d5/c14 to d0/d4/d10/c35 0 2026-03-10T12:37:32.201 INFO:tasks.workunit.client.1.vm07.stdout:7/143: symlink d0/l24 0 2026-03-10T12:37:32.201 INFO:tasks.workunit.client.1.vm07.stdout:7/144: chown d0/f20 108685877 1 2026-03-10T12:37:32.202 INFO:tasks.workunit.client.1.vm07.stdout:3/154: mknod dc/c35 0 2026-03-10T12:37:32.206 INFO:tasks.workunit.client.1.vm07.stdout:1/165: rename d9/f2f to d9/df/d29/d2b/d31/f35 0 2026-03-10T12:37:32.207 INFO:tasks.workunit.client.1.vm07.stdout:7/145: rename d0/f16 to d0/f25 0 2026-03-10T12:37:32.207 INFO:tasks.workunit.client.1.vm07.stdout:7/146: stat d0/f13 0 2026-03-10T12:37:32.207 INFO:tasks.workunit.client.1.vm07.stdout:3/155: creat dc/d18/f36 x:0 0 0 2026-03-10T12:37:32.207 INFO:tasks.workunit.client.1.vm07.stdout:6/123: link d1/d4/d6/lf d1/d4/d6/l28 0 2026-03-10T12:37:32.207 INFO:tasks.workunit.client.1.vm07.stdout:6/124: chown d1/d9 3 1 2026-03-10T12:37:32.207 INFO:tasks.workunit.client.1.vm07.stdout:6/125: read d1/d4/d6/f13 [2065157,27173] 0 2026-03-10T12:37:32.217 INFO:tasks.workunit.client.1.vm07.stdout:3/156: write dc/dd/f20 [578891,75367] 0 2026-03-10T12:37:32.226 INFO:tasks.workunit.client.1.vm07.stdout:4/164: creat d0/d4/d10/f36 x:0 0 0 2026-03-10T12:37:32.226 INFO:tasks.workunit.client.1.vm07.stdout:4/165: dread - d0/d4/d10/d23/f2e zero size 2026-03-10T12:37:32.230 INFO:tasks.workunit.client.1.vm07.stdout:3/157: creat dc/d18/d24/f37 x:0 0 0 2026-03-10T12:37:32.231 INFO:tasks.workunit.client.1.vm07.stdout:3/158: dread - dc/d18/f34 zero size 2026-03-10T12:37:32.232 INFO:tasks.workunit.client.1.vm07.stdout:3/159: symlink dc/l38 0 2026-03-10T12:37:32.233 INFO:tasks.workunit.client.1.vm07.stdout:3/160: symlink dc/d18/l39 0 2026-03-10T12:37:32.233 INFO:tasks.workunit.client.1.vm07.stdout:3/161: write dc/d18/d24/f2c [3746326,51384] 0 2026-03-10T12:37:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:31 vm00.local ceph-mon[50686]: pgmap v151: 65 pgs: 65 active+clean; 398 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 274 KiB/s rd, 29 MiB/s wr, 535 op/s 2026-03-10T12:37:32.234 INFO:tasks.workunit.client.1.vm07.stdout:3/162: write dc/dd/f29 [1255460,71572] 0 2026-03-10T12:37:32.236 INFO:tasks.workunit.client.1.vm07.stdout:3/163: creat dc/d18/d24/f3a x:0 0 0 2026-03-10T12:37:32.237 INFO:tasks.workunit.client.1.vm07.stdout:3/164: mkdir dc/dd/d28/d3b 0 2026-03-10T12:37:32.241 INFO:tasks.workunit.client.1.vm07.stdout:3/165: link dc/c35 dc/dd/d1f/c3c 0 2026-03-10T12:37:32.244 INFO:tasks.workunit.client.1.vm07.stdout:3/166: dread - dc/dd/f22 zero size 2026-03-10T12:37:32.244 INFO:tasks.workunit.client.1.vm07.stdout:3/167: dread - dc/d18/f36 zero size 2026-03-10T12:37:32.244 INFO:tasks.workunit.client.1.vm07.stdout:3/168: read dc/dd/f21 [7627788,28625] 0 2026-03-10T12:37:32.246 INFO:tasks.workunit.client.1.vm07.stdout:3/169: mkdir dc/d18/d2d/d3d 0 2026-03-10T12:37:32.247 INFO:tasks.workunit.client.1.vm07.stdout:3/170: creat dc/d18/d24/f3e x:0 0 0 2026-03-10T12:37:32.248 INFO:tasks.workunit.client.1.vm07.stdout:3/171: truncate dc/dd/f22 748553 0 2026-03-10T12:37:32.253 INFO:tasks.workunit.client.1.vm07.stdout:7/147: dread d0/f14 [0,4194304] 0 2026-03-10T12:37:32.255 INFO:tasks.workunit.client.1.vm07.stdout:0/193: dread d0/d14/d1a/f2c [0,4194304] 0 2026-03-10T12:37:32.255 INFO:tasks.workunit.client.1.vm07.stdout:0/194: stat d0/cf 0 2026-03-10T12:37:32.257 INFO:tasks.workunit.client.1.vm07.stdout:7/148: creat d0/f26 x:0 0 0 2026-03-10T12:37:32.258 INFO:tasks.workunit.client.1.vm07.stdout:0/195: mknod d0/d14/d1a/d2f/c3e 0 2026-03-10T12:37:32.258 INFO:tasks.workunit.client.1.vm07.stdout:0/196: write d0/f21 [362355,98819] 0 2026-03-10T12:37:32.259 INFO:tasks.workunit.client.1.vm07.stdout:0/197: stat d0/l1f 0 2026-03-10T12:37:32.262 INFO:tasks.workunit.client.1.vm07.stdout:0/198: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:37:32.264 INFO:tasks.workunit.client.1.vm07.stdout:0/199: write d0/f2e [962945,119353] 0 2026-03-10T12:37:32.269 INFO:tasks.workunit.client.1.vm07.stdout:0/200: chown d0/f1c 2215303 1 2026-03-10T12:37:32.269 INFO:tasks.workunit.client.1.vm07.stdout:0/201: unlink d0/d14/d1a/d1b/c28 0 2026-03-10T12:37:32.269 INFO:tasks.workunit.client.1.vm07.stdout:0/202: symlink d0/d14/d1a/d2f/d31/l3f 0 2026-03-10T12:37:32.271 INFO:tasks.workunit.client.1.vm07.stdout:0/203: dread d0/d14/f37 [0,4194304] 0 2026-03-10T12:37:32.287 INFO:tasks.workunit.client.1.vm07.stdout:0/204: fsync d0/d14/d1a/f3d 0 2026-03-10T12:37:32.288 INFO:tasks.workunit.client.1.vm07.stdout:0/205: write d0/d14/d1a/f30 [666683,6860] 0 2026-03-10T12:37:32.298 INFO:tasks.workunit.client.1.vm07.stdout:9/100: write d5/fb [6976567,90743] 0 2026-03-10T12:37:32.302 INFO:tasks.workunit.client.1.vm07.stdout:9/101: fdatasync d5/fe 0 2026-03-10T12:37:32.302 INFO:tasks.workunit.client.1.vm07.stdout:9/102: write d5/fb [2110629,94796] 0 2026-03-10T12:37:32.304 INFO:tasks.workunit.client.1.vm07.stdout:9/103: dread d5/f8 [0,4194304] 0 2026-03-10T12:37:32.306 INFO:tasks.workunit.client.1.vm07.stdout:9/104: symlink d5/d16/l1d 0 2026-03-10T12:37:32.307 INFO:tasks.workunit.client.1.vm07.stdout:9/105: link d5/f8 d5/d16/d18/f1e 0 2026-03-10T12:37:32.310 INFO:tasks.workunit.client.1.vm07.stdout:0/206: dread d0/fd [0,4194304] 0 2026-03-10T12:37:32.311 INFO:tasks.workunit.client.1.vm07.stdout:0/207: fsync d0/f1d 0 2026-03-10T12:37:32.311 INFO:tasks.workunit.client.1.vm07.stdout:0/208: chown d0/d14/d1a/d38 0 1 2026-03-10T12:37:32.313 INFO:tasks.workunit.client.1.vm07.stdout:9/106: mkdir d5/d1f 0 2026-03-10T12:37:32.315 INFO:tasks.workunit.client.1.vm07.stdout:0/209: unlink d0/d14/l35 0 2026-03-10T12:37:32.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:31 vm07.local ceph-mon[58582]: pgmap v151: 65 pgs: 65 active+clean; 398 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail; 274 KiB/s rd, 29 MiB/s wr, 535 op/s 2026-03-10T12:37:32.316 INFO:tasks.workunit.client.1.vm07.stdout:9/107: creat d5/d16/d18/f20 x:0 0 0 2026-03-10T12:37:32.318 INFO:tasks.workunit.client.1.vm07.stdout:0/210: mknod d0/d14/d1a/d2f/c40 0 2026-03-10T12:37:32.319 INFO:tasks.workunit.client.1.vm07.stdout:9/108: dread d5/fb [4194304,4194304] 0 2026-03-10T12:37:32.320 INFO:tasks.workunit.client.1.vm07.stdout:0/211: write d0/d14/f37 [2310874,68399] 0 2026-03-10T12:37:32.321 INFO:tasks.workunit.client.1.vm07.stdout:2/103: write d0/d19/f1e [1500556,16281] 0 2026-03-10T12:37:32.322 INFO:tasks.workunit.client.1.vm07.stdout:2/104: stat d0/d19/f22 0 2026-03-10T12:37:32.322 INFO:tasks.workunit.client.1.vm07.stdout:2/105: read d0/f13 [2206473,118744] 0 2026-03-10T12:37:32.327 INFO:tasks.workunit.client.1.vm07.stdout:2/106: fdatasync d0/d19/f1e 0 2026-03-10T12:37:32.327 INFO:tasks.workunit.client.1.vm07.stdout:2/107: fdatasync d0/f12 0 2026-03-10T12:37:32.328 INFO:tasks.workunit.client.1.vm07.stdout:4/166: getdents d0/d4/d5 0 2026-03-10T12:37:32.328 INFO:tasks.workunit.client.1.vm07.stdout:2/108: rename d0/d19/d1f/d20 to d0/d19/d1f/d20/d23 22 2026-03-10T12:37:32.330 INFO:tasks.workunit.client.1.vm07.stdout:9/109: write d5/d16/d18/f1e [488250,49482] 0 2026-03-10T12:37:32.334 INFO:tasks.workunit.client.1.vm07.stdout:5/168: truncate d0/f1f 539262 0 2026-03-10T12:37:32.344 INFO:tasks.workunit.client.1.vm07.stdout:6/126: write d1/d4/f11 [725776,56311] 0 2026-03-10T12:37:32.345 INFO:tasks.workunit.client.1.vm07.stdout:6/127: fdatasync d1/d4/d6/f15 0 2026-03-10T12:37:32.347 INFO:tasks.workunit.client.1.vm07.stdout:1/166: dwrite f6 [0,4194304] 0 2026-03-10T12:37:32.360 INFO:tasks.workunit.client.1.vm07.stdout:3/172: rmdir dc/d18/d24 39 2026-03-10T12:37:32.360 INFO:tasks.workunit.client.1.vm07.stdout:7/149: getdents d0 0 2026-03-10T12:37:32.362 INFO:tasks.workunit.client.1.vm07.stdout:5/169: mkdir d0/d22/d18/d19/d21/d3a 0 2026-03-10T12:37:32.365 INFO:tasks.workunit.client.1.vm07.stdout:5/170: dwrite d0/d22/d18/d19/d21/f2d [0,4194304] 0 2026-03-10T12:37:32.367 INFO:tasks.workunit.client.1.vm07.stdout:0/212: unlink d0/d14/d1a/c20 0 2026-03-10T12:37:32.367 INFO:tasks.workunit.client.1.vm07.stdout:0/213: chown d0/f2e 0 1 2026-03-10T12:37:32.369 INFO:tasks.workunit.client.1.vm07.stdout:5/171: dread d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:37:32.375 INFO:tasks.workunit.client.1.vm07.stdout:6/128: creat d1/d4/d6/d16/d1a/f29 x:0 0 0 2026-03-10T12:37:32.377 INFO:tasks.workunit.client.1.vm07.stdout:7/150: creat d0/f27 x:0 0 0 2026-03-10T12:37:32.379 INFO:tasks.workunit.client.1.vm07.stdout:2/109: rename d0/c2 to d0/d19/c24 0 2026-03-10T12:37:32.379 INFO:tasks.workunit.client.1.vm07.stdout:2/110: read d0/f4 [3893437,18204] 0 2026-03-10T12:37:32.384 INFO:tasks.workunit.client.1.vm07.stdout:0/214: fsync d0/d14/d1a/f2c 0 2026-03-10T12:37:32.384 INFO:tasks.workunit.client.1.vm07.stdout:5/172: unlink d0/fc 0 2026-03-10T12:37:32.385 INFO:tasks.workunit.client.1.vm07.stdout:3/173: creat dc/d18/d24/f3f x:0 0 0 2026-03-10T12:37:32.386 INFO:tasks.workunit.client.1.vm07.stdout:7/151: chown d0/f20 13 1 2026-03-10T12:37:32.386 INFO:tasks.workunit.client.1.vm07.stdout:6/129: creat d1/d4/d6/f2a x:0 0 0 2026-03-10T12:37:32.387 INFO:tasks.workunit.client.1.vm07.stdout:6/130: chown d1/d4/d6 11 1 2026-03-10T12:37:32.391 INFO:tasks.workunit.client.1.vm07.stdout:3/174: dwrite dc/dd/f20 [0,4194304] 0 2026-03-10T12:37:32.392 INFO:tasks.workunit.client.1.vm07.stdout:1/167: rename d9/f27 to d9/f36 0 2026-03-10T12:37:32.395 INFO:tasks.workunit.client.1.vm07.stdout:8/179: rmdir d1/d3 39 2026-03-10T12:37:32.403 INFO:tasks.workunit.client.1.vm07.stdout:7/152: truncate d0/f1e 577514 0 2026-03-10T12:37:32.405 INFO:tasks.workunit.client.1.vm07.stdout:0/215: rename d0/d14/d1a/d38 to d0/d14/d1a/d1b/d41 0 2026-03-10T12:37:32.409 INFO:tasks.workunit.client.1.vm07.stdout:7/153: write d0/f13 [345796,6805] 0 2026-03-10T12:37:32.412 INFO:tasks.workunit.client.1.vm07.stdout:0/216: unlink d0/d14/d1a/d1b/c2a 0 2026-03-10T12:37:32.412 INFO:tasks.workunit.client.1.vm07.stdout:0/217: stat d0/d14/d1a/f24 0 2026-03-10T12:37:32.413 INFO:tasks.workunit.client.1.vm07.stdout:8/180: write d1/d3/d11/f35 [382495,14356] 0 2026-03-10T12:37:32.415 INFO:tasks.workunit.client.1.vm07.stdout:7/154: dread d0/f3 [0,4194304] 0 2026-03-10T12:37:32.416 INFO:tasks.workunit.client.1.vm07.stdout:7/155: dread - d0/f27 zero size 2026-03-10T12:37:32.419 INFO:tasks.workunit.client.1.vm07.stdout:0/218: symlink d0/d14/d1a/d1b/d41/l42 0 2026-03-10T12:37:32.422 INFO:tasks.workunit.client.1.vm07.stdout:6/131: rename d1/d4/d6/f15 to d1/d4/f2b 0 2026-03-10T12:37:32.423 INFO:tasks.workunit.client.1.vm07.stdout:6/132: chown d1/d9/fb 4347 1 2026-03-10T12:37:32.425 INFO:tasks.workunit.client.1.vm07.stdout:8/181: creat d1/f3e x:0 0 0 2026-03-10T12:37:32.427 INFO:tasks.workunit.client.1.vm07.stdout:8/182: dread d1/f19 [0,4194304] 0 2026-03-10T12:37:32.427 INFO:tasks.workunit.client.1.vm07.stdout:8/183: chown d1/f19 971526 1 2026-03-10T12:37:32.428 INFO:tasks.workunit.client.1.vm07.stdout:0/219: symlink d0/d14/d1a/l43 0 2026-03-10T12:37:32.430 INFO:tasks.workunit.client.1.vm07.stdout:6/133: mkdir d1/d4/d6/d16/d1a/d2c 0 2026-03-10T12:37:32.431 INFO:tasks.workunit.client.1.vm07.stdout:6/134: chown d1/d4/d6/l1b 4782 1 2026-03-10T12:37:32.431 INFO:tasks.workunit.client.1.vm07.stdout:1/168: dread d9/fe [0,4194304] 0 2026-03-10T12:37:32.431 INFO:tasks.workunit.client.1.vm07.stdout:6/135: rename d1/d4 to d1/d4/d6/d16/d2d 22 2026-03-10T12:37:32.432 INFO:tasks.workunit.client.1.vm07.stdout:0/220: dread d0/d14/f36 [0,4194304] 0 2026-03-10T12:37:32.436 INFO:tasks.workunit.client.1.vm07.stdout:7/156: creat d0/f28 x:0 0 0 2026-03-10T12:37:32.438 INFO:tasks.workunit.client.1.vm07.stdout:7/157: write d0/f27 [186916,112527] 0 2026-03-10T12:37:32.439 INFO:tasks.workunit.client.1.vm07.stdout:2/111: sync 2026-03-10T12:37:32.439 INFO:tasks.workunit.client.1.vm07.stdout:8/184: dwrite d1/d3/d11/f15 [0,4194304] 0 2026-03-10T12:37:32.449 INFO:tasks.workunit.client.1.vm07.stdout:6/136: read d1/f17 [8304,126989] 0 2026-03-10T12:37:32.450 INFO:tasks.workunit.client.1.vm07.stdout:5/173: dread d0/f1f [0,4194304] 0 2026-03-10T12:37:32.458 INFO:tasks.workunit.client.1.vm07.stdout:4/167: write d0/d19/f25 [921349,78228] 0 2026-03-10T12:37:32.462 INFO:tasks.workunit.client.1.vm07.stdout:1/169: rename d9/df/d29/c2e to d9/df/d29/d2b/c37 0 2026-03-10T12:37:32.462 INFO:tasks.workunit.client.1.vm07.stdout:5/174: dread d0/ff [4194304,4194304] 0 2026-03-10T12:37:32.465 INFO:tasks.workunit.client.1.vm07.stdout:5/175: dread d0/f2b [0,4194304] 0 2026-03-10T12:37:32.471 INFO:tasks.workunit.client.1.vm07.stdout:9/110: truncate d5/d16/d18/f1e 4164445 0 2026-03-10T12:37:32.473 INFO:tasks.workunit.client.1.vm07.stdout:1/170: rename d9/f16 to d9/df/d29/d2b/d30/f38 0 2026-03-10T12:37:32.474 INFO:tasks.workunit.client.1.vm07.stdout:1/171: chown d9/df/f21 1 1 2026-03-10T12:37:32.474 INFO:tasks.workunit.client.1.vm07.stdout:1/172: stat d9/df/f26 0 2026-03-10T12:37:32.476 INFO:tasks.workunit.client.1.vm07.stdout:1/173: dread d9/fe [0,4194304] 0 2026-03-10T12:37:32.477 INFO:tasks.workunit.client.1.vm07.stdout:1/174: write d9/fc [81172,93728] 0 2026-03-10T12:37:32.477 INFO:tasks.workunit.client.1.vm07.stdout:1/175: stat d9/f19 0 2026-03-10T12:37:32.483 INFO:tasks.workunit.client.1.vm07.stdout:1/176: dwrite d9/df/f24 [0,4194304] 0 2026-03-10T12:37:32.490 INFO:tasks.workunit.client.1.vm07.stdout:7/158: creat d0/f29 x:0 0 0 2026-03-10T12:37:32.492 INFO:tasks.workunit.client.1.vm07.stdout:3/175: write dc/dd/d1f/f2f [1055271,119886] 0 2026-03-10T12:37:32.498 INFO:tasks.workunit.client.1.vm07.stdout:5/176: symlink d0/d22/d18/d30/l3b 0 2026-03-10T12:37:32.503 INFO:tasks.workunit.client.1.vm07.stdout:8/185: creat d1/f3f x:0 0 0 2026-03-10T12:37:32.509 INFO:tasks.workunit.client.1.vm07.stdout:8/186: dwrite d1/f3f [0,4194304] 0 2026-03-10T12:37:32.512 INFO:tasks.workunit.client.1.vm07.stdout:4/168: creat d0/d4/d5/d34/f37 x:0 0 0 2026-03-10T12:37:32.515 INFO:tasks.workunit.client.1.vm07.stdout:5/177: mknod d0/d22/d18/c3c 0 2026-03-10T12:37:32.515 INFO:tasks.workunit.client.1.vm07.stdout:0/221: link d0/d14/d1a/d2f/d31/l3f d0/d14/l44 0 2026-03-10T12:37:32.520 INFO:tasks.workunit.client.1.vm07.stdout:7/159: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:32.524 INFO:tasks.workunit.client.1.vm07.stdout:2/112: getdents d0/d19 0 2026-03-10T12:37:32.525 INFO:tasks.workunit.client.1.vm07.stdout:0/222: dwrite d0/f1c [0,4194304] 0 2026-03-10T12:37:32.530 INFO:tasks.workunit.client.1.vm07.stdout:9/111: link d5/d16/f19 d5/d1f/f21 0 2026-03-10T12:37:32.531 INFO:tasks.workunit.client.1.vm07.stdout:8/187: chown d1/f2 60062414 1 2026-03-10T12:37:32.532 INFO:tasks.workunit.client.1.vm07.stdout:7/160: dwrite d0/f29 [0,4194304] 0 2026-03-10T12:37:32.537 INFO:tasks.workunit.client.1.vm07.stdout:3/176: symlink dc/d18/l40 0 2026-03-10T12:37:32.537 INFO:tasks.workunit.client.1.vm07.stdout:0/223: mknod d0/c45 0 2026-03-10T12:37:32.544 INFO:tasks.workunit.client.1.vm07.stdout:6/137: link d1/d4/d6/l28 d1/d4/l2e 0 2026-03-10T12:37:32.546 INFO:tasks.workunit.client.1.vm07.stdout:4/169: rename d0/d19/d1f/d2b/d2d/c30 to d0/d4/d10/d18/c38 0 2026-03-10T12:37:32.547 INFO:tasks.workunit.client.1.vm07.stdout:9/112: mkdir d5/d13/d22 0 2026-03-10T12:37:32.547 INFO:tasks.workunit.client.1.vm07.stdout:8/188: mkdir d1/d3/d40 0 2026-03-10T12:37:32.547 INFO:tasks.workunit.client.1.vm07.stdout:0/224: dwrite d0/d14/d1a/f30 [0,4194304] 0 2026-03-10T12:37:32.547 INFO:tasks.workunit.client.1.vm07.stdout:8/189: write d1/d3/d11/f35 [2462350,17056] 0 2026-03-10T12:37:32.547 INFO:tasks.workunit.client.1.vm07.stdout:9/113: truncate d5/d13/f1b 575547 0 2026-03-10T12:37:32.548 INFO:tasks.workunit.client.1.vm07.stdout:5/178: creat d0/d22/d18/d19/d36/f3d x:0 0 0 2026-03-10T12:37:32.552 INFO:tasks.workunit.client.1.vm07.stdout:7/161: read d0/f3 [450008,78405] 0 2026-03-10T12:37:32.556 INFO:tasks.workunit.client.1.vm07.stdout:7/162: truncate d0/f13 1244836 0 2026-03-10T12:37:32.560 INFO:tasks.workunit.client.1.vm07.stdout:6/138: write d1/d4/d6/f13 [4196891,97765] 0 2026-03-10T12:37:32.564 INFO:tasks.workunit.client.1.vm07.stdout:2/113: rename d0/f17 to d0/d19/f25 0 2026-03-10T12:37:32.570 INFO:tasks.workunit.client.1.vm07.stdout:2/114: dwrite d0/f12 [0,4194304] 0 2026-03-10T12:37:32.570 INFO:tasks.workunit.client.1.vm07.stdout:1/177: getdents d9/df/d29/d2b 0 2026-03-10T12:37:32.576 INFO:tasks.workunit.client.1.vm07.stdout:8/190: fdatasync d1/f3d 0 2026-03-10T12:37:32.585 INFO:tasks.workunit.client.1.vm07.stdout:9/114: dwrite d5/d16/f19 [0,4194304] 0 2026-03-10T12:37:32.587 INFO:tasks.workunit.client.1.vm07.stdout:5/179: truncate d0/d22/d18/f20 4249548 0 2026-03-10T12:37:32.590 INFO:tasks.workunit.client.1.vm07.stdout:7/163: readlink d0/l1c 0 2026-03-10T12:37:32.595 INFO:tasks.workunit.client.1.vm07.stdout:1/178: mknod d9/df/c39 0 2026-03-10T12:37:32.595 INFO:tasks.workunit.client.1.vm07.stdout:2/115: mkdir d0/d19/d26 0 2026-03-10T12:37:32.595 INFO:tasks.workunit.client.1.vm07.stdout:9/115: dwrite d5/d13/f14 [0,4194304] 0 2026-03-10T12:37:32.597 INFO:tasks.workunit.client.1.vm07.stdout:8/191: write d1/d3/f1d [1582812,29194] 0 2026-03-10T12:37:32.597 INFO:tasks.workunit.client.1.vm07.stdout:2/116: chown d0/d19/f1b 21 1 2026-03-10T12:37:32.598 INFO:tasks.workunit.client.1.vm07.stdout:9/116: fdatasync d5/d16/d18/f20 0 2026-03-10T12:37:32.600 INFO:tasks.workunit.client.1.vm07.stdout:3/177: creat dc/dd/f41 x:0 0 0 2026-03-10T12:37:32.603 INFO:tasks.workunit.client.1.vm07.stdout:5/180: mkdir d0/d22/d18/d3e 0 2026-03-10T12:37:32.620 INFO:tasks.workunit.client.1.vm07.stdout:0/225: creat d0/d14/d1a/d1b/d3b/f46 x:0 0 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:9/117: truncate d5/fe 2081473 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:9/118: stat d5/d13/f14 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:5/181: mkdir d0/d22/d18/d19/d2e/d3f 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:6/139: link d1/d4/d6/lf d1/d4/d6/d16/l2f 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:6/140: stat d1/d9/f22 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:6/141: dread - d1/d4/d6/f2a zero size 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:6/142: dread d1/f1e [0,4194304] 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:8/192: creat d1/d3/d40/f41 x:0 0 0 2026-03-10T12:37:32.628 INFO:tasks.workunit.client.1.vm07.stdout:2/117: creat d0/d19/d26/f27 x:0 0 0 2026-03-10T12:37:32.629 INFO:tasks.workunit.client.1.vm07.stdout:2/118: dwrite d0/f1d [0,4194304] 0 2026-03-10T12:37:32.629 INFO:tasks.workunit.client.1.vm07.stdout:9/119: mkdir d5/d16/d23 0 2026-03-10T12:37:32.634 INFO:tasks.workunit.client.1.vm07.stdout:6/143: creat d1/d4/d6/f30 x:0 0 0 2026-03-10T12:37:32.634 INFO:tasks.workunit.client.1.vm07.stdout:8/193: creat d1/d3/d6/f42 x:0 0 0 2026-03-10T12:37:32.634 INFO:tasks.workunit.client.1.vm07.stdout:6/144: chown d1/d9 203 1 2026-03-10T12:37:32.637 INFO:tasks.workunit.client.1.vm07.stdout:8/194: write d1/d3/d18/f2e [581035,79547] 0 2026-03-10T12:37:32.637 INFO:tasks.workunit.client.0.vm00.stderr:++ readlink -f fsstress 2026-03-10T12:37:32.638 INFO:tasks.workunit.client.1.vm07.stdout:2/119: stat d0/lb 0 2026-03-10T12:37:32.639 INFO:tasks.workunit.client.1.vm07.stdout:9/120: mknod d5/d13/c24 0 2026-03-10T12:37:32.639 INFO:tasks.workunit.client.0.vm00.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T12:37:32.639 INFO:tasks.workunit.client.0.vm00.stderr:+ popd 2026-03-10T12:37:32.640 INFO:tasks.workunit.client.0.vm00.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T12:37:32.640 INFO:tasks.workunit.client.0.vm00.stderr:+ popd 2026-03-10T12:37:32.641 INFO:tasks.workunit.client.0.vm00.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-10T12:37:32.641 INFO:tasks.workunit.client.0.vm00.stderr:++ mktemp -d -p . 2026-03-10T12:37:32.651 INFO:tasks.workunit.client.1.vm07.stdout:5/182: symlink d0/d22/d18/d3e/l40 0 2026-03-10T12:37:32.665 INFO:tasks.workunit.client.0.vm00.stderr:+ T=./tmp.328fxcUIfq 2026-03-10T12:37:32.666 INFO:tasks.workunit.client.0.vm00.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.328fxcUIfq -l 1 -n 1000 -p 10 -v 2026-03-10T12:37:32.666 INFO:tasks.workunit.client.0.vm00.stdout:seed = 1773804569 2026-03-10T12:37:32.666 INFO:tasks.workunit.client.0.vm00.stdout:2/0: dwrite - no filename 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:1/179: dread d9/df/f13 [4194304,4194304] 0 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:3/178: link dc/dd/l11 dc/l42 0 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:2/120: chown d0/l11 22861555 1 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:5/183: chown d0/l3 876091371 1 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:1/180: write d9/df/f13 [545470,49252] 0 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:1/181: dread d9/fc [0,4194304] 0 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:1/182: truncate d9/df/d29/d2b/d30/f38 465690 0 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:9/121: link d5/d13/f1b d5/d13/d22/f25 0 2026-03-10T12:37:32.667 INFO:tasks.workunit.client.1.vm07.stdout:5/184: dwrite d0/f9 [0,4194304] 0 2026-03-10T12:37:32.674 INFO:tasks.workunit.client.0.vm00.stdout:0/0: creat f0 x:0 0 0 2026-03-10T12:37:32.675 INFO:tasks.workunit.client.0.vm00.stdout:3/0: link - no file 2026-03-10T12:37:32.676 INFO:tasks.workunit.client.0.vm00.stdout:3/1: rename - no filename 2026-03-10T12:37:32.676 INFO:tasks.workunit.client.0.vm00.stdout:3/2: dwrite - no filename 2026-03-10T12:37:32.676 INFO:tasks.workunit.client.1.vm07.stdout:0/226: sync 2026-03-10T12:37:32.680 INFO:tasks.workunit.client.1.vm07.stdout:6/145: creat d1/d4/f31 x:0 0 0 2026-03-10T12:37:32.681 INFO:tasks.workunit.client.1.vm07.stdout:6/146: write d1/d4/d6/f13 [2885078,129796] 0 2026-03-10T12:37:32.683 INFO:tasks.workunit.client.1.vm07.stdout:7/164: fsync d0/f10 0 2026-03-10T12:37:32.685 INFO:tasks.workunit.client.0.vm00.stdout:1/0: rmdir - no directory 2026-03-10T12:37:32.685 INFO:tasks.workunit.client.0.vm00.stdout:1/1: rename - no filename 2026-03-10T12:37:32.685 INFO:tasks.workunit.client.1.vm07.stdout:3/179: sync 2026-03-10T12:37:32.685 INFO:tasks.workunit.client.0.vm00.stdout:0/1: dwrite f0 [0,4194304] 0 2026-03-10T12:37:32.687 INFO:tasks.workunit.client.1.vm07.stdout:2/121: mknod d0/c28 0 2026-03-10T12:37:32.690 INFO:tasks.workunit.client.1.vm07.stdout:1/183: mknod d9/df/d29/d2b/c3a 0 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.0.vm00.stdout:3/3: mknod c0 0 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.0.vm00.stdout:3/4: truncate - no filename 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.0.vm00.stdout:4/0: chown . 2574 1 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.0.vm00.stdout:4/1: truncate - no filename 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.0.vm00.stdout:4/2: write - no filename 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.0.vm00.stdout:4/3: dwrite - no filename 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.1.vm07.stdout:0/227: dwrite d0/f1c [4194304,4194304] 0 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.1.vm07.stdout:5/185: mknod d0/d22/d18/d19/d2e/c41 0 2026-03-10T12:37:32.691 INFO:tasks.workunit.client.1.vm07.stdout:0/228: truncate d0/f1d 842341 0 2026-03-10T12:37:32.695 INFO:tasks.workunit.client.0.vm00.stdout:0/2: mknod c1 0 2026-03-10T12:37:32.695 INFO:tasks.workunit.client.0.vm00.stdout:0/3: stat c1 0 2026-03-10T12:37:32.696 INFO:tasks.workunit.client.0.vm00.stdout:5/0: dwrite - no filename 2026-03-10T12:37:32.697 INFO:tasks.workunit.client.0.vm00.stdout:1/2: mknod c0 0 2026-03-10T12:37:32.697 INFO:tasks.workunit.client.0.vm00.stdout:1/3: read - no filename 2026-03-10T12:37:32.697 INFO:tasks.workunit.client.0.vm00.stdout:1/4: dwrite - no filename 2026-03-10T12:37:32.698 INFO:tasks.workunit.client.0.vm00.stdout:1/5: chown c0 9849439 1 2026-03-10T12:37:32.698 INFO:tasks.workunit.client.0.vm00.stdout:1/6: rmdir - no directory 2026-03-10T12:37:32.700 INFO:tasks.workunit.client.0.vm00.stdout:4/4: creat f0 x:0 0 0 2026-03-10T12:37:32.706 INFO:tasks.workunit.client.0.vm00.stdout:3/5: link c0 c1 0 2026-03-10T12:37:32.706 INFO:tasks.workunit.client.0.vm00.stdout:3/6: rmdir - no directory 2026-03-10T12:37:32.706 INFO:tasks.workunit.client.0.vm00.stdout:0/4: dwrite f0 [0,4194304] 0 2026-03-10T12:37:32.706 INFO:tasks.workunit.client.0.vm00.stdout:1/7: unlink c0 0 2026-03-10T12:37:32.706 INFO:tasks.workunit.client.0.vm00.stdout:1/8: rmdir - no directory 2026-03-10T12:37:32.707 INFO:tasks.workunit.client.0.vm00.stdout:6/0: fsync - no filename 2026-03-10T12:37:32.707 INFO:tasks.workunit.client.0.vm00.stdout:6/1: rename - no filename 2026-03-10T12:37:32.707 INFO:tasks.workunit.client.1.vm07.stdout:5/186: creat d0/d22/d18/d19/d21/f42 x:0 0 0 2026-03-10T12:37:32.707 INFO:tasks.workunit.client.1.vm07.stdout:5/187: write d0/d22/d18/d19/d36/f3d [254144,82636] 0 2026-03-10T12:37:32.708 INFO:tasks.workunit.client.1.vm07.stdout:5/188: write d0/d22/d18/d19/d21/f2f [1558832,94497] 0 2026-03-10T12:37:32.710 INFO:tasks.workunit.client.1.vm07.stdout:6/147: rename d1/d4/l2e to d1/d4/d6/l32 0 2026-03-10T12:37:32.711 INFO:tasks.workunit.client.0.vm00.stdout:5/1: mknod c0 0 2026-03-10T12:37:32.711 INFO:tasks.workunit.client.0.vm00.stdout:5/2: dread - no filename 2026-03-10T12:37:32.715 INFO:tasks.workunit.client.0.vm00.stdout:3/7: creat f2 x:0 0 0 2026-03-10T12:37:32.718 INFO:tasks.workunit.client.0.vm00.stdout:4/5: dwrite f0 [0,4194304] 0 2026-03-10T12:37:32.719 INFO:tasks.workunit.client.1.vm07.stdout:7/165: write d0/fc [261280,27098] 0 2026-03-10T12:37:32.720 INFO:tasks.workunit.client.0.vm00.stdout:1/9: creat f1 x:0 0 0 2026-03-10T12:37:32.720 INFO:tasks.workunit.client.1.vm07.stdout:3/180: mkdir dc/dd/d43 0 2026-03-10T12:37:32.723 INFO:tasks.workunit.client.1.vm07.stdout:2/122: mkdir d0/d29 0 2026-03-10T12:37:32.725 INFO:tasks.workunit.client.1.vm07.stdout:2/123: readlink d0/l11 0 2026-03-10T12:37:32.725 INFO:tasks.workunit.client.0.vm00.stdout:0/5: dread f0 [0,4194304] 0 2026-03-10T12:37:32.725 INFO:tasks.workunit.client.0.vm00.stdout:0/6: read f0 [317882,48063] 0 2026-03-10T12:37:32.728 INFO:tasks.workunit.client.0.vm00.stdout:7/0: rename - no filename 2026-03-10T12:37:32.728 INFO:tasks.workunit.client.0.vm00.stdout:7/1: dwrite - no filename 2026-03-10T12:37:32.728 INFO:tasks.workunit.client.0.vm00.stdout:7/2: read - no filename 2026-03-10T12:37:32.728 INFO:tasks.workunit.client.0.vm00.stdout:7/3: rename - no filename 2026-03-10T12:37:32.728 INFO:tasks.workunit.client.0.vm00.stdout:7/4: readlink - no filename 2026-03-10T12:37:32.728 INFO:tasks.workunit.client.0.vm00.stdout:7/5: dwrite - no filename 2026-03-10T12:37:32.729 INFO:tasks.workunit.client.0.vm00.stdout:5/3: creat f1 x:0 0 0 2026-03-10T12:37:32.729 INFO:tasks.workunit.client.1.vm07.stdout:0/229: rename d0/d14/d1a/l2b to d0/d14/d1a/d1b/d41/l47 0 2026-03-10T12:37:32.730 INFO:tasks.workunit.client.0.vm00.stdout:5/4: truncate f1 122213 0 2026-03-10T12:37:32.733 INFO:tasks.workunit.client.1.vm07.stdout:6/148: mkdir d1/d4/d6/d16/d1a/d33 0 2026-03-10T12:37:32.737 INFO:tasks.workunit.client.1.vm07.stdout:6/149: write d1/d9/fb [1255826,55940] 0 2026-03-10T12:37:32.744 INFO:tasks.workunit.client.1.vm07.stdout:1/184: link d9/df/l17 d9/df/d29/d2b/d31/l3b 0 2026-03-10T12:37:32.745 INFO:tasks.workunit.client.1.vm07.stdout:2/124: truncate d0/f1 2852072 0 2026-03-10T12:37:32.747 INFO:tasks.workunit.client.1.vm07.stdout:1/185: dread d9/f19 [0,4194304] 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.1.vm07.stdout:5/189: symlink d0/d22/d18/d19/d21/d3a/l43 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.1.vm07.stdout:0/230: truncate d0/d14/f36 1946452 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.1.vm07.stdout:7/166: symlink d0/l2a 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.1.vm07.stdout:4/170: write d0/d4/d10/d23/f27 [2398518,1349] 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.0.vm00.stdout:3/8: rename f2 to f3 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.0.vm00.stdout:1/10: mknod c2 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.0.vm00.stdout:6/2: symlink l0 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.0.vm00.stdout:5/5: creat f2 x:0 0 0 2026-03-10T12:37:32.758 INFO:tasks.workunit.client.0.vm00.stdout:5/6: fdatasync f2 0 2026-03-10T12:37:32.760 INFO:tasks.workunit.client.0.vm00.stdout:4/6: link f0 f1 0 2026-03-10T12:37:32.767 INFO:tasks.workunit.client.0.vm00.stdout:0/7: dwrite f0 [0,4194304] 0 2026-03-10T12:37:32.768 INFO:tasks.workunit.client.0.vm00.stdout:5/7: symlink l3 0 2026-03-10T12:37:32.768 INFO:tasks.workunit.client.0.vm00.stdout:7/6: creat f0 x:0 0 0 2026-03-10T12:37:32.769 INFO:tasks.workunit.client.0.vm00.stdout:5/8: dread - f2 zero size 2026-03-10T12:37:32.769 INFO:tasks.workunit.client.0.vm00.stdout:4/7: mknod c2 0 2026-03-10T12:37:32.769 INFO:tasks.workunit.client.0.vm00.stdout:9/0: dread - no filename 2026-03-10T12:37:32.769 INFO:tasks.workunit.client.1.vm07.stdout:2/125: rename d0/f1c to d0/d29/f2a 0 2026-03-10T12:37:32.771 INFO:tasks.workunit.client.1.vm07.stdout:2/126: chown d0/f14 320 1 2026-03-10T12:37:32.783 INFO:tasks.workunit.client.1.vm07.stdout:5/190: write d0/d22/d18/d30/f35 [1433745,103170] 0 2026-03-10T12:37:32.784 INFO:tasks.workunit.client.1.vm07.stdout:1/186: creat d9/df/d29/d2b/d31/f3c x:0 0 0 2026-03-10T12:37:32.784 INFO:tasks.workunit.client.1.vm07.stdout:1/187: dread d9/df/f13 [4194304,4194304] 0 2026-03-10T12:37:32.784 INFO:tasks.workunit.client.1.vm07.stdout:0/231: mknod d0/d14/d1a/c48 0 2026-03-10T12:37:32.784 INFO:tasks.workunit.client.0.vm00.stdout:0/8: chown c1 7 1 2026-03-10T12:37:32.784 INFO:tasks.workunit.client.0.vm00.stdout:0/9: dread f0 [0,4194304] 0 2026-03-10T12:37:32.784 INFO:tasks.workunit.client.0.vm00.stdout:5/9: dwrite f2 [0,4194304] 0 2026-03-10T12:37:32.784 INFO:tasks.workunit.client.0.vm00.stdout:3/9: chown c0 220379 1 2026-03-10T12:37:32.791 INFO:tasks.workunit.client.0.vm00.stdout:0/10: dwrite f0 [0,4194304] 0 2026-03-10T12:37:32.793 INFO:tasks.workunit.client.0.vm00.stdout:8/0: getdents . 0 2026-03-10T12:37:32.793 INFO:tasks.workunit.client.0.vm00.stdout:8/1: rename - no filename 2026-03-10T12:37:32.793 INFO:tasks.workunit.client.0.vm00.stdout:8/2: truncate - no filename 2026-03-10T12:37:32.794 INFO:tasks.workunit.client.1.vm07.stdout:4/171: creat d0/d4/d10/f39 x:0 0 0 2026-03-10T12:37:32.803 INFO:tasks.workunit.client.1.vm07.stdout:0/232: rename d0/d14/d1a/c48 to d0/d14/c49 0 2026-03-10T12:37:32.806 INFO:tasks.workunit.client.1.vm07.stdout:0/233: dread d0/d14/f19 [0,4194304] 0 2026-03-10T12:37:32.808 INFO:tasks.workunit.client.0.vm00.stdout:5/10: creat f4 x:0 0 0 2026-03-10T12:37:32.815 INFO:tasks.workunit.client.1.vm07.stdout:0/234: rename d0/l33 to d0/d14/d1a/d2f/l4a 0 2026-03-10T12:37:32.815 INFO:tasks.workunit.client.0.vm00.stdout:3/10: write f3 [259455,32115] 0 2026-03-10T12:37:32.815 INFO:tasks.workunit.client.0.vm00.stdout:6/3: link l0 l1 0 2026-03-10T12:37:32.816 INFO:tasks.workunit.client.0.vm00.stdout:6/4: truncate - no filename 2026-03-10T12:37:32.816 INFO:tasks.workunit.client.0.vm00.stdout:0/11: creat f2 x:0 0 0 2026-03-10T12:37:32.816 INFO:tasks.workunit.client.0.vm00.stdout:7/7: link f0 f1 0 2026-03-10T12:37:32.818 INFO:tasks.workunit.client.0.vm00.stdout:9/1: mkdir d0 0 2026-03-10T12:37:32.818 INFO:tasks.workunit.client.0.vm00.stdout:9/2: chown d0 9639 1 2026-03-10T12:37:32.821 INFO:tasks.workunit.client.0.vm00.stdout:5/11: rename f1 to f5 0 2026-03-10T12:37:32.821 INFO:tasks.workunit.client.1.vm07.stdout:8/195: write d1/f2 [1420333,87889] 0 2026-03-10T12:37:32.821 INFO:tasks.workunit.client.0.vm00.stdout:5/12: write f4 [244925,65125] 0 2026-03-10T12:37:32.823 INFO:tasks.workunit.client.0.vm00.stdout:3/11: symlink l4 0 2026-03-10T12:37:32.823 INFO:tasks.workunit.client.1.vm07.stdout:5/191: sync 2026-03-10T12:37:32.825 INFO:tasks.workunit.client.1.vm07.stdout:8/196: dwrite d1/d3/f1f [4194304,4194304] 0 2026-03-10T12:37:32.825 INFO:tasks.workunit.client.0.vm00.stdout:3/12: truncate f3 1000390 0 2026-03-10T12:37:32.826 INFO:tasks.workunit.client.0.vm00.stdout:7/8: write f0 [503836,79276] 0 2026-03-10T12:37:32.826 INFO:tasks.workunit.client.0.vm00.stdout:7/9: chown f0 65 1 2026-03-10T12:37:32.832 INFO:tasks.workunit.client.0.vm00.stdout:0/12: dwrite f0 [0,4194304] 0 2026-03-10T12:37:32.837 INFO:tasks.workunit.client.1.vm07.stdout:0/235: link d0/f2e d0/d14/d1a/d1b/d3b/f4b 0 2026-03-10T12:37:32.838 INFO:tasks.workunit.client.0.vm00.stdout:5/13: unlink l3 0 2026-03-10T12:37:32.838 INFO:tasks.workunit.client.0.vm00.stdout:5/14: rmdir - no directory 2026-03-10T12:37:32.838 INFO:tasks.workunit.client.0.vm00.stdout:5/15: readlink - no filename 2026-03-10T12:37:32.839 INFO:tasks.workunit.client.0.vm00.stdout:5/16: chown f5 42680324 1 2026-03-10T12:37:32.839 INFO:tasks.workunit.client.0.vm00.stdout:9/3: mkdir d0/d1 0 2026-03-10T12:37:32.839 INFO:tasks.workunit.client.0.vm00.stdout:9/4: write - no filename 2026-03-10T12:37:32.840 INFO:tasks.workunit.client.1.vm07.stdout:8/197: creat d1/d3/d11/f43 x:0 0 0 2026-03-10T12:37:32.843 INFO:tasks.workunit.client.0.vm00.stdout:6/5: chown l1 53140 1 2026-03-10T12:37:32.843 INFO:tasks.workunit.client.0.vm00.stdout:6/6: fdatasync - no filename 2026-03-10T12:37:32.843 INFO:tasks.workunit.client.0.vm00.stdout:6/7: dwrite - no filename 2026-03-10T12:37:32.843 INFO:tasks.workunit.client.0.vm00.stdout:6/8: fdatasync - no filename 2026-03-10T12:37:32.843 INFO:tasks.workunit.client.1.vm07.stdout:8/198: dwrite d1/d3/d18/f32 [0,4194304] 0 2026-03-10T12:37:32.844 INFO:tasks.workunit.client.0.vm00.stdout:7/10: creat f2 x:0 0 0 2026-03-10T12:37:32.844 INFO:tasks.workunit.client.1.vm07.stdout:5/192: rename d0/ce to d0/d22/c44 0 2026-03-10T12:37:32.845 INFO:tasks.workunit.client.0.vm00.stdout:5/17: rename c0 to c6 0 2026-03-10T12:37:32.847 INFO:tasks.workunit.client.1.vm07.stdout:0/236: symlink d0/d14/d1a/l4c 0 2026-03-10T12:37:32.855 INFO:tasks.workunit.client.1.vm07.stdout:5/193: symlink d0/d22/l45 0 2026-03-10T12:37:32.860 INFO:tasks.workunit.client.0.vm00.stdout:9/5: mknod d0/c2 0 2026-03-10T12:37:32.861 INFO:tasks.workunit.client.0.vm00.stdout:9/6: dwrite - no filename 2026-03-10T12:37:32.861 INFO:tasks.workunit.client.0.vm00.stdout:6/9: mkdir d2 0 2026-03-10T12:37:32.861 INFO:tasks.workunit.client.0.vm00.stdout:6/10: dread - no filename 2026-03-10T12:37:32.861 INFO:tasks.workunit.client.1.vm07.stdout:5/194: symlink d0/d22/d18/d19/d2e/d3f/l46 0 2026-03-10T12:37:32.863 INFO:tasks.workunit.client.1.vm07.stdout:5/195: truncate d0/f2b 2224743 0 2026-03-10T12:37:32.867 INFO:tasks.workunit.client.1.vm07.stdout:5/196: dread d0/d22/d18/d19/d21/f2d [0,4194304] 0 2026-03-10T12:37:32.871 INFO:tasks.workunit.client.1.vm07.stdout:5/197: creat d0/f47 x:0 0 0 2026-03-10T12:37:32.872 INFO:tasks.workunit.client.0.vm00.stdout:9/7: creat d0/d1/f3 x:0 0 0 2026-03-10T12:37:32.872 INFO:tasks.workunit.client.0.vm00.stdout:9/8: truncate d0/d1/f3 924551 0 2026-03-10T12:37:32.873 INFO:tasks.workunit.client.0.vm00.stdout:9/9: write d0/d1/f3 [1693835,96577] 0 2026-03-10T12:37:32.873 INFO:tasks.workunit.client.0.vm00.stdout:9/10: chown d0/d1 12485044 1 2026-03-10T12:37:32.877 INFO:tasks.workunit.client.0.vm00.stdout:9/11: dwrite d0/d1/f3 [0,4194304] 0 2026-03-10T12:37:32.880 INFO:tasks.workunit.client.0.vm00.stdout:6/11: rename l0 to d2/l3 0 2026-03-10T12:37:32.881 INFO:tasks.workunit.client.0.vm00.stdout:9/12: unlink d0/c2 0 2026-03-10T12:37:32.882 INFO:tasks.workunit.client.0.vm00.stdout:9/13: write d0/d1/f3 [2560195,81562] 0 2026-03-10T12:37:32.884 INFO:tasks.workunit.client.0.vm00.stdout:6/12: mknod d2/c4 0 2026-03-10T12:37:32.908 INFO:tasks.workunit.client.0.vm00.stdout:6/13: write - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/14: write - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/15: chown l1 60815362 1 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/16: dread - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/17: chown d2/c4 37856 1 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/18: dwrite - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/19: write - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/20: truncate - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/21: fdatasync - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/22: link d2/c4 d2/c5 0 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/23: fdatasync - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/24: link d2/c5 d2/c6 0 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/25: symlink d2/l7 0 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/26: link d2/c4 d2/c8 0 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/27: truncate - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/28: dwrite - no filename 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/29: creat d2/f9 x:0 0 0 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/30: mkdir d2/da 0 2026-03-10T12:37:32.909 INFO:tasks.workunit.client.0.vm00.stdout:6/31: write d2/f9 [233996,115707] 0 2026-03-10T12:37:32.919 INFO:tasks.workunit.client.1.vm07.stdout:5/198: fdatasync d0/d22/d18/d19/d21/f2f 0 2026-03-10T12:37:32.925 INFO:tasks.workunit.client.1.vm07.stdout:5/199: rename d0/d22/d18/d30/c31 to d0/d22/d18/d19/c48 0 2026-03-10T12:37:32.925 INFO:tasks.workunit.client.1.vm07.stdout:5/200: chown d0/d22/d18/d19/f2c 4164730 1 2026-03-10T12:37:32.925 INFO:tasks.workunit.client.1.vm07.stdout:5/201: dwrite d0/d22/d18/d19/d21/f42 [0,4194304] 0 2026-03-10T12:37:32.933 INFO:tasks.workunit.client.1.vm07.stdout:5/202: symlink d0/d22/d18/d19/d2e/l49 0 2026-03-10T12:37:32.935 INFO:tasks.workunit.client.1.vm07.stdout:5/203: mkdir d0/d22/d4a 0 2026-03-10T12:37:32.936 INFO:tasks.workunit.client.1.vm07.stdout:5/204: write d0/fa [1012345,123409] 0 2026-03-10T12:37:32.940 INFO:tasks.workunit.client.1.vm07.stdout:5/205: dwrite d0/fd [0,4194304] 0 2026-03-10T12:37:32.943 INFO:tasks.workunit.client.1.vm07.stdout:5/206: symlink d0/d22/d4a/l4b 0 2026-03-10T12:37:32.943 INFO:tasks.workunit.client.1.vm07.stdout:5/207: fdatasync d0/f47 0 2026-03-10T12:37:33.043 INFO:tasks.workunit.client.0.vm00.stdout:3/13: unlink f3 0 2026-03-10T12:37:33.043 INFO:tasks.workunit.client.0.vm00.stdout:3/14: fsync - no filename 2026-03-10T12:37:33.043 INFO:tasks.workunit.client.0.vm00.stdout:3/15: creat f5 x:0 0 0 2026-03-10T12:37:33.044 INFO:tasks.workunit.client.0.vm00.stdout:3/16: symlink l6 0 2026-03-10T12:37:33.044 INFO:tasks.workunit.client.0.vm00.stdout:3/17: chown f5 5144 1 2026-03-10T12:37:33.045 INFO:tasks.workunit.client.0.vm00.stdout:3/18: rename f5 to f7 0 2026-03-10T12:37:33.046 INFO:tasks.workunit.client.0.vm00.stdout:3/19: mknod c8 0 2026-03-10T12:37:33.047 INFO:tasks.workunit.client.0.vm00.stdout:3/20: creat f9 x:0 0 0 2026-03-10T12:37:33.047 INFO:tasks.workunit.client.0.vm00.stdout:3/21: mknod ca 0 2026-03-10T12:37:33.048 INFO:tasks.workunit.client.0.vm00.stdout:3/22: creat fb x:0 0 0 2026-03-10T12:37:33.048 INFO:tasks.workunit.client.0.vm00.stdout:3/23: creat fc x:0 0 0 2026-03-10T12:37:33.274 INFO:tasks.workunit.client.0.vm00.stdout:5/18: unlink f5 0 2026-03-10T12:37:33.286 INFO:tasks.workunit.client.0.vm00.stdout:5/19: mknod c7 0 2026-03-10T12:37:33.286 INFO:tasks.workunit.client.0.vm00.stdout:5/20: chown c7 5557 1 2026-03-10T12:37:33.307 INFO:tasks.workunit.client.0.vm00.stdout:5/21: dread f4 [0,4194304] 0 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.1.vm07.stdout:3/181: truncate f1 1526509 0 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.1.vm07.stdout:3/182: dread - dc/d18/d24/f3a zero size 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.1.vm07.stdout:3/183: dwrite dc/d18/d24/f3a [0,4194304] 0 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.1.vm07.stdout:3/184: dread - dc/d18/d24/f3f zero size 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.0.vm00.stdout:5/22: dwrite f2 [0,4194304] 0 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.0.vm00.stdout:7/11: getdents . 0 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.0.vm00.stdout:5/23: link c6 c8 0 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.0.vm00.stdout:5/24: write f2 [1133555,104352] 0 2026-03-10T12:37:33.328 INFO:tasks.workunit.client.0.vm00.stdout:5/25: dwrite f4 [0,4194304] 0 2026-03-10T12:37:33.330 INFO:tasks.workunit.client.1.vm07.stdout:3/185: rename dc/dd/l11 to dc/d18/d2d/d3d/l44 0 2026-03-10T12:37:33.333 INFO:tasks.workunit.client.1.vm07.stdout:3/186: unlink dc/l42 0 2026-03-10T12:37:33.333 INFO:tasks.workunit.client.0.vm00.stdout:5/26: unlink c8 0 2026-03-10T12:37:33.333 INFO:tasks.workunit.client.0.vm00.stdout:5/27: read f4 [788975,58820] 0 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.1.vm07.stdout:3/187: chown dc/d18/d24/f3e 7923 1 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.0.vm00.stdout:5/28: symlink l9 0 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.1.vm07.stdout:3/188: mkdir dc/dd/d1f/d45 0 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.1.vm07.stdout:3/189: link dc/dd/f20 dc/dd/d28/f46 0 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.1.vm07.stdout:3/190: dread - dc/dd/f41 zero size 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.1.vm07.stdout:3/191: symlink dc/d18/d2d/l47 0 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.1.vm07.stdout:3/192: symlink dc/dd/l48 0 2026-03-10T12:37:33.342 INFO:tasks.workunit.client.1.vm07.stdout:3/193: write dc/dd/d1f/f27 [698194,130741] 0 2026-03-10T12:37:33.344 INFO:tasks.workunit.client.1.vm07.stdout:3/194: dwrite dc/d18/d24/f3e [0,4194304] 0 2026-03-10T12:37:33.346 INFO:tasks.workunit.client.1.vm07.stdout:3/195: creat dc/d18/d24/f49 x:0 0 0 2026-03-10T12:37:33.354 INFO:tasks.workunit.client.1.vm07.stdout:3/196: read dc/d18/d24/f3e [3132149,48885] 0 2026-03-10T12:37:33.375 INFO:tasks.workunit.client.0.vm00.stdout:6/32: fdatasync d2/f9 0 2026-03-10T12:37:33.397 INFO:tasks.workunit.client.1.vm07.stdout:2/127: fsync d0/d29/f2a 0 2026-03-10T12:37:33.397 INFO:tasks.workunit.client.1.vm07.stdout:9/122: write d5/d16/d18/f1e [1002642,44408] 0 2026-03-10T12:37:33.401 INFO:tasks.workunit.client.1.vm07.stdout:7/167: truncate d0/f29 860683 0 2026-03-10T12:37:33.401 INFO:tasks.workunit.client.1.vm07.stdout:9/123: dread d5/fb [4194304,4194304] 0 2026-03-10T12:37:33.402 INFO:tasks.workunit.client.0.vm00.stdout:4/8: fsync f0 0 2026-03-10T12:37:33.404 INFO:tasks.workunit.client.1.vm07.stdout:2/128: rename d0/d19/f25 to d0/d19/d1f/d20/f2b 0 2026-03-10T12:37:33.405 INFO:tasks.workunit.client.1.vm07.stdout:1/188: write d9/fd [392526,85692] 0 2026-03-10T12:37:33.405 INFO:tasks.workunit.client.1.vm07.stdout:2/129: truncate d0/d19/d26/f27 743470 0 2026-03-10T12:37:33.408 INFO:tasks.workunit.client.1.vm07.stdout:4/172: truncate d0/d4/d10/f16 892949 0 2026-03-10T12:37:33.421 INFO:tasks.workunit.client.1.vm07.stdout:7/168: chown d0/l1b 2802523 1 2026-03-10T12:37:33.421 INFO:tasks.workunit.client.1.vm07.stdout:4/173: stat d0/d4/d10/f36 0 2026-03-10T12:37:33.421 INFO:tasks.workunit.client.1.vm07.stdout:2/130: creat d0/d19/f2c x:0 0 0 2026-03-10T12:37:33.421 INFO:tasks.workunit.client.1.vm07.stdout:1/189: dwrite d9/f36 [0,4194304] 0 2026-03-10T12:37:33.421 INFO:tasks.workunit.client.1.vm07.stdout:4/174: rmdir d0/d19/d1f 39 2026-03-10T12:37:33.421 INFO:tasks.workunit.client.1.vm07.stdout:7/169: read - d0/f23 zero size 2026-03-10T12:37:33.421 INFO:tasks.workunit.client.1.vm07.stdout:4/175: dread - d0/d4/d10/f36 zero size 2026-03-10T12:37:33.426 INFO:tasks.workunit.client.1.vm07.stdout:7/170: dwrite d0/f28 [0,4194304] 0 2026-03-10T12:37:33.428 INFO:tasks.workunit.client.1.vm07.stdout:5/208: fsync d0/d22/d18/d19/d21/f42 0 2026-03-10T12:37:33.429 INFO:tasks.workunit.client.1.vm07.stdout:1/190: mkdir d9/df/d29/d2b/d3d 0 2026-03-10T12:37:33.431 INFO:tasks.workunit.client.1.vm07.stdout:4/176: chown d0/d19/d1f/d2b/d2d 95179295 1 2026-03-10T12:37:33.432 INFO:tasks.workunit.client.1.vm07.stdout:4/177: fsync d0/d4/d5/d34/f37 0 2026-03-10T12:37:33.435 INFO:tasks.workunit.client.1.vm07.stdout:2/131: creat d0/f2d x:0 0 0 2026-03-10T12:37:33.436 INFO:tasks.workunit.client.1.vm07.stdout:1/191: symlink d9/df/d29/d2b/d31/l3e 0 2026-03-10T12:37:33.440 INFO:tasks.workunit.client.0.vm00.stdout:7/12: dread f0 [0,4194304] 0 2026-03-10T12:37:33.440 INFO:tasks.workunit.client.0.vm00.stdout:7/13: read - f2 zero size 2026-03-10T12:37:33.441 INFO:tasks.workunit.client.0.vm00.stdout:7/14: chown f2 389165720 1 2026-03-10T12:37:33.447 INFO:tasks.workunit.client.1.vm07.stdout:4/178: mknod d0/d4/d10/d18/c3a 0 2026-03-10T12:37:33.447 INFO:tasks.workunit.client.1.vm07.stdout:4/179: write d0/d4/d10/f39 [476952,94968] 0 2026-03-10T12:37:33.450 INFO:tasks.workunit.client.1.vm07.stdout:6/150: dread d1/d4/f11 [0,4194304] 0 2026-03-10T12:37:33.451 INFO:tasks.workunit.client.1.vm07.stdout:1/192: symlink d9/df/d29/d2c/l3f 0 2026-03-10T12:37:33.452 INFO:tasks.workunit.client.1.vm07.stdout:1/193: fdatasync d9/df/d29/d2b/f32 0 2026-03-10T12:37:33.457 INFO:tasks.workunit.client.1.vm07.stdout:2/132: link d0/f14 d0/d19/d26/f2e 0 2026-03-10T12:37:33.460 INFO:tasks.workunit.client.1.vm07.stdout:4/180: rename d0/d4/d10/f2f to d0/d19/d1f/d2b/f3b 0 2026-03-10T12:37:33.464 INFO:tasks.workunit.client.1.vm07.stdout:1/194: truncate d9/df/f15 170713 0 2026-03-10T12:37:33.464 INFO:tasks.workunit.client.0.vm00.stdout:9/14: dwrite d0/d1/f3 [4194304,4194304] 0 2026-03-10T12:37:33.465 INFO:tasks.workunit.client.1.vm07.stdout:8/199: truncate d1/f3f 4047513 0 2026-03-10T12:37:33.465 INFO:tasks.workunit.client.1.vm07.stdout:8/200: readlink d1/d3/d11/l1a 0 2026-03-10T12:37:33.467 INFO:tasks.workunit.client.1.vm07.stdout:0/237: truncate d0/f21 627062 0 2026-03-10T12:37:33.476 INFO:tasks.workunit.client.1.vm07.stdout:6/151: getdents d1/d4/d6/d16/d1a/d33 0 2026-03-10T12:37:33.477 INFO:tasks.workunit.client.1.vm07.stdout:6/152: dread - d1/d4/d6/d16/d1a/f29 zero size 2026-03-10T12:37:33.479 INFO:tasks.workunit.client.1.vm07.stdout:1/195: chown d9/df/d29/d2b/c37 15909 1 2026-03-10T12:37:33.483 INFO:tasks.workunit.client.1.vm07.stdout:1/196: dwrite d9/df/f24 [0,4194304] 0 2026-03-10T12:37:33.494 INFO:tasks.workunit.client.1.vm07.stdout:0/238: unlink d0/fd 0 2026-03-10T12:37:33.498 INFO:tasks.workunit.client.1.vm07.stdout:1/197: fdatasync d9/df/f13 0 2026-03-10T12:37:33.502 INFO:tasks.workunit.client.1.vm07.stdout:8/201: mknod d1/c44 0 2026-03-10T12:37:33.504 INFO:tasks.workunit.client.1.vm07.stdout:0/239: rmdir d0/d14/d1a/d1b 39 2026-03-10T12:37:33.512 INFO:tasks.workunit.client.1.vm07.stdout:1/198: symlink d9/df/d29/d2c/l40 0 2026-03-10T12:37:33.513 INFO:tasks.workunit.client.1.vm07.stdout:1/199: fdatasync d9/df/f26 0 2026-03-10T12:37:33.525 INFO:tasks.workunit.client.1.vm07.stdout:8/202: dread d1/fc [0,4194304] 0 2026-03-10T12:37:33.525 INFO:tasks.workunit.client.1.vm07.stdout:8/203: stat d1/d3/c23 0 2026-03-10T12:37:33.526 INFO:tasks.workunit.client.1.vm07.stdout:8/204: dread d1/f19 [0,4194304] 0 2026-03-10T12:37:33.529 INFO:tasks.workunit.client.1.vm07.stdout:2/133: rename d0/f1 to d0/d19/d1f/f2f 0 2026-03-10T12:37:33.530 INFO:tasks.workunit.client.1.vm07.stdout:6/153: link d1/d9/f22 d1/f34 0 2026-03-10T12:37:33.531 INFO:tasks.workunit.client.1.vm07.stdout:1/200: rmdir d9/df/d29 39 2026-03-10T12:37:33.538 INFO:tasks.workunit.client.0.vm00.stdout:5/29: truncate f4 4013593 0 2026-03-10T12:37:33.549 INFO:tasks.workunit.client.1.vm07.stdout:3/197: rmdir dc 39 2026-03-10T12:37:33.553 INFO:tasks.workunit.client.1.vm07.stdout:9/124: dwrite d5/d13/f1b [0,4194304] 0 2026-03-10T12:37:33.555 INFO:tasks.workunit.client.1.vm07.stdout:2/134: mknod d0/d19/c30 0 2026-03-10T12:37:33.556 INFO:tasks.workunit.client.0.vm00.stdout:6/33: rmdir d2 39 2026-03-10T12:37:33.563 INFO:tasks.workunit.client.0.vm00.stdout:7/15: creat f3 x:0 0 0 2026-03-10T12:37:33.563 INFO:tasks.workunit.client.0.vm00.stdout:7/16: chown f3 2028049 1 2026-03-10T12:37:33.564 INFO:tasks.workunit.client.1.vm07.stdout:3/198: readlink la 0 2026-03-10T12:37:33.565 INFO:tasks.workunit.client.0.vm00.stdout:9/15: creat d0/f4 x:0 0 0 2026-03-10T12:37:33.568 INFO:tasks.workunit.client.0.vm00.stdout:5/30: mknod ca 0 2026-03-10T12:37:33.598 INFO:tasks.workunit.client.0.vm00.stdout:6/34: dwrite d2/f9 [0,4194304] 0 2026-03-10T12:37:33.598 INFO:tasks.workunit.client.0.vm00.stdout:7/17: rename f3 to f4 0 2026-03-10T12:37:33.598 INFO:tasks.workunit.client.0.vm00.stdout:7/18: rmdir - no directory 2026-03-10T12:37:33.598 INFO:tasks.workunit.client.0.vm00.stdout:9/16: rename d0/d1 to d0/d5 0 2026-03-10T12:37:33.598 INFO:tasks.workunit.client.0.vm00.stdout:9/17: dwrite d0/f4 [0,4194304] 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.0.vm00.stdout:9/18: dread d0/f4 [0,4194304] 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.0.vm00.stdout:6/35: mknod d2/cb 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.0.vm00.stdout:7/19: rename f2 to f5 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.0.vm00.stdout:4/9: getdents . 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.0.vm00.stdout:6/36: write d2/f9 [3609988,103899] 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:2/135: mknod d0/d19/d1f/d20/c31 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:6/154: mknod d1/d4/d6/d16/d1a/d2c/c35 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:6/155: fsync d1/d4/f19 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:1/201: mknod d9/df/d29/d2b/d31/c41 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:2/136: creat d0/d29/f32 x:0 0 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:6/156: link d1/d4/f31 d1/d4/d6/f36 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:6/157: read - d1/d4/d6/f30 zero size 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:1/202: symlink d9/l42 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:3/199: symlink dc/dd/l4a 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:1/203: creat d9/df/d29/d2b/d3d/f43 x:0 0 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:1/204: symlink d9/df/d29/d2b/d30/l44 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:6/158: getdents d1/d9 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.1.vm07.stdout:6/159: creat d1/d4/d6/d16/d1a/d33/f37 x:0 0 0 2026-03-10T12:37:33.599 INFO:tasks.workunit.client.0.vm00.stdout:9/19: mknod d0/c6 0 2026-03-10T12:37:33.601 INFO:tasks.workunit.client.0.vm00.stdout:4/10: rename f1 to f3 0 2026-03-10T12:37:33.602 INFO:tasks.workunit.client.0.vm00.stdout:4/11: stat f3 0 2026-03-10T12:37:33.603 INFO:tasks.workunit.client.0.vm00.stdout:9/20: symlink d0/d5/l7 0 2026-03-10T12:37:33.605 INFO:tasks.workunit.client.0.vm00.stdout:4/12: mknod c4 0 2026-03-10T12:37:33.609 INFO:tasks.workunit.client.0.vm00.stdout:4/13: dwrite f0 [0,4194304] 0 2026-03-10T12:37:33.611 INFO:tasks.workunit.client.0.vm00.stdout:4/14: symlink l5 0 2026-03-10T12:37:33.612 INFO:tasks.workunit.client.0.vm00.stdout:4/15: write f0 [3721550,103893] 0 2026-03-10T12:37:33.612 INFO:tasks.workunit.client.0.vm00.stdout:4/16: rmdir - no directory 2026-03-10T12:37:33.613 INFO:tasks.workunit.client.0.vm00.stdout:4/17: chown l5 2 1 2026-03-10T12:37:33.616 INFO:tasks.workunit.client.0.vm00.stdout:4/18: dwrite f3 [4194304,4194304] 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/19: mknod c6 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/20: dread f0 [0,4194304] 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/21: mknod c7 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/22: rmdir - no directory 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/23: creat f8 x:0 0 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/24: rename f0 to f9 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/25: dread - f8 zero size 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/26: link f3 fa 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/27: truncate f8 356154 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/28: creat fb x:0 0 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/29: symlink lc 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/30: dread f9 [0,4194304] 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/31: write fb [441641,36740] 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/32: dread fa [4194304,4194304] 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/33: mkdir dd 0 2026-03-10T12:37:33.728 INFO:tasks.workunit.client.0.vm00.stdout:4/34: rmdir dd 0 2026-03-10T12:37:33.955 INFO:tasks.workunit.client.0.vm00.stdout:2/1: sync 2026-03-10T12:37:33.955 INFO:tasks.workunit.client.0.vm00.stdout:8/3: sync 2026-03-10T12:37:33.955 INFO:tasks.workunit.client.0.vm00.stdout:8/4: dwrite - no filename 2026-03-10T12:37:33.955 INFO:tasks.workunit.client.0.vm00.stdout:3/24: sync 2026-03-10T12:37:33.955 INFO:tasks.workunit.client.0.vm00.stdout:1/11: sync 2026-03-10T12:37:33.955 INFO:tasks.workunit.client.0.vm00.stdout:8/5: chown . 486 1 2026-03-10T12:37:33.955 INFO:tasks.workunit.client.0.vm00.stdout:3/25: chown c8 3302490 1 2026-03-10T12:37:33.963 INFO:tasks.workunit.client.0.vm00.stdout:1/12: creat f3 x:0 0 0 2026-03-10T12:37:33.964 INFO:tasks.workunit.client.0.vm00.stdout:3/26: mkdir dd 0 2026-03-10T12:37:33.966 INFO:tasks.workunit.client.0.vm00.stdout:2/2: symlink l0 0 2026-03-10T12:37:33.968 INFO:tasks.workunit.client.0.vm00.stdout:8/6: mkdir d0 0 2026-03-10T12:37:33.968 INFO:tasks.workunit.client.1.vm07.stdout:7/171: truncate d0/fc 1135819 0 2026-03-10T12:37:33.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.968+0000 7fcefa600700 1 -- 192.168.123.100:0/4239285865 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 msgr2=0x7fcef410c4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:33.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.968+0000 7fcefa600700 1 --2- 192.168.123.100:0/4239285865 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef410c4f0 secure :-1 s=READY pgs=326 cs=0 l=1 rev1=1 crypto rx=0x7fcee8009b00 tx=0x7fcee8009e10 comp rx=0 tx=0).stop 2026-03-10T12:37:33.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.968+0000 7fcefa600700 1 -- 192.168.123.100:0/4239285865 shutdown_connections 2026-03-10T12:37:33.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.968+0000 7fcefa600700 1 --2- 192.168.123.100:0/4239285865 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef410c4f0 unknown :-1 s=CLOSED pgs=326 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:33.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.968+0000 7fcefa600700 1 --2- 192.168.123.100:0/4239285865 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcef410b150 0x7fcef410b560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:33.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.968+0000 7fcefa600700 1 -- 192.168.123.100:0/4239285865 >> 192.168.123.100:0/4239285865 conn(0x7fcef406ac40 msgr2=0x7fcef406b0a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:33.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.968+0000 7fcefa600700 1 -- 192.168.123.100:0/4239285865 shutdown_connections 2026-03-10T12:37:33.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.969+0000 7fcefa600700 1 -- 192.168.123.100:0/4239285865 wait complete. 2026-03-10T12:37:33.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.969+0000 7fcefa600700 1 Processor -- start 2026-03-10T12:37:33.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.969+0000 7fcefa600700 1 -- start start 2026-03-10T12:37:33.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcefa600700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcef410b150 0x7fcef41a0a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:33.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcefa600700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef41a0fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:33.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcefa600700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcef41a15c0 con 0x7fcef410c080 2026-03-10T12:37:33.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcefa600700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcef41a1700 con 0x7fcef410b150 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcef37fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef41a0fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcef37fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef41a0fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59216/0 (socket says 192.168.123.100:59216) 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcef37fe700 1 -- 192.168.123.100:0/1360128456 learned_addr learned my addr 192.168.123.100:0/1360128456 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcef37fe700 1 -- 192.168.123.100:0/1360128456 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcef410b150 msgr2=0x7fcef41a0a60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcef3fff700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcef410b150 0x7fcef41a0a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcef37fe700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcef410b150 0x7fcef41a0a60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.970+0000 7fcef37fe700 1 -- 192.168.123.100:0/1360128456 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcee80097e0 con 0x7fcef410c080 2026-03-10T12:37:33.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.971+0000 7fcef3fff700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcef410b150 0x7fcef41a0a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:37:33.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.971+0000 7fcef37fe700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef41a0fa0 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7fcee80052f0 tx=0x7fcee8003730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:33.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.971+0000 7fcef17fa700 1 -- 192.168.123.100:0/1360128456 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcee801d070 con 0x7fcef410c080 2026-03-10T12:37:33.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.971+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcef41a6150 con 0x7fcef410c080 2026-03-10T12:37:33.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.971+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcef41a6640 con 0x7fcef410c080 2026-03-10T12:37:33.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.971+0000 7fcef17fa700 1 -- 192.168.123.100:0/1360128456 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcee8005470 con 0x7fcef410c080 2026-03-10T12:37:33.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.972+0000 7fcef17fa700 1 -- 192.168.123.100:0/1360128456 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcee800e3e0 con 0x7fcef410c080 2026-03-10T12:37:33.973 INFO:tasks.workunit.client.1.vm07.stdout:7/172: dwrite d0/f14 [0,4194304] 0 2026-03-10T12:37:33.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.972+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcee0005320 con 0x7fcef410c080 2026-03-10T12:37:33.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.973+0000 7fcef17fa700 1 -- 192.168.123.100:0/1360128456 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fcee8003d40 con 0x7fcef410c080 2026-03-10T12:37:33.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.973+0000 7fcef17fa700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcedc06c4d0 0x7fcedc06e980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:33.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.973+0000 7fcef17fa700 1 -- 192.168.123.100:0/1360128456 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fcee808c9b0 con 0x7fcef410c080 2026-03-10T12:37:33.976 INFO:tasks.workunit.client.1.vm07.stdout:7/173: readlink d0/l1b 0 2026-03-10T12:37:33.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.976+0000 7fcef3fff700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcedc06c4d0 0x7fcedc06e980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:33.977 INFO:tasks.workunit.client.1.vm07.stdout:7/174: unlink d0/f25 0 2026-03-10T12:37:33.978 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.977+0000 7fcef3fff700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcedc06c4d0 0x7fcedc06e980 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fcee4005fd0 tx=0x7fcee4005ee0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:33.978 INFO:tasks.workunit.client.0.vm00.stdout:1/13: rename f1 to f4 0 2026-03-10T12:37:33.978 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:33.978+0000 7fcef17fa700 1 -- 192.168.123.100:0/1360128456 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcee805acf0 con 0x7fcef410c080 2026-03-10T12:37:33.980 INFO:tasks.workunit.client.1.vm07.stdout:7/175: link d0/f26 d0/f2b 0 2026-03-10T12:37:33.980 INFO:tasks.workunit.client.1.vm07.stdout:7/176: chown d0/l1b 848 1 2026-03-10T12:37:33.981 INFO:tasks.workunit.client.0.vm00.stdout:2/3: creat f1 x:0 0 0 2026-03-10T12:37:33.981 INFO:tasks.workunit.client.0.vm00.stdout:2/4: dread - f1 zero size 2026-03-10T12:37:33.982 INFO:tasks.workunit.client.1.vm07.stdout:7/177: symlink d0/l2c 0 2026-03-10T12:37:33.984 INFO:tasks.workunit.client.1.vm07.stdout:7/178: link d0/l1b d0/l2d 0 2026-03-10T12:37:33.989 INFO:tasks.workunit.client.1.vm07.stdout:5/209: write d0/f1e [3043791,29763] 0 2026-03-10T12:37:33.989 INFO:tasks.workunit.client.0.vm00.stdout:2/5: dwrite f1 [0,4194304] 0 2026-03-10T12:37:33.989 INFO:tasks.workunit.client.0.vm00.stdout:2/6: readlink l0 0 2026-03-10T12:37:33.995 INFO:tasks.workunit.client.1.vm07.stdout:7/179: symlink d0/l2e 0 2026-03-10T12:37:33.996 INFO:tasks.workunit.client.1.vm07.stdout:7/180: chown d0/f26 1760 1 2026-03-10T12:37:34.004 INFO:tasks.workunit.client.1.vm07.stdout:4/181: rename d0/d19/d1f to d0/d4/d10/d3c 0 2026-03-10T12:37:34.004 INFO:tasks.workunit.client.1.vm07.stdout:2/137: rename d0 to d0/d19/d1f/d20/d33 22 2026-03-10T12:37:34.005 INFO:tasks.workunit.client.1.vm07.stdout:2/138: fdatasync d0/d29/f32 0 2026-03-10T12:37:34.013 INFO:tasks.workunit.client.1.vm07.stdout:0/240: dread d0/f21 [0,4194304] 0 2026-03-10T12:37:34.022 INFO:tasks.workunit.client.1.vm07.stdout:4/182: symlink d0/d4/d10/d3c/d2b/d2d/l3d 0 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.1.vm07.stdout:4/183: chown d0/d4 230245 1 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.1.vm07.stdout:0/241: creat d0/d14/d1a/d2f/d31/f4d x:0 0 0 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.1.vm07.stdout:2/139: truncate d0/d19/f1e 231466 0 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.1.vm07.stdout:2/140: chown d0/lb 2 1 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.0.vm00.stdout:1/14: dwrite f4 [0,4194304] 0 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.0.vm00.stdout:3/27: mknod dd/ce 0 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.0.vm00.stdout:2/7: creat f2 x:0 0 0 2026-03-10T12:37:34.036 INFO:tasks.workunit.client.0.vm00.stdout:2/8: write f2 [228953,115227] 0 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/7: getdents d0 0 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/8: dread - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/9: truncate - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/10: chown d0 749281 1 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/11: dread - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/12: unlink - no file 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/13: write - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/14: truncate - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/15: dread - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/16: fdatasync - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/17: dwrite - no filename 2026-03-10T12:37:34.038 INFO:tasks.workunit.client.0.vm00.stdout:8/18: dread - no filename 2026-03-10T12:37:34.045 INFO:tasks.workunit.client.0.vm00.stdout:3/28: rename c8 to dd/cf 0 2026-03-10T12:37:34.045 INFO:tasks.workunit.client.0.vm00.stdout:3/29: dread - fb zero size 2026-03-10T12:37:34.045 INFO:tasks.workunit.client.0.vm00.stdout:3/30: write f9 [1011054,80295] 0 2026-03-10T12:37:34.045 INFO:tasks.workunit.client.0.vm00.stdout:3/31: stat l6 0 2026-03-10T12:37:34.045 INFO:tasks.workunit.client.0.vm00.stdout:3/32: chown dd 185941 1 2026-03-10T12:37:34.046 INFO:tasks.workunit.client.0.vm00.stdout:3/33: write fc [1016813,105412] 0 2026-03-10T12:37:34.046 INFO:tasks.workunit.client.0.vm00.stdout:3/34: stat l4 0 2026-03-10T12:37:34.047 INFO:tasks.workunit.client.0.vm00.stdout:3/35: write f9 [2079306,48497] 0 2026-03-10T12:37:34.051 INFO:tasks.workunit.client.0.vm00.stdout:1/15: link f4 f5 0 2026-03-10T12:37:34.054 INFO:tasks.workunit.client.0.vm00.stdout:3/36: rmdir dd 39 2026-03-10T12:37:34.054 INFO:tasks.workunit.client.0.vm00.stdout:3/37: fdatasync fb 0 2026-03-10T12:37:34.055 INFO:tasks.workunit.client.0.vm00.stdout:2/9: link f2 f3 0 2026-03-10T12:37:34.057 INFO:tasks.workunit.client.0.vm00.stdout:8/19: symlink d0/l1 0 2026-03-10T12:37:34.057 INFO:tasks.workunit.client.0.vm00.stdout:8/20: dread - no filename 2026-03-10T12:37:34.057 INFO:tasks.workunit.client.0.vm00.stdout:8/21: fdatasync - no filename 2026-03-10T12:37:34.057 INFO:tasks.workunit.client.0.vm00.stdout:8/22: dread - no filename 2026-03-10T12:37:34.058 INFO:tasks.workunit.client.0.vm00.stdout:1/16: mkdir d6 0 2026-03-10T12:37:34.060 INFO:tasks.workunit.client.0.vm00.stdout:2/10: mkdir d4 0 2026-03-10T12:37:34.062 INFO:tasks.workunit.client.0.vm00.stdout:8/23: rmdir d0 39 2026-03-10T12:37:34.067 INFO:tasks.workunit.client.0.vm00.stdout:1/17: rmdir d6 0 2026-03-10T12:37:34.069 INFO:tasks.workunit.client.0.vm00.stdout:1/18: mknod c7 0 2026-03-10T12:37:34.069 INFO:tasks.workunit.client.0.vm00.stdout:1/19: read f4 [2491645,127973] 0 2026-03-10T12:37:34.074 INFO:tasks.workunit.client.1.vm07.stdout:4/184: dread d0/d4/d5/da/f15 [8388608,4194304] 0 2026-03-10T12:37:34.075 INFO:tasks.workunit.client.1.vm07.stdout:4/185: fdatasync d0/d4/d10/d18/f1a 0 2026-03-10T12:37:34.076 INFO:tasks.workunit.client.1.vm07.stdout:4/186: read d0/d4/d10/f39 [178879,118808] 0 2026-03-10T12:37:34.077 INFO:tasks.workunit.client.1.vm07.stdout:4/187: dread - d0/d4/d10/f36 zero size 2026-03-10T12:37:34.082 INFO:tasks.workunit.client.1.vm07.stdout:4/188: dwrite d0/d4/d10/d3c/d2b/f3b [0,4194304] 0 2026-03-10T12:37:34.088 INFO:tasks.workunit.client.0.vm00.stdout:0/13: fdatasync f0 0 2026-03-10T12:37:34.095 INFO:tasks.workunit.client.1.vm07.stdout:4/189: link d0/d4/d10/d3c/f22 d0/d4/d10/d18/f3e 0 2026-03-10T12:37:34.097 INFO:tasks.workunit.client.0.vm00.stdout:0/14: mkdir d3 0 2026-03-10T12:37:34.097 INFO:tasks.workunit.client.1.vm07.stdout:5/210: sync 2026-03-10T12:37:34.097 INFO:tasks.workunit.client.1.vm07.stdout:8/205: dwrite d1/f36 [0,4194304] 0 2026-03-10T12:37:34.097 INFO:tasks.workunit.client.1.vm07.stdout:7/181: read d0/f29 [860423,128885] 0 2026-03-10T12:37:34.098 INFO:tasks.workunit.client.1.vm07.stdout:7/182: dread - d0/f23 zero size 2026-03-10T12:37:34.110 INFO:tasks.workunit.client.0.vm00.stdout:0/15: creat d3/f4 x:0 0 0 2026-03-10T12:37:34.111 INFO:tasks.workunit.client.1.vm07.stdout:7/183: chown d0/f20 28 1 2026-03-10T12:37:34.111 INFO:tasks.workunit.client.1.vm07.stdout:9/125: write d5/fe [657009,118949] 0 2026-03-10T12:37:34.111 INFO:tasks.workunit.client.1.vm07.stdout:9/126: stat l3 0 2026-03-10T12:37:34.111 INFO:tasks.workunit.client.1.vm07.stdout:7/184: chown d0/f10 0 1 2026-03-10T12:37:34.111 INFO:tasks.workunit.client.1.vm07.stdout:9/127: dread d5/d13/d22/f25 [0,4194304] 0 2026-03-10T12:37:34.111 INFO:tasks.workunit.client.1.vm07.stdout:3/200: write dc/dd/f20 [341046,101118] 0 2026-03-10T12:37:34.113 INFO:tasks.workunit.client.1.vm07.stdout:3/201: dread dc/dd/f29 [0,4194304] 0 2026-03-10T12:37:34.114 INFO:tasks.workunit.client.1.vm07.stdout:1/205: rmdir d9/df/d29/d2b/d3d 39 2026-03-10T12:37:34.121 INFO:tasks.workunit.client.1.vm07.stdout:6/160: dwrite d1/f17 [0,4194304] 0 2026-03-10T12:37:34.130 INFO:tasks.workunit.client.1.vm07.stdout:7/185: write d0/f1f [928277,106861] 0 2026-03-10T12:37:34.130 INFO:tasks.workunit.client.1.vm07.stdout:3/202: symlink dc/d18/d2d/l4b 0 2026-03-10T12:37:34.130 INFO:tasks.workunit.client.1.vm07.stdout:1/206: symlink d9/df/d29/d2b/d31/l45 0 2026-03-10T12:37:34.131 INFO:tasks.workunit.client.1.vm07.stdout:4/190: rename d0/l1c to d0/d4/d10/d3c/d2b/d2d/l3f 0 2026-03-10T12:37:34.132 INFO:tasks.workunit.client.1.vm07.stdout:8/206: symlink d1/l45 0 2026-03-10T12:37:34.133 INFO:tasks.workunit.client.1.vm07.stdout:8/207: readlink d1/d3/d11/l39 0 2026-03-10T12:37:34.134 INFO:tasks.workunit.client.1.vm07.stdout:9/128: sync 2026-03-10T12:37:34.136 INFO:tasks.workunit.client.0.vm00.stdout:6/37: fsync d2/f9 0 2026-03-10T12:37:34.141 INFO:tasks.workunit.client.1.vm07.stdout:4/191: write d0/d4/d5/da/f15 [13471635,39426] 0 2026-03-10T12:37:34.141 INFO:tasks.workunit.client.0.vm00.stdout:6/38: chown d2/c6 560026517 1 2026-03-10T12:37:34.142 INFO:tasks.workunit.client.1.vm07.stdout:4/192: read - d0/d4/d10/d23/f2e zero size 2026-03-10T12:37:34.142 INFO:tasks.workunit.client.1.vm07.stdout:8/208: truncate d1/f3d 1511910 0 2026-03-10T12:37:34.143 INFO:tasks.workunit.client.1.vm07.stdout:8/209: chown d1/d3/d11/f35 103 1 2026-03-10T12:37:34.144 INFO:tasks.workunit.client.1.vm07.stdout:8/210: write d1/d3/d11/f43 [872205,24935] 0 2026-03-10T12:37:34.145 INFO:tasks.workunit.client.1.vm07.stdout:8/211: readlink d1/d3/d6/l17 0 2026-03-10T12:37:34.145 INFO:tasks.workunit.client.1.vm07.stdout:8/212: chown d1/d3/d6/f24 33108611 1 2026-03-10T12:37:34.147 INFO:tasks.workunit.client.1.vm07.stdout:8/213: write d1/d3/f1d [469766,27217] 0 2026-03-10T12:37:34.148 INFO:tasks.workunit.client.1.vm07.stdout:8/214: chown d1 230 1 2026-03-10T12:37:34.149 INFO:tasks.workunit.client.1.vm07.stdout:8/215: read d1/d3/f2d [2924470,23306] 0 2026-03-10T12:37:34.158 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.155+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcee0000bf0 con 0x7fcedc06c4d0 2026-03-10T12:37:34.161 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.161+0000 7fcef17fa700 1 -- 192.168.123.100:0/1360128456 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fcee0000bf0 con 0x7fcedc06c4d0 2026-03-10T12:37:34.164 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcedc06c4d0 msgr2=0x7fcedc06e980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.164 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcedc06c4d0 0x7fcedc06e980 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fcee4005fd0 tx=0x7fcee4005ee0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.164 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 msgr2=0x7fcef41a0fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.164 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef41a0fa0 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7fcee80052f0 tx=0x7fcee8003730 comp rx=0 tx=0).stop 2026-03-10T12:37:34.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 shutdown_connections 2026-03-10T12:37:34.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcedc06c4d0 0x7fcedc06e980 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcef410b150 0x7fcef41a0a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 --2- 192.168.123.100:0/1360128456 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcef410c080 0x7fcef41a0fa0 unknown :-1 s=CLOSED pgs=327 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 >> 192.168.123.100:0/1360128456 conn(0x7fcef406ac40 msgr2=0x7fcef410f340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:34.165 INFO:tasks.workunit.client.1.vm07.stdout:7/186: chown d0/fc 2350 1 2026-03-10T12:37:34.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.164+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 shutdown_connections 2026-03-10T12:37:34.165 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.165+0000 7fcefa600700 1 -- 192.168.123.100:0/1360128456 wait complete. 2026-03-10T12:37:34.165 INFO:tasks.workunit.client.1.vm07.stdout:3/203: creat dc/dd/d28/d3b/f4c x:0 0 0 2026-03-10T12:37:34.169 INFO:tasks.workunit.client.1.vm07.stdout:1/207: rename d9/df/d29/d2b/c3a to d9/df/d29/d2b/d3d/c46 0 2026-03-10T12:37:34.169 INFO:tasks.workunit.client.1.vm07.stdout:9/129: mkdir d5/d16/d23/d26 0 2026-03-10T12:37:34.174 INFO:tasks.workunit.client.1.vm07.stdout:8/216: creat d1/d3/d11/f46 x:0 0 0 2026-03-10T12:37:34.174 INFO:tasks.workunit.client.1.vm07.stdout:5/211: getdents d0/d22/d18/d19/d21 0 2026-03-10T12:37:34.176 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:37:34.181 INFO:tasks.workunit.client.1.vm07.stdout:4/193: link d0/d4/d10/f36 d0/d4/d10/d23/f40 0 2026-03-10T12:37:34.185 INFO:tasks.workunit.client.1.vm07.stdout:9/130: creat d5/d13/f27 x:0 0 0 2026-03-10T12:37:34.186 INFO:tasks.workunit.client.1.vm07.stdout:9/131: dread - d5/d13/f27 zero size 2026-03-10T12:37:34.187 INFO:tasks.workunit.client.1.vm07.stdout:9/132: write d5/d13/f14 [4479928,113430] 0 2026-03-10T12:37:34.188 INFO:tasks.workunit.client.1.vm07.stdout:6/161: getdents d1/d4/d6/d16 0 2026-03-10T12:37:34.193 INFO:tasks.workunit.client.1.vm07.stdout:1/208: read d9/fd [161482,15721] 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.1.vm07.stdout:3/204: link dc/d18/d24/f3f dc/dd/d28/d3b/f4d 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.1.vm07.stdout:8/217: readlink d1/d3/d18/l31 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.1.vm07.stdout:3/205: fsync dc/dd/d1f/f2f 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.1.vm07.stdout:8/218: write d1/d3/d6/f24 [181122,29782] 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.1.vm07.stdout:5/212: creat d0/d22/d18/f4c x:0 0 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.1.vm07.stdout:4/194: rename d0/d4/d10/d18/c21 to d0/c41 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.1.vm07.stdout:4/195: dwrite d0/d4/d5/d34/f37 [0,4194304] 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/11: read f3 [82261,103825] 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/12: mknod d4/c5 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/13: mkdir d4/d6 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/14: write f1 [3011284,35578] 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/15: dwrite f3 [0,4194304] 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/16: symlink d4/d6/l7 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/17: chown d4/d6 122 1 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/18: dwrite f2 [0,4194304] 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/19: mknod d4/c8 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/20: rename l0 to d4/l9 0 2026-03-10T12:37:34.224 INFO:tasks.workunit.client.0.vm00.stdout:2/21: dread f3 [0,4194304] 0 2026-03-10T12:37:34.229 INFO:tasks.workunit.client.0.vm00.stdout:2/22: mknod d4/d6/ca 0 2026-03-10T12:37:34.231 INFO:tasks.workunit.client.1.vm07.stdout:6/162: dwrite d1/d4/f2b [0,4194304] 0 2026-03-10T12:37:34.232 INFO:tasks.workunit.client.1.vm07.stdout:6/163: write d1/d4/d6/d16/d1a/d33/f37 [220441,11980] 0 2026-03-10T12:37:34.232 INFO:tasks.workunit.client.1.vm07.stdout:6/164: dread - d1/d4/d6/f2a zero size 2026-03-10T12:37:34.233 INFO:tasks.workunit.client.0.vm00.stdout:9/21: fsync d0/f4 0 2026-03-10T12:37:34.233 INFO:tasks.workunit.client.0.vm00.stdout:9/22: fdatasync d0/f4 0 2026-03-10T12:37:34.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:34 vm00.local ceph-mon[50686]: pgmap v152: 65 pgs: 65 active+clean; 426 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 388 KiB/s rd, 35 MiB/s wr, 355 op/s 2026-03-10T12:37:34.237 INFO:tasks.workunit.client.0.vm00.stdout:9/23: dwrite d0/d5/f3 [8388608,4194304] 0 2026-03-10T12:37:34.239 INFO:tasks.workunit.client.0.vm00.stdout:7/20: rename f5 to f6 0 2026-03-10T12:37:34.240 INFO:tasks.workunit.client.0.vm00.stdout:9/24: symlink d0/l8 0 2026-03-10T12:37:34.241 INFO:tasks.workunit.client.0.vm00.stdout:9/25: rename d0 to d0/d9 22 2026-03-10T12:37:34.242 INFO:tasks.workunit.client.0.vm00.stdout:9/26: symlink d0/la 0 2026-03-10T12:37:34.243 INFO:tasks.workunit.client.0.vm00.stdout:9/27: read d0/d5/f3 [6670687,37525] 0 2026-03-10T12:37:34.243 INFO:tasks.workunit.client.0.vm00.stdout:9/28: write d0/d5/f3 [6384611,12762] 0 2026-03-10T12:37:34.245 INFO:tasks.workunit.client.0.vm00.stdout:9/29: dread d0/d5/f3 [8388608,4194304] 0 2026-03-10T12:37:34.246 INFO:tasks.workunit.client.0.vm00.stdout:7/21: link f0 f7 0 2026-03-10T12:37:34.246 INFO:tasks.workunit.client.0.vm00.stdout:7/22: rmdir - no directory 2026-03-10T12:37:34.247 INFO:tasks.workunit.client.0.vm00.stdout:7/23: chown f1 4945 1 2026-03-10T12:37:34.247 INFO:tasks.workunit.client.0.vm00.stdout:4/35: fsync f3 0 2026-03-10T12:37:34.252 INFO:tasks.workunit.client.0.vm00.stdout:7/24: creat f8 x:0 0 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:9/133: rename d5/d13/f27 to d5/d16/d23/f28 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:1/209: unlink d9/df/l17 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:1/210: stat d9/df/d29/d2b/d3d 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:3/206: truncate dc/dd/f16 4936188 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:8/219: rename d1/d3/d6/f42 to d1/d3/d11/f47 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:0/242: read - d0/d14/d1a/d1b/d3b/f46 zero size 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:5/213: rename d0/d22/d18/c1d to d0/d22/d18/d19/c4d 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.1.vm07.stdout:5/214: dwrite d0/f9 [0,4194304] 0 2026-03-10T12:37:34.270 INFO:tasks.workunit.client.0.vm00.stdout:4/36: dwrite fa [0,4194304] 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/37: write f8 [1323232,58113] 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/25: creat f9 x:0 0 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/38: symlink le 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/26: fdatasync f7 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:9/30: getdents d0/d5 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/39: dread f9 [0,4194304] 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/27: mkdir da 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/28: stat da 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/40: mkdir df 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/41: getdents df 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/29: link f9 da/fb 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/42: rename c2 to df/c10 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/43: dread f3 [0,4194304] 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/30: creat da/fc x:0 0 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:4/44: dread f9 [4194304,4194304] 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/31: unlink da/fc 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/32: write f7 [1224756,17188] 0 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:7/33: readlink - no filename 2026-03-10T12:37:34.271 INFO:tasks.workunit.client.0.vm00.stdout:6/39: sync 2026-03-10T12:37:34.274 INFO:tasks.workunit.client.1.vm07.stdout:0/243: mkdir d0/d14/d1a/d1b/d41/d4e 0 2026-03-10T12:37:34.275 INFO:tasks.workunit.client.1.vm07.stdout:0/244: write d0/d14/d1a/f3d [4667610,18878] 0 2026-03-10T12:37:34.279 INFO:tasks.workunit.client.1.vm07.stdout:9/134: rename d5/d13/c24 to d5/c29 0 2026-03-10T12:37:34.279 INFO:tasks.workunit.client.1.vm07.stdout:5/215: symlink d0/d22/d18/l4e 0 2026-03-10T12:37:34.279 INFO:tasks.workunit.client.1.vm07.stdout:0/245: mkdir d0/d14/d1a/d2f/d31/d4f 0 2026-03-10T12:37:34.279 INFO:tasks.workunit.client.1.vm07.stdout:4/196: sync 2026-03-10T12:37:34.283 INFO:tasks.workunit.client.1.vm07.stdout:0/246: dwrite d0/d14/d1a/d2f/d31/f4d [0,4194304] 0 2026-03-10T12:37:34.284 INFO:tasks.workunit.client.1.vm07.stdout:0/247: chown d0/d14/d1a/d1b 1892929 1 2026-03-10T12:37:34.284 INFO:tasks.workunit.client.1.vm07.stdout:0/248: readlink d0/d14/d1a/d1b/d41/l42 0 2026-03-10T12:37:34.285 INFO:tasks.workunit.client.0.vm00.stdout:4/45: link f9 df/f11 0 2026-03-10T12:37:34.286 INFO:tasks.workunit.client.0.vm00.stdout:7/34: write f6 [903822,75602] 0 2026-03-10T12:37:34.286 INFO:tasks.workunit.client.0.vm00.stdout:4/46: chown fa 32 1 2026-03-10T12:37:34.286 INFO:tasks.workunit.client.0.vm00.stdout:6/40: chown d2/l3 0 1 2026-03-10T12:37:34.287 INFO:tasks.workunit.client.0.vm00.stdout:6/41: fdatasync d2/f9 0 2026-03-10T12:37:34.289 INFO:tasks.workunit.client.1.vm07.stdout:5/216: rmdir d0/d22/d18/d19/d36 39 2026-03-10T12:37:34.290 INFO:tasks.workunit.client.0.vm00.stdout:6/42: dread d2/f9 [0,4194304] 0 2026-03-10T12:37:34.291 INFO:tasks.workunit.client.1.vm07.stdout:4/197: rmdir d0/d4/d10/d3c/d2b/d2d 39 2026-03-10T12:37:34.291 INFO:tasks.workunit.client.0.vm00.stdout:7/35: creat da/fd x:0 0 0 2026-03-10T12:37:34.292 INFO:tasks.workunit.client.1.vm07.stdout:4/198: readlink d0/d4/d5/l1e 0 2026-03-10T12:37:34.294 INFO:tasks.workunit.client.0.vm00.stdout:6/43: dread d2/f9 [0,4194304] 0 2026-03-10T12:37:34.295 INFO:tasks.workunit.client.0.vm00.stdout:7/36: creat da/fe x:0 0 0 2026-03-10T12:37:34.296 INFO:tasks.workunit.client.0.vm00.stdout:6/44: unlink l1 0 2026-03-10T12:37:34.297 INFO:tasks.workunit.client.0.vm00.stdout:6/45: read d2/f9 [885241,47938] 0 2026-03-10T12:37:34.297 INFO:tasks.workunit.client.0.vm00.stdout:6/46: fdatasync d2/f9 0 2026-03-10T12:37:34.300 INFO:tasks.workunit.client.1.vm07.stdout:5/217: dwrite d0/d22/d18/d19/d36/f3d [0,4194304] 0 2026-03-10T12:37:34.301 INFO:tasks.workunit.client.1.vm07.stdout:5/218: fdatasync d0/d22/d18/d30/f35 0 2026-03-10T12:37:34.301 INFO:tasks.workunit.client.1.vm07.stdout:5/219: readlink d0/l3 0 2026-03-10T12:37:34.302 INFO:tasks.workunit.client.0.vm00.stdout:7/37: dwrite da/fd [0,4194304] 0 2026-03-10T12:37:34.307 INFO:tasks.workunit.client.0.vm00.stdout:6/47: unlink d2/l3 0 2026-03-10T12:37:34.307 INFO:tasks.workunit.client.1.vm07.stdout:0/249: dwrite d0/f1c [8388608,4194304] 0 2026-03-10T12:37:34.307 INFO:tasks.workunit.client.0.vm00.stdout:7/38: write da/fb [580338,581] 0 2026-03-10T12:37:34.311 INFO:tasks.workunit.client.0.vm00.stdout:6/48: mkdir d2/da/dc 0 2026-03-10T12:37:34.313 INFO:tasks.workunit.client.0.vm00.stdout:6/49: dwrite d2/f9 [0,4194304] 0 2026-03-10T12:37:34.319 INFO:tasks.workunit.client.0.vm00.stdout:6/50: creat d2/da/dc/fd x:0 0 0 2026-03-10T12:37:34.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.320+0000 7f989aac4700 1 -- 192.168.123.100:0/1575521790 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894072360 msgr2=0x7f98940770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.320+0000 7f989aac4700 1 --2- 192.168.123.100:0/1575521790 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894072360 0x7f98940770e0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f988c00d3f0 tx=0x7f988c00d700 comp rx=0 tx=0).stop 2026-03-10T12:37:34.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.321+0000 7f989aac4700 1 -- 192.168.123.100:0/1575521790 shutdown_connections 2026-03-10T12:37:34.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.321+0000 7f989aac4700 1 --2- 192.168.123.100:0/1575521790 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894072360 0x7f98940770e0 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.321+0000 7f989aac4700 1 --2- 192.168.123.100:0/1575521790 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9894071980 0x7f9894071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.322 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.321+0000 7f989aac4700 1 -- 192.168.123.100:0/1575521790 >> 192.168.123.100:0/1575521790 conn(0x7f989406d1a0 msgr2=0x7f989406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:34.322 INFO:tasks.workunit.client.0.vm00.stdout:6/51: link d2/da/dc/fd d2/da/fe 0 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.321+0000 7f989aac4700 1 -- 192.168.123.100:0/1575521790 shutdown_connections 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.321+0000 7f989aac4700 1 -- 192.168.123.100:0/1575521790 wait complete. 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f989aac4700 1 Processor -- start 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f989aac4700 1 -- start start 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f989aac4700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894071980 0x7f9894131380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f989aac4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f98941318c0 0x7f989407f550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f989aac4700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9894131dc0 con 0x7f98941318c0 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f989aac4700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9894131f30 con 0x7f9894071980 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f9893fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f98941318c0 0x7f989407f550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f9893fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f98941318c0 0x7f989407f550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59222/0 (socket says 192.168.123.100:59222) 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.322+0000 7f9893fff700 1 -- 192.168.123.100:0/2629837578 learned_addr learned my addr 192.168.123.100:0/2629837578 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:34.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.323+0000 7f9898860700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894071980 0x7f9894131380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.323+0000 7f9893fff700 1 -- 192.168.123.100:0/2629837578 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894071980 msgr2=0x7f9894131380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.323+0000 7f9893fff700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894071980 0x7f9894131380 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.323+0000 7f9893fff700 1 -- 192.168.123.100:0/2629837578 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f988c007ed0 con 0x7f98941318c0 2026-03-10T12:37:34.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.325+0000 7f9893fff700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f98941318c0 0x7f989407f550 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7f988c003c30 tx=0x7f988c003d10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:34.326 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.325+0000 7f9891ffb700 1 -- 192.168.123.100:0/2629837578 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f988c01c070 con 0x7f98941318c0 2026-03-10T12:37:34.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.325+0000 7f989aac4700 1 -- 192.168.123.100:0/2629837578 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f989407fa90 con 0x7f98941318c0 2026-03-10T12:37:34.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.326+0000 7f989aac4700 1 -- 192.168.123.100:0/2629837578 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f989407ff50 con 0x7f98941318c0 2026-03-10T12:37:34.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.327+0000 7f9891ffb700 1 -- 192.168.123.100:0/2629837578 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f988c00fcf0 con 0x7f98941318c0 2026-03-10T12:37:34.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.327+0000 7f9891ffb700 1 -- 192.168.123.100:0/2629837578 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f988c017dd0 con 0x7f98941318c0 2026-03-10T12:37:34.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.328+0000 7f9891ffb700 1 -- 192.168.123.100:0/2629837578 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f988c02a430 con 0x7f98941318c0 2026-03-10T12:37:34.329 INFO:tasks.workunit.client.1.vm07.stdout:9/135: rename d5/fe to d5/f2a 0 2026-03-10T12:37:34.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.329+0000 7f9891ffb700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f987c06c7a0 0x7f987c06ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.329+0000 7f9891ffb700 1 -- 192.168.123.100:0/2629837578 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f988c013070 con 0x7f98941318c0 2026-03-10T12:37:34.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.330+0000 7f989aac4700 1 -- 192.168.123.100:0/2629837578 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9880005320 con 0x7f98941318c0 2026-03-10T12:37:34.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.331+0000 7f9898860700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f987c06c7a0 0x7f987c06ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.332 INFO:tasks.workunit.client.1.vm07.stdout:0/250: dread d0/f15 [0,4194304] 0 2026-03-10T12:37:34.332 INFO:tasks.workunit.client.1.vm07.stdout:9/136: creat d5/d13/f2b x:0 0 0 2026-03-10T12:37:34.332 INFO:tasks.workunit.client.0.vm00.stdout:6/52: dwrite d2/da/fe [0,4194304] 0 2026-03-10T12:37:34.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.333+0000 7f9898860700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f987c06c7a0 0x7f987c06ec50 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f9884009990 tx=0x7f9884008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:34.333 INFO:tasks.workunit.client.0.vm00.stdout:6/53: stat d2/l7 0 2026-03-10T12:37:34.333 INFO:tasks.workunit.client.1.vm07.stdout:9/137: dread - d5/d16/d23/f28 zero size 2026-03-10T12:37:34.341 INFO:tasks.workunit.client.0.vm00.stdout:6/54: creat d2/da/dc/ff x:0 0 0 2026-03-10T12:37:34.341 INFO:tasks.workunit.client.1.vm07.stdout:9/138: dwrite d5/fb [8388608,4194304] 0 2026-03-10T12:37:34.341 INFO:tasks.workunit.client.1.vm07.stdout:4/199: read d0/d4/d10/f39 [319940,39118] 0 2026-03-10T12:37:34.343 INFO:tasks.workunit.client.1.vm07.stdout:0/251: rename d0/d14/d1a/d2f/d31/l3a to d0/d14/d1a/d2f/d31/d4f/l50 0 2026-03-10T12:37:34.343 INFO:tasks.workunit.client.1.vm07.stdout:4/200: fdatasync d0/d19/f25 0 2026-03-10T12:37:34.344 INFO:tasks.workunit.client.1.vm07.stdout:9/139: read d5/d16/d18/f1e [3678347,54008] 0 2026-03-10T12:37:34.346 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.346+0000 7f9891ffb700 1 -- 192.168.123.100:0/2629837578 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f988c00fe60 con 0x7f98941318c0 2026-03-10T12:37:34.351 INFO:tasks.workunit.client.1.vm07.stdout:0/252: mknod d0/d14/d1a/d2f/d31/c51 0 2026-03-10T12:37:34.351 INFO:tasks.workunit.client.1.vm07.stdout:9/140: rmdir d5 39 2026-03-10T12:37:34.351 INFO:tasks.workunit.client.1.vm07.stdout:9/141: readlink l3 0 2026-03-10T12:37:34.351 INFO:tasks.workunit.client.1.vm07.stdout:0/253: chown d0/d14/d1a/d1b/d41/d4e 173 1 2026-03-10T12:37:34.359 INFO:tasks.workunit.client.1.vm07.stdout:0/254: dwrite d0/d14/f37 [0,4194304] 0 2026-03-10T12:37:34.363 INFO:tasks.workunit.client.1.vm07.stdout:9/142: mkdir d5/d13/d2c 0 2026-03-10T12:37:34.363 INFO:tasks.workunit.client.1.vm07.stdout:0/255: write d0/d14/d1a/f2c [557899,68584] 0 2026-03-10T12:37:34.365 INFO:tasks.workunit.client.1.vm07.stdout:9/143: mknod d5/d13/d22/c2d 0 2026-03-10T12:37:34.367 INFO:tasks.workunit.client.1.vm07.stdout:9/144: truncate d5/d16/d18/f20 154893 0 2026-03-10T12:37:34.371 INFO:tasks.workunit.client.1.vm07.stdout:9/145: chown d5/d16/d23/d26 1 1 2026-03-10T12:37:34.371 INFO:tasks.workunit.client.1.vm07.stdout:9/146: rename d5/ld to d5/d1f/l2e 0 2026-03-10T12:37:34.374 INFO:tasks.workunit.client.1.vm07.stdout:0/256: dwrite d0/d14/d1a/f27 [0,4194304] 0 2026-03-10T12:37:34.374 INFO:tasks.workunit.client.1.vm07.stdout:4/201: dread d0/d4/d10/d23/f27 [0,4194304] 0 2026-03-10T12:37:34.374 INFO:tasks.workunit.client.1.vm07.stdout:9/147: unlink d5/d1f/f21 0 2026-03-10T12:37:34.377 INFO:tasks.workunit.client.1.vm07.stdout:9/148: mkdir d5/d13/d2c/d2f 0 2026-03-10T12:37:34.379 INFO:tasks.workunit.client.1.vm07.stdout:9/149: write d5/f1a [354941,116696] 0 2026-03-10T12:37:34.390 INFO:tasks.workunit.client.1.vm07.stdout:9/150: fsync d5/d13/d22/f25 0 2026-03-10T12:37:34.391 INFO:tasks.workunit.client.1.vm07.stdout:9/151: write d5/d13/f14 [635249,83115] 0 2026-03-10T12:37:34.392 INFO:tasks.workunit.client.1.vm07.stdout:4/202: dwrite d0/d4/d10/d18/f1a [0,4194304] 0 2026-03-10T12:37:34.394 INFO:tasks.workunit.client.1.vm07.stdout:9/152: write d5/fb [10698145,3814] 0 2026-03-10T12:37:34.395 INFO:tasks.workunit.client.1.vm07.stdout:4/203: truncate d0/d4/d10/d23/f40 197207 0 2026-03-10T12:37:34.395 INFO:tasks.workunit.client.1.vm07.stdout:7/187: fdatasync d0/f28 0 2026-03-10T12:37:34.399 INFO:tasks.workunit.client.1.vm07.stdout:4/204: symlink d0/d4/d10/d18/l42 0 2026-03-10T12:37:34.399 INFO:tasks.workunit.client.1.vm07.stdout:9/153: link d5/f2a d5/d13/d2c/f30 0 2026-03-10T12:37:34.402 INFO:tasks.workunit.client.1.vm07.stdout:4/205: link d0/d4/d10/f36 d0/d4/d5/f43 0 2026-03-10T12:37:34.403 INFO:tasks.workunit.client.1.vm07.stdout:9/154: mkdir d5/d1f/d31 0 2026-03-10T12:37:34.405 INFO:tasks.workunit.client.1.vm07.stdout:4/206: creat d0/d4/d5/da/f44 x:0 0 0 2026-03-10T12:37:34.406 INFO:tasks.workunit.client.1.vm07.stdout:9/155: rmdir d5 39 2026-03-10T12:37:34.410 INFO:tasks.workunit.client.1.vm07.stdout:9/156: write d5/d16/d18/f1e [3453518,126222] 0 2026-03-10T12:37:34.418 INFO:tasks.workunit.client.1.vm07.stdout:9/157: creat d5/d13/d22/f32 x:0 0 0 2026-03-10T12:37:34.459 INFO:tasks.workunit.client.0.vm00.stdout:3/38: truncate fc 903136 0 2026-03-10T12:37:34.462 INFO:tasks.workunit.client.0.vm00.stdout:3/39: dwrite f9 [0,4194304] 0 2026-03-10T12:37:34.467 INFO:tasks.workunit.client.1.vm07.stdout:4/207: sync 2026-03-10T12:37:34.488 INFO:tasks.workunit.client.1.vm07.stdout:1/211: dread d9/fb [0,4194304] 0 2026-03-10T12:37:34.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.489+0000 7f989aac4700 1 -- 192.168.123.100:0/2629837578 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9880000bf0 con 0x7f987c06c7a0 2026-03-10T12:37:34.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.491+0000 7f9891ffb700 1 -- 192.168.123.100:0/2629837578 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7f9880000bf0 con 0x7f987c06c7a0 2026-03-10T12:37:34.494 INFO:tasks.workunit.client.1.vm07.stdout:5/220: dread d0/d22/d18/d30/f35 [0,4194304] 0 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 -- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f987c06c7a0 msgr2=0x7f987c06ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f987c06c7a0 0x7f987c06ec50 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f9884009990 tx=0x7f9884008040 comp rx=0 tx=0).stop 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 -- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f98941318c0 msgr2=0x7f989407f550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f98941318c0 0x7f989407f550 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7f988c003c30 tx=0x7f988c003d10 comp rx=0 tx=0).stop 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 -- 192.168.123.100:0/2629837578 shutdown_connections 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f987c06c7a0 0x7f987c06ec50 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9894071980 0x7f9894131380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 --2- 192.168.123.100:0/2629837578 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f98941318c0 0x7f989407f550 unknown :-1 s=CLOSED pgs=328 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 -- 192.168.123.100:0/2629837578 >> 192.168.123.100:0/2629837578 conn(0x7f989406d1a0 msgr2=0x7f9894076490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:34.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 -- 192.168.123.100:0/2629837578 shutdown_connections 2026-03-10T12:37:34.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.494+0000 7f987b7fe700 1 -- 192.168.123.100:0/2629837578 wait complete. 2026-03-10T12:37:34.495 INFO:tasks.workunit.client.1.vm07.stdout:5/221: creat d0/d22/d18/d19/d21/d3a/f4f x:0 0 0 2026-03-10T12:37:34.496 INFO:tasks.workunit.client.1.vm07.stdout:5/222: write d0/d22/d18/d19/f23 [4077175,34360] 0 2026-03-10T12:37:34.499 INFO:tasks.workunit.client.1.vm07.stdout:5/223: rename d0/f2b to d0/d22/f50 0 2026-03-10T12:37:34.507 INFO:tasks.workunit.client.0.vm00.stdout:0/16: rmdir d3 39 2026-03-10T12:37:34.507 INFO:tasks.workunit.client.0.vm00.stdout:0/17: write f2 [819076,60966] 0 2026-03-10T12:37:34.507 INFO:tasks.workunit.client.1.vm07.stdout:5/224: symlink d0/d22/d18/d19/d36/l51 0 2026-03-10T12:37:34.507 INFO:tasks.workunit.client.1.vm07.stdout:5/225: creat d0/d22/d18/d19/d2e/f52 x:0 0 0 2026-03-10T12:37:34.507 INFO:tasks.workunit.client.1.vm07.stdout:5/226: write d0/d22/d18/d19/d2e/f52 [557910,130217] 0 2026-03-10T12:37:34.517 INFO:tasks.workunit.client.1.vm07.stdout:2/141: write d0/d19/d26/f2e [1763785,90133] 0 2026-03-10T12:37:34.517 INFO:tasks.workunit.client.1.vm07.stdout:2/142: chown d0/d19/d26 2474 1 2026-03-10T12:37:34.520 INFO:tasks.workunit.client.1.vm07.stdout:3/207: truncate dc/dd/f29 4303383 0 2026-03-10T12:37:34.523 INFO:tasks.workunit.client.1.vm07.stdout:3/208: symlink dc/dd/l4e 0 2026-03-10T12:37:34.524 INFO:tasks.workunit.client.1.vm07.stdout:3/209: rmdir dc/dd/d1f 39 2026-03-10T12:37:34.525 INFO:tasks.workunit.client.1.vm07.stdout:3/210: mknod dc/d18/c4f 0 2026-03-10T12:37:34.528 INFO:tasks.workunit.client.1.vm07.stdout:3/211: creat dc/dd/d1f/d45/f50 x:0 0 0 2026-03-10T12:37:34.529 INFO:tasks.workunit.client.1.vm07.stdout:5/227: sync 2026-03-10T12:37:34.530 INFO:tasks.workunit.client.1.vm07.stdout:3/212: creat dc/dd/d28/d3b/f51 x:0 0 0 2026-03-10T12:37:34.530 INFO:tasks.workunit.client.1.vm07.stdout:3/213: dread - dc/dd/d1f/f30 zero size 2026-03-10T12:37:34.530 INFO:tasks.workunit.client.1.vm07.stdout:2/143: sync 2026-03-10T12:37:34.531 INFO:tasks.workunit.client.1.vm07.stdout:2/144: write d0/d19/d26/f2e [1728306,124370] 0 2026-03-10T12:37:34.534 INFO:tasks.workunit.client.1.vm07.stdout:5/228: mkdir d0/d22/d18/d3e/d53 0 2026-03-10T12:37:34.538 INFO:tasks.workunit.client.1.vm07.stdout:2/145: symlink d0/d29/l34 0 2026-03-10T12:37:34.539 INFO:tasks.workunit.client.1.vm07.stdout:5/229: write d0/f1f [387561,48101] 0 2026-03-10T12:37:34.540 INFO:tasks.workunit.client.1.vm07.stdout:5/230: stat d0/d22/d18/d19/d21/f42 0 2026-03-10T12:37:34.542 INFO:tasks.workunit.client.1.vm07.stdout:2/146: rmdir d0/d19/d1f 39 2026-03-10T12:37:34.543 INFO:tasks.workunit.client.1.vm07.stdout:3/214: link dc/dd/d1f/l2a dc/dd/d43/l52 0 2026-03-10T12:37:34.545 INFO:tasks.workunit.client.1.vm07.stdout:5/231: rename d0/d22/d4a to d0/d22/d18/d19/d21/d54 0 2026-03-10T12:37:34.547 INFO:tasks.workunit.client.1.vm07.stdout:5/232: dread d0/f9 [0,4194304] 0 2026-03-10T12:37:34.561 INFO:tasks.workunit.client.1.vm07.stdout:2/147: symlink d0/l35 0 2026-03-10T12:37:34.563 INFO:tasks.workunit.client.1.vm07.stdout:3/215: mknod dc/d18/c53 0 2026-03-10T12:37:34.563 INFO:tasks.workunit.client.1.vm07.stdout:3/216: stat dc/dd/d1f 0 2026-03-10T12:37:34.566 INFO:tasks.workunit.client.1.vm07.stdout:2/148: dwrite d0/f15 [0,4194304] 0 2026-03-10T12:37:34.581 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:34 vm07.local ceph-mon[58582]: pgmap v152: 65 pgs: 65 active+clean; 426 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 388 KiB/s rd, 35 MiB/s wr, 355 op/s 2026-03-10T12:37:34.581 INFO:tasks.workunit.client.1.vm07.stdout:3/217: creat dc/dd/d1f/d45/f54 x:0 0 0 2026-03-10T12:37:34.581 INFO:tasks.workunit.client.1.vm07.stdout:2/149: dwrite d0/d29/f32 [0,4194304] 0 2026-03-10T12:37:34.582 INFO:tasks.workunit.client.1.vm07.stdout:3/218: dwrite dc/d18/f36 [0,4194304] 0 2026-03-10T12:37:34.582 INFO:tasks.workunit.client.1.vm07.stdout:3/219: truncate dc/dd/d1f/d45/f54 708309 0 2026-03-10T12:37:34.582 INFO:tasks.workunit.client.1.vm07.stdout:2/150: chown d0/d19/d1f 6478 1 2026-03-10T12:37:34.582 INFO:tasks.workunit.client.1.vm07.stdout:3/220: dread - dc/d18/d24/f3f zero size 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.586+0000 7f09c2624700 1 -- 192.168.123.100:0/101768431 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f09bc072330 msgr2=0x7f09bc0770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.586+0000 7f09c2624700 1 --2- 192.168.123.100:0/101768431 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f09bc072330 0x7f09bc0770b0 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f09b400b3a0 tx=0x7f09b400b6b0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.587+0000 7f09c2624700 1 -- 192.168.123.100:0/101768431 shutdown_connections 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.587+0000 7f09c2624700 1 --2- 192.168.123.100:0/101768431 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f09bc072330 0x7f09bc0770b0 unknown :-1 s=CLOSED pgs=329 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.587+0000 7f09c2624700 1 --2- 192.168.123.100:0/101768431 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 0x7f09bc071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.587+0000 7f09c2624700 1 -- 192.168.123.100:0/101768431 >> 192.168.123.100:0/101768431 conn(0x7f09bc06d1a0 msgr2=0x7f09bc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.587+0000 7f09c2624700 1 -- 192.168.123.100:0/101768431 shutdown_connections 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.587+0000 7f09c2624700 1 -- 192.168.123.100:0/101768431 wait complete. 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.588+0000 7f09c2624700 1 Processor -- start 2026-03-10T12:37:34.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.588+0000 7f09c2624700 1 -- start start 2026-03-10T12:37:34.589 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.588+0000 7f09c2624700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 0x7f09bc0825b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.589 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.588+0000 7f09c2624700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f09bc082af0 0x7f09bc082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.589 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.588+0000 7f09c2624700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09bc12dd80 con 0x7f09bc082af0 2026-03-10T12:37:34.589 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.588+0000 7f09c2624700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f09bc12def0 con 0x7f09bc071950 2026-03-10T12:37:34.589 INFO:tasks.workunit.client.1.vm07.stdout:2/151: mknod d0/d19/d1f/c36 0 2026-03-10T12:37:34.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.589+0000 7f09c1622700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 0x7f09bc0825b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.589+0000 7f09c1622700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 0x7f09bc0825b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:50320/0 (socket says 192.168.123.100:50320) 2026-03-10T12:37:34.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.589+0000 7f09c1622700 1 -- 192.168.123.100:0/3401243365 learned_addr learned my addr 192.168.123.100:0/3401243365 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:34.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.589+0000 7f09c1622700 1 -- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f09bc082af0 msgr2=0x7f09bc082f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.589+0000 7f09c1622700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f09bc082af0 0x7f09bc082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.589+0000 7f09c1622700 1 -- 192.168.123.100:0/3401243365 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f09b400b050 con 0x7f09bc071950 2026-03-10T12:37:34.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.590+0000 7f09c1622700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 0x7f09bc0825b0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f09b800b770 tx=0x7f09b800bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:34.593 INFO:tasks.workunit.client.1.vm07.stdout:2/152: dwrite d0/f14 [0,4194304] 0 2026-03-10T12:37:34.595 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.594+0000 7f09b27fc700 1 -- 192.168.123.100:0/3401243365 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09b800f820 con 0x7f09bc071950 2026-03-10T12:37:34.595 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.594+0000 7f09c2624700 1 -- 192.168.123.100:0/3401243365 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09bc12e1d0 con 0x7f09bc071950 2026-03-10T12:37:34.595 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.595+0000 7f09c2624700 1 -- 192.168.123.100:0/3401243365 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09bc12e7a0 con 0x7f09bc071950 2026-03-10T12:37:34.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.595+0000 7f09b27fc700 1 -- 192.168.123.100:0/3401243365 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f09b800fe60 con 0x7f09bc071950 2026-03-10T12:37:34.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.595+0000 7f09b27fc700 1 -- 192.168.123.100:0/3401243365 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09b800d610 con 0x7f09bc071950 2026-03-10T12:37:34.598 INFO:tasks.workunit.client.1.vm07.stdout:3/221: creat dc/d18/d24/f55 x:0 0 0 2026-03-10T12:37:34.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.597+0000 7f09b27fc700 1 -- 192.168.123.100:0/3401243365 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f09b8017400 con 0x7f09bc071950 2026-03-10T12:37:34.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.598+0000 7f09b27fc700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f09a806c7a0 0x7f09a806ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.598+0000 7f09b27fc700 1 -- 192.168.123.100:0/3401243365 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f09b808b650 con 0x7f09bc071950 2026-03-10T12:37:34.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.598+0000 7f09c0e21700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f09a806c7a0 0x7f09a806ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.599+0000 7f09c2624700 1 -- 192.168.123.100:0/3401243365 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f09bc07c910 con 0x7f09bc071950 2026-03-10T12:37:34.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.600+0000 7f09c0e21700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f09a806c7a0 0x7f09a806ec50 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f09b400bb30 tx=0x7f09b400bf90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:34.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.603+0000 7f09b27fc700 1 -- 192.168.123.100:0/3401243365 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f09b8055dc0 con 0x7f09bc071950 2026-03-10T12:37:34.620 INFO:tasks.workunit.client.1.vm07.stdout:3/222: creat dc/dd/d1f/d45/f56 x:0 0 0 2026-03-10T12:37:34.620 INFO:tasks.workunit.client.1.vm07.stdout:2/153: rename d0/d19/c24 to d0/c37 0 2026-03-10T12:37:34.620 INFO:tasks.workunit.client.1.vm07.stdout:2/154: unlink d0/c28 0 2026-03-10T12:37:34.620 INFO:tasks.workunit.client.1.vm07.stdout:2/155: mkdir d0/d19/d26/d38 0 2026-03-10T12:37:34.620 INFO:tasks.workunit.client.1.vm07.stdout:2/156: dread - d0/d19/f2c zero size 2026-03-10T12:37:34.625 INFO:tasks.workunit.client.0.vm00.stdout:3/40: rename dd to dd/d10 22 2026-03-10T12:37:34.625 INFO:tasks.workunit.client.0.vm00.stdout:3/41: write fb [82389,129658] 0 2026-03-10T12:37:34.635 INFO:tasks.workunit.client.0.vm00.stdout:1/20: fsync f4 0 2026-03-10T12:37:34.645 INFO:tasks.workunit.client.0.vm00.stdout:3/42: mkdir dd/d11 0 2026-03-10T12:37:34.645 INFO:tasks.workunit.client.0.vm00.stdout:3/43: dread - f7 zero size 2026-03-10T12:37:34.646 INFO:tasks.workunit.client.0.vm00.stdout:1/21: mknod c8 0 2026-03-10T12:37:34.648 INFO:tasks.workunit.client.0.vm00.stdout:0/18: symlink d3/l5 0 2026-03-10T12:37:34.651 INFO:tasks.workunit.client.0.vm00.stdout:8/24: link d0/l1 d0/l2 0 2026-03-10T12:37:34.651 INFO:tasks.workunit.client.0.vm00.stdout:8/25: dwrite - no filename 2026-03-10T12:37:34.651 INFO:tasks.workunit.client.0.vm00.stdout:8/26: dread - no filename 2026-03-10T12:37:34.652 INFO:tasks.workunit.client.0.vm00.stdout:0/19: dwrite f0 [4194304,4194304] 0 2026-03-10T12:37:34.668 INFO:tasks.workunit.client.0.vm00.stdout:3/44: dwrite f7 [0,4194304] 0 2026-03-10T12:37:34.673 INFO:tasks.workunit.client.1.vm07.stdout:3/223: dread f1 [0,4194304] 0 2026-03-10T12:37:34.675 INFO:tasks.workunit.client.0.vm00.stdout:8/27: symlink d0/l3 0 2026-03-10T12:37:34.691 INFO:tasks.workunit.client.0.vm00.stdout:0/20: symlink d3/l6 0 2026-03-10T12:37:34.691 INFO:tasks.workunit.client.0.vm00.stdout:8/28: readlink d0/l2 0 2026-03-10T12:37:34.691 INFO:tasks.workunit.client.0.vm00.stdout:1/22: link c8 c9 0 2026-03-10T12:37:34.691 INFO:tasks.workunit.client.0.vm00.stdout:1/23: dread f4 [0,4194304] 0 2026-03-10T12:37:34.691 INFO:tasks.workunit.client.0.vm00.stdout:1/24: truncate f4 4565039 0 2026-03-10T12:37:34.691 INFO:tasks.workunit.client.0.vm00.stdout:1/25: read f4 [2044700,113286] 0 2026-03-10T12:37:34.701 INFO:tasks.workunit.client.0.vm00.stdout:1/26: mkdir da 0 2026-03-10T12:37:34.701 INFO:tasks.workunit.client.0.vm00.stdout:1/27: write f3 [153202,22406] 0 2026-03-10T12:37:34.705 INFO:tasks.workunit.client.0.vm00.stdout:1/28: dread f4 [0,4194304] 0 2026-03-10T12:37:34.705 INFO:tasks.workunit.client.0.vm00.stdout:1/29: read f5 [564933,26946] 0 2026-03-10T12:37:34.711 INFO:tasks.workunit.client.0.vm00.stdout:1/30: dwrite f5 [0,4194304] 0 2026-03-10T12:37:34.730 INFO:tasks.workunit.client.0.vm00.stdout:2/23: truncate f3 29920 0 2026-03-10T12:37:34.735 INFO:tasks.workunit.client.0.vm00.stdout:2/24: creat d4/d6/fb x:0 0 0 2026-03-10T12:37:34.736 INFO:tasks.workunit.client.0.vm00.stdout:1/31: symlink da/lb 0 2026-03-10T12:37:34.742 INFO:tasks.workunit.client.0.vm00.stdout:1/32: creat da/fc x:0 0 0 2026-03-10T12:37:34.742 INFO:tasks.workunit.client.0.vm00.stdout:1/33: chown f3 241 1 2026-03-10T12:37:34.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.739+0000 7f09c2624700 1 -- 192.168.123.100:0/3401243365 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f09bc02ced0 con 0x7f09a806c7a0 2026-03-10T12:37:34.744 INFO:tasks.workunit.client.0.vm00.stdout:1/34: symlink da/ld 0 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (3m) 2m ago 4m 22.8M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (4m) 2m ago 4m 8074k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (4m) 2m ago 4m 8208k - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (4m) 2m ago 4m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (4m) 2m ago 4m 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (3m) 2m ago 4m 82.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (2m) 2m ago 2m 17.2M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (2m) 2m ago 2m 14.2M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (2m) 2m ago 2m 13.7M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (2m) 2m ago 2m 18.6M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:9283,8765,8443 running (5m) 2m ago 5m 498M - 18.2.0 dc2bc1663786 8dc0a869be20 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (4m) 2m ago 4m 448M - 18.2.0 dc2bc1663786 1662ba2e507c 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (5m) 2m ago 5m 50.6M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (3m) 2m ago 3m 44.3M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (4m) 2m ago 4m 14.4M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (4m) 2m ago 4m 12.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (3m) 2m ago 3m 45.5M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (3m) 2m ago 3m 45.9M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (3m) 2m ago 3m 46.7M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (3m) 2m ago 3m 44.8M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (2m) 2m ago 2m 43.9M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (2m) 2m ago 2m 42.7M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:37:34.747 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (3m) 2m ago 4m 39.1M - 2.43.0 a07b618ecd1d 5d567c813f4b 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.745+0000 7f09b27fc700 1 -- 192.168.123.100:0/3401243365 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3216 (secure 0 0 0) 0x7f09bc02ced0 con 0x7f09a806c7a0 2026-03-10T12:37:34.748 INFO:tasks.workunit.client.0.vm00.stdout:1/35: mknod da/ce 0 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.747+0000 7f09a7fff700 1 -- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f09a806c7a0 msgr2=0x7f09a806ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.747+0000 7f09a7fff700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f09a806c7a0 0x7f09a806ec50 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f09b400bb30 tx=0x7f09b400bf90 comp rx=0 tx=0).stop 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.747+0000 7f09a7fff700 1 -- 192.168.123.100:0/3401243365 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 msgr2=0x7f09bc0825b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.747+0000 7f09a7fff700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 0x7f09bc0825b0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f09b800b770 tx=0x7f09b800bb30 comp rx=0 tx=0).stop 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.748+0000 7f09a7fff700 1 -- 192.168.123.100:0/3401243365 shutdown_connections 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.748+0000 7f09a7fff700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f09a806c7a0 0x7f09a806ec50 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.748+0000 7f09a7fff700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f09bc071950 0x7f09bc0825b0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.748+0000 7f09a7fff700 1 --2- 192.168.123.100:0/3401243365 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f09bc082af0 0x7f09bc082f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.748+0000 7f09a7fff700 1 -- 192.168.123.100:0/3401243365 >> 192.168.123.100:0/3401243365 conn(0x7f09bc06d1a0 msgr2=0x7f09bc0764a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.748+0000 7f09a7fff700 1 -- 192.168.123.100:0/3401243365 shutdown_connections 2026-03-10T12:37:34.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.748+0000 7f09a7fff700 1 -- 192.168.123.100:0/3401243365 wait complete. 2026-03-10T12:37:34.756 INFO:tasks.workunit.client.0.vm00.stdout:1/36: rename c9 to da/cf 0 2026-03-10T12:37:34.757 INFO:tasks.workunit.client.0.vm00.stdout:1/37: mknod da/c10 0 2026-03-10T12:37:34.765 INFO:tasks.workunit.client.0.vm00.stdout:1/38: link da/fc da/f11 0 2026-03-10T12:37:34.765 INFO:tasks.workunit.client.0.vm00.stdout:1/39: dread - da/fc zero size 2026-03-10T12:37:34.768 INFO:tasks.workunit.client.0.vm00.stdout:1/40: mkdir da/d12 0 2026-03-10T12:37:34.769 INFO:tasks.workunit.client.0.vm00.stdout:1/41: write f3 [206713,26173] 0 2026-03-10T12:37:34.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/3636848367 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b100a5800 msgr2=0x7f8b100a5c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 --2- 192.168.123.100:0/3636848367 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b100a5800 0x7f8b100a5c10 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f8b14009a60 tx=0x7f8b14009d70 comp rx=0 tx=0).stop 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/3636848367 shutdown_connections 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 --2- 192.168.123.100:0/3636848367 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8b100a3e40 0x7f8b100a4290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 --2- 192.168.123.100:0/3636848367 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b100a5800 0x7f8b100a5c10 unknown :-1 s=CLOSED pgs=330 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/3636848367 >> 192.168.123.100:0/3636848367 conn(0x7f8b1009f7b0 msgr2=0x7f8b100a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/3636848367 shutdown_connections 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.824+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/3636848367 wait complete. 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.825+0000 7f8b1ecd7700 1 Processor -- start 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.825+0000 7f8b1ecd7700 1 -- start start 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.825+0000 7f8b1ecd7700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8b100a3e40 0x7f8b100141d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.825+0000 7f8b1ecd7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b10014710 0x7f8b10015760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.825+0000 7f8b1ecd7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b10014c10 con 0x7f8b10014710 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.825+0000 7f8b1ecd7700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b10014d80 con 0x7f8b100a3e40 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1d4d4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b10014710 0x7f8b10015760 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1d4d4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b10014710 0x7f8b10015760 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59262/0 (socket says 192.168.123.100:59262) 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1d4d4700 1 -- 192.168.123.100:0/1835453162 learned_addr learned my addr 192.168.123.100:0/1835453162 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1dcd5700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8b100a3e40 0x7f8b100141d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1d4d4700 1 -- 192.168.123.100:0/1835453162 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8b100a3e40 msgr2=0x7f8b100141d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1d4d4700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8b100a3e40 0x7f8b100141d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1d4d4700 1 -- 192.168.123.100:0/1835453162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8b14009710 con 0x7f8b10014710 2026-03-10T12:37:34.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.826+0000 7f8b1d4d4700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b10014710 0x7f8b10015760 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f8b18066720 tx=0x7f8b18072a20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:34.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.827+0000 7f8b0effd700 1 -- 192.168.123.100:0/1835453162 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b180675b0 con 0x7f8b10014710 2026-03-10T12:37:34.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.828+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/1835453162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8b10015d00 con 0x7f8b10014710 2026-03-10T12:37:34.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.828+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/1835453162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8b10016220 con 0x7f8b10014710 2026-03-10T12:37:34.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.828+0000 7f8b0effd700 1 -- 192.168.123.100:0/1835453162 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8b1807a910 con 0x7f8b10014710 2026-03-10T12:37:34.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.828+0000 7f8b0effd700 1 -- 192.168.123.100:0/1835453162 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b18083960 con 0x7f8b10014710 2026-03-10T12:37:34.830 INFO:tasks.workunit.client.0.vm00.stdout:9/31: truncate d0/d5/f3 8345490 0 2026-03-10T12:37:34.830 INFO:tasks.workunit.client.0.vm00.stdout:9/32: rename d0 to d0/db 22 2026-03-10T12:37:34.832 INFO:tasks.workunit.client.0.vm00.stdout:9/33: dread d0/f4 [0,4194304] 0 2026-03-10T12:37:34.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.830+0000 7f8b0effd700 1 -- 192.168.123.100:0/1835453162 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f8b18083ac0 con 0x7f8b10014710 2026-03-10T12:37:34.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.830+0000 7f8b0effd700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8b0406c6d0 0x7f8b0406eb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:34.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.831+0000 7f8b1dcd5700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8b0406c6d0 0x7f8b0406eb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:34.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.831+0000 7f8b0effd700 1 -- 192.168.123.100:0/1835453162 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f8b180ef0f0 con 0x7f8b10014710 2026-03-10T12:37:34.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.831+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/1835453162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8b10004f40 con 0x7f8b10014710 2026-03-10T12:37:34.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.832+0000 7f8b1dcd5700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8b0406c6d0 0x7f8b0406eb80 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f8b1400d7d0 tx=0x7f8b14017040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:34.835 INFO:tasks.workunit.client.1.vm07.stdout:6/165: dwrite d1/d9/f22 [0,4194304] 0 2026-03-10T12:37:34.836 INFO:tasks.workunit.client.0.vm00.stdout:9/34: mkdir d0/d5/dc 0 2026-03-10T12:37:34.838 INFO:tasks.workunit.client.0.vm00.stdout:9/35: mkdir d0/dd 0 2026-03-10T12:37:34.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:34.838+0000 7f8b0effd700 1 -- 192.168.123.100:0/1835453162 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f8b180bd300 con 0x7f8b10014710 2026-03-10T12:37:34.844 INFO:tasks.workunit.client.1.vm07.stdout:6/166: creat d1/f38 x:0 0 0 2026-03-10T12:37:34.850 INFO:tasks.workunit.client.1.vm07.stdout:6/167: creat d1/d4/d6/f39 x:0 0 0 2026-03-10T12:37:34.854 INFO:tasks.workunit.client.1.vm07.stdout:6/168: symlink d1/d4/d6/l3a 0 2026-03-10T12:37:34.855 INFO:tasks.workunit.client.1.vm07.stdout:6/169: chown d1/d4/d6/d16/d1a/d33/f37 29840988 1 2026-03-10T12:37:34.857 INFO:tasks.workunit.client.1.vm07.stdout:9/158: truncate d5/d13/d2c/f30 1051783 0 2026-03-10T12:37:34.859 INFO:tasks.workunit.client.1.vm07.stdout:9/159: readlink d5/l7 0 2026-03-10T12:37:34.860 INFO:tasks.workunit.client.1.vm07.stdout:9/160: fsync d5/fb 0 2026-03-10T12:37:34.860 INFO:tasks.workunit.client.1.vm07.stdout:9/161: stat d5/d13/d22 0 2026-03-10T12:37:34.871 INFO:tasks.workunit.client.1.vm07.stdout:9/162: symlink d5/d13/l33 0 2026-03-10T12:37:34.877 INFO:tasks.workunit.client.1.vm07.stdout:9/163: dread d5/d16/f19 [0,4194304] 0 2026-03-10T12:37:34.881 INFO:tasks.workunit.client.1.vm07.stdout:7/188: write d0/f29 [137849,45530] 0 2026-03-10T12:37:34.884 INFO:tasks.workunit.client.1.vm07.stdout:7/189: dread - d0/f1d zero size 2026-03-10T12:37:34.892 INFO:tasks.workunit.client.1.vm07.stdout:7/190: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:37:34.902 INFO:tasks.workunit.client.1.vm07.stdout:7/191: unlink d0/f1f 0 2026-03-10T12:37:34.912 INFO:tasks.workunit.client.1.vm07.stdout:4/208: dwrite d0/d4/d10/f16 [0,4194304] 0 2026-03-10T12:37:34.930 INFO:tasks.workunit.client.1.vm07.stdout:4/209: truncate d0/d4/d5/da/f44 412863 0 2026-03-10T12:37:34.930 INFO:tasks.workunit.client.1.vm07.stdout:7/192: rename d0/f1d to d0/f2f 0 2026-03-10T12:37:34.930 INFO:tasks.workunit.client.1.vm07.stdout:7/193: truncate d0/f26 500557 0 2026-03-10T12:37:34.930 INFO:tasks.workunit.client.1.vm07.stdout:1/212: dwrite d9/fb [0,4194304] 0 2026-03-10T12:37:34.939 INFO:tasks.workunit.client.1.vm07.stdout:4/210: mknod d0/d4/d10/c45 0 2026-03-10T12:37:34.941 INFO:tasks.workunit.client.1.vm07.stdout:0/257: dread d0/d14/d1a/f24 [0,4194304] 0 2026-03-10T12:37:34.943 INFO:tasks.workunit.client.1.vm07.stdout:8/220: write d1/f3d [911591,85653] 0 2026-03-10T12:37:34.955 INFO:tasks.workunit.client.1.vm07.stdout:0/258: symlink d0/d14/d1a/d1b/l52 0 2026-03-10T12:37:34.967 INFO:tasks.workunit.client.1.vm07.stdout:7/194: creat d0/f30 x:0 0 0 2026-03-10T12:37:34.967 INFO:tasks.workunit.client.1.vm07.stdout:5/233: getdents d0/d22/d18/d3e 0 2026-03-10T12:37:34.967 INFO:tasks.workunit.client.1.vm07.stdout:4/211: chown d0/d4/d10/d3c/d2b/d2d/l3f 46 1 2026-03-10T12:37:34.974 INFO:tasks.workunit.client.1.vm07.stdout:5/234: write d0/ff [862309,74049] 0 2026-03-10T12:37:34.976 INFO:tasks.workunit.client.1.vm07.stdout:4/212: mkdir d0/d4/d10/d23/d46 0 2026-03-10T12:37:34.976 INFO:tasks.workunit.client.0.vm00.stdout:4/47: getdents df 0 2026-03-10T12:37:34.976 INFO:tasks.workunit.client.0.vm00.stdout:4/48: chown fb 0 1 2026-03-10T12:37:34.977 INFO:tasks.workunit.client.0.vm00.stdout:7/39: getdents da 0 2026-03-10T12:37:34.979 INFO:tasks.workunit.client.1.vm07.stdout:7/195: unlink d0/c22 0 2026-03-10T12:37:34.982 INFO:tasks.workunit.client.1.vm07.stdout:8/221: link d1/f19 d1/f48 0 2026-03-10T12:37:34.984 INFO:tasks.workunit.client.1.vm07.stdout:5/235: symlink d0/d22/d18/d19/d2e/d3f/l55 0 2026-03-10T12:37:34.984 INFO:tasks.workunit.client.1.vm07.stdout:5/236: fsync d0/f1f 0 2026-03-10T12:37:34.984 INFO:tasks.workunit.client.1.vm07.stdout:5/237: chown d0/ff 5 1 2026-03-10T12:37:34.986 INFO:tasks.workunit.client.1.vm07.stdout:4/213: creat d0/d4/d10/d23/f47 x:0 0 0 2026-03-10T12:37:34.986 INFO:tasks.workunit.client.0.vm00.stdout:4/49: creat df/f12 x:0 0 0 2026-03-10T12:37:34.987 INFO:tasks.workunit.client.0.vm00.stdout:4/50: write fb [903349,72216] 0 2026-03-10T12:37:34.987 INFO:tasks.workunit.client.0.vm00.stdout:4/51: chown lc 739978 1 2026-03-10T12:37:34.987 INFO:tasks.workunit.client.0.vm00.stdout:7/40: mknod da/cf 0 2026-03-10T12:37:34.991 INFO:tasks.workunit.client.1.vm07.stdout:8/222: read d1/d3/ff [1750732,102217] 0 2026-03-10T12:37:34.995 INFO:tasks.workunit.client.0.vm00.stdout:6/55: getdents d2/da 0 2026-03-10T12:37:34.995 INFO:tasks.workunit.client.1.vm07.stdout:3/224: fsync dc/dd/d1f/d45/f56 0 2026-03-10T12:37:34.997 INFO:tasks.workunit.client.0.vm00.stdout:6/56: dread d2/da/fe [0,4194304] 0 2026-03-10T12:37:35.000 INFO:tasks.workunit.client.1.vm07.stdout:0/259: link d0/l1f d0/d14/d1a/d2f/d31/d4f/l53 0 2026-03-10T12:37:35.003 INFO:tasks.workunit.client.0.vm00.stdout:7/41: write f4 [558331,42729] 0 2026-03-10T12:37:35.011 INFO:tasks.workunit.client.0.vm00.stdout:6/57: creat d2/da/f10 x:0 0 0 2026-03-10T12:37:35.012 INFO:tasks.workunit.client.1.vm07.stdout:2/157: write d0/d19/d1f/f2f [2259131,36303] 0 2026-03-10T12:37:35.015 INFO:tasks.workunit.client.0.vm00.stdout:6/58: dwrite d2/da/f10 [0,4194304] 0 2026-03-10T12:37:35.023 INFO:tasks.workunit.client.1.vm07.stdout:8/223: mknod d1/c49 0 2026-03-10T12:37:35.027 INFO:tasks.workunit.client.1.vm07.stdout:8/224: dwrite d1/d3/f1d [4194304,4194304] 0 2026-03-10T12:37:35.033 INFO:tasks.workunit.client.0.vm00.stdout:6/59: creat d2/da/f11 x:0 0 0 2026-03-10T12:37:35.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.036+0000 7f8b1ecd7700 1 -- 192.168.123.100:0/1835453162 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f8b10151380 con 0x7f8b10014710 2026-03-10T12:37:35.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.036+0000 7f8b0effd700 1 -- 192.168.123.100:0/1835453162 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f8b180bce90 con 0x7f8b10014710 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 14 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:37:35.038 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:37:35.041 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.040+0000 7f8b0cff9700 1 -- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8b0406c6d0 msgr2=0x7f8b0406eb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.041 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.040+0000 7f8b0cff9700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8b0406c6d0 0x7f8b0406eb80 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f8b1400d7d0 tx=0x7f8b14017040 comp rx=0 tx=0).stop 2026-03-10T12:37:35.041 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.041+0000 7f8b0cff9700 1 -- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b10014710 msgr2=0x7f8b10015760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.041 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.041+0000 7f8b0cff9700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b10014710 0x7f8b10015760 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f8b18066720 tx=0x7f8b18072a20 comp rx=0 tx=0).stop 2026-03-10T12:37:35.041 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.041+0000 7f8b0cff9700 1 -- 192.168.123.100:0/1835453162 shutdown_connections 2026-03-10T12:37:35.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.041+0000 7f8b0cff9700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f8b0406c6d0 0x7f8b0406eb80 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.041+0000 7f8b0cff9700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8b100a3e40 0x7f8b100141d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.042+0000 7f8b0cff9700 1 --2- 192.168.123.100:0/1835453162 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8b10014710 0x7f8b10015760 unknown :-1 s=CLOSED pgs=331 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.042+0000 7f8b0cff9700 1 -- 192.168.123.100:0/1835453162 >> 192.168.123.100:0/1835453162 conn(0x7f8b1009f7b0 msgr2=0x7f8b100a1b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:35.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.042+0000 7f8b0cff9700 1 -- 192.168.123.100:0/1835453162 shutdown_connections 2026-03-10T12:37:35.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.042+0000 7f8b0cff9700 1 -- 192.168.123.100:0/1835453162 wait complete. 2026-03-10T12:37:35.045 INFO:tasks.workunit.client.1.vm07.stdout:5/238: getdents d0/d22/d18/d3e 0 2026-03-10T12:37:35.046 INFO:tasks.workunit.client.0.vm00.stdout:3/45: fsync fc 0 2026-03-10T12:37:35.051 INFO:tasks.workunit.client.1.vm07.stdout:2/158: dwrite d0/d19/f22 [0,4194304] 0 2026-03-10T12:37:35.051 INFO:tasks.workunit.client.1.vm07.stdout:2/159: write d0/d19/d26/f27 [465318,84683] 0 2026-03-10T12:37:35.051 INFO:tasks.workunit.client.0.vm00.stdout:3/46: creat dd/d11/f12 x:0 0 0 2026-03-10T12:37:35.051 INFO:tasks.workunit.client.0.vm00.stdout:3/47: chown dd/ce 9479362 1 2026-03-10T12:37:35.053 INFO:tasks.workunit.client.0.vm00.stdout:3/48: rmdir dd 39 2026-03-10T12:37:35.057 INFO:tasks.workunit.client.0.vm00.stdout:3/49: mkdir dd/d11/d13 0 2026-03-10T12:37:35.058 INFO:tasks.workunit.client.0.vm00.stdout:3/50: mkdir dd/d11/d14 0 2026-03-10T12:37:35.061 INFO:tasks.workunit.client.0.vm00.stdout:3/51: dread fb [0,4194304] 0 2026-03-10T12:37:35.061 INFO:tasks.workunit.client.1.vm07.stdout:0/260: link d0/d14/d1a/f24 d0/d14/d1a/d1b/f54 0 2026-03-10T12:37:35.062 INFO:tasks.workunit.client.1.vm07.stdout:0/261: chown d0/d14/d1a/l4c 241447640 1 2026-03-10T12:37:35.063 INFO:tasks.workunit.client.0.vm00.stdout:3/52: rename fc to dd/f15 0 2026-03-10T12:37:35.063 INFO:tasks.workunit.client.1.vm07.stdout:0/262: truncate d0/d14/d1a/d2f/d31/f4d 4310307 0 2026-03-10T12:37:35.064 INFO:tasks.workunit.client.0.vm00.stdout:3/53: write dd/f15 [1459763,94311] 0 2026-03-10T12:37:35.066 INFO:tasks.workunit.client.0.vm00.stdout:3/54: creat dd/f16 x:0 0 0 2026-03-10T12:37:35.073 INFO:tasks.workunit.client.1.vm07.stdout:8/225: symlink d1/d3/l4a 0 2026-03-10T12:37:35.073 INFO:tasks.workunit.client.1.vm07.stdout:6/170: dwrite d1/d4/f31 [0,4194304] 0 2026-03-10T12:37:35.076 INFO:tasks.workunit.client.1.vm07.stdout:6/171: dwrite d1/d4/d6/f36 [0,4194304] 0 2026-03-10T12:37:35.081 INFO:tasks.workunit.client.1.vm07.stdout:6/172: write d1/d4/d6/d16/d1a/f29 [665005,93918] 0 2026-03-10T12:37:35.083 INFO:tasks.workunit.client.1.vm07.stdout:2/160: creat d0/d19/d1f/d20/f39 x:0 0 0 2026-03-10T12:37:35.089 INFO:tasks.workunit.client.1.vm07.stdout:3/225: link dc/l13 dc/d18/d2d/d3d/l57 0 2026-03-10T12:37:35.094 INFO:tasks.workunit.client.1.vm07.stdout:1/213: readlink d9/df/d29/d2b/d31/l45 0 2026-03-10T12:37:35.094 INFO:tasks.workunit.client.1.vm07.stdout:0/263: chown d0/d14/c49 1 1 2026-03-10T12:37:35.094 INFO:tasks.workunit.client.1.vm07.stdout:0/264: readlink d0/d14/l17 0 2026-03-10T12:37:35.108 INFO:tasks.workunit.client.1.vm07.stdout:2/161: dwrite d0/d19/d1f/d20/f2b [0,4194304] 0 2026-03-10T12:37:35.108 INFO:tasks.workunit.client.0.vm00.stdout:6/60: fdatasync d2/da/f10 0 2026-03-10T12:37:35.109 INFO:tasks.workunit.client.1.vm07.stdout:2/162: read d0/f14 [3936726,95992] 0 2026-03-10T12:37:35.117 INFO:tasks.workunit.client.1.vm07.stdout:2/163: dread d0/f18 [0,4194304] 0 2026-03-10T12:37:35.119 INFO:tasks.workunit.client.1.vm07.stdout:3/226: symlink dc/dd/d1f/d45/l58 0 2026-03-10T12:37:35.121 INFO:tasks.workunit.client.1.vm07.stdout:9/164: dread d5/d13/d2c/f30 [0,4194304] 0 2026-03-10T12:37:35.122 INFO:tasks.workunit.client.0.vm00.stdout:6/61: write d2/da/dc/fd [2046315,27663] 0 2026-03-10T12:37:35.127 INFO:tasks.workunit.client.1.vm07.stdout:1/214: readlink d9/df/d29/d2b/d31/l45 0 2026-03-10T12:37:35.127 INFO:tasks.workunit.client.1.vm07.stdout:9/165: dwrite d5/f8 [0,4194304] 0 2026-03-10T12:37:35.127 INFO:tasks.workunit.client.1.vm07.stdout:9/166: chown d5/c11 26803284 1 2026-03-10T12:37:35.135 INFO:tasks.workunit.client.1.vm07.stdout:0/265: creat d0/d14/d1a/d1b/d41/f55 x:0 0 0 2026-03-10T12:37:35.138 INFO:tasks.workunit.client.1.vm07.stdout:7/196: getdents d0 0 2026-03-10T12:37:35.139 INFO:tasks.workunit.client.0.vm00.stdout:6/62: symlink d2/da/dc/l12 0 2026-03-10T12:37:35.140 INFO:tasks.workunit.client.0.vm00.stdout:6/63: write d2/da/dc/ff [598718,61188] 0 2026-03-10T12:37:35.142 INFO:tasks.workunit.client.1.vm07.stdout:4/214: write d0/d4/d10/d23/f40 [233143,18477] 0 2026-03-10T12:37:35.145 INFO:tasks.workunit.client.0.vm00.stdout:6/64: dwrite d2/da/dc/fd [0,4194304] 0 2026-03-10T12:37:35.159 INFO:tasks.workunit.client.0.vm00.stdout:6/65: creat d2/da/dc/f13 x:0 0 0 2026-03-10T12:37:35.159 INFO:tasks.workunit.client.0.vm00.stdout:6/66: write d2/da/f11 [603684,20025] 0 2026-03-10T12:37:35.162 INFO:tasks.workunit.client.1.vm07.stdout:9/167: chown d5/c29 6863 1 2026-03-10T12:37:35.163 INFO:tasks.workunit.client.0.vm00.stdout:6/67: dread d2/da/fe [0,4194304] 0 2026-03-10T12:37:35.163 INFO:tasks.workunit.client.0.vm00.stdout:6/68: write d2/da/f11 [283734,41620] 0 2026-03-10T12:37:35.165 INFO:tasks.workunit.client.0.vm00.stdout:6/69: mkdir d2/d14 0 2026-03-10T12:37:35.165 INFO:tasks.workunit.client.1.vm07.stdout:4/215: dread d0/d4/d10/d3c/f22 [0,4194304] 0 2026-03-10T12:37:35.168 INFO:tasks.workunit.client.0.vm00.stdout:6/70: dread d2/da/fe [0,4194304] 0 2026-03-10T12:37:35.170 INFO:tasks.workunit.client.0.vm00.stdout:6/71: mkdir d2/da/dc/d15 0 2026-03-10T12:37:35.170 INFO:tasks.workunit.client.0.vm00.stdout:6/72: write d2/f9 [4296434,1112] 0 2026-03-10T12:37:35.172 INFO:tasks.workunit.client.1.vm07.stdout:8/226: rename d1/d3/d18/f1b to d1/f4b 0 2026-03-10T12:37:35.172 INFO:tasks.workunit.client.0.vm00.stdout:6/73: write d2/da/dc/fd [4860889,117408] 0 2026-03-10T12:37:35.173 INFO:tasks.workunit.client.1.vm07.stdout:6/173: getdents d1/d4/d6/d16 0 2026-03-10T12:37:35.174 INFO:tasks.workunit.client.0.vm00.stdout:6/74: rename d2/da/dc/d15 to d2/d16 0 2026-03-10T12:37:35.175 INFO:tasks.workunit.client.0.vm00.stdout:6/75: unlink d2/da/fe 0 2026-03-10T12:37:35.176 INFO:tasks.workunit.client.0.vm00.stdout:6/76: chown d2/c8 12 1 2026-03-10T12:37:35.176 INFO:tasks.workunit.client.0.vm00.stdout:6/77: creat d2/d16/f17 x:0 0 0 2026-03-10T12:37:35.181 INFO:tasks.workunit.client.0.vm00.stdout:6/78: dwrite d2/da/f11 [0,4194304] 0 2026-03-10T12:37:35.181 INFO:tasks.workunit.client.1.vm07.stdout:2/164: creat d0/d19/d26/d38/f3a x:0 0 0 2026-03-10T12:37:35.186 INFO:tasks.workunit.client.1.vm07.stdout:3/227: unlink dc/c26 0 2026-03-10T12:37:35.186 INFO:tasks.workunit.client.1.vm07.stdout:3/228: readlink dc/d18/d2d/l4b 0 2026-03-10T12:37:35.197 INFO:tasks.workunit.client.0.vm00.stdout:6/79: creat d2/da/dc/f18 x:0 0 0 2026-03-10T12:37:35.198 INFO:tasks.workunit.client.0.vm00.stdout:6/80: creat d2/d16/f19 x:0 0 0 2026-03-10T12:37:35.199 INFO:tasks.workunit.client.1.vm07.stdout:4/216: dwrite d0/d4/d10/d23/f27 [0,4194304] 0 2026-03-10T12:37:35.199 INFO:tasks.workunit.client.0.vm00.stdout:6/81: rename d2/da/f10 to d2/da/f1a 0 2026-03-10T12:37:35.199 INFO:tasks.workunit.client.1.vm07.stdout:8/227: dwrite d1/f3d [0,4194304] 0 2026-03-10T12:37:35.201 INFO:tasks.workunit.client.1.vm07.stdout:8/228: write d1/d3/f1d [7613757,119761] 0 2026-03-10T12:37:35.201 INFO:tasks.workunit.client.1.vm07.stdout:4/217: chown d0/d4/d10/d18/f1a 96273 1 2026-03-10T12:37:35.204 INFO:tasks.workunit.client.1.vm07.stdout:4/218: chown d0/d4/d5/da/f44 494 1 2026-03-10T12:37:35.213 INFO:tasks.workunit.client.1.vm07.stdout:2/165: mknod d0/d29/c3b 0 2026-03-10T12:37:35.222 INFO:tasks.workunit.client.0.vm00.stdout:8/29: getdents d0 0 2026-03-10T12:37:35.222 INFO:tasks.workunit.client.0.vm00.stdout:8/30: dwrite - no filename 2026-03-10T12:37:35.222 INFO:tasks.workunit.client.0.vm00.stdout:8/31: chown d0/l1 482510 1 2026-03-10T12:37:35.222 INFO:tasks.workunit.client.1.vm07.stdout:3/229: readlink dc/d18/d2d/d3d/l44 0 2026-03-10T12:37:35.222 INFO:tasks.workunit.client.1.vm07.stdout:8/229: rmdir d1/d3 39 2026-03-10T12:37:35.223 INFO:tasks.workunit.client.1.vm07.stdout:3/230: dwrite dc/dd/d1f/f27 [0,4194304] 0 2026-03-10T12:37:35.223 INFO:tasks.workunit.client.1.vm07.stdout:4/219: rename d0/d4/d10/f16 to d0/d4/d5/da/f48 0 2026-03-10T12:37:35.223 INFO:tasks.workunit.client.1.vm07.stdout:1/215: getdents d9/df/d29 0 2026-03-10T12:37:35.224 INFO:tasks.workunit.client.1.vm07.stdout:4/220: mknod d0/d4/d10/d23/c49 0 2026-03-10T12:37:35.225 INFO:tasks.workunit.client.1.vm07.stdout:6/174: creat d1/d4/f3b x:0 0 0 2026-03-10T12:37:35.225 INFO:tasks.workunit.client.1.vm07.stdout:8/230: rmdir d1/d3/d6 39 2026-03-10T12:37:35.225 INFO:tasks.workunit.client.1.vm07.stdout:3/231: fdatasync dc/dd/d1f/d45/f56 0 2026-03-10T12:37:35.226 INFO:tasks.workunit.client.1.vm07.stdout:8/231: write d1/d3/d18/f38 [1236621,56867] 0 2026-03-10T12:37:35.227 INFO:tasks.workunit.client.0.vm00.stdout:0/21: truncate f0 457398 0 2026-03-10T12:37:35.227 INFO:tasks.workunit.client.0.vm00.stdout:0/22: dread - d3/f4 zero size 2026-03-10T12:37:35.227 INFO:tasks.workunit.client.0.vm00.stdout:0/23: chown d3/l5 136 1 2026-03-10T12:37:35.228 INFO:tasks.workunit.client.0.vm00.stdout:0/24: read - d3/f4 zero size 2026-03-10T12:37:35.228 INFO:tasks.workunit.client.0.vm00.stdout:8/32: link d0/l1 d0/l4 0 2026-03-10T12:37:35.229 INFO:tasks.workunit.client.0.vm00.stdout:0/25: mkdir d3/d7 0 2026-03-10T12:37:35.229 INFO:tasks.workunit.client.0.vm00.stdout:0/26: dread - d3/f4 zero size 2026-03-10T12:37:35.230 INFO:tasks.workunit.client.0.vm00.stdout:0/27: chown c1 2469947 1 2026-03-10T12:37:35.230 INFO:tasks.workunit.client.0.vm00.stdout:0/28: dread - d3/f4 zero size 2026-03-10T12:37:35.230 INFO:tasks.workunit.client.0.vm00.stdout:0/29: chown c1 112800 1 2026-03-10T12:37:35.231 INFO:tasks.workunit.client.0.vm00.stdout:0/30: chown f0 42476 1 2026-03-10T12:37:35.231 INFO:tasks.workunit.client.0.vm00.stdout:0/31: rename d3 to d3/d8 22 2026-03-10T12:37:35.232 INFO:tasks.workunit.client.0.vm00.stdout:0/32: rename d3/l6 to d3/l9 0 2026-03-10T12:37:35.233 INFO:tasks.workunit.client.1.vm07.stdout:4/221: unlink d0/d4/d10/d23/f47 0 2026-03-10T12:37:35.234 INFO:tasks.workunit.client.1.vm07.stdout:4/222: fsync d0/d19/f25 0 2026-03-10T12:37:35.234 INFO:tasks.workunit.client.0.vm00.stdout:0/33: symlink d3/la 0 2026-03-10T12:37:35.234 INFO:tasks.workunit.client.1.vm07.stdout:6/175: creat d1/d4/d6/d16/d1a/d33/f3c x:0 0 0 2026-03-10T12:37:35.234 INFO:tasks.workunit.client.1.vm07.stdout:4/223: chown d0/d4/d5/d34 353 1 2026-03-10T12:37:35.235 INFO:tasks.workunit.client.1.vm07.stdout:8/232: creat d1/d3/d40/f4c x:0 0 0 2026-03-10T12:37:35.235 INFO:tasks.workunit.client.1.vm07.stdout:6/176: dread - d1/d4/d6/f39 zero size 2026-03-10T12:37:35.236 INFO:tasks.workunit.client.0.vm00.stdout:2/25: dread f3 [0,4194304] 0 2026-03-10T12:37:35.236 INFO:tasks.workunit.client.0.vm00.stdout:2/26: read f3 [20088,11970] 0 2026-03-10T12:37:35.236 INFO:tasks.workunit.client.1.vm07.stdout:3/232: symlink dc/dd/d1f/l59 0 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 -- 192.168.123.100:0/867301 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248072330 msgr2=0x7ff2480770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 --2- 192.168.123.100:0/867301 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248072330 0x7ff2480770b0 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7ff238009a60 tx=0x7ff238009d70 comp rx=0 tx=0).stop 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 -- 192.168.123.100:0/867301 shutdown_connections 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 --2- 192.168.123.100:0/867301 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248072330 0x7ff2480770b0 unknown :-1 s=CLOSED pgs=332 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 --2- 192.168.123.100:0/867301 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff248071950 0x7ff248071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 -- 192.168.123.100:0/867301 >> 192.168.123.100:0/867301 conn(0x7ff24806d1a0 msgr2=0x7ff24806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 -- 192.168.123.100:0/867301 shutdown_connections 2026-03-10T12:37:35.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 -- 192.168.123.100:0/867301 wait complete. 2026-03-10T12:37:35.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.241+0000 7ff24cc54700 1 Processor -- start 2026-03-10T12:37:35.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.242+0000 7ff24cc54700 1 -- start start 2026-03-10T12:37:35.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.242+0000 7ff24cc54700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248071950 0x7ff2480824f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.242+0000 7ff24cc54700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff248082a30 0x7ff248082ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.242+0000 7ff24cc54700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2481b2a90 con 0x7ff248071950 2026-03-10T12:37:35.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.242+0000 7ff24cc54700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2481b2bd0 con 0x7ff248082a30 2026-03-10T12:37:35.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.244+0000 7ff2477fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248071950 0x7ff2480824f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.244+0000 7ff2477fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248071950 0x7ff2480824f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59280/0 (socket says 192.168.123.100:59280) 2026-03-10T12:37:35.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.244+0000 7ff2477fe700 1 -- 192.168.123.100:0/3068058105 learned_addr learned my addr 192.168.123.100:0/3068058105 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:35.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.244+0000 7ff246ffd700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff248082a30 0x7ff248082ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.245+0000 7ff2477fe700 1 -- 192.168.123.100:0/3068058105 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff248082a30 msgr2=0x7ff248082ea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.245+0000 7ff2477fe700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff248082a30 0x7ff248082ea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.245+0000 7ff2477fe700 1 -- 192.168.123.100:0/3068058105 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff238009710 con 0x7ff248071950 2026-03-10T12:37:35.246 INFO:tasks.workunit.client.0.vm00.stdout:1/42: fsync da/f11 0 2026-03-10T12:37:35.246 INFO:tasks.workunit.client.1.vm07.stdout:4/224: dwrite d0/d4/d10/d3c/d2b/f3b [0,4194304] 0 2026-03-10T12:37:35.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.246+0000 7ff2477fe700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248071950 0x7ff2480824f0 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7ff24000ea00 tx=0x7ff24000edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:35.247 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.247+0000 7ff244ff9700 1 -- 192.168.123.100:0/3068058105 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff24000d4e0 con 0x7ff248071950 2026-03-10T12:37:35.249 INFO:tasks.workunit.client.1.vm07.stdout:9/168: sync 2026-03-10T12:37:35.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.247+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff2481b2d10 con 0x7ff248071950 2026-03-10T12:37:35.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.247+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2481b31d0 con 0x7ff248071950 2026-03-10T12:37:35.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.248+0000 7ff244ff9700 1 -- 192.168.123.100:0/3068058105 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff240018470 con 0x7ff248071950 2026-03-10T12:37:35.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.248+0000 7ff244ff9700 1 -- 192.168.123.100:0/3068058105 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff24000f660 con 0x7ff248071950 2026-03-10T12:37:35.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.248+0000 7ff22e7fc700 1 -- 192.168.123.100:0/3068058105 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff24804ea50 con 0x7ff248071950 2026-03-10T12:37:35.250 INFO:tasks.workunit.client.1.vm07.stdout:6/177: dwrite d1/d4/d6/d16/d1a/f29 [0,4194304] 0 2026-03-10T12:37:35.251 INFO:tasks.workunit.client.1.vm07.stdout:6/178: read d1/d4/f31 [29245,91915] 0 2026-03-10T12:37:35.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.251+0000 7ff244ff9700 1 -- 192.168.123.100:0/3068058105 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7ff240015070 con 0x7ff248071950 2026-03-10T12:37:35.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.251+0000 7ff244ff9700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff230070af0 0x7ff230072fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.251+0000 7ff244ff9700 1 -- 192.168.123.100:0/3068058105 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7ff24008c370 con 0x7ff248071950 2026-03-10T12:37:35.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.251+0000 7ff246ffd700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff230070af0 0x7ff230072fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.252+0000 7ff246ffd700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff230070af0 0x7ff230072fa0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7ff23800b5c0 tx=0x7ff238011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:35.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.255+0000 7ff244ff9700 1 -- 192.168.123.100:0/3068058105 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7ff24005a600 con 0x7ff248071950 2026-03-10T12:37:35.263 INFO:tasks.workunit.client.0.vm00.stdout:1/43: creat da/f13 x:0 0 0 2026-03-10T12:37:35.263 INFO:tasks.workunit.client.1.vm07.stdout:5/239: write d0/d22/f50 [2244297,58985] 0 2026-03-10T12:37:35.263 INFO:tasks.workunit.client.1.vm07.stdout:3/233: rename dc/dd/d1f/f2f to dc/d18/d2d/d3d/f5a 0 2026-03-10T12:37:35.263 INFO:tasks.workunit.client.1.vm07.stdout:9/169: unlink d5/d16/l1d 0 2026-03-10T12:37:35.263 INFO:tasks.workunit.client.1.vm07.stdout:9/170: write d5/d13/f2b [703179,54492] 0 2026-03-10T12:37:35.267 INFO:tasks.workunit.client.1.vm07.stdout:3/234: rmdir dc/d18/d2d 39 2026-03-10T12:37:35.268 INFO:tasks.workunit.client.1.vm07.stdout:3/235: read dc/d18/d24/f3e [1434844,94950] 0 2026-03-10T12:37:35.273 INFO:tasks.workunit.client.1.vm07.stdout:3/236: dwrite dc/f17 [4194304,4194304] 0 2026-03-10T12:37:35.278 INFO:tasks.workunit.client.0.vm00.stdout:7/42: getdents da 0 2026-03-10T12:37:35.278 INFO:tasks.workunit.client.0.vm00.stdout:1/44: chown da/cf 61 1 2026-03-10T12:37:35.278 INFO:tasks.workunit.client.1.vm07.stdout:9/171: rename d5/d13/d22/f25 to d5/d16/f34 0 2026-03-10T12:37:35.278 INFO:tasks.workunit.client.0.vm00.stdout:7/43: chown da/fb 19577 1 2026-03-10T12:37:35.280 INFO:tasks.workunit.client.0.vm00.stdout:9/36: dwrite d0/d5/f3 [0,4194304] 0 2026-03-10T12:37:35.280 INFO:tasks.workunit.client.1.vm07.stdout:6/179: creat d1/f3d x:0 0 0 2026-03-10T12:37:35.280 INFO:tasks.workunit.client.1.vm07.stdout:6/180: write d1/d4/d6/f2a [935648,45384] 0 2026-03-10T12:37:35.282 INFO:tasks.workunit.client.1.vm07.stdout:9/172: creat d5/d16/f35 x:0 0 0 2026-03-10T12:37:35.282 INFO:tasks.workunit.client.0.vm00.stdout:1/45: dread f4 [0,4194304] 0 2026-03-10T12:37:35.282 INFO:tasks.workunit.client.0.vm00.stdout:1/46: read - da/fc zero size 2026-03-10T12:37:35.343 INFO:tasks.workunit.client.0.vm00.stdout:4/52: truncate f3 507284 0 2026-03-10T12:37:35.343 INFO:tasks.workunit.client.0.vm00.stdout:7/44: dwrite f7 [0,4194304] 0 2026-03-10T12:37:35.343 INFO:tasks.workunit.client.0.vm00.stdout:7/45: rename f7 to da/f10 0 2026-03-10T12:37:35.343 INFO:tasks.workunit.client.0.vm00.stdout:7/46: fdatasync f9 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.0.vm00.stdout:4/53: dwrite df/f12 [0,4194304] 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.0.vm00.stdout:7/47: dread - da/fe zero size 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.0.vm00.stdout:4/54: dwrite f8 [0,4194304] 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.0.vm00.stdout:4/55: chown c6 441 1 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.0.vm00.stdout:7/48: symlink da/l11 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.0.vm00.stdout:7/49: mknod da/c12 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:4/225: link d0/d4/d5/da/l32 d0/d4/l4a 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/181: unlink d1/f17 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/182: dread - d1/d4/d6/d16/d1a/d33/f3c zero size 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:9/173: write d5/f2a [596752,116354] 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:4/226: truncate d0/d4/d10/f39 311106 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/183: unlink d1/d4/d6/f39 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:9/174: creat d5/d13/d22/f36 x:0 0 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:9/175: dwrite d5/d13/f2b [0,4194304] 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:4/227: link d0/d4/d5/d34/f37 d0/d4/d10/f4b 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/184: link d1/c18 d1/d4/d6/c3e 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:4/228: symlink d0/d4/d10/l4c 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:4/229: creat d0/d4/d5/da/f4d x:0 0 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/185: rename d1/d4/f2b to d1/d4/f3f 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/186: write d1/d4/f11 [1396428,128058] 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/187: unlink d1/d9/c1d 0 2026-03-10T12:37:35.344 INFO:tasks.workunit.client.1.vm07.stdout:6/188: symlink d1/d4/d6/d16/d1a/d2c/l40 0 2026-03-10T12:37:35.350 INFO:tasks.workunit.client.0.vm00.stdout:6/82: fdatasync d2/da/dc/ff 0 2026-03-10T12:37:35.354 INFO:tasks.workunit.client.0.vm00.stdout:2/27: sync 2026-03-10T12:37:35.363 INFO:tasks.workunit.client.0.vm00.stdout:2/28: write d4/d6/fb [472895,5186] 0 2026-03-10T12:37:35.363 INFO:tasks.workunit.client.0.vm00.stdout:2/29: dwrite f1 [0,4194304] 0 2026-03-10T12:37:35.363 INFO:tasks.workunit.client.0.vm00.stdout:6/83: creat d2/d14/f1b x:0 0 0 2026-03-10T12:37:35.363 INFO:tasks.workunit.client.0.vm00.stdout:2/30: rename d4/d6/ca to d4/d6/cc 0 2026-03-10T12:37:35.365 INFO:tasks.workunit.client.1.vm07.stdout:1/216: dread d9/df/f10 [0,4194304] 0 2026-03-10T12:37:35.369 INFO:tasks.workunit.client.1.vm07.stdout:5/240: sync 2026-03-10T12:37:35.374 INFO:tasks.workunit.client.1.vm07.stdout:5/241: dwrite d0/d22/d18/f4c [0,4194304] 0 2026-03-10T12:37:35.378 INFO:tasks.workunit.client.1.vm07.stdout:1/217: rename f6 to d9/df/d29/d2b/d3d/f47 0 2026-03-10T12:37:35.391 INFO:tasks.workunit.client.1.vm07.stdout:1/218: unlink d9/f22 0 2026-03-10T12:37:35.392 INFO:tasks.workunit.client.0.vm00.stdout:6/84: creat d2/d16/f1c x:0 0 0 2026-03-10T12:37:35.392 INFO:tasks.workunit.client.0.vm00.stdout:6/85: dread d2/da/f1a [0,4194304] 0 2026-03-10T12:37:35.392 INFO:tasks.workunit.client.0.vm00.stdout:6/86: truncate d2/d16/f1c 681921 0 2026-03-10T12:37:35.392 INFO:tasks.workunit.client.0.vm00.stdout:2/31: mkdir d4/dd 0 2026-03-10T12:37:35.395 INFO:tasks.workunit.client.1.vm07.stdout:1/219: fdatasync d9/df/f15 0 2026-03-10T12:37:35.397 INFO:tasks.workunit.client.1.vm07.stdout:5/242: getdents d0/d22/d18/d19/d2e 0 2026-03-10T12:37:35.399 INFO:tasks.workunit.client.1.vm07.stdout:5/243: symlink d0/d22/d18/l56 0 2026-03-10T12:37:35.402 INFO:tasks.workunit.client.1.vm07.stdout:5/244: mknod d0/d22/d18/d19/d21/d3a/c57 0 2026-03-10T12:37:35.413 INFO:tasks.workunit.client.0.vm00.stdout:3/55: truncate dd/f15 683563 0 2026-03-10T12:37:35.414 INFO:tasks.workunit.client.0.vm00.stdout:3/56: rename dd to dd/d11/d14/d17 22 2026-03-10T12:37:35.415 INFO:tasks.workunit.client.1.vm07.stdout:1/220: link l2 d9/df/d29/d2b/d3d/l48 0 2026-03-10T12:37:35.433 INFO:tasks.workunit.client.1.vm07.stdout:3/237: dread dc/dd/f21 [4194304,4194304] 0 2026-03-10T12:37:35.456 INFO:tasks.workunit.client.1.vm07.stdout:3/238: sync 2026-03-10T12:37:35.459 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.456+0000 7ff22e7fc700 1 -- 192.168.123.100:0/3068058105 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7ff248061960 con 0x7ff248071950 2026-03-10T12:37:35.459 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:35 vm00.local ceph-mon[50686]: from='client.14650 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:35.459 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:35 vm00.local ceph-mon[50686]: from='client.14652 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:35.459 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:35 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/1835453162' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:37:35.462 INFO:tasks.workunit.client.1.vm07.stdout:3/239: dwrite dc/dd/d1f/f30 [0,4194304] 0 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:e12 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:epoch 12 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:36:51.752695+0000 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:37:35.465 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.462+0000 7ff244ff9700 1 -- 192.168.123.100:0/3068058105 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1853 (secure 0 0 0) 0x7ff24005a190 con 0x7ff248071950 2026-03-10T12:37:35.466 INFO:tasks.workunit.client.1.vm07.stdout:3/240: write dc/d18/d24/f3e [4207225,109080] 0 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff230070af0 msgr2=0x7ff230072fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff230070af0 0x7ff230072fa0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7ff23800b5c0 tx=0x7ff238011040 comp rx=0 tx=0).stop 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248071950 msgr2=0x7ff2480824f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248071950 0x7ff2480824f0 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7ff24000ea00 tx=0x7ff24000edc0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 shutdown_connections 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7ff230070af0 0x7ff230072fa0 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff248071950 0x7ff2480824f0 unknown :-1 s=CLOSED pgs=333 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 --2- 192.168.123.100:0/3068058105 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff248082a30 0x7ff248082ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 >> 192.168.123.100:0/3068058105 conn(0x7ff24806d1a0 msgr2=0x7ff248076490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 shutdown_connections 2026-03-10T12:37:35.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.466+0000 7ff24cc54700 1 -- 192.168.123.100:0/3068058105 wait complete. 2026-03-10T12:37:35.471 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 12 2026-03-10T12:37:35.472 INFO:tasks.workunit.client.1.vm07.stdout:0/266: dread d0/f2e [0,4194304] 0 2026-03-10T12:37:35.478 INFO:tasks.workunit.client.1.vm07.stdout:0/267: rename d0/f2e to d0/d14/d1a/d1b/d41/d4e/f56 0 2026-03-10T12:37:35.482 INFO:tasks.workunit.client.1.vm07.stdout:0/268: mknod d0/d14/d1a/d2f/d31/d4f/c57 0 2026-03-10T12:37:35.485 INFO:tasks.workunit.client.1.vm07.stdout:7/197: write d0/f26 [612633,37224] 0 2026-03-10T12:37:35.487 INFO:tasks.workunit.client.1.vm07.stdout:2/166: rmdir d0/d19/d26 39 2026-03-10T12:37:35.487 INFO:tasks.workunit.client.1.vm07.stdout:7/198: stat d0/c17 0 2026-03-10T12:37:35.488 INFO:tasks.workunit.client.1.vm07.stdout:2/167: fsync d0/d29/f32 0 2026-03-10T12:37:35.488 INFO:tasks.workunit.client.1.vm07.stdout:2/168: chown d0/d29 0 1 2026-03-10T12:37:35.489 INFO:tasks.workunit.client.1.vm07.stdout:7/199: write d0/f1e [569315,65186] 0 2026-03-10T12:37:35.490 INFO:tasks.workunit.client.1.vm07.stdout:7/200: chown d0/c17 5767412 1 2026-03-10T12:37:35.494 INFO:tasks.workunit.client.1.vm07.stdout:0/269: dread d0/d14/d1a/f2c [0,4194304] 0 2026-03-10T12:37:35.495 INFO:tasks.workunit.client.1.vm07.stdout:7/201: dwrite d0/f26 [0,4194304] 0 2026-03-10T12:37:35.496 INFO:tasks.workunit.client.1.vm07.stdout:0/270: chown d0/d14/d1a/f24 14646863 1 2026-03-10T12:37:35.496 INFO:tasks.workunit.client.1.vm07.stdout:2/169: creat d0/d19/f3c x:0 0 0 2026-03-10T12:37:35.499 INFO:tasks.workunit.client.1.vm07.stdout:7/202: fdatasync d0/fc 0 2026-03-10T12:37:35.500 INFO:tasks.workunit.client.1.vm07.stdout:7/203: write d0/f2b [1436984,91835] 0 2026-03-10T12:37:35.501 INFO:tasks.workunit.client.1.vm07.stdout:0/271: dread d0/f21 [0,4194304] 0 2026-03-10T12:37:35.502 INFO:tasks.workunit.client.1.vm07.stdout:0/272: rename d0/d14/d1a to d0/d14/d1a/d1b/d41/d58 22 2026-03-10T12:37:35.509 INFO:tasks.workunit.client.1.vm07.stdout:7/204: link d0/f23 d0/f31 0 2026-03-10T12:37:35.516 INFO:tasks.workunit.client.1.vm07.stdout:2/170: link d0/f13 d0/d19/d26/d38/f3d 0 2026-03-10T12:37:35.516 INFO:tasks.workunit.client.1.vm07.stdout:2/171: chown d0/f2d 1378 1 2026-03-10T12:37:35.516 INFO:tasks.workunit.client.1.vm07.stdout:2/172: dread d0/d19/d1f/d20/f2b [0,4194304] 0 2026-03-10T12:37:35.518 INFO:tasks.workunit.client.1.vm07.stdout:7/205: creat d0/f32 x:0 0 0 2026-03-10T12:37:35.520 INFO:tasks.workunit.client.1.vm07.stdout:2/173: creat d0/d19/d26/f3e x:0 0 0 2026-03-10T12:37:35.528 INFO:tasks.workunit.client.1.vm07.stdout:2/174: creat d0/d19/d1f/d20/f3f x:0 0 0 2026-03-10T12:37:35.531 INFO:tasks.workunit.client.1.vm07.stdout:7/206: symlink d0/l33 0 2026-03-10T12:37:35.537 INFO:tasks.workunit.client.1.vm07.stdout:7/207: dread - d0/f30 zero size 2026-03-10T12:37:35.537 INFO:tasks.workunit.client.1.vm07.stdout:7/208: write d0/f32 [127685,81524] 0 2026-03-10T12:37:35.537 INFO:tasks.workunit.client.0.vm00.stdout:5/31: truncate f4 790370 0 2026-03-10T12:37:35.543 INFO:tasks.workunit.client.0.vm00.stdout:8/33: getdents d0 0 2026-03-10T12:37:35.543 INFO:tasks.workunit.client.0.vm00.stdout:8/34: write - no filename 2026-03-10T12:37:35.546 INFO:tasks.workunit.client.1.vm07.stdout:7/209: rename d0/c18 to d0/c34 0 2026-03-10T12:37:35.546 INFO:tasks.workunit.client.0.vm00.stdout:0/34: dwrite f0 [0,4194304] 0 2026-03-10T12:37:35.558 INFO:tasks.workunit.client.1.vm07.stdout:7/210: mknod d0/c35 0 2026-03-10T12:37:35.559 INFO:tasks.workunit.client.1.vm07.stdout:7/211: mknod d0/c36 0 2026-03-10T12:37:35.559 INFO:tasks.workunit.client.0.vm00.stdout:0/35: readlink d3/la 0 2026-03-10T12:37:35.559 INFO:tasks.workunit.client.0.vm00.stdout:0/36: truncate d3/f4 284741 0 2026-03-10T12:37:35.559 INFO:tasks.workunit.client.0.vm00.stdout:0/37: chown d3/d7 6739131 1 2026-03-10T12:37:35.559 INFO:tasks.workunit.client.0.vm00.stdout:4/56: truncate f9 1405021 0 2026-03-10T12:37:35.559 INFO:tasks.workunit.client.0.vm00.stdout:4/57: stat le 0 2026-03-10T12:37:35.559 INFO:tasks.workunit.client.0.vm00.stdout:1/47: truncate f5 1566825 0 2026-03-10T12:37:35.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:35 vm07.local ceph-mon[58582]: from='client.14650 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:35.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:35 vm07.local ceph-mon[58582]: from='client.14652 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:35.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:35 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/1835453162' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:37:35.575 INFO:tasks.workunit.client.0.vm00.stdout:7/50: truncate f6 146421 0 2026-03-10T12:37:35.578 INFO:tasks.workunit.client.1.vm07.stdout:2/175: dread d0/f13 [0,4194304] 0 2026-03-10T12:37:35.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.578+0000 7fcf6b480700 1 -- 192.168.123.100:0/3001580442 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf64072330 msgr2=0x7fcf640770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.578+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3001580442 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf64072330 0x7fcf640770b0 secure :-1 s=READY pgs=334 cs=0 l=1 rev1=1 crypto rx=0x7fcf5c009230 tx=0x7fcf5c009260 comp rx=0 tx=0).stop 2026-03-10T12:37:35.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 -- 192.168.123.100:0/3001580442 shutdown_connections 2026-03-10T12:37:35.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3001580442 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf64072330 0x7fcf640770b0 unknown :-1 s=CLOSED pgs=334 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3001580442 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf64071950 0x7fcf64071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 -- 192.168.123.100:0/3001580442 >> 192.168.123.100:0/3001580442 conn(0x7fcf6406d1a0 msgr2=0x7fcf6406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 -- 192.168.123.100:0/3001580442 shutdown_connections 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 -- 192.168.123.100:0/3001580442 wait complete. 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 Processor -- start 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 -- start start 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf64071950 0x7fcf64082480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf640829c0 0x7fcf64082e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf64083e30 con 0x7fcf64071950 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf6b480700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf641b2a90 con 0x7fcf640829c0 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf69c7d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf640829c0 0x7fcf64082e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf69c7d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf640829c0 0x7fcf64082e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:50380/0 (socket says 192.168.123.100:50380) 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.579+0000 7fcf69c7d700 1 -- 192.168.123.100:0/3613104716 learned_addr learned my addr 192.168.123.100:0/3613104716 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.580+0000 7fcf69c7d700 1 -- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf64071950 msgr2=0x7fcf64082480 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.580+0000 7fcf69c7d700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf64071950 0x7fcf64082480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.580+0000 7fcf69c7d700 1 -- 192.168.123.100:0/3613104716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcf5c008ee0 con 0x7fcf640829c0 2026-03-10T12:37:35.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.580+0000 7fcf69c7d700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf640829c0 0x7fcf64082e30 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fcf5c0076a0 tx=0x7fcf5c005a10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:35.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.580+0000 7fcf5b7fe700 1 -- 192.168.123.100:0/3613104716 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcf5c00b8b0 con 0x7fcf640829c0 2026-03-10T12:37:35.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.580+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcf641b2c30 con 0x7fcf640829c0 2026-03-10T12:37:35.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.580+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcf641b3120 con 0x7fcf640829c0 2026-03-10T12:37:35.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.581+0000 7fcf5b7fe700 1 -- 192.168.123.100:0/3613104716 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcf5c01e420 con 0x7fcf640829c0 2026-03-10T12:37:35.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.581+0000 7fcf5b7fe700 1 -- 192.168.123.100:0/3613104716 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcf5c016620 con 0x7fcf640829c0 2026-03-10T12:37:35.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.583+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcf6407c880 con 0x7fcf640829c0 2026-03-10T12:37:35.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.584+0000 7fcf5b7fe700 1 -- 192.168.123.100:0/3613104716 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fcf5c004030 con 0x7fcf640829c0 2026-03-10T12:37:35.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.585+0000 7fcf5b7fe700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf5006c7a0 0x7fcf5006ec50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.585+0000 7fcf5b7fe700 1 -- 192.168.123.100:0/3613104716 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fcf5c08fcf0 con 0x7fcf640829c0 2026-03-10T12:37:35.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.585+0000 7fcf6a47e700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf5006c7a0 0x7fcf5006ec50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.586 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.586+0000 7fcf6a47e700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf5006c7a0 0x7fcf5006ec50 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fcf6000a990 tx=0x7fcf60005c80 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:35.588 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.588+0000 7fcf5b7fe700 1 -- 192.168.123.100:0/3613104716 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fcf5c05df80 con 0x7fcf640829c0 2026-03-10T12:37:35.605 INFO:tasks.workunit.client.1.vm07.stdout:8/233: getdents d1/d3/d40 0 2026-03-10T12:37:35.616 INFO:tasks.workunit.client.1.vm07.stdout:6/189: getdents d1 0 2026-03-10T12:37:35.616 INFO:tasks.workunit.client.1.vm07.stdout:6/190: dread - d1/d4/d6/d16/d1a/d33/f3c zero size 2026-03-10T12:37:35.625 INFO:tasks.workunit.client.1.vm07.stdout:4/230: truncate d0/d4/d10/f4b 2025180 0 2026-03-10T12:37:35.625 INFO:tasks.workunit.client.1.vm07.stdout:9/176: truncate d5/d16/d18/f20 23167 0 2026-03-10T12:37:35.629 INFO:tasks.workunit.client.1.vm07.stdout:9/177: creat d5/d13/d22/f37 x:0 0 0 2026-03-10T12:37:35.636 INFO:tasks.workunit.client.1.vm07.stdout:4/231: creat d0/d4/d5/da/f4e x:0 0 0 2026-03-10T12:37:35.636 INFO:tasks.workunit.client.1.vm07.stdout:9/178: write d5/d13/f2b [3447927,96883] 0 2026-03-10T12:37:35.636 INFO:tasks.workunit.client.1.vm07.stdout:4/232: dread - d0/d4/d10/d23/f2e zero size 2026-03-10T12:37:35.636 INFO:tasks.workunit.client.1.vm07.stdout:9/179: write d5/d16/f34 [4632975,110875] 0 2026-03-10T12:37:35.636 INFO:tasks.workunit.client.1.vm07.stdout:9/180: chown d5/fb 384098 1 2026-03-10T12:37:35.636 INFO:tasks.workunit.client.1.vm07.stdout:4/233: write d0/d4/d5/da/f48 [3225692,127066] 0 2026-03-10T12:37:35.637 INFO:tasks.workunit.client.1.vm07.stdout:9/181: creat d5/d13/d2c/f38 x:0 0 0 2026-03-10T12:37:35.639 INFO:tasks.workunit.client.1.vm07.stdout:4/234: rename d0/d4/d10/d23/f40 to d0/d4/d10/d23/f4f 0 2026-03-10T12:37:35.642 INFO:tasks.workunit.client.1.vm07.stdout:9/182: creat d5/d13/d22/f39 x:0 0 0 2026-03-10T12:37:35.654 INFO:tasks.workunit.client.1.vm07.stdout:1/221: getdents d9/df/d29/d2b/d3d 0 2026-03-10T12:37:35.656 INFO:tasks.workunit.client.1.vm07.stdout:5/245: write d0/d22/d18/d19/d21/f2d [3248253,99533] 0 2026-03-10T12:37:35.660 INFO:tasks.workunit.client.1.vm07.stdout:5/246: dwrite d0/f1e [0,4194304] 0 2026-03-10T12:37:35.667 INFO:tasks.workunit.client.1.vm07.stdout:3/241: dwrite dc/dd/f29 [4194304,4194304] 0 2026-03-10T12:37:35.682 INFO:tasks.workunit.client.1.vm07.stdout:5/247: symlink d0/d22/d18/d3e/l58 0 2026-03-10T12:37:35.687 INFO:tasks.workunit.client.1.vm07.stdout:5/248: dwrite d0/d22/f50 [0,4194304] 0 2026-03-10T12:37:35.699 INFO:tasks.workunit.client.1.vm07.stdout:0/273: dwrite d0/d14/d1a/f27 [4194304,4194304] 0 2026-03-10T12:37:35.704 INFO:tasks.workunit.client.1.vm07.stdout:0/274: fdatasync d0/d14/d1a/f3d 0 2026-03-10T12:37:35.705 INFO:tasks.workunit.client.1.vm07.stdout:3/242: link dc/dd/d28/d3b/f4d dc/dd/d28/d3b/f5b 0 2026-03-10T12:37:35.708 INFO:tasks.workunit.client.1.vm07.stdout:0/275: symlink d0/d14/d1a/d2f/d31/l59 0 2026-03-10T12:37:35.715 INFO:tasks.workunit.client.1.vm07.stdout:3/243: dwrite dc/dd/f21 [8388608,4194304] 0 2026-03-10T12:37:35.722 INFO:tasks.workunit.client.1.vm07.stdout:8/234: dread d1/d3/d18/f32 [0,4194304] 0 2026-03-10T12:37:35.731 INFO:tasks.workunit.client.1.vm07.stdout:8/235: truncate d1/d3/d6/f24 1325960 0 2026-03-10T12:37:35.733 INFO:tasks.workunit.client.1.vm07.stdout:7/212: rmdir d0 39 2026-03-10T12:37:35.737 INFO:tasks.workunit.client.1.vm07.stdout:2/176: truncate d0/d19/d1f/d20/f2b 4094906 0 2026-03-10T12:37:35.753 INFO:tasks.workunit.client.1.vm07.stdout:7/213: dread d0/f14 [0,4194304] 0 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.754+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fcf64061190 con 0x7fcf5006c7a0 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.756+0000 7fcf5b7fe700 1 -- 192.168.123.100:0/3613104716 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+442 (secure 0 0 0) 0x7fcf64061190 con 0x7fcf5006c7a0 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "0/23 daemons upgraded", 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm07", 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:37:35.782 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf5006c7a0 msgr2=0x7fcf5006ec50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf5006c7a0 0x7fcf5006ec50 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fcf6000a990 tx=0x7fcf60005c80 comp rx=0 tx=0).stop 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf640829c0 msgr2=0x7fcf64082e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf640829c0 0x7fcf64082e30 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fcf5c0076a0 tx=0x7fcf5c005a10 comp rx=0 tx=0).stop 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 shutdown_connections 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fcf5006c7a0 0x7fcf5006ec50 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcf64071950 0x7fcf64082480 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.759+0000 7fcf6b480700 1 --2- 192.168.123.100:0/3613104716 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcf640829c0 0x7fcf64082e30 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.760+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 >> 192.168.123.100:0/3613104716 conn(0x7fcf6406d1a0 msgr2=0x7fcf64076490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.760+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 shutdown_connections 2026-03-10T12:37:35.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.760+0000 7fcf6b480700 1 -- 192.168.123.100:0/3613104716 wait complete. 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:6/191: write d1/f3d [1004719,96438] 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/244: getdents dc/dd/d28 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/245: mkdir dc/dd/d43/d5c 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:6/192: creat d1/d4/d6/f41 x:0 0 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:6/193: mknod d1/d4/d6/d16/d1a/c42 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/246: symlink dc/dd/d43/d5c/l5d 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/247: read f2 [1166100,57931] 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/248: write dc/dd/f22 [962027,31787] 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:6/194: mkdir d1/d4/d6/d43 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/249: chown dc/dd/c1a 39027 1 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/250: read dc/dd/d1f/d45/f54 [468288,21977] 0 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/251: chown dc/dd/d43/d5c/l5d 4702 1 2026-03-10T12:37:35.783 INFO:tasks.workunit.client.1.vm07.stdout:3/252: stat dc/d18/d24/c33 0 2026-03-10T12:37:35.786 INFO:tasks.workunit.client.1.vm07.stdout:3/253: chown dc/c35 1 1 2026-03-10T12:37:35.789 INFO:tasks.workunit.client.1.vm07.stdout:6/195: dwrite d1/d9/fb [0,4194304] 0 2026-03-10T12:37:35.791 INFO:tasks.workunit.client.1.vm07.stdout:8/236: sync 2026-03-10T12:37:35.793 INFO:tasks.workunit.client.1.vm07.stdout:8/237: chown d1/d3/d40/f41 3213 1 2026-03-10T12:37:35.793 INFO:tasks.workunit.client.1.vm07.stdout:8/238: stat d1/d3/d6/f24 0 2026-03-10T12:37:35.800 INFO:tasks.workunit.client.1.vm07.stdout:8/239: dwrite d1/d3/d18/f38 [0,4194304] 0 2026-03-10T12:37:35.805 INFO:tasks.workunit.client.1.vm07.stdout:4/235: truncate d0/d4/d10/d23/f4f 5118 0 2026-03-10T12:37:35.807 INFO:tasks.workunit.client.1.vm07.stdout:6/196: unlink d1/d4/d6/f36 0 2026-03-10T12:37:35.808 INFO:tasks.workunit.client.1.vm07.stdout:3/254: dread - dc/dd/d28/d3b/f5b zero size 2026-03-10T12:37:35.812 INFO:tasks.workunit.client.1.vm07.stdout:3/255: creat dc/dd/d1f/d45/f5e x:0 0 0 2026-03-10T12:37:35.813 INFO:tasks.workunit.client.1.vm07.stdout:8/240: symlink d1/l4d 0 2026-03-10T12:37:35.813 INFO:tasks.workunit.client.1.vm07.stdout:8/241: chown d1/d3/d18/l33 1142764126 1 2026-03-10T12:37:35.814 INFO:tasks.workunit.client.1.vm07.stdout:3/256: creat dc/dd/d43/d5c/f5f x:0 0 0 2026-03-10T12:37:35.822 INFO:tasks.workunit.client.1.vm07.stdout:6/197: getdents d1/d4 0 2026-03-10T12:37:35.823 INFO:tasks.workunit.client.1.vm07.stdout:5/249: read d0/d22/d18/d19/d21/f2d [3575471,87120] 0 2026-03-10T12:37:35.825 INFO:tasks.workunit.client.1.vm07.stdout:8/242: getdents d1/d3/d40 0 2026-03-10T12:37:35.825 INFO:tasks.workunit.client.1.vm07.stdout:8/243: write d1/d3/d40/f41 [659116,72777] 0 2026-03-10T12:37:35.826 INFO:tasks.workunit.client.1.vm07.stdout:8/244: chown d1/f19 403681 1 2026-03-10T12:37:35.826 INFO:tasks.workunit.client.1.vm07.stdout:8/245: readlink d1/d3/d6/l2b 0 2026-03-10T12:37:35.829 INFO:tasks.workunit.client.1.vm07.stdout:8/246: rename d1/d3/d6/c30 to d1/d3/d18/c4e 0 2026-03-10T12:37:35.829 INFO:tasks.workunit.client.1.vm07.stdout:6/198: chown d1/l21 1840149 1 2026-03-10T12:37:35.829 INFO:tasks.workunit.client.1.vm07.stdout:6/199: chown d1/d9 1 1 2026-03-10T12:37:35.830 INFO:tasks.workunit.client.1.vm07.stdout:6/200: readlink d1/d4/d6/l27 0 2026-03-10T12:37:35.835 INFO:tasks.workunit.client.1.vm07.stdout:7/214: dread d0/f3 [0,4194304] 0 2026-03-10T12:37:35.841 INFO:tasks.workunit.client.1.vm07.stdout:9/183: dread d5/d16/d18/f20 [0,4194304] 0 2026-03-10T12:37:35.842 INFO:tasks.workunit.client.1.vm07.stdout:5/250: link d0/f9 d0/d22/d18/d19/d2e/f59 0 2026-03-10T12:37:35.842 INFO:tasks.workunit.client.1.vm07.stdout:9/184: write d5/d13/d22/f36 [796172,109236] 0 2026-03-10T12:37:35.842 INFO:tasks.workunit.client.1.vm07.stdout:5/251: dread - d0/d22/d18/d19/d21/f37 zero size 2026-03-10T12:37:35.844 INFO:tasks.workunit.client.1.vm07.stdout:1/222: write d9/f19 [3885786,4092] 0 2026-03-10T12:37:35.849 INFO:tasks.workunit.client.1.vm07.stdout:6/201: mkdir d1/d4/d44 0 2026-03-10T12:37:35.850 INFO:tasks.workunit.client.1.vm07.stdout:3/257: getdents dc 0 2026-03-10T12:37:35.855 INFO:tasks.workunit.client.1.vm07.stdout:5/252: unlink d0/d22/d18/d19/d2e/c39 0 2026-03-10T12:37:35.855 INFO:tasks.workunit.client.1.vm07.stdout:5/253: readlink d0/d22/d18/d19/d21/d3a/l43 0 2026-03-10T12:37:35.857 INFO:tasks.workunit.client.1.vm07.stdout:1/223: creat d9/df/d29/f49 x:0 0 0 2026-03-10T12:37:35.858 INFO:tasks.workunit.client.1.vm07.stdout:1/224: chown d9/df/f11 93040776 1 2026-03-10T12:37:35.860 INFO:tasks.workunit.client.1.vm07.stdout:8/247: link d1/d3/d18/f2e d1/d3/d6/f4f 0 2026-03-10T12:37:35.862 INFO:tasks.workunit.client.1.vm07.stdout:3/258: truncate f1 980136 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:9/185: mknod d5/c3a 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:8/248: mkdir d1/d3/d6/d50 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:7/215: creat d0/f37 x:0 0 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:5/254: link d0/d22/l45 d0/d22/d18/d19/d21/d54/l5a 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:8/249: symlink d1/d3/l51 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:7/216: mknod d0/c38 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:5/255: dwrite d0/d22/d18/d19/d21/f42 [4194304,4194304] 0 2026-03-10T12:37:35.879 INFO:tasks.workunit.client.1.vm07.stdout:7/217: creat d0/f39 x:0 0 0 2026-03-10T12:37:35.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 -- 192.168.123.100:0/596350187 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 msgr2=0x7f317c10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 --2- 192.168.123.100:0/596350187 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c10be90 secure :-1 s=READY pgs=335 cs=0 l=1 rev1=1 crypto rx=0x7f3170009b00 tx=0x7f3170009e10 comp rx=0 tx=0).stop 2026-03-10T12:37:35.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 -- 192.168.123.100:0/596350187 shutdown_connections 2026-03-10T12:37:35.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 --2- 192.168.123.100:0/596350187 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c10be90 unknown :-1 s=CLOSED pgs=335 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 --2- 192.168.123.100:0/596350187 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f317c071a60 0x7f317c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 -- 192.168.123.100:0/596350187 >> 192.168.123.100:0/596350187 conn(0x7f317c06d1a0 msgr2=0x7f317c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 -- 192.168.123.100:0/596350187 shutdown_connections 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 -- 192.168.123.100:0/596350187 wait complete. 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.864+0000 7f31830b9700 1 Processor -- start 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31830b9700 1 -- start start 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31830b9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f317c071a60 0x7f317c116c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31830b9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c1171d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31830b9700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f317c1177a0 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31830b9700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f317c117910 con 0x7f317c071a60 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31818b6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c1171d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31820b7700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f317c071a60 0x7f317c116c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31818b6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c1171d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59314/0 (socket says 192.168.123.100:59314) 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.865+0000 7f31818b6700 1 -- 192.168.123.100:0/4128295406 learned_addr learned my addr 192.168.123.100:0/4128295406 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.866+0000 7f31818b6700 1 -- 192.168.123.100:0/4128295406 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f317c071a60 msgr2=0x7f317c116c90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.866+0000 7f31818b6700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f317c071a60 0x7f317c116c90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.866+0000 7f31818b6700 1 -- 192.168.123.100:0/4128295406 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31700097e0 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.866+0000 7f31818b6700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c1171d0 secure :-1 s=READY pgs=336 cs=0 l=1 rev1=1 crypto rx=0x7f31700052f0 tx=0x7f3170003680 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.866+0000 7f316f7fe700 1 -- 192.168.123.100:0/4128295406 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f317001d070 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.866+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f317c1b2b20 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.867+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f317c1b3010 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.867+0000 7f316d7fa700 1 -- 192.168.123.100:0/4128295406 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3160005320 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.869+0000 7f316f7fe700 1 -- 192.168.123.100:0/4128295406 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3170003c90 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.869+0000 7f316f7fe700 1 -- 192.168.123.100:0/4128295406 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3170017850 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.869+0000 7f316f7fe700 1 -- 192.168.123.100:0/4128295406 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7f3170017a70 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.869+0000 7f316f7fe700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f316806c870 0x7f316806ed20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.869+0000 7f316f7fe700 1 -- 192.168.123.100:0/4128295406 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f3170030080 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.871+0000 7f316f7fe700 1 -- 192.168.123.100:0/4128295406 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7f3170058900 con 0x7f317c072440 2026-03-10T12:37:35.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.873+0000 7f31820b7700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f316806c870 0x7f316806ed20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:37:35.880 INFO:tasks.workunit.client.1.vm07.stdout:5/256: creat d0/d22/f5b x:0 0 0 2026-03-10T12:37:35.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:35.881+0000 7f31820b7700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f316806c870 0x7f316806ed20 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f317c1ae690 tx=0x7f3178009400 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:37:35.883 INFO:tasks.workunit.client.1.vm07.stdout:7/218: chown d0/c15 102 1 2026-03-10T12:37:35.888 INFO:tasks.workunit.client.1.vm07.stdout:5/257: mkdir d0/d22/d18/d19/d2e/d3f/d5c 0 2026-03-10T12:37:35.889 INFO:tasks.workunit.client.1.vm07.stdout:5/258: stat d0/d22/d18/d3e 0 2026-03-10T12:37:35.895 INFO:tasks.workunit.client.1.vm07.stdout:8/250: dread d1/fc [0,4194304] 0 2026-03-10T12:37:35.896 INFO:tasks.workunit.client.1.vm07.stdout:8/251: read - d1/d3/d11/f3c zero size 2026-03-10T12:37:35.897 INFO:tasks.workunit.client.1.vm07.stdout:7/219: creat d0/f3a x:0 0 0 2026-03-10T12:37:35.897 INFO:tasks.workunit.client.1.vm07.stdout:7/220: dread - d0/f30 zero size 2026-03-10T12:37:35.908 INFO:tasks.workunit.client.1.vm07.stdout:9/186: sync 2026-03-10T12:37:35.916 INFO:tasks.workunit.client.1.vm07.stdout:8/252: mknod d1/d3/c52 0 2026-03-10T12:37:35.924 INFO:tasks.workunit.client.1.vm07.stdout:8/253: dwrite d1/d3/d40/f41 [0,4194304] 0 2026-03-10T12:37:35.924 INFO:tasks.workunit.client.0.vm00.stdout:6/87: creat d2/d16/f1d x:0 0 0 2026-03-10T12:37:35.925 INFO:tasks.workunit.client.1.vm07.stdout:5/259: getdents d0/d22/d18/d19/d21/d3a 0 2026-03-10T12:37:35.926 INFO:tasks.workunit.client.1.vm07.stdout:0/276: truncate d0/d14/d1a/f30 2504427 0 2026-03-10T12:37:35.929 INFO:tasks.workunit.client.1.vm07.stdout:2/177: write d0/d19/d26/f2e [2419429,45014] 0 2026-03-10T12:37:35.930 INFO:tasks.workunit.client.1.vm07.stdout:0/277: dwrite d0/d14/f37 [0,4194304] 0 2026-03-10T12:37:35.936 INFO:tasks.workunit.client.1.vm07.stdout:7/221: creat d0/f3b x:0 0 0 2026-03-10T12:37:35.939 INFO:tasks.workunit.client.0.vm00.stdout:2/32: unlink f2 0 2026-03-10T12:37:35.939 INFO:tasks.workunit.client.0.vm00.stdout:2/33: chown d4 26115 1 2026-03-10T12:37:35.941 INFO:tasks.workunit.client.0.vm00.stdout:3/57: rename dd/d11 to dd/d18 0 2026-03-10T12:37:35.941 INFO:tasks.workunit.client.1.vm07.stdout:5/260: sync 2026-03-10T12:37:35.941 INFO:tasks.workunit.client.1.vm07.stdout:0/278: sync 2026-03-10T12:37:35.942 INFO:tasks.workunit.client.1.vm07.stdout:5/261: chown d0/d22/d18/d19/d21 408 1 2026-03-10T12:37:35.944 INFO:tasks.workunit.client.1.vm07.stdout:2/178: creat d0/f40 x:0 0 0 2026-03-10T12:37:35.944 INFO:tasks.workunit.client.0.vm00.stdout:3/58: dwrite f9 [0,4194304] 0 2026-03-10T12:37:35.945 INFO:tasks.workunit.client.0.vm00.stdout:3/59: chown dd/d18/d13 644740926 1 2026-03-10T12:37:35.947 INFO:tasks.workunit.client.1.vm07.stdout:0/279: write d0/f21 [203174,90228] 0 2026-03-10T12:37:35.949 INFO:tasks.workunit.client.1.vm07.stdout:5/262: mkdir d0/d22/d18/d3e/d5d 0 2026-03-10T12:37:35.950 INFO:tasks.workunit.client.0.vm00.stdout:5/32: chown c6 216 1 2026-03-10T12:37:35.950 INFO:tasks.workunit.client.0.vm00.stdout:5/33: rmdir - no directory 2026-03-10T12:37:35.953 INFO:tasks.workunit.client.1.vm07.stdout:7/222: rename d0/f26 to d0/f3c 0 2026-03-10T12:37:35.956 INFO:tasks.workunit.client.1.vm07.stdout:0/280: creat d0/d14/d1a/d2f/d31/f5a x:0 0 0 2026-03-10T12:37:35.956 INFO:tasks.workunit.client.0.vm00.stdout:8/35: symlink d0/l5 0 2026-03-10T12:37:35.956 INFO:tasks.workunit.client.0.vm00.stdout:8/36: write - no filename 2026-03-10T12:37:35.956 INFO:tasks.workunit.client.1.vm07.stdout:2/179: dread d0/d19/f1b [0,4194304] 0 2026-03-10T12:37:35.965 INFO:tasks.workunit.client.0.vm00.stdout:4/58: rename df to df/d13 22 2026-03-10T12:37:36.002 INFO:tasks.workunit.client.0.vm00.stdout:7/51: creat da/f13 x:0 0 0 2026-03-10T12:37:36.002 INFO:tasks.workunit.client.0.vm00.stdout:6/88: creat d2/d16/f1e x:0 0 0 2026-03-10T12:37:36.002 INFO:tasks.workunit.client.0.vm00.stdout:6/89: write d2/d16/f19 [520810,30866] 0 2026-03-10T12:37:36.002 INFO:tasks.workunit.client.0.vm00.stdout:6/90: dwrite d2/d14/f1b [0,4194304] 0 2026-03-10T12:37:36.002 INFO:tasks.workunit.client.0.vm00.stdout:5/34: symlink lb 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/236: write d0/d4/d10/d18/f3e [1274260,8386] 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/281: creat d0/d14/d1a/d1b/d3b/f5b x:0 0 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:8/254: dread d1/f7 [0,4194304] 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:8/255: chown d1/d3/f29 960 1 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:7/223: dwrite d0/f28 [0,4194304] 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/237: rename d0/d4/d10/f39 to d0/d4/d10/d23/f50 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/282: creat d0/d14/d1a/d2f/d31/d4f/f5c x:0 0 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:8/256: rename d1/d3/d11/f15 to d1/d3/d40/f53 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/238: symlink d0/d4/l51 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:8/257: dread - d1/d3/d11/f3c zero size 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/283: dread d0/f15 [0,4194304] 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/239: mknod d0/d4/d5/da/c52 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:7/224: symlink d0/l3d 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:8/258: rename d1/d3/d18/d3a to d1/d3/d6/d54 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/284: creat d0/d14/d1a/d2f/f5d x:0 0 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/240: creat d0/f53 x:0 0 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/241: truncate d0/f33 621693 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:8/259: read d1/f4b [675144,127371] 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/285: mknod d0/d14/c5e 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/242: mkdir d0/d4/d10/d3c/d2b/d54 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:8/260: mknod d1/d3/c55 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:4/243: write d0/d4/d10/d3c/d2b/f3b [1462169,1147] 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/286: chown d0/f1d 83 1 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/287: write d0/f1d [1288696,49177] 0 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/288: chown d0/l34 1336837 1 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/289: chown d0/d14/f36 1931 1 2026-03-10T12:37:36.003 INFO:tasks.workunit.client.1.vm07.stdout:0/290: write d0/d14/d1a/f3d [4059669,67083] 0 2026-03-10T12:37:36.017 INFO:tasks.workunit.client.0.vm00.stdout:0/38: getdents d3/d7 0 2026-03-10T12:37:36.017 INFO:tasks.workunit.client.0.vm00.stdout:7/52: mknod da/c14 0 2026-03-10T12:37:36.017 INFO:tasks.workunit.client.0.vm00.stdout:8/37: chown d0/l2 62484 1 2026-03-10T12:37:36.017 INFO:tasks.workunit.client.0.vm00.stdout:8/38: dwrite - no filename 2026-03-10T12:37:36.017 INFO:tasks.workunit.client.0.vm00.stdout:2/34: mkdir d4/dd/de 0 2026-03-10T12:37:36.017 INFO:tasks.workunit.client.0.vm00.stdout:0/39: write d3/f4 [1233131,87303] 0 2026-03-10T12:37:36.018 INFO:tasks.workunit.client.0.vm00.stdout:7/53: chown da/c12 226153 1 2026-03-10T12:37:36.021 INFO:tasks.workunit.client.0.vm00.stdout:0/40: dwrite f2 [0,4194304] 0 2026-03-10T12:37:36.022 INFO:tasks.workunit.client.0.vm00.stdout:2/35: dread d4/d6/fb [0,4194304] 0 2026-03-10T12:37:36.024 INFO:tasks.workunit.client.0.vm00.stdout:0/41: dwrite f0 [0,4194304] 0 2026-03-10T12:37:36.038 INFO:tasks.workunit.client.0.vm00.stdout:8/39: symlink d0/l6 0 2026-03-10T12:37:36.041 INFO:tasks.workunit.client.0.vm00.stdout:6/91: link d2/cb d2/da/c1f 0 2026-03-10T12:37:36.044 INFO:tasks.workunit.client.0.vm00.stdout:7/54: link f4 da/f15 0 2026-03-10T12:37:36.046 INFO:tasks.workunit.client.0.vm00.stdout:0/42: getdents d3/d7 0 2026-03-10T12:37:36.047 INFO:tasks.workunit.client.1.vm07.stdout:5/263: truncate d0/f9 2113367 0 2026-03-10T12:37:36.048 INFO:tasks.workunit.client.1.vm07.stdout:5/264: write d0/f1f [352396,84693] 0 2026-03-10T12:37:36.049 INFO:tasks.workunit.client.1.vm07.stdout:5/265: write d0/f1f [747373,24587] 0 2026-03-10T12:37:36.054 INFO:tasks.workunit.client.0.vm00.stdout:6/92: creat d2/d16/f20 x:0 0 0 2026-03-10T12:37:36.054 INFO:tasks.workunit.client.0.vm00.stdout:6/93: read - d2/d16/f17 zero size 2026-03-10T12:37:36.059 INFO:tasks.workunit.client.1.vm07.stdout:6/202: dwrite d1/d4/d6/d16/d1a/d33/f3c [0,4194304] 0 2026-03-10T12:37:36.060 INFO:tasks.workunit.client.0.vm00.stdout:7/55: dwrite da/f10 [4194304,4194304] 0 2026-03-10T12:37:36.065 INFO:tasks.workunit.client.0.vm00.stdout:7/56: dwrite da/f13 [0,4194304] 0 2026-03-10T12:37:36.065 INFO:tasks.workunit.client.1.vm07.stdout:3/259: truncate dc/d18/d24/f3a 2797369 0 2026-03-10T12:37:36.067 INFO:tasks.workunit.client.1.vm07.stdout:2/180: sync 2026-03-10T12:37:36.067 INFO:tasks.workunit.client.1.vm07.stdout:4/244: sync 2026-03-10T12:37:36.074 INFO:tasks.workunit.client.0.vm00.stdout:7/57: dwrite da/fd [0,4194304] 0 2026-03-10T12:37:36.082 INFO:tasks.workunit.client.1.vm07.stdout:6/203: dread d1/d4/f11 [0,4194304] 0 2026-03-10T12:37:36.086 INFO:tasks.workunit.client.1.vm07.stdout:5/266: mknod d0/c5e 0 2026-03-10T12:37:36.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.086+0000 7f316d7fa700 1 -- 192.168.123.100:0/4128295406 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f3160005cc0 con 0x7f317c072440 2026-03-10T12:37:36.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.086+0000 7f316f7fe700 1 -- 192.168.123.100:0/4128295406 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f3170020370 con 0x7f317c072440 2026-03-10T12:37:36.090 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:37:36.091 INFO:tasks.workunit.client.1.vm07.stdout:2/181: unlink d0/f14 0 2026-03-10T12:37:36.091 INFO:tasks.workunit.client.1.vm07.stdout:2/182: stat d0/f15 0 2026-03-10T12:37:36.091 INFO:tasks.workunit.client.1.vm07.stdout:2/183: readlink d0/l35 0 2026-03-10T12:37:36.094 INFO:tasks.workunit.client.1.vm07.stdout:4/245: mknod d0/d4/c55 0 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.094+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f316806c870 msgr2=0x7f316806ed20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.094+0000 7f31830b9700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f316806c870 0x7f316806ed20 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f317c1ae690 tx=0x7f3178009400 comp rx=0 tx=0).stop 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.094+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 msgr2=0x7f317c1171d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.094+0000 7f31830b9700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c1171d0 secure :-1 s=READY pgs=336 cs=0 l=1 rev1=1 crypto rx=0x7f31700052f0 tx=0x7f3170003680 comp rx=0 tx=0).stop 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.095+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 shutdown_connections 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.095+0000 7f31830b9700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7f316806c870 0x7f316806ed20 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.095+0000 7f31830b9700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f317c071a60 0x7f317c116c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.095+0000 7f31830b9700 1 --2- 192.168.123.100:0/4128295406 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f317c072440 0x7f317c1171d0 unknown :-1 s=CLOSED pgs=336 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:37:36.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.095+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 >> 192.168.123.100:0/4128295406 conn(0x7f317c06d1a0 msgr2=0x7f317c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:37:36.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.097+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 shutdown_connections 2026-03-10T12:37:36.098 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:37:36.097+0000 7f31830b9700 1 -- 192.168.123.100:0/4128295406 wait complete. 2026-03-10T12:37:36.102 INFO:tasks.workunit.client.0.vm00.stdout:7/58: dread da/f15 [0,4194304] 0 2026-03-10T12:37:36.107 INFO:tasks.workunit.client.1.vm07.stdout:6/204: link d1/d4/f3f d1/d4/d44/f45 0 2026-03-10T12:37:36.108 INFO:tasks.workunit.client.1.vm07.stdout:1/225: link d9/df/d29/d2b/d31/f3c d9/df/f4a 0 2026-03-10T12:37:36.111 INFO:tasks.workunit.client.1.vm07.stdout:3/260: getdents dc/d18/d2d/d3d 0 2026-03-10T12:37:36.116 INFO:tasks.workunit.client.1.vm07.stdout:3/261: dwrite dc/dd/f21 [4194304,4194304] 0 2026-03-10T12:37:36.119 INFO:tasks.workunit.client.0.vm00.stdout:2/36: getdents d4/d6 0 2026-03-10T12:37:36.123 INFO:tasks.workunit.client.1.vm07.stdout:5/267: dread d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:37:36.124 INFO:tasks.workunit.client.1.vm07.stdout:2/184: sync 2026-03-10T12:37:36.126 INFO:tasks.workunit.client.0.vm00.stdout:8/40: creat d0/f7 x:0 0 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:6/205: dread d1/f3d [0,4194304] 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:1/226: getdents d9/df/d29/d2b/d3d 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:1/227: chown d9/df/d29/d2b/d30/f38 6633 1 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:5/268: rename d0/d22/d18/l1a to d0/d22/d18/d30/l5f 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:6/206: mkdir d1/d4/d6/d46 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:3/262: mknod dc/dd/c60 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:6/207: mkdir d1/d4/d6/d16/d47 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:6/208: fsync d1/d4/d6/f30 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.1.vm07.stdout:3/263: creat dc/dd/d43/f61 x:0 0 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:2/37: creat d4/dd/ff x:0 0 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:7/59: creat da/f16 x:0 0 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:8/41: truncate d0/f7 189079 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:8/42: write d0/f7 [1033342,50509] 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:7/60: dwrite da/f13 [0,4194304] 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:2/38: rmdir d4/dd/de 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:8/43: creat d0/f8 x:0 0 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:2/39: rmdir d4/d6 39 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:2/40: chown d4/c8 156043 1 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:2/41: creat d4/dd/f10 x:0 0 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:8/44: creat d0/f9 x:0 0 0 2026-03-10T12:37:36.145 INFO:tasks.workunit.client.0.vm00.stdout:8/45: write d0/f9 [913747,28531] 0 2026-03-10T12:37:36.148 INFO:tasks.workunit.client.1.vm07.stdout:3/264: symlink dc/d18/d24/l62 0 2026-03-10T12:37:36.148 INFO:tasks.workunit.client.1.vm07.stdout:3/265: dread - dc/d18/f34 zero size 2026-03-10T12:37:36.149 INFO:tasks.workunit.client.1.vm07.stdout:5/269: getdents d0/d22/d18/d19/d2e/d3f 0 2026-03-10T12:37:36.150 INFO:tasks.workunit.client.1.vm07.stdout:3/266: symlink dc/d18/d24/l63 0 2026-03-10T12:37:36.151 INFO:tasks.workunit.client.0.vm00.stdout:2/42: write d4/d6/fb [930748,120392] 0 2026-03-10T12:37:36.152 INFO:tasks.workunit.client.0.vm00.stdout:2/43: read d4/d6/fb [274580,85856] 0 2026-03-10T12:37:36.152 INFO:tasks.workunit.client.1.vm07.stdout:5/270: dread d0/d22/d18/d19/d21/f42 [4194304,4194304] 0 2026-03-10T12:37:36.154 INFO:tasks.workunit.client.0.vm00.stdout:8/46: symlink d0/la 0 2026-03-10T12:37:36.154 INFO:tasks.workunit.client.1.vm07.stdout:3/267: rename dc/d18/d24/c33 to dc/dd/d43/c64 0 2026-03-10T12:37:36.158 INFO:tasks.workunit.client.0.vm00.stdout:2/44: symlink d4/l11 0 2026-03-10T12:37:36.158 INFO:tasks.workunit.client.1.vm07.stdout:3/268: getdents dc/dd/d1f 0 2026-03-10T12:37:36.158 INFO:tasks.workunit.client.1.vm07.stdout:3/269: stat dc/c1b 0 2026-03-10T12:37:36.158 INFO:tasks.workunit.client.1.vm07.stdout:3/270: write dc/dd/d1f/d45/f56 [478718,67687] 0 2026-03-10T12:37:36.161 INFO:tasks.workunit.client.1.vm07.stdout:3/271: unlink dc/d18/l1e 0 2026-03-10T12:37:36.162 INFO:tasks.workunit.client.1.vm07.stdout:3/272: write dc/d18/d24/f3e [2096074,113506] 0 2026-03-10T12:37:36.162 INFO:tasks.workunit.client.0.vm00.stdout:2/45: dwrite d4/dd/f10 [0,4194304] 0 2026-03-10T12:37:36.163 INFO:tasks.workunit.client.0.vm00.stdout:8/47: symlink d0/lb 0 2026-03-10T12:37:36.164 INFO:tasks.workunit.client.0.vm00.stdout:0/43: fdatasync d3/f4 0 2026-03-10T12:37:36.164 INFO:tasks.workunit.client.0.vm00.stdout:8/48: chown d0/f7 635270 1 2026-03-10T12:37:36.165 INFO:tasks.workunit.client.0.vm00.stdout:8/49: chown d0/lb 13 1 2026-03-10T12:37:36.165 INFO:tasks.workunit.client.0.vm00.stdout:0/44: truncate d3/f4 1902793 0 2026-03-10T12:37:36.168 INFO:tasks.workunit.client.0.vm00.stdout:2/46: dwrite f1 [0,4194304] 0 2026-03-10T12:37:36.170 INFO:tasks.workunit.client.1.vm07.stdout:2/185: sync 2026-03-10T12:37:36.177 INFO:tasks.workunit.client.1.vm07.stdout:2/186: mknod d0/d19/d1f/d20/c41 0 2026-03-10T12:37:36.177 INFO:tasks.workunit.client.0.vm00.stdout:0/45: chown d3/l9 201373 1 2026-03-10T12:37:36.179 INFO:tasks.workunit.client.0.vm00.stdout:0/46: dread f2 [0,4194304] 0 2026-03-10T12:37:36.179 INFO:tasks.workunit.client.0.vm00.stdout:0/47: stat f2 0 2026-03-10T12:37:36.187 INFO:tasks.workunit.client.1.vm07.stdout:9/187: write d5/d16/f19 [4491015,21603] 0 2026-03-10T12:37:36.187 INFO:tasks.workunit.client.1.vm07.stdout:0/291: write d0/d14/d1a/f30 [3206110,21323] 0 2026-03-10T12:37:36.187 INFO:tasks.workunit.client.1.vm07.stdout:2/187: dread d0/f4 [0,4194304] 0 2026-03-10T12:37:36.188 INFO:tasks.workunit.client.1.vm07.stdout:2/188: dread - d0/f2d zero size 2026-03-10T12:37:36.200 INFO:tasks.workunit.client.0.vm00.stdout:0/48: getdents d3 0 2026-03-10T12:37:36.200 INFO:tasks.workunit.client.0.vm00.stdout:0/49: read f0 [2559469,25337] 0 2026-03-10T12:37:36.202 INFO:tasks.workunit.client.0.vm00.stdout:0/50: unlink f0 0 2026-03-10T12:37:36.202 INFO:tasks.workunit.client.1.vm07.stdout:7/225: dwrite d0/f23 [0,4194304] 0 2026-03-10T12:37:36.202 INFO:tasks.workunit.client.0.vm00.stdout:0/51: mkdir d3/db 0 2026-03-10T12:37:36.205 INFO:tasks.workunit.client.1.vm07.stdout:7/226: dread d0/f23 [0,4194304] 0 2026-03-10T12:37:36.206 INFO:tasks.workunit.client.0.vm00.stdout:0/52: dwrite d3/f4 [0,4194304] 0 2026-03-10T12:37:36.208 INFO:tasks.workunit.client.0.vm00.stdout:0/53: read d3/f4 [2348332,77032] 0 2026-03-10T12:37:36.208 INFO:tasks.workunit.client.0.vm00.stdout:0/54: readlink d3/la 0 2026-03-10T12:37:36.215 INFO:tasks.workunit.client.0.vm00.stdout:0/55: dwrite d3/f4 [0,4194304] 0 2026-03-10T12:37:36.221 INFO:tasks.workunit.client.1.vm07.stdout:8/261: write d1/d3/f2d [1333970,72888] 0 2026-03-10T12:37:36.221 INFO:tasks.workunit.client.0.vm00.stdout:0/56: symlink d3/d7/lc 0 2026-03-10T12:37:36.221 INFO:tasks.workunit.client.1.vm07.stdout:4/246: dread d0/d4/d10/d23/f50 [0,4194304] 0 2026-03-10T12:37:36.221 INFO:tasks.workunit.client.0.vm00.stdout:0/57: dread d3/f4 [0,4194304] 0 2026-03-10T12:37:36.222 INFO:tasks.workunit.client.1.vm07.stdout:9/188: mknod d5/d13/d2c/d2f/c3b 0 2026-03-10T12:37:36.229 INFO:tasks.workunit.client.0.vm00.stdout:0/58: creat d3/fd x:0 0 0 2026-03-10T12:37:36.234 INFO:tasks.workunit.client.0.vm00.stdout:0/59: dread f2 [0,4194304] 0 2026-03-10T12:37:36.234 INFO:tasks.workunit.client.1.vm07.stdout:0/292: rename d0/d14/d1a/d1b to d0/d14/d5f 0 2026-03-10T12:37:36.234 INFO:tasks.workunit.client.1.vm07.stdout:7/227: mknod d0/c3e 0 2026-03-10T12:37:36.234 INFO:tasks.workunit.client.1.vm07.stdout:7/228: fsync d0/f39 0 2026-03-10T12:37:36.234 INFO:tasks.workunit.client.1.vm07.stdout:8/262: link d1/f2 d1/d3/d6/d50/f56 0 2026-03-10T12:37:36.237 INFO:tasks.workunit.client.1.vm07.stdout:8/263: dread d1/d3/d18/f38 [0,4194304] 0 2026-03-10T12:37:36.238 INFO:tasks.workunit.client.1.vm07.stdout:8/264: fsync d1/d3/f25 0 2026-03-10T12:37:36.241 INFO:tasks.workunit.client.0.vm00.stdout:0/60: dwrite f2 [0,4194304] 0 2026-03-10T12:37:36.244 INFO:tasks.workunit.client.0.vm00.stdout:0/61: symlink d3/le 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.0.vm00.stdout:0/62: dwrite d3/f4 [0,4194304] 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.0.vm00.stdout:0/63: link d3/l5 d3/lf 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.0.vm00.stdout:0/64: dread f2 [0,4194304] 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.0.vm00.stdout:0/65: write f2 [1535030,93695] 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.0.vm00.stdout:0/66: write d3/f4 [1979361,9944] 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.0.vm00.stdout:0/67: stat d3/d7/lc 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/265: creat d1/d3/f57 x:0 0 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/266: read d1/d3/d18/f38 [174260,39371] 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/267: dread - d1/d3/f57 zero size 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/268: mknod d1/d3/d40/c58 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/269: creat d1/d3/f59 x:0 0 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/270: dread d1/fc [0,4194304] 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/271: creat d1/d3/d40/f5a x:0 0 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/272: creat d1/d3/d40/f5b x:0 0 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/273: symlink d1/d3/d11/l5c 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/274: read d1/f19 [1566820,52695] 0 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.1.vm07.stdout:8/275: dread - d1/d3/f57 zero size 2026-03-10T12:37:36.263 INFO:tasks.workunit.client.0.vm00.stdout:0/68: creat d3/d7/f10 x:0 0 0 2026-03-10T12:37:36.264 INFO:tasks.workunit.client.0.vm00.stdout:0/69: chown d3/l5 2991745 1 2026-03-10T12:37:36.265 INFO:tasks.workunit.client.0.vm00.stdout:0/70: write d3/fd [341489,82878] 0 2026-03-10T12:37:36.267 INFO:tasks.workunit.client.1.vm07.stdout:8/276: dwrite d1/d3/d6/d50/f56 [0,4194304] 0 2026-03-10T12:37:36.268 INFO:tasks.workunit.client.0.vm00.stdout:0/71: dwrite d3/f4 [0,4194304] 0 2026-03-10T12:37:36.270 INFO:tasks.workunit.client.0.vm00.stdout:0/72: chown d3/f4 41873323 1 2026-03-10T12:37:36.271 INFO:tasks.workunit.client.0.vm00.stdout:0/73: creat d3/d7/f11 x:0 0 0 2026-03-10T12:37:36.274 INFO:tasks.workunit.client.1.vm07.stdout:7/229: sync 2026-03-10T12:37:36.279 INFO:tasks.workunit.client.1.vm07.stdout:8/277: dread d1/d3/d40/f41 [0,4194304] 0 2026-03-10T12:37:36.328 INFO:tasks.workunit.client.1.vm07.stdout:8/278: dread d1/d3/d6/f24 [0,4194304] 0 2026-03-10T12:37:36.328 INFO:tasks.workunit.client.1.vm07.stdout:8/279: chown d1/d3/f8 167874 1 2026-03-10T12:37:36.332 INFO:tasks.workunit.client.1.vm07.stdout:8/280: mkdir d1/d3/d5d 0 2026-03-10T12:37:36.335 INFO:tasks.workunit.client.1.vm07.stdout:8/281: creat d1/d3/d6/d50/f5e x:0 0 0 2026-03-10T12:37:36.337 INFO:tasks.workunit.client.1.vm07.stdout:8/282: fsync d1/f3d 0 2026-03-10T12:37:36.337 INFO:tasks.workunit.client.1.vm07.stdout:8/283: stat d1/d3/c52 0 2026-03-10T12:37:36.402 INFO:tasks.workunit.client.1.vm07.stdout:9/189: dread d5/f1c [4194304,4194304] 0 2026-03-10T12:37:36.404 INFO:tasks.workunit.client.1.vm07.stdout:9/190: dread d5/d13/d22/f36 [0,4194304] 0 2026-03-10T12:37:36.405 INFO:tasks.workunit.client.1.vm07.stdout:9/191: symlink d5/d13/d2c/l3c 0 2026-03-10T12:37:36.406 INFO:tasks.workunit.client.0.vm00.stdout:2/47: dread d4/d6/fb [0,4194304] 0 2026-03-10T12:37:36.408 INFO:tasks.workunit.client.1.vm07.stdout:9/192: creat d5/d1f/f3d x:0 0 0 2026-03-10T12:37:36.411 INFO:tasks.workunit.client.1.vm07.stdout:9/193: unlink d5/d13/d22/f37 0 2026-03-10T12:37:36.424 INFO:tasks.workunit.client.1.vm07.stdout:9/194: mkdir d5/d13/d2c/d2f/d3e 0 2026-03-10T12:37:36.424 INFO:tasks.workunit.client.1.vm07.stdout:9/195: readlink d5/lf 0 2026-03-10T12:37:36.429 INFO:tasks.workunit.client.0.vm00.stdout:1/48: write da/f11 [592856,82864] 0 2026-03-10T12:37:36.431 INFO:tasks.workunit.client.1.vm07.stdout:9/196: link d5/d1f/l2e d5/d13/d22/l3f 0 2026-03-10T12:37:36.433 INFO:tasks.workunit.client.1.vm07.stdout:9/197: creat d5/d16/d23/d26/f40 x:0 0 0 2026-03-10T12:37:36.436 INFO:tasks.workunit.client.1.vm07.stdout:8/284: read d1/d3/d11/f43 [394276,113102] 0 2026-03-10T12:37:36.436 INFO:tasks.workunit.client.1.vm07.stdout:9/198: dwrite d5/d13/d2c/f30 [0,4194304] 0 2026-03-10T12:37:36.439 INFO:tasks.workunit.client.1.vm07.stdout:8/285: write d1/d3/f29 [1381587,47286] 0 2026-03-10T12:37:36.443 INFO:tasks.workunit.client.1.vm07.stdout:9/199: dwrite d5/d16/f19 [4194304,4194304] 0 2026-03-10T12:37:36.444 INFO:tasks.workunit.client.0.vm00.stdout:9/37: truncate d0/d5/f3 5077643 0 2026-03-10T12:37:36.445 INFO:tasks.workunit.client.1.vm07.stdout:8/286: creat d1/d3/d5d/f5f x:0 0 0 2026-03-10T12:37:36.449 INFO:tasks.workunit.client.1.vm07.stdout:8/287: read d1/f2 [12640,110351] 0 2026-03-10T12:37:36.457 INFO:tasks.workunit.client.0.vm00.stdout:3/60: truncate f7 3370177 0 2026-03-10T12:37:36.457 INFO:tasks.workunit.client.0.vm00.stdout:3/61: truncate dd/f16 816585 0 2026-03-10T12:37:36.457 INFO:tasks.workunit.client.1.vm07.stdout:9/200: rename d5/d13/d2c/f30 to d5/d13/d2c/f41 0 2026-03-10T12:37:36.457 INFO:tasks.workunit.client.1.vm07.stdout:9/201: rename d5/d16/d23/f28 to d5/d16/d23/d26/f42 0 2026-03-10T12:37:36.458 INFO:tasks.workunit.client.1.vm07.stdout:9/202: write d5/d16/d23/d26/f40 [653996,62245] 0 2026-03-10T12:37:36.459 INFO:tasks.workunit.client.1.vm07.stdout:9/203: readlink d5/lf 0 2026-03-10T12:37:36.466 INFO:tasks.workunit.client.1.vm07.stdout:9/204: dwrite d5/d13/d22/f39 [0,4194304] 0 2026-03-10T12:37:36.476 INFO:tasks.workunit.client.1.vm07.stdout:9/205: dwrite d5/d16/d23/d26/f42 [0,4194304] 0 2026-03-10T12:37:36.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:36 vm00.local ceph-mon[50686]: from='client.24445 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:36.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:36 vm00.local ceph-mon[50686]: pgmap v153: 65 pgs: 65 active+clean; 662 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 2.1 MiB/s rd, 66 MiB/s wr, 400 op/s 2026-03-10T12:37:36.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:36 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/3068058105' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:37:36.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:36 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/4128295406' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:37:36.488 INFO:tasks.workunit.client.1.vm07.stdout:9/206: rename d5/f2a to d5/d1f/d31/f43 0 2026-03-10T12:37:36.493 INFO:tasks.workunit.client.1.vm07.stdout:9/207: rmdir d5/d16/d23 39 2026-03-10T12:37:36.498 INFO:tasks.workunit.client.0.vm00.stdout:4/59: fsync f9 0 2026-03-10T12:37:36.498 INFO:tasks.workunit.client.0.vm00.stdout:4/60: rmdir df 39 2026-03-10T12:37:36.498 INFO:tasks.workunit.client.0.vm00.stdout:9/38: creat d0/dd/fe x:0 0 0 2026-03-10T12:37:36.498 INFO:tasks.workunit.client.0.vm00.stdout:9/39: rename d0 to d0/d5/df 22 2026-03-10T12:37:36.501 INFO:tasks.workunit.client.1.vm07.stdout:0/293: read d0/d14/d1a/f30 [2585344,96551] 0 2026-03-10T12:37:36.501 INFO:tasks.workunit.client.0.vm00.stdout:9/40: link d0/c6 d0/dd/c10 0 2026-03-10T12:37:36.504 INFO:tasks.workunit.client.1.vm07.stdout:0/294: mkdir d0/d14/d1a/d2f/d31/d4f/d60 0 2026-03-10T12:37:36.504 INFO:tasks.workunit.client.1.vm07.stdout:0/295: write d0/d14/d5f/d3b/f46 [385361,2575] 0 2026-03-10T12:37:36.505 INFO:tasks.workunit.client.1.vm07.stdout:0/296: chown d0/d14/d1a/d2f/c40 383808849 1 2026-03-10T12:37:36.512 INFO:tasks.workunit.client.0.vm00.stdout:9/41: link d0/dd/c10 d0/c11 0 2026-03-10T12:37:36.517 INFO:tasks.workunit.client.0.vm00.stdout:3/62: dread dd/f15 [0,4194304] 0 2026-03-10T12:37:36.517 INFO:tasks.workunit.client.0.vm00.stdout:3/63: fsync dd/d18/f12 0 2026-03-10T12:37:36.517 INFO:tasks.workunit.client.1.vm07.stdout:3/273: write dc/d18/d24/f3a [2459150,7297] 0 2026-03-10T12:37:36.517 INFO:tasks.workunit.client.1.vm07.stdout:3/274: creat dc/dd/d43/d5c/f65 x:0 0 0 2026-03-10T12:37:36.517 INFO:tasks.workunit.client.1.vm07.stdout:3/275: truncate dc/dd/f19 3813585 0 2026-03-10T12:37:36.517 INFO:tasks.workunit.client.1.vm07.stdout:3/276: chown dc/d18/d24/f49 15247549 1 2026-03-10T12:37:36.521 INFO:tasks.workunit.client.0.vm00.stdout:6/94: truncate d2/d14/f1b 2453301 0 2026-03-10T12:37:36.521 INFO:tasks.workunit.client.0.vm00.stdout:7/61: getdents da 0 2026-03-10T12:37:36.522 INFO:tasks.workunit.client.0.vm00.stdout:3/64: mknod dd/c19 0 2026-03-10T12:37:36.523 INFO:tasks.workunit.client.0.vm00.stdout:3/65: read f9 [2485453,24958] 0 2026-03-10T12:37:36.523 INFO:tasks.workunit.client.1.vm07.stdout:1/228: dwrite d9/df/f13 [0,4194304] 0 2026-03-10T12:37:36.524 INFO:tasks.workunit.client.1.vm07.stdout:1/229: stat d9/f36 0 2026-03-10T12:37:36.525 INFO:tasks.workunit.client.0.vm00.stdout:6/95: stat d2/da/c1f 0 2026-03-10T12:37:36.526 INFO:tasks.workunit.client.0.vm00.stdout:8/50: rmdir d0 39 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.0.vm00.stdout:2/48: write d4/dd/f10 [4936077,38189] 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.0.vm00.stdout:7/62: creat da/f17 x:0 0 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.0.vm00.stdout:3/66: mknod dd/d18/c1a 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.0.vm00.stdout:6/96: rename d2/da/dc/ff to d2/d14/f21 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.0.vm00.stdout:8/51: fdatasync d0/f8 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.0.vm00.stdout:8/52: dread d0/f7 [0,4194304] 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.0.vm00.stdout:8/53: write d0/f8 [642889,8679] 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:6/209: dread d1/d4/f3f [0,4194304] 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:3/277: symlink dc/d18/d24/l66 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:6/210: rmdir d1/d4/d6/d16/d1a/d2c 39 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:6/211: readlink d1/d4/d6/d16/l24 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:1/230: symlink d9/df/d29/d2b/d31/l4b 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:1/231: read - d9/df/d29/f49 zero size 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:6/212: rename d1 to d1/d4/d6/d16/d1a/d33/d48 22 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:3/278: chown dc/dd/f19 1161 1 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:3/279: write dc/dd/d1f/d45/f56 [778353,111647] 0 2026-03-10T12:37:36.536 INFO:tasks.workunit.client.1.vm07.stdout:3/280: write dc/dd/d1f/f27 [2296142,9207] 0 2026-03-10T12:37:36.537 INFO:tasks.workunit.client.1.vm07.stdout:1/232: dwrite d9/df/f13 [4194304,4194304] 0 2026-03-10T12:37:36.542 INFO:tasks.workunit.client.0.vm00.stdout:3/67: mknod dd/c1b 0 2026-03-10T12:37:36.542 INFO:tasks.workunit.client.0.vm00.stdout:3/68: write dd/f16 [1469691,95134] 0 2026-03-10T12:37:36.543 INFO:tasks.workunit.client.0.vm00.stdout:3/69: stat dd/c1b 0 2026-03-10T12:37:36.543 INFO:tasks.workunit.client.1.vm07.stdout:3/281: creat dc/dd/d28/f67 x:0 0 0 2026-03-10T12:37:36.546 INFO:tasks.workunit.client.1.vm07.stdout:6/213: unlink d1/c18 0 2026-03-10T12:37:36.547 INFO:tasks.workunit.client.1.vm07.stdout:6/214: truncate d1/d4/d6/d16/d1a/d33/f37 984523 0 2026-03-10T12:37:36.550 INFO:tasks.workunit.client.0.vm00.stdout:7/63: symlink da/l18 0 2026-03-10T12:37:36.551 INFO:tasks.workunit.client.1.vm07.stdout:1/233: creat d9/df/d29/d2b/d3d/f4c x:0 0 0 2026-03-10T12:37:36.552 INFO:tasks.workunit.client.1.vm07.stdout:1/234: write d9/df/f26 [328699,47135] 0 2026-03-10T12:37:36.553 INFO:tasks.workunit.client.1.vm07.stdout:6/215: mkdir d1/d4/d6/d16/d49 0 2026-03-10T12:37:36.553 INFO:tasks.workunit.client.0.vm00.stdout:3/70: dwrite f9 [0,4194304] 0 2026-03-10T12:37:36.555 INFO:tasks.workunit.client.0.vm00.stdout:3/71: truncate dd/d18/f12 483598 0 2026-03-10T12:37:36.555 INFO:tasks.workunit.client.0.vm00.stdout:3/72: read f9 [2057819,85614] 0 2026-03-10T12:37:36.556 INFO:tasks.workunit.client.1.vm07.stdout:6/216: dread d1/d4/d6/d16/d1a/f29 [0,4194304] 0 2026-03-10T12:37:36.557 INFO:tasks.workunit.client.0.vm00.stdout:8/54: symlink d0/lc 0 2026-03-10T12:37:36.558 INFO:tasks.workunit.client.0.vm00.stdout:7/64: symlink da/l19 0 2026-03-10T12:37:36.559 INFO:tasks.workunit.client.0.vm00.stdout:8/55: mkdir d0/dd 0 2026-03-10T12:37:36.562 INFO:tasks.workunit.client.0.vm00.stdout:8/56: dwrite d0/f9 [0,4194304] 0 2026-03-10T12:37:36.565 INFO:tasks.workunit.client.0.vm00.stdout:8/57: read d0/f7 [678996,3953] 0 2026-03-10T12:37:36.565 INFO:tasks.workunit.client.0.vm00.stdout:7/65: creat da/f1a x:0 0 0 2026-03-10T12:37:36.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:36 vm07.local ceph-mon[58582]: from='client.24445 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:36.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:36 vm07.local ceph-mon[58582]: pgmap v153: 65 pgs: 65 active+clean; 662 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 2.1 MiB/s rd, 66 MiB/s wr, 400 op/s 2026-03-10T12:37:36.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:36 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/3068058105' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:37:36.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:36 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/4128295406' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:37:36.566 INFO:tasks.workunit.client.0.vm00.stdout:8/58: rename d0/lb to d0/dd/le 0 2026-03-10T12:37:36.569 INFO:tasks.workunit.client.1.vm07.stdout:1/235: link d9/df/d29/d2c/l3f d9/df/l4d 0 2026-03-10T12:37:36.569 INFO:tasks.workunit.client.0.vm00.stdout:7/66: mkdir da/d1b 0 2026-03-10T12:37:36.576 INFO:tasks.workunit.client.0.vm00.stdout:7/67: link da/c12 da/c1c 0 2026-03-10T12:37:36.579 INFO:tasks.workunit.client.0.vm00.stdout:0/74: fdatasync d3/f4 0 2026-03-10T12:37:36.585 INFO:tasks.workunit.client.1.vm07.stdout:1/236: dread d9/f19 [0,4194304] 0 2026-03-10T12:37:36.585 INFO:tasks.workunit.client.1.vm07.stdout:1/237: rename d9/df/f13 to d9/df/d29/d2b/f4e 0 2026-03-10T12:37:36.585 INFO:tasks.workunit.client.0.vm00.stdout:0/75: dread - d3/d7/f11 zero size 2026-03-10T12:37:36.585 INFO:tasks.workunit.client.0.vm00.stdout:0/76: mknod d3/c12 0 2026-03-10T12:37:36.585 INFO:tasks.workunit.client.0.vm00.stdout:0/77: dwrite d3/d7/f11 [0,4194304] 0 2026-03-10T12:37:36.587 INFO:tasks.workunit.client.0.vm00.stdout:0/78: mknod d3/db/c13 0 2026-03-10T12:37:36.587 INFO:tasks.workunit.client.1.vm07.stdout:1/238: dwrite d9/f1a [0,4194304] 0 2026-03-10T12:37:36.588 INFO:tasks.workunit.client.0.vm00.stdout:0/79: read - d3/d7/f10 zero size 2026-03-10T12:37:36.588 INFO:tasks.workunit.client.0.vm00.stdout:0/80: rename d3/db to d3/db/d14 22 2026-03-10T12:37:36.588 INFO:tasks.workunit.client.0.vm00.stdout:0/81: stat d3/fd 0 2026-03-10T12:37:36.589 INFO:tasks.workunit.client.0.vm00.stdout:0/82: write f2 [1495823,30647] 0 2026-03-10T12:37:36.593 INFO:tasks.workunit.client.1.vm07.stdout:1/239: unlink d9/fb 0 2026-03-10T12:37:36.598 INFO:tasks.workunit.client.0.vm00.stdout:0/83: unlink d3/c12 0 2026-03-10T12:37:36.598 INFO:tasks.workunit.client.1.vm07.stdout:1/240: unlink d9/df/d29/d2c/l40 0 2026-03-10T12:37:36.599 INFO:tasks.workunit.client.0.vm00.stdout:0/84: dwrite d3/fd [0,4194304] 0 2026-03-10T12:37:36.606 INFO:tasks.workunit.client.1.vm07.stdout:1/241: mkdir d9/d2d/d4f 0 2026-03-10T12:37:36.608 INFO:tasks.workunit.client.0.vm00.stdout:0/85: creat d3/d7/f15 x:0 0 0 2026-03-10T12:37:36.610 INFO:tasks.workunit.client.1.vm07.stdout:2/189: rename d0/d19 to d0/d42 0 2026-03-10T12:37:36.611 INFO:tasks.workunit.client.0.vm00.stdout:0/86: dwrite d3/d7/f11 [0,4194304] 0 2026-03-10T12:37:36.611 INFO:tasks.workunit.client.0.vm00.stdout:0/87: dread - d3/d7/f10 zero size 2026-03-10T12:37:36.615 INFO:tasks.workunit.client.0.vm00.stdout:5/35: truncate f4 123182 0 2026-03-10T12:37:36.624 INFO:tasks.workunit.client.0.vm00.stdout:0/88: creat d3/db/f16 x:0 0 0 2026-03-10T12:37:36.624 INFO:tasks.workunit.client.0.vm00.stdout:1/49: fdatasync f5 0 2026-03-10T12:37:36.624 INFO:tasks.workunit.client.0.vm00.stdout:5/36: link ca cc 0 2026-03-10T12:37:36.624 INFO:tasks.workunit.client.0.vm00.stdout:5/37: unlink f2 0 2026-03-10T12:37:36.625 INFO:tasks.workunit.client.1.vm07.stdout:7/230: truncate d0/f31 2914646 0 2026-03-10T12:37:36.625 INFO:tasks.workunit.client.1.vm07.stdout:3/282: dread dc/f17 [0,4194304] 0 2026-03-10T12:37:36.625 INFO:tasks.workunit.client.1.vm07.stdout:7/231: truncate d0/f39 170969 0 2026-03-10T12:37:36.625 INFO:tasks.workunit.client.1.vm07.stdout:8/288: getdents d1/d3/d11 0 2026-03-10T12:37:36.625 INFO:tasks.workunit.client.1.vm07.stdout:1/242: fdatasync d9/df/f4a 0 2026-03-10T12:37:36.625 INFO:tasks.workunit.client.0.vm00.stdout:5/38: creat fd x:0 0 0 2026-03-10T12:37:36.625 INFO:tasks.workunit.client.0.vm00.stdout:5/39: stat c7 0 2026-03-10T12:37:36.626 INFO:tasks.workunit.client.1.vm07.stdout:5/271: write d0/d22/d18/d19/d2e/f59 [605682,99061] 0 2026-03-10T12:37:36.627 INFO:tasks.workunit.client.1.vm07.stdout:5/272: write d0/f9 [1814081,48073] 0 2026-03-10T12:37:36.630 INFO:tasks.workunit.client.1.vm07.stdout:3/283: rename dc/dd/d28/d3b/f51 to dc/dd/d1f/d45/f68 0 2026-03-10T12:37:36.630 INFO:tasks.workunit.client.1.vm07.stdout:3/284: fdatasync dc/d18/d24/f2c 0 2026-03-10T12:37:36.631 INFO:tasks.workunit.client.1.vm07.stdout:3/285: write dc/dd/f21 [2263280,25497] 0 2026-03-10T12:37:36.631 INFO:tasks.workunit.client.1.vm07.stdout:7/232: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:36.633 INFO:tasks.workunit.client.1.vm07.stdout:3/286: fsync dc/d18/d24/f49 0 2026-03-10T12:37:36.637 INFO:tasks.workunit.client.1.vm07.stdout:8/289: mknod d1/d3/d40/c60 0 2026-03-10T12:37:36.639 INFO:tasks.workunit.client.1.vm07.stdout:7/233: dread d0/f3c [0,4194304] 0 2026-03-10T12:37:36.650 INFO:tasks.workunit.client.1.vm07.stdout:3/287: dwrite dc/dd/d1f/f30 [0,4194304] 0 2026-03-10T12:37:36.650 INFO:tasks.workunit.client.1.vm07.stdout:3/288: symlink dc/dd/d43/d5c/l69 0 2026-03-10T12:37:36.653 INFO:tasks.workunit.client.0.vm00.stdout:8/59: fdatasync d0/f8 0 2026-03-10T12:37:36.654 INFO:tasks.workunit.client.1.vm07.stdout:5/273: link d0/l2a d0/d22/d18/d19/d2e/d3f/d5c/l60 0 2026-03-10T12:37:36.657 INFO:tasks.workunit.client.0.vm00.stdout:8/60: symlink d0/dd/lf 0 2026-03-10T12:37:36.659 INFO:tasks.workunit.client.1.vm07.stdout:3/289: unlink dc/d18/d2d/d3d/l44 0 2026-03-10T12:37:36.667 INFO:tasks.workunit.client.0.vm00.stdout:8/61: dwrite d0/f7 [0,4194304] 0 2026-03-10T12:37:36.667 INFO:tasks.workunit.client.0.vm00.stdout:8/62: truncate d0/f7 4889781 0 2026-03-10T12:37:36.667 INFO:tasks.workunit.client.0.vm00.stdout:8/63: write d0/f7 [69880,2043] 0 2026-03-10T12:37:36.667 INFO:tasks.workunit.client.1.vm07.stdout:8/290: dread d1/d3/ff [0,4194304] 0 2026-03-10T12:37:36.668 INFO:tasks.workunit.client.1.vm07.stdout:5/274: dwrite d0/f13 [0,4194304] 0 2026-03-10T12:37:36.668 INFO:tasks.workunit.client.1.vm07.stdout:3/290: dread dc/f17 [0,4194304] 0 2026-03-10T12:37:36.671 INFO:tasks.workunit.client.0.vm00.stdout:8/64: creat d0/f10 x:0 0 0 2026-03-10T12:37:36.672 INFO:tasks.workunit.client.0.vm00.stdout:8/65: write d0/f7 [1027398,121400] 0 2026-03-10T12:37:36.677 INFO:tasks.workunit.client.1.vm07.stdout:8/291: symlink d1/d3/d40/l61 0 2026-03-10T12:37:36.686 INFO:tasks.workunit.client.0.vm00.stdout:8/66: creat d0/f11 x:0 0 0 2026-03-10T12:37:36.686 INFO:tasks.workunit.client.1.vm07.stdout:5/275: creat d0/d22/d18/d19/d21/f61 x:0 0 0 2026-03-10T12:37:36.686 INFO:tasks.workunit.client.1.vm07.stdout:5/276: creat d0/d22/d18/d19/d2e/f62 x:0 0 0 2026-03-10T12:37:36.686 INFO:tasks.workunit.client.1.vm07.stdout:5/277: fdatasync d0/d22/d18/d19/d2e/f62 0 2026-03-10T12:37:36.686 INFO:tasks.workunit.client.1.vm07.stdout:5/278: write d0/fa [1860936,95702] 0 2026-03-10T12:37:36.690 INFO:tasks.workunit.client.0.vm00.stdout:8/67: mkdir d0/d12 0 2026-03-10T12:37:36.703 INFO:tasks.workunit.client.1.vm07.stdout:3/291: dread dc/d18/d24/f3a [0,4194304] 0 2026-03-10T12:37:36.704 INFO:tasks.workunit.client.1.vm07.stdout:3/292: chown dc/dd/d1f/d45/f50 601919 1 2026-03-10T12:37:36.709 INFO:tasks.workunit.client.1.vm07.stdout:3/293: symlink dc/dd/d28/d3b/l6a 0 2026-03-10T12:37:36.710 INFO:tasks.workunit.client.1.vm07.stdout:3/294: dread dc/dd/d1f/d45/f54 [0,4194304] 0 2026-03-10T12:37:36.712 INFO:tasks.workunit.client.1.vm07.stdout:8/292: dread d1/d3/f1d [0,4194304] 0 2026-03-10T12:37:36.713 INFO:tasks.workunit.client.1.vm07.stdout:8/293: chown d1/d3/d18/f38 19576 1 2026-03-10T12:37:36.714 INFO:tasks.workunit.client.1.vm07.stdout:8/294: dread - d1/d3/d40/f4c zero size 2026-03-10T12:37:36.715 INFO:tasks.workunit.client.1.vm07.stdout:8/295: chown d1/f4b 3973513 1 2026-03-10T12:37:36.729 INFO:tasks.workunit.client.1.vm07.stdout:5/279: dread d0/d22/d18/d19/d2e/f52 [0,4194304] 0 2026-03-10T12:37:36.786 INFO:tasks.workunit.client.1.vm07.stdout:5/280: sync 2026-03-10T12:37:36.790 INFO:tasks.workunit.client.1.vm07.stdout:5/281: dwrite d0/d22/f5b [0,4194304] 0 2026-03-10T12:37:36.825 INFO:tasks.workunit.client.0.vm00.stdout:0/89: fsync f2 0 2026-03-10T12:37:36.834 INFO:tasks.workunit.client.0.vm00.stdout:0/90: mkdir d3/db/d17 0 2026-03-10T12:37:36.840 INFO:tasks.workunit.client.0.vm00.stdout:0/91: link d3/l9 d3/l18 0 2026-03-10T12:37:36.843 INFO:tasks.workunit.client.0.vm00.stdout:0/92: creat d3/db/d17/f19 x:0 0 0 2026-03-10T12:37:36.844 INFO:tasks.workunit.client.0.vm00.stdout:0/93: symlink d3/d7/l1a 0 2026-03-10T12:37:36.846 INFO:tasks.workunit.client.0.vm00.stdout:0/94: mkdir d3/d1b 0 2026-03-10T12:37:36.846 INFO:tasks.workunit.client.0.vm00.stdout:0/95: chown d3/d7/f11 1 1 2026-03-10T12:37:36.857 INFO:tasks.workunit.client.0.vm00.stdout:0/96: creat d3/d7/f1c x:0 0 0 2026-03-10T12:37:36.857 INFO:tasks.workunit.client.0.vm00.stdout:0/97: creat d3/f1d x:0 0 0 2026-03-10T12:37:36.857 INFO:tasks.workunit.client.0.vm00.stdout:0/98: symlink d3/d1b/l1e 0 2026-03-10T12:37:36.857 INFO:tasks.workunit.client.0.vm00.stdout:0/99: fdatasync d3/d7/f1c 0 2026-03-10T12:37:36.857 INFO:tasks.workunit.client.0.vm00.stdout:3/73: dread dd/f16 [0,4194304] 0 2026-03-10T12:37:36.857 INFO:tasks.workunit.client.0.vm00.stdout:0/100: mknod d3/d7/c1f 0 2026-03-10T12:37:36.861 INFO:tasks.workunit.client.0.vm00.stdout:3/74: dwrite fb [0,4194304] 0 2026-03-10T12:37:36.866 INFO:tasks.workunit.client.0.vm00.stdout:0/101: symlink d3/db/d17/l20 0 2026-03-10T12:37:36.871 INFO:tasks.workunit.client.0.vm00.stdout:0/102: dread - d3/d7/f15 zero size 2026-03-10T12:37:36.871 INFO:tasks.workunit.client.0.vm00.stdout:0/103: dread - d3/db/d17/f19 zero size 2026-03-10T12:37:36.871 INFO:tasks.workunit.client.0.vm00.stdout:9/42: dread d0/d5/f3 [0,4194304] 0 2026-03-10T12:37:36.877 INFO:tasks.workunit.client.0.vm00.stdout:0/104: rename d3/d7/l1a to d3/d1b/l21 0 2026-03-10T12:37:36.877 INFO:tasks.workunit.client.0.vm00.stdout:9/43: fdatasync d0/d5/f3 0 2026-03-10T12:37:36.880 INFO:tasks.workunit.client.0.vm00.stdout:0/105: dwrite d3/d7/f15 [0,4194304] 0 2026-03-10T12:37:36.890 INFO:tasks.workunit.client.0.vm00.stdout:9/44: unlink d0/d5/f3 0 2026-03-10T12:37:36.892 INFO:tasks.workunit.client.0.vm00.stdout:0/106: mkdir d3/d22 0 2026-03-10T12:37:36.892 INFO:tasks.workunit.client.0.vm00.stdout:9/45: symlink d0/dd/l12 0 2026-03-10T12:37:36.894 INFO:tasks.workunit.client.0.vm00.stdout:0/107: mknod d3/db/c23 0 2026-03-10T12:37:36.898 INFO:tasks.workunit.client.0.vm00.stdout:0/108: readlink d3/l9 0 2026-03-10T12:37:36.898 INFO:tasks.workunit.client.0.vm00.stdout:9/46: link d0/dd/l12 d0/dd/l13 0 2026-03-10T12:37:36.898 INFO:tasks.workunit.client.0.vm00.stdout:9/47: write d0/f4 [4485038,57319] 0 2026-03-10T12:37:36.898 INFO:tasks.workunit.client.0.vm00.stdout:0/109: mkdir d3/db/d24 0 2026-03-10T12:37:36.900 INFO:tasks.workunit.client.0.vm00.stdout:9/48: mknod d0/d5/c14 0 2026-03-10T12:37:36.902 INFO:tasks.workunit.client.0.vm00.stdout:0/110: mkdir d3/db/d24/d25 0 2026-03-10T12:37:36.907 INFO:tasks.workunit.client.1.vm07.stdout:9/208: getdents d5 0 2026-03-10T12:37:36.907 INFO:tasks.workunit.client.0.vm00.stdout:9/49: chown d0/dd/c10 1884318 1 2026-03-10T12:37:36.907 INFO:tasks.workunit.client.0.vm00.stdout:9/50: stat d0/d5/c14 0 2026-03-10T12:37:36.907 INFO:tasks.workunit.client.0.vm00.stdout:0/111: dwrite d3/f1d [0,4194304] 0 2026-03-10T12:37:36.908 INFO:tasks.workunit.client.1.vm07.stdout:9/209: dwrite d5/d16/f19 [4194304,4194304] 0 2026-03-10T12:37:36.915 INFO:tasks.workunit.client.0.vm00.stdout:0/112: symlink d3/db/l26 0 2026-03-10T12:37:36.915 INFO:tasks.workunit.client.0.vm00.stdout:0/113: chown d3/db/c13 0 1 2026-03-10T12:37:36.916 INFO:tasks.workunit.client.0.vm00.stdout:9/51: rename d0/c11 to d0/d5/c15 0 2026-03-10T12:37:36.916 INFO:tasks.workunit.client.1.vm07.stdout:0/297: write d0/d14/d5f/d41/d4e/f56 [1622158,94646] 0 2026-03-10T12:37:36.917 INFO:tasks.workunit.client.0.vm00.stdout:9/52: write d0/dd/fe [841880,94635] 0 2026-03-10T12:37:36.917 INFO:tasks.workunit.client.1.vm07.stdout:0/298: dread - d0/d14/d5f/d41/f55 zero size 2026-03-10T12:37:36.917 INFO:tasks.workunit.client.0.vm00.stdout:9/53: truncate d0/dd/fe 1470158 0 2026-03-10T12:37:36.918 INFO:tasks.workunit.client.1.vm07.stdout:0/299: write d0/d14/f37 [3451437,23281] 0 2026-03-10T12:37:36.919 INFO:tasks.workunit.client.1.vm07.stdout:0/300: fsync d0/d14/d1a/f27 0 2026-03-10T12:37:36.921 INFO:tasks.workunit.client.0.vm00.stdout:9/54: dwrite d0/dd/fe [0,4194304] 0 2026-03-10T12:37:36.926 INFO:tasks.workunit.client.0.vm00.stdout:9/55: read d0/dd/fe [1454873,84194] 0 2026-03-10T12:37:36.930 INFO:tasks.workunit.client.0.vm00.stdout:9/56: dread d0/dd/fe [0,4194304] 0 2026-03-10T12:37:36.933 INFO:tasks.workunit.client.1.vm07.stdout:0/301: creat d0/d14/d1a/d2f/d31/d4f/f61 x:0 0 0 2026-03-10T12:37:36.942 INFO:tasks.workunit.client.1.vm07.stdout:6/217: write d1/f26 [3434240,111994] 0 2026-03-10T12:37:36.945 INFO:tasks.workunit.client.1.vm07.stdout:6/218: write d1/d4/d6/d16/d1a/d33/f37 [1795372,76851] 0 2026-03-10T12:37:36.953 INFO:tasks.workunit.client.0.vm00.stdout:2/49: dwrite f3 [0,4194304] 0 2026-03-10T12:37:36.967 INFO:tasks.workunit.client.0.vm00.stdout:3/75: getdents dd/d18 0 2026-03-10T12:37:36.968 INFO:tasks.workunit.client.0.vm00.stdout:2/50: creat d4/f12 x:0 0 0 2026-03-10T12:37:36.969 INFO:tasks.workunit.client.0.vm00.stdout:2/51: stat f3 0 2026-03-10T12:37:36.969 INFO:tasks.workunit.client.0.vm00.stdout:3/76: fdatasync dd/f16 0 2026-03-10T12:37:36.969 INFO:tasks.workunit.client.0.vm00.stdout:2/52: chown d4/c5 56375284 1 2026-03-10T12:37:36.970 INFO:tasks.workunit.client.0.vm00.stdout:3/77: dread dd/f15 [0,4194304] 0 2026-03-10T12:37:36.970 INFO:tasks.workunit.client.0.vm00.stdout:3/78: read f9 [1297284,82947] 0 2026-03-10T12:37:36.970 INFO:tasks.workunit.client.0.vm00.stdout:3/79: chown dd/d18/d13 2135320589 1 2026-03-10T12:37:36.971 INFO:tasks.workunit.client.0.vm00.stdout:6/97: rmdir d2/da/dc 39 2026-03-10T12:37:36.972 INFO:tasks.workunit.client.0.vm00.stdout:2/53: rename d4/c8 to d4/d6/c13 0 2026-03-10T12:37:36.972 INFO:tasks.workunit.client.0.vm00.stdout:6/98: rename d2/d14 to d2/d14/d22 22 2026-03-10T12:37:36.976 INFO:tasks.workunit.client.0.vm00.stdout:6/99: dwrite d2/d16/f20 [0,4194304] 0 2026-03-10T12:37:36.978 INFO:tasks.workunit.client.0.vm00.stdout:2/54: chown f1 4 1 2026-03-10T12:37:36.979 INFO:tasks.workunit.client.0.vm00.stdout:2/55: dread d4/d6/fb [0,4194304] 0 2026-03-10T12:37:36.979 INFO:tasks.workunit.client.0.vm00.stdout:2/56: chown d4/d6/fb 3525665 1 2026-03-10T12:37:36.980 INFO:tasks.workunit.client.1.vm07.stdout:1/243: rmdir d9 39 2026-03-10T12:37:36.993 INFO:tasks.workunit.client.0.vm00.stdout:2/57: readlink d4/l9 0 2026-03-10T12:37:36.993 INFO:tasks.workunit.client.0.vm00.stdout:2/58: readlink d4/d6/l7 0 2026-03-10T12:37:36.998 INFO:tasks.workunit.client.0.vm00.stdout:6/100: link d2/da/dc/f13 d2/d16/f23 0 2026-03-10T12:37:36.999 INFO:tasks.workunit.client.0.vm00.stdout:6/101: unlink d2/d14/f21 0 2026-03-10T12:37:37.001 INFO:tasks.workunit.client.1.vm07.stdout:1/244: read d9/f1b [1452911,43886] 0 2026-03-10T12:37:37.003 INFO:tasks.workunit.client.0.vm00.stdout:2/59: link d4/dd/ff d4/f14 0 2026-03-10T12:37:37.008 INFO:tasks.workunit.client.1.vm07.stdout:1/245: dwrite d9/df/d29/d2b/f32 [0,4194304] 0 2026-03-10T12:37:37.009 INFO:tasks.workunit.client.0.vm00.stdout:6/102: creat d2/d14/f24 x:0 0 0 2026-03-10T12:37:37.009 INFO:tasks.workunit.client.0.vm00.stdout:6/103: write d2/d14/f24 [967722,73416] 0 2026-03-10T12:37:37.011 INFO:tasks.workunit.client.1.vm07.stdout:6/219: dread d1/d4/f19 [0,4194304] 0 2026-03-10T12:37:37.011 INFO:tasks.workunit.client.1.vm07.stdout:6/220: chown d1/l21 49430628 1 2026-03-10T12:37:37.012 INFO:tasks.workunit.client.1.vm07.stdout:4/247: dwrite d0/d4/d5/f43 [0,4194304] 0 2026-03-10T12:37:37.013 INFO:tasks.workunit.client.0.vm00.stdout:7/68: truncate da/f13 2413693 0 2026-03-10T12:37:37.013 INFO:tasks.workunit.client.0.vm00.stdout:2/60: dwrite d4/dd/f10 [4194304,4194304] 0 2026-03-10T12:37:37.015 INFO:tasks.workunit.client.0.vm00.stdout:2/61: dread - d4/dd/ff zero size 2026-03-10T12:37:37.016 INFO:tasks.workunit.client.0.vm00.stdout:2/62: chown d4/f14 31 1 2026-03-10T12:37:37.023 INFO:tasks.workunit.client.1.vm07.stdout:4/248: dwrite d0/d4/d5/da/f4e [0,4194304] 0 2026-03-10T12:37:37.030 INFO:tasks.workunit.client.0.vm00.stdout:1/50: truncate f5 1140167 0 2026-03-10T12:37:37.036 INFO:tasks.workunit.client.0.vm00.stdout:0/114: dwrite d3/d7/f11 [4194304,4194304] 0 2026-03-10T12:37:37.038 INFO:tasks.workunit.client.0.vm00.stdout:0/115: dwrite d3/f4 [0,4194304] 0 2026-03-10T12:37:37.044 INFO:tasks.workunit.client.0.vm00.stdout:0/116: rename d3 to d3/d22/d27 22 2026-03-10T12:37:37.045 INFO:tasks.workunit.client.0.vm00.stdout:0/117: write d3/f1d [3260380,91641] 0 2026-03-10T12:37:37.045 INFO:tasks.workunit.client.0.vm00.stdout:6/104: creat d2/da/dc/f25 x:0 0 0 2026-03-10T12:37:37.045 INFO:tasks.workunit.client.1.vm07.stdout:1/246: sync 2026-03-10T12:37:37.048 INFO:tasks.workunit.client.1.vm07.stdout:1/247: chown d9/df/d29/d2b/d31/f35 2016 1 2026-03-10T12:37:37.048 INFO:tasks.workunit.client.0.vm00.stdout:2/63: mknod d4/d6/c15 0 2026-03-10T12:37:37.048 INFO:tasks.workunit.client.0.vm00.stdout:2/64: chown d4/c5 86052 1 2026-03-10T12:37:37.053 INFO:tasks.workunit.client.1.vm07.stdout:4/249: creat d0/d4/d10/d23/d46/f56 x:0 0 0 2026-03-10T12:37:37.056 INFO:tasks.workunit.client.0.vm00.stdout:0/118: unlink d3/f1d 0 2026-03-10T12:37:37.056 INFO:tasks.workunit.client.0.vm00.stdout:2/65: dwrite f3 [0,4194304] 0 2026-03-10T12:37:37.058 INFO:tasks.workunit.client.0.vm00.stdout:8/68: truncate d0/f7 91525 0 2026-03-10T12:37:37.058 INFO:tasks.workunit.client.0.vm00.stdout:8/69: dread - d0/f10 zero size 2026-03-10T12:37:37.062 INFO:tasks.workunit.client.0.vm00.stdout:7/69: creat da/d1b/f1d x:0 0 0 2026-03-10T12:37:37.069 INFO:tasks.workunit.client.1.vm07.stdout:7/234: write d0/fc [1853618,38658] 0 2026-03-10T12:37:37.069 INFO:tasks.workunit.client.1.vm07.stdout:7/235: chown d0/f32 853 1 2026-03-10T12:37:37.069 INFO:tasks.workunit.client.1.vm07.stdout:7/236: write d0/f37 [912723,34800] 0 2026-03-10T12:37:37.069 INFO:tasks.workunit.client.1.vm07.stdout:1/248: link d9/df/d29/d2b/d31/l45 d9/df/d29/d2b/d30/l50 0 2026-03-10T12:37:37.070 INFO:tasks.workunit.client.0.vm00.stdout:0/119: symlink d3/l28 0 2026-03-10T12:37:37.071 INFO:tasks.workunit.client.0.vm00.stdout:0/120: truncate d3/db/f16 922362 0 2026-03-10T12:37:37.073 INFO:tasks.workunit.client.0.vm00.stdout:2/66: write d4/d6/fb [807760,99948] 0 2026-03-10T12:37:37.075 INFO:tasks.workunit.client.1.vm07.stdout:7/237: rename d0/f23 to d0/f3f 0 2026-03-10T12:37:37.075 INFO:tasks.workunit.client.0.vm00.stdout:2/67: dread f1 [0,4194304] 0 2026-03-10T12:37:37.075 INFO:tasks.workunit.client.0.vm00.stdout:2/68: chown f1 231118 1 2026-03-10T12:37:37.076 INFO:tasks.workunit.client.0.vm00.stdout:2/69: write f1 [304678,4841] 0 2026-03-10T12:37:37.076 INFO:tasks.workunit.client.0.vm00.stdout:2/70: readlink d4/l9 0 2026-03-10T12:37:37.079 INFO:tasks.workunit.client.0.vm00.stdout:7/70: creat da/d1b/f1e x:0 0 0 2026-03-10T12:37:37.082 INFO:tasks.workunit.client.1.vm07.stdout:7/238: link d0/f20 d0/f40 0 2026-03-10T12:37:37.083 INFO:tasks.workunit.client.0.vm00.stdout:0/121: symlink d3/d7/l29 0 2026-03-10T12:37:37.083 INFO:tasks.workunit.client.0.vm00.stdout:0/122: chown f2 28 1 2026-03-10T12:37:37.083 INFO:tasks.workunit.client.0.vm00.stdout:6/105: link d2/d16/f1c d2/da/f26 0 2026-03-10T12:37:37.083 INFO:tasks.workunit.client.1.vm07.stdout:3/295: write dc/d18/d24/f3f [1043564,123406] 0 2026-03-10T12:37:37.085 INFO:tasks.workunit.client.0.vm00.stdout:2/71: creat d4/d6/f16 x:0 0 0 2026-03-10T12:37:37.087 INFO:tasks.workunit.client.0.vm00.stdout:7/71: write da/fb [1555042,104873] 0 2026-03-10T12:37:37.087 INFO:tasks.workunit.client.0.vm00.stdout:7/72: stat f1 0 2026-03-10T12:37:37.088 INFO:tasks.workunit.client.0.vm00.stdout:0/123: creat d3/d1b/f2a x:0 0 0 2026-03-10T12:37:37.089 INFO:tasks.workunit.client.0.vm00.stdout:6/106: creat d2/da/dc/f27 x:0 0 0 2026-03-10T12:37:37.089 INFO:tasks.workunit.client.0.vm00.stdout:6/107: dread - d2/da/dc/f13 zero size 2026-03-10T12:37:37.090 INFO:tasks.workunit.client.0.vm00.stdout:6/108: write d2/d16/f1d [456735,20238] 0 2026-03-10T12:37:37.091 INFO:tasks.workunit.client.1.vm07.stdout:8/296: dwrite d1/f3f [0,4194304] 0 2026-03-10T12:37:37.092 INFO:tasks.workunit.client.1.vm07.stdout:8/297: fdatasync d1/d3/d5d/f5f 0 2026-03-10T12:37:37.093 INFO:tasks.workunit.client.0.vm00.stdout:0/124: creat d3/d1b/f2b x:0 0 0 2026-03-10T12:37:37.094 INFO:tasks.workunit.client.1.vm07.stdout:7/239: rename d0/l2a to d0/l41 0 2026-03-10T12:37:37.098 INFO:tasks.workunit.client.0.vm00.stdout:6/109: unlink d2/da/f1a 0 2026-03-10T12:37:37.100 INFO:tasks.workunit.client.0.vm00.stdout:2/72: link f3 d4/dd/f17 0 2026-03-10T12:37:37.103 INFO:tasks.workunit.client.0.vm00.stdout:6/110: creat d2/da/dc/f28 x:0 0 0 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.1.vm07.stdout:7/240: unlink d0/f32 0 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.1.vm07.stdout:5/282: truncate d0/fd 484755 0 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.0.vm00.stdout:0/125: mknod d3/db/d24/d25/c2c 0 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.0.vm00.stdout:6/111: mkdir d2/d16/d29 0 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.0.vm00.stdout:0/126: mknod d3/d7/c2d 0 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.0.vm00.stdout:6/112: rename d2/da/dc/f18 to d2/d16/f2a 0 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.0.vm00.stdout:6/113: dread - d2/d16/f23 zero size 2026-03-10T12:37:37.110 INFO:tasks.workunit.client.0.vm00.stdout:6/114: creat d2/d14/f2b x:0 0 0 2026-03-10T12:37:37.111 INFO:tasks.workunit.client.0.vm00.stdout:6/115: truncate d2/d16/f1e 289405 0 2026-03-10T12:37:37.111 INFO:tasks.workunit.client.0.vm00.stdout:6/116: dread - d2/d16/f17 zero size 2026-03-10T12:37:37.111 INFO:tasks.workunit.client.1.vm07.stdout:5/283: dread d0/d22/d18/d19/d21/f38 [0,4194304] 0 2026-03-10T12:37:37.111 INFO:tasks.workunit.client.0.vm00.stdout:0/127: link d3/d7/f1c d3/d22/f2e 0 2026-03-10T12:37:37.112 INFO:tasks.workunit.client.0.vm00.stdout:6/117: creat d2/da/f2c x:0 0 0 2026-03-10T12:37:37.116 INFO:tasks.workunit.client.1.vm07.stdout:5/284: dwrite d0/d22/d18/d19/d21/f37 [0,4194304] 0 2026-03-10T12:37:37.129 INFO:tasks.workunit.client.1.vm07.stdout:5/285: mkdir d0/d22/d18/d19/d2e/d3f/d63 0 2026-03-10T12:37:37.131 INFO:tasks.workunit.client.1.vm07.stdout:5/286: write d0/d22/d18/d30/f35 [884854,88742] 0 2026-03-10T12:37:37.132 INFO:tasks.workunit.client.1.vm07.stdout:7/241: creat d0/f42 x:0 0 0 2026-03-10T12:37:37.135 INFO:tasks.workunit.client.1.vm07.stdout:5/287: mknod d0/d22/d18/d19/d21/d3a/c64 0 2026-03-10T12:37:37.140 INFO:tasks.workunit.client.1.vm07.stdout:7/242: dwrite d0/f2b [0,4194304] 0 2026-03-10T12:37:37.142 INFO:tasks.workunit.client.1.vm07.stdout:7/243: write d0/f3a [656578,89191] 0 2026-03-10T12:37:37.145 INFO:tasks.workunit.client.1.vm07.stdout:7/244: unlink d0/f31 0 2026-03-10T12:37:37.146 INFO:tasks.workunit.client.1.vm07.stdout:7/245: symlink d0/l43 0 2026-03-10T12:37:37.147 INFO:tasks.workunit.client.1.vm07.stdout:7/246: write d0/f2b [4872708,28458] 0 2026-03-10T12:37:37.298 INFO:tasks.workunit.client.0.vm00.stdout:2/73: sync 2026-03-10T12:37:37.301 INFO:tasks.workunit.client.0.vm00.stdout:2/74: dwrite d4/dd/f10 [4194304,4194304] 0 2026-03-10T12:37:37.324 INFO:tasks.workunit.client.0.vm00.stdout:7/73: read da/fb [678292,60675] 0 2026-03-10T12:37:37.325 INFO:tasks.workunit.client.0.vm00.stdout:7/74: write da/fb [1050443,128256] 0 2026-03-10T12:37:37.326 INFO:tasks.workunit.client.0.vm00.stdout:7/75: unlink f4 0 2026-03-10T12:37:37.328 INFO:tasks.workunit.client.0.vm00.stdout:7/76: symlink da/l1f 0 2026-03-10T12:37:37.338 INFO:tasks.workunit.client.0.vm00.stdout:7/77: sync 2026-03-10T12:37:37.339 INFO:tasks.workunit.client.0.vm00.stdout:7/78: creat da/d1b/f20 x:0 0 0 2026-03-10T12:37:37.340 INFO:tasks.workunit.client.0.vm00.stdout:7/79: chown da/c12 59780 1 2026-03-10T12:37:37.393 INFO:tasks.workunit.client.0.vm00.stdout:9/57: getdents d0 0 2026-03-10T12:37:37.394 INFO:tasks.workunit.client.1.vm07.stdout:9/210: truncate d5/d1f/d31/f43 2418146 0 2026-03-10T12:37:37.396 INFO:tasks.workunit.client.0.vm00.stdout:4/61: write df/f11 [320756,25644] 0 2026-03-10T12:37:37.397 INFO:tasks.workunit.client.0.vm00.stdout:4/62: read fa [194990,59027] 0 2026-03-10T12:37:37.401 INFO:tasks.workunit.client.0.vm00.stdout:4/63: dread fa [0,4194304] 0 2026-03-10T12:37:37.401 INFO:tasks.workunit.client.0.vm00.stdout:4/64: chown f8 2009199 1 2026-03-10T12:37:37.404 INFO:tasks.workunit.client.1.vm07.stdout:5/288: mknod d0/d22/c65 0 2026-03-10T12:37:37.404 INFO:tasks.workunit.client.0.vm00.stdout:4/65: dwrite df/f11 [0,4194304] 0 2026-03-10T12:37:37.407 INFO:tasks.workunit.client.0.vm00.stdout:4/66: dread fa [0,4194304] 0 2026-03-10T12:37:37.412 INFO:tasks.workunit.client.1.vm07.stdout:2/190: mknod d0/d42/c43 0 2026-03-10T12:37:37.415 INFO:tasks.workunit.client.1.vm07.stdout:0/302: mkdir d0/d62 0 2026-03-10T12:37:37.415 INFO:tasks.workunit.client.0.vm00.stdout:5/40: write f4 [485263,86803] 0 2026-03-10T12:37:37.416 INFO:tasks.workunit.client.0.vm00.stdout:7/80: dwrite da/f13 [0,4194304] 0 2026-03-10T12:37:37.421 INFO:tasks.workunit.client.1.vm07.stdout:0/303: dwrite d0/d14/d1a/d2f/d31/d4f/f61 [0,4194304] 0 2026-03-10T12:37:37.424 INFO:tasks.workunit.client.0.vm00.stdout:8/70: dwrite d0/f7 [0,4194304] 0 2026-03-10T12:37:37.439 INFO:tasks.workunit.client.0.vm00.stdout:3/80: dwrite f7 [0,4194304] 0 2026-03-10T12:37:37.444 INFO:tasks.workunit.client.1.vm07.stdout:0/304: symlink d0/d14/d1a/d2f/l63 0 2026-03-10T12:37:37.444 INFO:tasks.workunit.client.0.vm00.stdout:2/75: getdents d4/d6 0 2026-03-10T12:37:37.448 INFO:tasks.workunit.client.0.vm00.stdout:2/76: dwrite d4/d6/f16 [0,4194304] 0 2026-03-10T12:37:37.448 INFO:tasks.workunit.client.1.vm07.stdout:0/305: getdents d0/d62 0 2026-03-10T12:37:37.450 INFO:tasks.workunit.client.0.vm00.stdout:2/77: rename d4/d6 to d4/d6/d18 22 2026-03-10T12:37:37.450 INFO:tasks.workunit.client.0.vm00.stdout:2/78: fsync d4/f12 0 2026-03-10T12:37:37.453 INFO:tasks.workunit.client.1.vm07.stdout:0/306: dwrite d0/f15 [0,4194304] 0 2026-03-10T12:37:37.453 INFO:tasks.workunit.client.0.vm00.stdout:0/128: getdents d3/d1b 0 2026-03-10T12:37:37.465 INFO:tasks.workunit.client.0.vm00.stdout:6/118: truncate d2/d14/f1b 1475158 0 2026-03-10T12:37:37.467 INFO:tasks.workunit.client.1.vm07.stdout:4/250: write d0/d4/d5/d34/f37 [1857094,117790] 0 2026-03-10T12:37:37.468 INFO:tasks.workunit.client.1.vm07.stdout:4/251: write d0/d4/d5/da/f4d [929009,36189] 0 2026-03-10T12:37:37.474 INFO:tasks.workunit.client.1.vm07.stdout:7/247: dwrite d0/f3f [0,4194304] 0 2026-03-10T12:37:37.474 INFO:tasks.workunit.client.0.vm00.stdout:6/119: dread d2/d16/f19 [0,4194304] 0 2026-03-10T12:37:37.475 INFO:tasks.workunit.client.0.vm00.stdout:6/120: dread - d2/da/dc/f25 zero size 2026-03-10T12:37:37.475 INFO:tasks.workunit.client.0.vm00.stdout:6/121: fsync d2/da/dc/f25 0 2026-03-10T12:37:37.476 INFO:tasks.workunit.client.0.vm00.stdout:6/122: stat d2/d14 0 2026-03-10T12:37:37.480 INFO:tasks.workunit.client.1.vm07.stdout:4/252: symlink d0/d4/d10/l57 0 2026-03-10T12:37:37.486 INFO:tasks.workunit.client.1.vm07.stdout:4/253: mknod d0/c58 0 2026-03-10T12:37:37.486 INFO:tasks.workunit.client.1.vm07.stdout:4/254: write d0/d4/d10/d23/d46/f56 [548472,120095] 0 2026-03-10T12:37:37.487 INFO:tasks.workunit.client.1.vm07.stdout:4/255: truncate d0/d4/d5/da/f44 810091 0 2026-03-10T12:37:37.489 INFO:tasks.workunit.client.1.vm07.stdout:7/248: symlink d0/l44 0 2026-03-10T12:37:37.495 INFO:tasks.workunit.client.1.vm07.stdout:3/296: write dc/dd/f1d [501358,96088] 0 2026-03-10T12:37:37.500 INFO:tasks.workunit.client.1.vm07.stdout:3/297: write dc/dd/d28/d3b/f5b [1600226,102291] 0 2026-03-10T12:37:37.509 INFO:tasks.workunit.client.1.vm07.stdout:7/249: mknod d0/c45 0 2026-03-10T12:37:37.510 INFO:tasks.workunit.client.1.vm07.stdout:7/250: readlink d0/l44 0 2026-03-10T12:37:37.510 INFO:tasks.workunit.client.1.vm07.stdout:3/298: fdatasync dc/dd/d1f/d45/f68 0 2026-03-10T12:37:37.510 INFO:tasks.workunit.client.1.vm07.stdout:4/256: dread d0/d4/d10/d18/f1a [0,4194304] 0 2026-03-10T12:37:37.510 INFO:tasks.workunit.client.1.vm07.stdout:5/289: getdents d0/d22/d18/d19/d2e/d3f 0 2026-03-10T12:37:37.510 INFO:tasks.workunit.client.1.vm07.stdout:3/299: symlink dc/dd/d28/l6b 0 2026-03-10T12:37:37.510 INFO:tasks.workunit.client.1.vm07.stdout:8/298: link d1/d3/d6/l17 d1/d3/l62 0 2026-03-10T12:37:37.510 INFO:tasks.workunit.client.1.vm07.stdout:4/257: symlink d0/d4/d5/l59 0 2026-03-10T12:37:37.511 INFO:tasks.workunit.client.0.vm00.stdout:5/41: rename fd to fe 0 2026-03-10T12:37:37.511 INFO:tasks.workunit.client.0.vm00.stdout:7/81: symlink da/d1b/l21 0 2026-03-10T12:37:37.511 INFO:tasks.workunit.client.1.vm07.stdout:3/300: mknod dc/d18/d24/c6c 0 2026-03-10T12:37:37.512 INFO:tasks.workunit.client.1.vm07.stdout:8/299: mknod d1/d3/d6/d54/c63 0 2026-03-10T12:37:37.513 INFO:tasks.workunit.client.0.vm00.stdout:2/79: rename f3 to d4/d6/f19 0 2026-03-10T12:37:37.514 INFO:tasks.workunit.client.0.vm00.stdout:0/129: creat d3/db/d24/f2f x:0 0 0 2026-03-10T12:37:37.515 INFO:tasks.workunit.client.1.vm07.stdout:4/258: mknod d0/d4/d10/d3c/d2b/d2d/c5a 0 2026-03-10T12:37:37.516 INFO:tasks.workunit.client.1.vm07.stdout:5/290: mknod d0/c66 0 2026-03-10T12:37:37.517 INFO:tasks.workunit.client.1.vm07.stdout:5/291: chown d0/d22/d18/d19/f23 1 1 2026-03-10T12:37:37.517 INFO:tasks.workunit.client.1.vm07.stdout:3/301: rename dc/dd/d43/d5c/f5f to dc/dd/d1f/f6d 0 2026-03-10T12:37:37.518 INFO:tasks.workunit.client.1.vm07.stdout:8/300: mknod d1/d3/d11/c64 0 2026-03-10T12:37:37.518 INFO:tasks.workunit.client.0.vm00.stdout:5/42: stat cc 0 2026-03-10T12:37:37.518 INFO:tasks.workunit.client.0.vm00.stdout:5/43: dread - fe zero size 2026-03-10T12:37:37.519 INFO:tasks.workunit.client.1.vm07.stdout:4/259: mknod d0/d4/d10/d3c/c5b 0 2026-03-10T12:37:37.520 INFO:tasks.workunit.client.0.vm00.stdout:3/81: mknod dd/d18/d13/c1c 0 2026-03-10T12:37:37.521 INFO:tasks.workunit.client.1.vm07.stdout:4/260: mkdir d0/d5c 0 2026-03-10T12:37:37.521 INFO:tasks.workunit.client.0.vm00.stdout:0/130: creat d3/db/d17/f30 x:0 0 0 2026-03-10T12:37:37.522 INFO:tasks.workunit.client.0.vm00.stdout:8/71: creat d0/f13 x:0 0 0 2026-03-10T12:37:37.525 INFO:tasks.workunit.client.1.vm07.stdout:4/261: dwrite d0/d4/d10/d3c/d2b/f3b [0,4194304] 0 2026-03-10T12:37:37.525 INFO:tasks.workunit.client.0.vm00.stdout:8/72: read d0/f7 [1274547,113520] 0 2026-03-10T12:37:37.525 INFO:tasks.workunit.client.0.vm00.stdout:3/82: mkdir dd/d18/d13/d1d 0 2026-03-10T12:37:37.525 INFO:tasks.workunit.client.0.vm00.stdout:0/131: creat d3/d7/f31 x:0 0 0 2026-03-10T12:37:37.525 INFO:tasks.workunit.client.0.vm00.stdout:3/83: symlink dd/d18/d13/l1e 0 2026-03-10T12:37:37.526 INFO:tasks.workunit.client.0.vm00.stdout:3/84: write dd/d18/f12 [495055,96655] 0 2026-03-10T12:37:37.527 INFO:tasks.workunit.client.1.vm07.stdout:4/262: chown d0/d4/d10/d18/f3e 7442172 1 2026-03-10T12:37:37.528 INFO:tasks.workunit.client.1.vm07.stdout:4/263: fdatasync d0/d4/d5/f43 0 2026-03-10T12:37:37.529 INFO:tasks.workunit.client.1.vm07.stdout:4/264: chown d0/d4/d10/d18/f3e 201890187 1 2026-03-10T12:37:37.531 INFO:tasks.workunit.client.0.vm00.stdout:3/85: chown dd/cf 87114 1 2026-03-10T12:37:37.532 INFO:tasks.workunit.client.0.vm00.stdout:0/132: rename d3/lf to d3/d1b/l32 0 2026-03-10T12:37:37.533 INFO:tasks.workunit.client.0.vm00.stdout:3/86: symlink dd/l1f 0 2026-03-10T12:37:37.534 INFO:tasks.workunit.client.0.vm00.stdout:0/133: mkdir d3/d33 0 2026-03-10T12:37:37.539 INFO:tasks.workunit.client.0.vm00.stdout:0/134: chown d3/d1b/f2b 13752846 1 2026-03-10T12:37:37.539 INFO:tasks.workunit.client.0.vm00.stdout:0/135: readlink d3/la 0 2026-03-10T12:37:37.542 INFO:tasks.workunit.client.1.vm07.stdout:4/265: creat d0/d4/d5/d34/f5d x:0 0 0 2026-03-10T12:37:37.542 INFO:tasks.workunit.client.0.vm00.stdout:0/136: mknod d3/db/d24/d25/c34 0 2026-03-10T12:37:37.547 INFO:tasks.workunit.client.0.vm00.stdout:0/137: dwrite d3/d7/f15 [0,4194304] 0 2026-03-10T12:37:37.547 INFO:tasks.workunit.client.0.vm00.stdout:3/87: mknod dd/d18/d14/c20 0 2026-03-10T12:37:37.547 INFO:tasks.workunit.client.1.vm07.stdout:6/221: mkdir d1/d4/d4a 0 2026-03-10T12:37:37.547 INFO:tasks.workunit.client.1.vm07.stdout:1/249: mknod d9/df/c51 0 2026-03-10T12:37:37.547 INFO:tasks.workunit.client.1.vm07.stdout:4/266: symlink d0/d4/d5/d34/l5e 0 2026-03-10T12:37:37.551 INFO:tasks.workunit.client.0.vm00.stdout:0/138: mknod d3/d22/c35 0 2026-03-10T12:37:37.559 INFO:tasks.workunit.client.0.vm00.stdout:3/88: dwrite f7 [4194304,4194304] 0 2026-03-10T12:37:37.559 INFO:tasks.workunit.client.1.vm07.stdout:6/222: readlink d1/d4/d6/l32 0 2026-03-10T12:37:37.559 INFO:tasks.workunit.client.1.vm07.stdout:4/267: mkdir d0/d4/d10/d5f 0 2026-03-10T12:37:37.559 INFO:tasks.workunit.client.1.vm07.stdout:6/223: rename d1/d4/d6/l3a to d1/d4/d6/l4b 0 2026-03-10T12:37:37.562 INFO:tasks.workunit.client.1.vm07.stdout:0/307: sync 2026-03-10T12:37:37.562 INFO:tasks.workunit.client.1.vm07.stdout:0/308: chown d0/d14 59007376 1 2026-03-10T12:37:37.564 INFO:tasks.workunit.client.0.vm00.stdout:0/139: rename d3/d22/c35 to d3/db/d17/c36 0 2026-03-10T12:37:37.565 INFO:tasks.workunit.client.0.vm00.stdout:0/140: truncate d3/d1b/f2b 623821 0 2026-03-10T12:37:37.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:37 vm07.local ceph-mon[58582]: from='client.24455 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:37.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:37 vm07.local ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:37:37.569 INFO:tasks.workunit.client.1.vm07.stdout:6/224: mknod d1/d4/d6/d46/c4c 0 2026-03-10T12:37:37.569 INFO:tasks.workunit.client.0.vm00.stdout:0/141: dwrite d3/d1b/f2a [0,4194304] 0 2026-03-10T12:37:37.572 INFO:tasks.workunit.client.1.vm07.stdout:6/225: rename d1/d9 to d1/d4/d6/d46/d4d 0 2026-03-10T12:37:37.573 INFO:tasks.workunit.client.0.vm00.stdout:0/142: dread d3/fd [0,4194304] 0 2026-03-10T12:37:37.575 INFO:tasks.workunit.client.1.vm07.stdout:6/226: rename d1/d4/d6/d16/d47 to d1/d4/d6/d4e 0 2026-03-10T12:37:37.577 INFO:tasks.workunit.client.1.vm07.stdout:5/292: sync 2026-03-10T12:37:37.577 INFO:tasks.workunit.client.0.vm00.stdout:3/89: creat dd/d18/f21 x:0 0 0 2026-03-10T12:37:37.578 INFO:tasks.workunit.client.1.vm07.stdout:6/227: dread d1/d4/f3f [0,4194304] 0 2026-03-10T12:37:37.580 INFO:tasks.workunit.client.1.vm07.stdout:5/293: mkdir d0/d22/d18/d19/d2e/d67 0 2026-03-10T12:37:37.584 INFO:tasks.workunit.client.0.vm00.stdout:0/143: creat d3/d1b/f37 x:0 0 0 2026-03-10T12:37:37.587 INFO:tasks.workunit.client.0.vm00.stdout:3/90: creat dd/d18/d13/f22 x:0 0 0 2026-03-10T12:37:37.603 INFO:tasks.workunit.client.0.vm00.stdout:3/91: write dd/d18/f21 [963530,126213] 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.0.vm00.stdout:3/92: symlink dd/d18/d13/d1d/l23 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.0.vm00.stdout:0/144: dread d3/db/f16 [0,4194304] 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.0.vm00.stdout:0/145: mkdir d3/d1b/d38 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.0.vm00.stdout:0/146: getdents d3/d33 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:5/294: unlink d0/d22/d18/d19/d2e/d3f/l46 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:6/228: dwrite d1/d4/d6/f30 [0,4194304] 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:2/191: creat d0/f44 x:0 0 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:2/192: readlink d0/d29/l34 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:5/295: mknod d0/d22/d18/d19/d21/d3a/c68 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:2/193: stat d0/cd 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:5/296: symlink d0/d22/d18/d30/l69 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:5/297: unlink d0/d22/d18/d30/l5f 0 2026-03-10T12:37:37.604 INFO:tasks.workunit.client.1.vm07.stdout:6/229: dwrite d1/f1e [0,4194304] 0 2026-03-10T12:37:37.607 INFO:tasks.workunit.client.1.vm07.stdout:2/194: dwrite d0/f4 [0,4194304] 0 2026-03-10T12:37:37.610 INFO:tasks.workunit.client.0.vm00.stdout:0/147: creat d3/d1b/d38/f39 x:0 0 0 2026-03-10T12:37:37.610 INFO:tasks.workunit.client.0.vm00.stdout:0/148: chown d3/d7/lc 63 1 2026-03-10T12:37:37.611 INFO:tasks.workunit.client.0.vm00.stdout:0/149: read - d3/d7/f1c zero size 2026-03-10T12:37:37.616 INFO:tasks.workunit.client.0.vm00.stdout:0/150: mkdir d3/d22/d3a 0 2026-03-10T12:37:37.617 INFO:tasks.workunit.client.0.vm00.stdout:0/151: write d3/f4 [1054737,73057] 0 2026-03-10T12:37:37.618 INFO:tasks.workunit.client.1.vm07.stdout:4/268: dread d0/d4/d5/da/f48 [0,4194304] 0 2026-03-10T12:37:37.625 INFO:tasks.workunit.client.0.vm00.stdout:0/152: dwrite d3/db/d24/f2f [0,4194304] 0 2026-03-10T12:37:37.627 INFO:tasks.workunit.client.0.vm00.stdout:0/153: fsync d3/db/f16 0 2026-03-10T12:37:37.628 INFO:tasks.workunit.client.0.vm00.stdout:0/154: chown d3/d1b/d38/f39 45 1 2026-03-10T12:37:37.628 INFO:tasks.workunit.client.0.vm00.stdout:0/155: chown d3/d7 43211568 1 2026-03-10T12:37:37.628 INFO:tasks.workunit.client.0.vm00.stdout:0/156: dread - d3/d1b/f37 zero size 2026-03-10T12:37:37.630 INFO:tasks.workunit.client.0.vm00.stdout:0/157: link d3/l9 d3/d7/l3b 0 2026-03-10T12:37:37.638 INFO:tasks.workunit.client.1.vm07.stdout:4/269: creat d0/d4/d10/d3c/d2b/f60 x:0 0 0 2026-03-10T12:37:37.638 INFO:tasks.workunit.client.1.vm07.stdout:4/270: write d0/d4/d10/d3c/f22 [902734,64153] 0 2026-03-10T12:37:37.638 INFO:tasks.workunit.client.0.vm00.stdout:0/158: stat d3/l18 0 2026-03-10T12:37:37.638 INFO:tasks.workunit.client.0.vm00.stdout:0/159: chown d3/d1b/d38/f39 1319 1 2026-03-10T12:37:37.638 INFO:tasks.workunit.client.0.vm00.stdout:0/160: dwrite d3/d1b/f37 [0,4194304] 0 2026-03-10T12:37:37.638 INFO:tasks.workunit.client.1.vm07.stdout:4/271: write d0/d4/d5/da/f4d [1058544,104638] 0 2026-03-10T12:37:37.640 INFO:tasks.workunit.client.1.vm07.stdout:4/272: mknod d0/d4/c61 0 2026-03-10T12:37:37.645 INFO:tasks.workunit.client.1.vm07.stdout:2/195: dread d0/f15 [0,4194304] 0 2026-03-10T12:37:37.651 INFO:tasks.workunit.client.1.vm07.stdout:2/196: mkdir d0/d45 0 2026-03-10T12:37:37.651 INFO:tasks.workunit.client.1.vm07.stdout:2/197: creat d0/f46 x:0 0 0 2026-03-10T12:37:37.651 INFO:tasks.workunit.client.1.vm07.stdout:2/198: symlink d0/d42/d1f/d20/l47 0 2026-03-10T12:37:37.651 INFO:tasks.workunit.client.1.vm07.stdout:2/199: unlink d0/lf 0 2026-03-10T12:37:37.653 INFO:tasks.workunit.client.1.vm07.stdout:2/200: read d0/f18 [595882,108855] 0 2026-03-10T12:37:37.662 INFO:tasks.workunit.client.1.vm07.stdout:2/201: rmdir d0/d42/d26 39 2026-03-10T12:37:37.668 INFO:tasks.workunit.client.1.vm07.stdout:2/202: creat d0/d42/d26/f48 x:0 0 0 2026-03-10T12:37:37.670 INFO:tasks.workunit.client.1.vm07.stdout:2/203: dread d0/f4 [0,4194304] 0 2026-03-10T12:37:37.676 INFO:tasks.workunit.client.0.vm00.stdout:5/44: dread f4 [0,4194304] 0 2026-03-10T12:37:37.677 INFO:tasks.workunit.client.0.vm00.stdout:5/45: stat lb 0 2026-03-10T12:37:37.680 INFO:tasks.workunit.client.1.vm07.stdout:0/309: write d0/d14/d1a/d2f/d31/d4f/f61 [4366572,83576] 0 2026-03-10T12:37:37.684 INFO:tasks.workunit.client.1.vm07.stdout:8/301: getdents d1/d3 0 2026-03-10T12:37:37.689 INFO:tasks.workunit.client.1.vm07.stdout:0/310: unlink d0/d14/d5f/c23 0 2026-03-10T12:37:37.689 INFO:tasks.workunit.client.1.vm07.stdout:7/251: dwrite d0/f20 [0,4194304] 0 2026-03-10T12:37:37.689 INFO:tasks.workunit.client.1.vm07.stdout:0/311: chown d0/l34 161962 1 2026-03-10T12:37:37.689 INFO:tasks.workunit.client.1.vm07.stdout:0/312: fsync d0/d14/f37 0 2026-03-10T12:37:37.691 INFO:tasks.workunit.client.1.vm07.stdout:8/302: mkdir d1/d3/d5d/d65 0 2026-03-10T12:37:37.692 INFO:tasks.workunit.client.1.vm07.stdout:8/303: read - d1/d3/d11/f47 zero size 2026-03-10T12:37:37.693 INFO:tasks.workunit.client.1.vm07.stdout:7/252: rename d0/l44 to d0/l46 0 2026-03-10T12:37:37.694 INFO:tasks.workunit.client.1.vm07.stdout:7/253: stat d0/f3c 0 2026-03-10T12:37:37.699 INFO:tasks.workunit.client.1.vm07.stdout:7/254: dwrite d0/f3b [0,4194304] 0 2026-03-10T12:37:37.705 INFO:tasks.workunit.client.1.vm07.stdout:9/211: dwrite d5/d1f/d31/f43 [0,4194304] 0 2026-03-10T12:37:37.706 INFO:tasks.workunit.client.1.vm07.stdout:9/212: creat d5/d13/d2c/f44 x:0 0 0 2026-03-10T12:37:37.706 INFO:tasks.workunit.client.1.vm07.stdout:7/255: mkdir d0/d47 0 2026-03-10T12:37:37.706 INFO:tasks.workunit.client.1.vm07.stdout:9/213: chown d5/d16/d18/f20 2685016 1 2026-03-10T12:37:37.708 INFO:tasks.workunit.client.1.vm07.stdout:7/256: mkdir d0/d47/d48 0 2026-03-10T12:37:37.709 INFO:tasks.workunit.client.1.vm07.stdout:9/214: link d5/fb d5/f45 0 2026-03-10T12:37:37.713 INFO:tasks.workunit.client.1.vm07.stdout:9/215: creat d5/d16/d23/d26/f46 x:0 0 0 2026-03-10T12:37:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:37 vm00.local ceph-mon[50686]: from='client.24455 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:37:37.738 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:37 vm00.local ceph-mon[50686]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:37:37.769 INFO:tasks.workunit.client.0.vm00.stdout:5/46: sync 2026-03-10T12:37:37.770 INFO:tasks.workunit.client.1.vm07.stdout:9/216: sync 2026-03-10T12:37:37.770 INFO:tasks.workunit.client.0.vm00.stdout:5/47: creat ff x:0 0 0 2026-03-10T12:37:37.771 INFO:tasks.workunit.client.1.vm07.stdout:9/217: fsync d5/d16/d18/f1e 0 2026-03-10T12:37:37.771 INFO:tasks.workunit.client.0.vm00.stdout:5/48: mknod c10 0 2026-03-10T12:37:37.772 INFO:tasks.workunit.client.0.vm00.stdout:5/49: read f4 [510349,38247] 0 2026-03-10T12:37:37.772 INFO:tasks.workunit.client.0.vm00.stdout:5/50: fdatasync ff 0 2026-03-10T12:37:37.792 INFO:tasks.workunit.client.1.vm07.stdout:0/313: dread d0/d14/f19 [0,4194304] 0 2026-03-10T12:37:37.793 INFO:tasks.workunit.client.1.vm07.stdout:9/218: dread d5/d13/f2b [0,4194304] 0 2026-03-10T12:37:37.793 INFO:tasks.workunit.client.0.vm00.stdout:1/51: dwrite f5 [0,4194304] 0 2026-03-10T12:37:37.793 INFO:tasks.workunit.client.1.vm07.stdout:0/314: chown d0/d14/d1a/d2f 46405663 1 2026-03-10T12:37:37.795 INFO:tasks.workunit.client.0.vm00.stdout:1/52: dread f4 [0,4194304] 0 2026-03-10T12:37:37.797 INFO:tasks.workunit.client.1.vm07.stdout:9/219: symlink d5/d13/l47 0 2026-03-10T12:37:37.798 INFO:tasks.workunit.client.1.vm07.stdout:0/315: mknod d0/d14/d1a/d2f/d31/d4f/c64 0 2026-03-10T12:37:37.798 INFO:tasks.workunit.client.0.vm00.stdout:1/53: creat da/f14 x:0 0 0 2026-03-10T12:37:37.798 INFO:tasks.workunit.client.0.vm00.stdout:1/54: stat da/f13 0 2026-03-10T12:37:37.799 INFO:tasks.workunit.client.1.vm07.stdout:0/316: write d0/d14/d1a/f27 [7285131,42281] 0 2026-03-10T12:37:37.800 INFO:tasks.workunit.client.0.vm00.stdout:1/55: symlink da/d12/l15 0 2026-03-10T12:37:37.800 INFO:tasks.workunit.client.1.vm07.stdout:0/317: write d0/d14/d5f/d41/f55 [351877,85698] 0 2026-03-10T12:37:37.802 INFO:tasks.workunit.client.0.vm00.stdout:1/56: dread f4 [0,4194304] 0 2026-03-10T12:37:37.803 INFO:tasks.workunit.client.1.vm07.stdout:0/318: chown d0/c2 42593 1 2026-03-10T12:37:37.803 INFO:tasks.workunit.client.0.vm00.stdout:1/57: dread f5 [0,4194304] 0 2026-03-10T12:37:37.805 INFO:tasks.workunit.client.1.vm07.stdout:9/220: link d5/c11 d5/d16/c48 0 2026-03-10T12:37:37.806 INFO:tasks.workunit.client.0.vm00.stdout:6/123: rmdir d2 39 2026-03-10T12:37:37.810 INFO:tasks.workunit.client.0.vm00.stdout:5/51: sync 2026-03-10T12:37:37.811 INFO:tasks.workunit.client.1.vm07.stdout:9/221: symlink d5/d13/d2c/d2f/d3e/l49 0 2026-03-10T12:37:37.811 INFO:tasks.workunit.client.0.vm00.stdout:5/52: write ff [379810,13118] 0 2026-03-10T12:37:37.821 INFO:tasks.workunit.client.0.vm00.stdout:1/58: rename da/lb to da/l16 0 2026-03-10T12:37:37.829 INFO:tasks.workunit.client.0.vm00.stdout:5/53: dwrite fe [0,4194304] 0 2026-03-10T12:37:37.832 INFO:tasks.workunit.client.0.vm00.stdout:1/59: mknod da/d12/c17 0 2026-03-10T12:37:37.833 INFO:tasks.workunit.client.0.vm00.stdout:6/124: symlink d2/d14/l2d 0 2026-03-10T12:37:37.834 INFO:tasks.workunit.client.0.vm00.stdout:1/60: chown da/f13 13532 1 2026-03-10T12:37:37.835 INFO:tasks.workunit.client.0.vm00.stdout:1/61: chown da/f14 1865708518 1 2026-03-10T12:37:37.835 INFO:tasks.workunit.client.1.vm07.stdout:1/250: dwrite d9/df/d29/d2b/d31/f3c [0,4194304] 0 2026-03-10T12:37:37.836 INFO:tasks.workunit.client.0.vm00.stdout:1/62: read - da/f13 zero size 2026-03-10T12:37:37.836 INFO:tasks.workunit.client.1.vm07.stdout:1/251: dread - d9/df/d29/d2b/d3d/f43 zero size 2026-03-10T12:37:37.836 INFO:tasks.workunit.client.0.vm00.stdout:5/54: dwrite ff [0,4194304] 0 2026-03-10T12:37:37.838 INFO:tasks.workunit.client.0.vm00.stdout:6/125: rmdir d2/d14 39 2026-03-10T12:37:37.839 INFO:tasks.workunit.client.0.vm00.stdout:6/126: dread - d2/d16/f2a zero size 2026-03-10T12:37:37.839 INFO:tasks.workunit.client.0.vm00.stdout:5/55: write ff [1929149,22718] 0 2026-03-10T12:37:37.840 INFO:tasks.workunit.client.0.vm00.stdout:5/56: chown ca 189467940 1 2026-03-10T12:37:37.840 INFO:tasks.workunit.client.0.vm00.stdout:6/127: write d2/f9 [492208,58016] 0 2026-03-10T12:37:37.840 INFO:tasks.workunit.client.0.vm00.stdout:5/57: stat l9 0 2026-03-10T12:37:37.841 INFO:tasks.workunit.client.1.vm07.stdout:1/252: creat d9/f52 x:0 0 0 2026-03-10T12:37:37.844 INFO:tasks.workunit.client.0.vm00.stdout:5/58: rename f4 to f11 0 2026-03-10T12:37:37.855 INFO:tasks.workunit.client.1.vm07.stdout:3/302: read dc/dd/f22 [210745,15483] 0 2026-03-10T12:37:37.865 INFO:tasks.workunit.client.1.vm07.stdout:3/303: mknod dc/dd/c6e 0 2026-03-10T12:37:37.865 INFO:tasks.workunit.client.0.vm00.stdout:6/128: creat d2/d14/f2e x:0 0 0 2026-03-10T12:37:37.866 INFO:tasks.workunit.client.1.vm07.stdout:3/304: mkdir dc/dd/d1f/d6f 0 2026-03-10T12:37:37.866 INFO:tasks.workunit.client.1.vm07.stdout:3/305: chown dc/d18/d24/l66 833197 1 2026-03-10T12:37:37.866 INFO:tasks.workunit.client.0.vm00.stdout:5/59: creat f12 x:0 0 0 2026-03-10T12:37:37.868 INFO:tasks.workunit.client.1.vm07.stdout:3/306: creat dc/dd/d28/d3b/f70 x:0 0 0 2026-03-10T12:37:37.868 INFO:tasks.workunit.client.1.vm07.stdout:3/307: stat dc/f17 0 2026-03-10T12:37:37.873 INFO:tasks.workunit.client.1.vm07.stdout:3/308: dwrite dc/dd/d28/f67 [0,4194304] 0 2026-03-10T12:37:37.880 INFO:tasks.workunit.client.1.vm07.stdout:3/309: creat dc/d18/d2d/f71 x:0 0 0 2026-03-10T12:37:37.880 INFO:tasks.workunit.client.1.vm07.stdout:3/310: write dc/dd/d28/d3b/f70 [707715,82696] 0 2026-03-10T12:37:37.882 INFO:tasks.workunit.client.1.vm07.stdout:3/311: mkdir dc/d18/d24/d72 0 2026-03-10T12:37:37.897 INFO:tasks.workunit.client.1.vm07.stdout:5/298: write d0/d22/f50 [1250951,123228] 0 2026-03-10T12:37:37.898 INFO:tasks.workunit.client.0.vm00.stdout:6/129: mkdir d2/da/dc/d2f 0 2026-03-10T12:37:37.903 INFO:tasks.workunit.client.1.vm07.stdout:5/299: dwrite d0/f47 [0,4194304] 0 2026-03-10T12:37:37.906 INFO:tasks.workunit.client.1.vm07.stdout:5/300: write d0/d22/d18/d19/d21/f37 [311546,65938] 0 2026-03-10T12:37:37.907 INFO:tasks.workunit.client.1.vm07.stdout:6/230: rename d1/d4/d6/l4b to d1/d4/d6/d16/d1a/l4f 0 2026-03-10T12:37:37.917 INFO:tasks.workunit.client.0.vm00.stdout:5/60: rename ca to c13 0 2026-03-10T12:37:37.917 INFO:tasks.workunit.client.0.vm00.stdout:5/61: chown l9 168206 1 2026-03-10T12:37:37.921 INFO:tasks.workunit.client.0.vm00.stdout:5/62: write f11 [668041,114471] 0 2026-03-10T12:37:37.922 INFO:tasks.workunit.client.0.vm00.stdout:5/63: dread - f12 zero size 2026-03-10T12:37:37.924 INFO:tasks.workunit.client.0.vm00.stdout:6/130: dwrite d2/d16/f2a [0,4194304] 0 2026-03-10T12:37:37.942 INFO:tasks.workunit.client.0.vm00.stdout:6/131: dread d2/d14/f24 [0,4194304] 0 2026-03-10T12:37:37.942 INFO:tasks.workunit.client.0.vm00.stdout:6/132: dread - d2/d16/f23 zero size 2026-03-10T12:37:37.942 INFO:tasks.workunit.client.0.vm00.stdout:6/133: write d2/d14/f2e [486135,42629] 0 2026-03-10T12:37:37.945 INFO:tasks.workunit.client.0.vm00.stdout:6/134: creat d2/f30 x:0 0 0 2026-03-10T12:37:37.946 INFO:tasks.workunit.client.0.vm00.stdout:6/135: stat d2/c5 0 2026-03-10T12:37:37.953 INFO:tasks.workunit.client.0.vm00.stdout:2/80: unlink d4/d6/f19 0 2026-03-10T12:37:37.954 INFO:tasks.workunit.client.1.vm07.stdout:8/304: write d1/d3/d40/f41 [5079516,130557] 0 2026-03-10T12:37:37.955 INFO:tasks.workunit.client.1.vm07.stdout:8/305: write d1/d3/d5d/f5f [335733,1201] 0 2026-03-10T12:37:37.955 INFO:tasks.workunit.client.1.vm07.stdout:8/306: stat d1/d3/f57 0 2026-03-10T12:37:37.957 INFO:tasks.workunit.client.1.vm07.stdout:9/222: fsync d5/fb 0 2026-03-10T12:37:37.957 INFO:tasks.workunit.client.1.vm07.stdout:8/307: dread d1/f48 [0,4194304] 0 2026-03-10T12:37:37.957 INFO:tasks.workunit.client.0.vm00.stdout:2/81: dwrite d4/d6/f16 [4194304,4194304] 0 2026-03-10T12:37:37.958 INFO:tasks.workunit.client.1.vm07.stdout:8/308: dread - d1/d3/d11/f3c zero size 2026-03-10T12:37:37.962 INFO:tasks.workunit.client.0.vm00.stdout:2/82: symlink d4/l1a 0 2026-03-10T12:37:37.963 INFO:tasks.workunit.client.0.vm00.stdout:7/82: chown da/f15 472630827 1 2026-03-10T12:37:37.963 INFO:tasks.workunit.client.0.vm00.stdout:2/83: read d4/d6/fb [148876,53851] 0 2026-03-10T12:37:37.964 INFO:tasks.workunit.client.0.vm00.stdout:2/84: read - d4/dd/ff zero size 2026-03-10T12:37:37.966 INFO:tasks.workunit.client.1.vm07.stdout:7/257: truncate d0/f21 2789324 0 2026-03-10T12:37:37.966 INFO:tasks.workunit.client.0.vm00.stdout:2/85: creat d4/dd/f1b x:0 0 0 2026-03-10T12:37:37.968 INFO:tasks.workunit.client.0.vm00.stdout:9/58: truncate d0/dd/fe 3522495 0 2026-03-10T12:37:37.969 INFO:tasks.workunit.client.0.vm00.stdout:7/83: fdatasync da/fb 0 2026-03-10T12:37:37.970 INFO:tasks.workunit.client.0.vm00.stdout:2/86: unlink d4/dd/f1b 0 2026-03-10T12:37:37.971 INFO:tasks.workunit.client.0.vm00.stdout:4/67: truncate f9 200612 0 2026-03-10T12:37:37.971 INFO:tasks.workunit.client.1.vm07.stdout:4/273: rename d0/d4/d10/d3c/d2b/f3b to d0/d4/d10/d3c/d2b/d54/f62 0 2026-03-10T12:37:37.972 INFO:tasks.workunit.client.1.vm07.stdout:6/231: dread d1/d4/f19 [0,4194304] 0 2026-03-10T12:37:37.973 INFO:tasks.workunit.client.1.vm07.stdout:2/204: chown d0/d42/d26/f3e 131358567 1 2026-03-10T12:37:37.977 INFO:tasks.workunit.client.1.vm07.stdout:7/258: symlink d0/d47/l49 0 2026-03-10T12:37:37.978 INFO:tasks.workunit.client.0.vm00.stdout:2/87: symlink d4/dd/l1c 0 2026-03-10T12:37:37.994 INFO:tasks.workunit.client.0.vm00.stdout:8/73: getdents d0 0 2026-03-10T12:37:37.994 INFO:tasks.workunit.client.0.vm00.stdout:8/74: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:37.994 INFO:tasks.workunit.client.0.vm00.stdout:8/75: symlink d0/l14 0 2026-03-10T12:37:37.994 INFO:tasks.workunit.client.0.vm00.stdout:8/76: dread - d0/f13 zero size 2026-03-10T12:37:37.994 INFO:tasks.workunit.client.0.vm00.stdout:8/77: truncate d0/f13 793520 0 2026-03-10T12:37:37.994 INFO:tasks.workunit.client.0.vm00.stdout:0/161: rename d3/db/d17 to d3/d7/d3c 0 2026-03-10T12:37:37.994 INFO:tasks.workunit.client.0.vm00.stdout:0/162: symlink d3/d7/d3c/l3d 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:0/319: rename d0/d14/d5f/d41/d4e to d0/d62/d65 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:0/320: write d0/d14/d1a/d2f/d31/f4d [2169049,10412] 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:6/232: creat d1/d4/d6/d16/f50 x:0 0 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:6/233: stat d1/d4/d6/d46/c4c 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:6/234: chown d1/d4/d6/d16/d1a/d33/f3c 0 1 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:9/223: symlink d5/d1f/l4a 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:9/224: truncate d5/d1f/d31/f43 4696411 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:5/301: rename d0/d22/f5b to d0/d22/d18/d19/d2e/d3f/f6a 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:0/321: rmdir d0/d14/d5f/d41 39 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:4/274: creat d0/d4/d10/d5f/f63 x:0 0 0 2026-03-10T12:37:37.995 INFO:tasks.workunit.client.1.vm07.stdout:4/275: dread - d0/d4/d10/d5f/f63 zero size 2026-03-10T12:37:37.996 INFO:tasks.workunit.client.0.vm00.stdout:7/84: rename da/f1a to da/d1b/f22 0 2026-03-10T12:37:37.998 INFO:tasks.workunit.client.1.vm07.stdout:4/276: dwrite d0/d4/d10/d3c/d2b/f60 [0,4194304] 0 2026-03-10T12:37:37.998 INFO:tasks.workunit.client.0.vm00.stdout:0/163: rename d3/d1b/l21 to d3/d33/l3e 0 2026-03-10T12:37:37.999 INFO:tasks.workunit.client.0.vm00.stdout:3/93: getdents dd/d18/d13/d1d 0 2026-03-10T12:37:38.001 INFO:tasks.workunit.client.0.vm00.stdout:3/94: write dd/d18/d13/f22 [532435,1416] 0 2026-03-10T12:37:38.002 INFO:tasks.workunit.client.1.vm07.stdout:8/309: link d1/c49 d1/d3/d6/d50/c66 0 2026-03-10T12:37:38.003 INFO:tasks.workunit.client.1.vm07.stdout:7/259: mknod d0/d47/d48/c4a 0 2026-03-10T12:37:38.003 INFO:tasks.workunit.client.0.vm00.stdout:3/95: symlink dd/d18/d13/d1d/l24 0 2026-03-10T12:37:38.009 INFO:tasks.workunit.client.1.vm07.stdout:7/260: fdatasync d0/f1e 0 2026-03-10T12:37:38.011 INFO:tasks.workunit.client.1.vm07.stdout:6/235: dwrite d1/f3d [0,4194304] 0 2026-03-10T12:37:38.016 INFO:tasks.workunit.client.1.vm07.stdout:0/322: mknod d0/d14/d1a/d2f/d31/d4f/d60/c66 0 2026-03-10T12:37:38.016 INFO:tasks.workunit.client.1.vm07.stdout:7/261: dwrite d0/f20 [0,4194304] 0 2026-03-10T12:37:38.017 INFO:tasks.workunit.client.1.vm07.stdout:0/323: write d0/d14/d1a/f3d [4402955,33022] 0 2026-03-10T12:37:38.019 INFO:tasks.workunit.client.1.vm07.stdout:0/324: dread - d0/d14/d1a/d2f/d31/d4f/f5c zero size 2026-03-10T12:37:38.023 INFO:tasks.workunit.client.1.vm07.stdout:7/262: dread d0/f39 [0,4194304] 0 2026-03-10T12:37:38.026 INFO:tasks.workunit.client.1.vm07.stdout:6/236: dwrite d1/d4/d6/d16/d1a/d33/f37 [0,4194304] 0 2026-03-10T12:37:38.044 INFO:tasks.workunit.client.1.vm07.stdout:8/310: creat d1/d3/d5d/d65/f67 x:0 0 0 2026-03-10T12:37:38.044 INFO:tasks.workunit.client.1.vm07.stdout:2/205: dread d0/d42/f22 [0,4194304] 0 2026-03-10T12:37:38.044 INFO:tasks.workunit.client.1.vm07.stdout:7/263: unlink d0/f30 0 2026-03-10T12:37:38.044 INFO:tasks.workunit.client.1.vm07.stdout:0/325: rename d0/d14/d1a/d2f/d31/c51 to d0/d14/d1a/d2f/d31/c67 0 2026-03-10T12:37:38.044 INFO:tasks.workunit.client.1.vm07.stdout:7/264: chown d0/f2f 16462394 1 2026-03-10T12:37:38.044 INFO:tasks.workunit.client.1.vm07.stdout:7/265: creat d0/d47/d48/f4b x:0 0 0 2026-03-10T12:37:38.046 INFO:tasks.workunit.client.1.vm07.stdout:6/237: getdents d1 0 2026-03-10T12:37:38.046 INFO:tasks.workunit.client.1.vm07.stdout:7/266: creat d0/d47/d48/f4c x:0 0 0 2026-03-10T12:37:38.047 INFO:tasks.workunit.client.1.vm07.stdout:6/238: write d1/d4/d6/d46/d4d/f22 [1455162,84868] 0 2026-03-10T12:37:38.049 INFO:tasks.workunit.client.1.vm07.stdout:0/326: rename d0/l1f to d0/d14/d5f/l68 0 2026-03-10T12:37:38.052 INFO:tasks.workunit.client.1.vm07.stdout:7/267: link d0/f3 d0/f4d 0 2026-03-10T12:37:38.053 INFO:tasks.workunit.client.1.vm07.stdout:7/268: creat d0/f4e x:0 0 0 2026-03-10T12:37:38.059 INFO:tasks.workunit.client.0.vm00.stdout:1/63: fdatasync f5 0 2026-03-10T12:37:38.069 INFO:tasks.workunit.client.1.vm07.stdout:1/253: write d9/fc [239925,17514] 0 2026-03-10T12:37:38.073 INFO:tasks.workunit.client.1.vm07.stdout:1/254: dwrite d9/df/d29/d2b/d3d/f43 [0,4194304] 0 2026-03-10T12:37:38.074 INFO:tasks.workunit.client.0.vm00.stdout:6/136: dwrite d2/d14/f1b [0,4194304] 0 2026-03-10T12:37:38.077 INFO:tasks.workunit.client.1.vm07.stdout:1/255: truncate d9/f1f 832086 0 2026-03-10T12:37:38.078 INFO:tasks.workunit.client.1.vm07.stdout:1/256: stat d9/df/d29/d2b/d30 0 2026-03-10T12:37:38.079 INFO:tasks.workunit.client.1.vm07.stdout:1/257: symlink d9/df/d29/d2b/d31/l53 0 2026-03-10T12:37:38.080 INFO:tasks.workunit.client.0.vm00.stdout:6/137: chown d2/da/dc/f28 3 1 2026-03-10T12:37:38.083 INFO:tasks.workunit.client.1.vm07.stdout:1/258: dread d9/f1a [0,4194304] 0 2026-03-10T12:37:38.087 INFO:tasks.workunit.client.1.vm07.stdout:1/259: dwrite d9/fc [0,4194304] 0 2026-03-10T12:37:38.089 INFO:tasks.workunit.client.0.vm00.stdout:0/164: sync 2026-03-10T12:37:38.089 INFO:tasks.workunit.client.0.vm00.stdout:8/78: sync 2026-03-10T12:37:38.093 INFO:tasks.workunit.client.1.vm07.stdout:1/260: dwrite d9/fc [0,4194304] 0 2026-03-10T12:37:38.095 INFO:tasks.workunit.client.1.vm07.stdout:1/261: read - d9/df/d29/d2b/d31/f35 zero size 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.1.vm07.stdout:1/262: mkdir d9/df/d54 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.1.vm07.stdout:1/263: mkdir d9/df/d55 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.1.vm07.stdout:1/264: stat d9/df/d29/d2c 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.0.vm00.stdout:6/138: mkdir d2/d16/d29/d31 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.0.vm00.stdout:8/79: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.0.vm00.stdout:0/165: link d3/d1b/f2b d3/db/d24/d25/f3f 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.0.vm00.stdout:0/166: write d3/d1b/f37 [2638348,111082] 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.0.vm00.stdout:0/167: write d3/d7/f31 [593784,53570] 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.0.vm00.stdout:0/168: fdatasync d3/d7/f15 0 2026-03-10T12:37:38.106 INFO:tasks.workunit.client.0.vm00.stdout:6/139: truncate d2/d16/f1c 269207 0 2026-03-10T12:37:38.112 INFO:tasks.workunit.client.0.vm00.stdout:6/140: truncate d2/d16/f19 1513652 0 2026-03-10T12:37:38.112 INFO:tasks.workunit.client.1.vm07.stdout:4/277: sync 2026-03-10T12:37:38.115 INFO:tasks.workunit.client.0.vm00.stdout:8/80: rename d0/la to d0/d12/l15 0 2026-03-10T12:37:38.116 INFO:tasks.workunit.client.0.vm00.stdout:6/141: unlink d2/d14/f24 0 2026-03-10T12:37:38.117 INFO:tasks.workunit.client.0.vm00.stdout:6/142: write d2/d14/f2e [22304,130196] 0 2026-03-10T12:37:38.117 INFO:tasks.workunit.client.0.vm00.stdout:5/64: fdatasync f11 0 2026-03-10T12:37:38.117 INFO:tasks.workunit.client.0.vm00.stdout:5/65: write ff [1160231,122516] 0 2026-03-10T12:37:38.118 INFO:tasks.workunit.client.0.vm00.stdout:5/66: chown f12 15 1 2026-03-10T12:37:38.119 INFO:tasks.workunit.client.0.vm00.stdout:8/81: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:38.121 INFO:tasks.workunit.client.0.vm00.stdout:6/143: unlink d2/c8 0 2026-03-10T12:37:38.123 INFO:tasks.workunit.client.1.vm07.stdout:4/278: dwrite d0/f53 [0,4194304] 0 2026-03-10T12:37:38.123 INFO:tasks.workunit.client.1.vm07.stdout:0/327: dread d0/d62/d65/f56 [0,4194304] 0 2026-03-10T12:37:38.124 INFO:tasks.workunit.client.1.vm07.stdout:0/328: fdatasync d0/f15 0 2026-03-10T12:37:38.125 INFO:tasks.workunit.client.0.vm00.stdout:6/144: dwrite d2/d16/f20 [0,4194304] 0 2026-03-10T12:37:38.127 INFO:tasks.workunit.client.1.vm07.stdout:4/279: symlink d0/d4/d5/d34/l64 0 2026-03-10T12:37:38.130 INFO:tasks.workunit.client.1.vm07.stdout:4/280: creat d0/d4/d10/d3c/d2b/d2d/f65 x:0 0 0 2026-03-10T12:37:38.136 INFO:tasks.workunit.client.0.vm00.stdout:6/145: creat d2/d14/f32 x:0 0 0 2026-03-10T12:37:38.136 INFO:tasks.workunit.client.0.vm00.stdout:6/146: mknod d2/da/dc/c33 0 2026-03-10T12:37:38.136 INFO:tasks.workunit.client.1.vm07.stdout:4/281: write d0/d4/d10/f4b [1194484,101688] 0 2026-03-10T12:37:38.136 INFO:tasks.workunit.client.1.vm07.stdout:4/282: readlink d0/d4/d5/da/l17 0 2026-03-10T12:37:38.136 INFO:tasks.workunit.client.1.vm07.stdout:4/283: dread - d0/d4/d10/d3c/d2b/d2d/f65 zero size 2026-03-10T12:37:38.137 INFO:tasks.workunit.client.1.vm07.stdout:4/284: dwrite d0/d4/d10/d3c/f22 [0,4194304] 0 2026-03-10T12:37:38.137 INFO:tasks.workunit.client.0.vm00.stdout:6/147: dwrite d2/da/dc/f28 [0,4194304] 0 2026-03-10T12:37:38.141 INFO:tasks.workunit.client.0.vm00.stdout:6/148: mkdir d2/d16/d29/d31/d34 0 2026-03-10T12:37:38.142 INFO:tasks.workunit.client.0.vm00.stdout:6/149: readlink d2/da/dc/l12 0 2026-03-10T12:37:38.152 INFO:tasks.workunit.client.0.vm00.stdout:8/82: sync 2026-03-10T12:37:38.156 INFO:tasks.workunit.client.1.vm07.stdout:3/312: truncate dc/dd/d1f/f27 702987 0 2026-03-10T12:37:38.161 INFO:tasks.workunit.client.0.vm00.stdout:8/83: creat d0/f16 x:0 0 0 2026-03-10T12:37:38.161 INFO:tasks.workunit.client.1.vm07.stdout:3/313: creat dc/d18/d2d/d3d/f73 x:0 0 0 2026-03-10T12:37:38.161 INFO:tasks.workunit.client.1.vm07.stdout:3/314: mknod dc/d18/c74 0 2026-03-10T12:37:38.164 INFO:tasks.workunit.client.1.vm07.stdout:3/315: dwrite dc/dd/d28/d3b/f4c [0,4194304] 0 2026-03-10T12:37:38.174 INFO:tasks.workunit.client.0.vm00.stdout:8/84: mkdir d0/d12/d17 0 2026-03-10T12:37:38.175 INFO:tasks.workunit.client.1.vm07.stdout:3/316: fdatasync dc/dd/d28/d3b/f4d 0 2026-03-10T12:37:38.195 INFO:tasks.workunit.client.0.vm00.stdout:5/67: dread f11 [0,4194304] 0 2026-03-10T12:37:38.198 INFO:tasks.workunit.client.0.vm00.stdout:1/64: dread da/f11 [0,4194304] 0 2026-03-10T12:37:38.199 INFO:tasks.workunit.client.0.vm00.stdout:1/65: write f4 [3338235,18745] 0 2026-03-10T12:37:38.203 INFO:tasks.workunit.client.0.vm00.stdout:5/68: mknod c14 0 2026-03-10T12:37:38.205 INFO:tasks.workunit.client.0.vm00.stdout:1/66: rename da/ld to da/d12/l18 0 2026-03-10T12:37:38.207 INFO:tasks.workunit.client.0.vm00.stdout:5/69: dwrite f11 [0,4194304] 0 2026-03-10T12:37:38.209 INFO:tasks.workunit.client.0.vm00.stdout:1/67: rename c7 to da/c19 0 2026-03-10T12:37:38.211 INFO:tasks.workunit.client.0.vm00.stdout:1/68: creat da/d12/f1a x:0 0 0 2026-03-10T12:37:38.213 INFO:tasks.workunit.client.1.vm07.stdout:6/239: sync 2026-03-10T12:37:38.213 INFO:tasks.workunit.client.1.vm07.stdout:1/265: sync 2026-03-10T12:37:38.214 INFO:tasks.workunit.client.1.vm07.stdout:4/285: sync 2026-03-10T12:37:38.214 INFO:tasks.workunit.client.1.vm07.stdout:3/317: sync 2026-03-10T12:37:38.217 INFO:tasks.workunit.client.0.vm00.stdout:1/69: rename da/f11 to da/d12/f1b 0 2026-03-10T12:37:38.218 INFO:tasks.workunit.client.1.vm07.stdout:4/286: dwrite d0/d4/d10/d23/f27 [0,4194304] 0 2026-03-10T12:37:38.224 INFO:tasks.workunit.client.0.vm00.stdout:1/70: mknod da/d12/c1c 0 2026-03-10T12:37:38.234 INFO:tasks.workunit.client.0.vm00.stdout:1/71: link da/d12/f1a da/d12/f1d 0 2026-03-10T12:37:38.234 INFO:tasks.workunit.client.0.vm00.stdout:1/72: stat da/c10 0 2026-03-10T12:37:38.240 INFO:tasks.workunit.client.0.vm00.stdout:2/88: truncate d4/d6/f16 5974259 0 2026-03-10T12:37:38.242 INFO:tasks.workunit.client.0.vm00.stdout:2/89: creat d4/f1d x:0 0 0 2026-03-10T12:37:38.268 INFO:tasks.workunit.client.0.vm00.stdout:2/90: sync 2026-03-10T12:37:38.268 INFO:tasks.workunit.client.0.vm00.stdout:2/91: chown d4/l1a 93336231 1 2026-03-10T12:37:38.281 INFO:tasks.workunit.client.1.vm07.stdout:1/266: rename c3 to d9/df/d29/d2b/d3d/c56 0 2026-03-10T12:37:38.284 INFO:tasks.workunit.client.1.vm07.stdout:6/240: creat d1/d4/d6/d4e/f51 x:0 0 0 2026-03-10T12:37:38.284 INFO:tasks.workunit.client.1.vm07.stdout:3/318: rename dc/d18/c4f to dc/dd/d43/c75 0 2026-03-10T12:37:38.284 INFO:tasks.workunit.client.1.vm07.stdout:3/319: chown dc/dd/d28/d3b/f4c 13 1 2026-03-10T12:37:38.286 INFO:tasks.workunit.client.1.vm07.stdout:3/320: dwrite dc/d18/f34 [0,4194304] 0 2026-03-10T12:37:38.288 INFO:tasks.workunit.client.0.vm00.stdout:2/92: unlink d4/f14 0 2026-03-10T12:37:38.293 INFO:tasks.workunit.client.0.vm00.stdout:2/93: symlink d4/d6/l1e 0 2026-03-10T12:37:38.293 INFO:tasks.workunit.client.0.vm00.stdout:2/94: chown d4/d6/cc 60138 1 2026-03-10T12:37:38.293 INFO:tasks.workunit.client.0.vm00.stdout:2/95: readlink d4/d6/l7 0 2026-03-10T12:37:38.299 INFO:tasks.workunit.client.0.vm00.stdout:7/85: dread - da/f17 zero size 2026-03-10T12:37:38.306 INFO:tasks.workunit.client.1.vm07.stdout:4/287: unlink d0/d4/d10/d3c/d2b/d54/f62 0 2026-03-10T12:37:38.308 INFO:tasks.workunit.client.0.vm00.stdout:7/86: dwrite da/d1b/f22 [0,4194304] 0 2026-03-10T12:37:38.320 INFO:tasks.workunit.client.0.vm00.stdout:7/87: creat da/f23 x:0 0 0 2026-03-10T12:37:38.322 INFO:tasks.workunit.client.0.vm00.stdout:7/88: rename da/l18 to da/d1b/l24 0 2026-03-10T12:37:38.323 INFO:tasks.workunit.client.0.vm00.stdout:7/89: mkdir da/d25 0 2026-03-10T12:37:38.324 INFO:tasks.workunit.client.0.vm00.stdout:7/90: dread - da/d1b/f1e zero size 2026-03-10T12:37:38.328 INFO:tasks.workunit.client.0.vm00.stdout:7/91: dwrite f9 [0,4194304] 0 2026-03-10T12:37:38.335 INFO:tasks.workunit.client.0.vm00.stdout:7/92: mkdir da/d26 0 2026-03-10T12:37:38.336 INFO:tasks.workunit.client.0.vm00.stdout:7/93: write da/fb [1887894,95707] 0 2026-03-10T12:37:38.338 INFO:tasks.workunit.client.0.vm00.stdout:7/94: rename da/d1b/f1d to da/d26/f27 0 2026-03-10T12:37:38.340 INFO:tasks.workunit.client.0.vm00.stdout:7/95: rename da/c1c to da/d25/c28 0 2026-03-10T12:37:38.341 INFO:tasks.workunit.client.0.vm00.stdout:7/96: fdatasync da/f13 0 2026-03-10T12:37:38.345 INFO:tasks.workunit.client.0.vm00.stdout:7/97: dwrite da/fe [0,4194304] 0 2026-03-10T12:37:38.346 INFO:tasks.workunit.client.0.vm00.stdout:7/98: creat da/d25/f29 x:0 0 0 2026-03-10T12:37:38.347 INFO:tasks.workunit.client.0.vm00.stdout:7/99: mknod da/d26/c2a 0 2026-03-10T12:37:38.348 INFO:tasks.workunit.client.0.vm00.stdout:7/100: creat da/d25/f2b x:0 0 0 2026-03-10T12:37:38.349 INFO:tasks.workunit.client.0.vm00.stdout:7/101: write da/d26/f27 [902158,32292] 0 2026-03-10T12:37:38.349 INFO:tasks.workunit.client.0.vm00.stdout:7/102: fdatasync da/d1b/f1e 0 2026-03-10T12:37:38.350 INFO:tasks.workunit.client.0.vm00.stdout:7/103: write da/fb [2633529,74310] 0 2026-03-10T12:37:38.354 INFO:tasks.workunit.client.0.vm00.stdout:7/104: dwrite da/f23 [0,4194304] 0 2026-03-10T12:37:38.370 INFO:tasks.workunit.client.0.vm00.stdout:0/169: dwrite d3/d7/f1c [0,4194304] 0 2026-03-10T12:37:38.370 INFO:tasks.workunit.client.0.vm00.stdout:7/105: mkdir da/d25/d2c 0 2026-03-10T12:37:38.376 INFO:tasks.workunit.client.0.vm00.stdout:7/106: mkdir da/d1b/d2d 0 2026-03-10T12:37:38.386 INFO:tasks.workunit.client.0.vm00.stdout:7/107: fsync da/d1b/f1e 0 2026-03-10T12:37:38.387 INFO:tasks.workunit.client.0.vm00.stdout:7/108: mkdir da/d25/d2e 0 2026-03-10T12:37:38.387 INFO:tasks.workunit.client.0.vm00.stdout:0/170: truncate f2 3767247 0 2026-03-10T12:37:38.387 INFO:tasks.workunit.client.0.vm00.stdout:7/109: dwrite da/d26/f27 [0,4194304] 0 2026-03-10T12:37:38.390 INFO:tasks.workunit.client.0.vm00.stdout:7/110: mknod da/d26/c2f 0 2026-03-10T12:37:38.396 INFO:tasks.workunit.client.0.vm00.stdout:7/111: rename da/d1b/f20 to da/d25/d2c/f30 0 2026-03-10T12:37:38.418 INFO:tasks.workunit.client.1.vm07.stdout:1/267: creat d9/df/d54/f57 x:0 0 0 2026-03-10T12:37:38.421 INFO:tasks.workunit.client.1.vm07.stdout:1/268: dread d9/f19 [0,4194304] 0 2026-03-10T12:37:38.428 INFO:tasks.workunit.client.1.vm07.stdout:6/241: dwrite d1/f3d [4194304,4194304] 0 2026-03-10T12:37:38.429 INFO:tasks.workunit.client.1.vm07.stdout:8/311: truncate d1/d3/f1d 4599842 0 2026-03-10T12:37:38.430 INFO:tasks.workunit.client.1.vm07.stdout:6/242: write d1/f34 [4468816,115569] 0 2026-03-10T12:37:38.430 INFO:tasks.workunit.client.1.vm07.stdout:2/206: write d0/d42/d26/d38/f3d [3864390,51277] 0 2026-03-10T12:37:38.449 INFO:tasks.workunit.client.1.vm07.stdout:7/269: truncate d0/f40 1578547 0 2026-03-10T12:37:38.449 INFO:tasks.workunit.client.1.vm07.stdout:7/270: chown d0/d47/d48 72 1 2026-03-10T12:37:38.454 INFO:tasks.workunit.client.1.vm07.stdout:4/288: write d0/d4/d5/da/f48 [1094785,6317] 0 2026-03-10T12:37:38.456 INFO:tasks.workunit.client.1.vm07.stdout:1/269: write d9/df/d29/d2b/d3d/f47 [2119408,81599] 0 2026-03-10T12:37:38.460 INFO:tasks.workunit.client.1.vm07.stdout:2/207: mkdir d0/d42/d1f/d20/d49 0 2026-03-10T12:37:38.461 INFO:tasks.workunit.client.1.vm07.stdout:2/208: fdatasync d0/d42/d26/f48 0 2026-03-10T12:37:38.464 INFO:tasks.workunit.client.1.vm07.stdout:0/329: truncate d0/f1d 225304 0 2026-03-10T12:37:38.470 INFO:tasks.workunit.client.1.vm07.stdout:6/243: rename d1/d4/d6/d16/d1a/d2c/l40 to d1/d4/d4a/l52 0 2026-03-10T12:37:38.474 INFO:tasks.workunit.client.1.vm07.stdout:6/244: dwrite d1/f26 [0,4194304] 0 2026-03-10T12:37:38.485 INFO:tasks.workunit.client.1.vm07.stdout:8/312: creat d1/f68 x:0 0 0 2026-03-10T12:37:38.487 INFO:tasks.workunit.client.1.vm07.stdout:6/245: mkdir d1/d4/d6/d53 0 2026-03-10T12:37:38.488 INFO:tasks.workunit.client.1.vm07.stdout:1/270: creat d9/df/f58 x:0 0 0 2026-03-10T12:37:38.489 INFO:tasks.workunit.client.1.vm07.stdout:8/313: rmdir d1/d3/d5d/d65 39 2026-03-10T12:37:38.490 INFO:tasks.workunit.client.1.vm07.stdout:4/289: getdents d0/d4/d10/d3c 0 2026-03-10T12:37:38.493 INFO:tasks.workunit.client.1.vm07.stdout:1/271: unlink l7 0 2026-03-10T12:37:38.495 INFO:tasks.workunit.client.1.vm07.stdout:8/314: symlink d1/d3/d6/d50/l69 0 2026-03-10T12:37:38.497 INFO:tasks.workunit.client.1.vm07.stdout:4/290: mkdir d0/d4/d5/da/d66 0 2026-03-10T12:37:38.500 INFO:tasks.workunit.client.1.vm07.stdout:8/315: symlink d1/d3/d6/d54/l6a 0 2026-03-10T12:37:38.502 INFO:tasks.workunit.client.1.vm07.stdout:4/291: dwrite d0/d4/d5/da/f15 [8388608,4194304] 0 2026-03-10T12:37:38.505 INFO:tasks.workunit.client.1.vm07.stdout:4/292: read d0/d4/d5/da/f15 [894222,54685] 0 2026-03-10T12:37:38.505 INFO:tasks.workunit.client.1.vm07.stdout:6/246: dread d1/d4/d6/d46/d4d/fb [0,4194304] 0 2026-03-10T12:37:38.507 INFO:tasks.workunit.client.1.vm07.stdout:6/247: write d1/f34 [4670473,103572] 0 2026-03-10T12:37:38.510 INFO:tasks.workunit.client.1.vm07.stdout:4/293: dread d0/d4/d5/f43 [0,4194304] 0 2026-03-10T12:37:38.514 INFO:tasks.workunit.client.1.vm07.stdout:8/316: creat d1/f6b x:0 0 0 2026-03-10T12:37:38.525 INFO:tasks.workunit.client.1.vm07.stdout:4/294: mknod d0/d4/d10/d5f/c67 0 2026-03-10T12:37:38.525 INFO:tasks.workunit.client.1.vm07.stdout:4/295: dread - d0/d4/d10/d23/f2e zero size 2026-03-10T12:37:38.525 INFO:tasks.workunit.client.1.vm07.stdout:6/248: link d1/d4/f19 d1/d4/d6/d16/f54 0 2026-03-10T12:37:38.525 INFO:tasks.workunit.client.1.vm07.stdout:0/330: sync 2026-03-10T12:37:38.530 INFO:tasks.workunit.client.1.vm07.stdout:6/249: dwrite d1/d4/d6/f2a [0,4194304] 0 2026-03-10T12:37:38.533 INFO:tasks.workunit.client.0.vm00.stdout:6/150: write d2/d16/f23 [621921,111361] 0 2026-03-10T12:37:38.534 INFO:tasks.workunit.client.1.vm07.stdout:0/331: dwrite d0/d14/d5f/d3b/f46 [0,4194304] 0 2026-03-10T12:37:38.535 INFO:tasks.workunit.client.0.vm00.stdout:6/151: symlink d2/da/dc/l35 0 2026-03-10T12:37:38.537 INFO:tasks.workunit.client.0.vm00.stdout:6/152: getdents d2/d16 0 2026-03-10T12:37:38.537 INFO:tasks.workunit.client.1.vm07.stdout:0/332: write d0/d14/d5f/d3b/f4b [769126,65407] 0 2026-03-10T12:37:38.538 INFO:tasks.workunit.client.0.vm00.stdout:6/153: mknod d2/d16/c36 0 2026-03-10T12:37:38.550 INFO:tasks.workunit.client.1.vm07.stdout:0/333: write d0/d62/d65/f56 [812001,27302] 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.1.vm07.stdout:0/334: truncate d0/d14/d1a/f30 3964359 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.1.vm07.stdout:0/335: creat d0/d14/f69 x:0 0 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.1.vm07.stdout:0/336: write d0/f1c [2513999,48697] 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.1.vm07.stdout:0/337: write d0/d14/f37 [1403563,49664] 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.1.vm07.stdout:0/338: mkdir d0/d14/d5f/d41/d6a 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.0.vm00.stdout:6/154: stat d2/da 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.0.vm00.stdout:6/155: chown d2/f30 12738031 1 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.0.vm00.stdout:6/156: rename d2/d16/f1d to d2/d16/d29/f37 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.0.vm00.stdout:6/157: mkdir d2/da/dc/d2f/d38 0 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.0.vm00.stdout:6/158: chown d2/d16/d29 898981947 1 2026-03-10T12:37:38.551 INFO:tasks.workunit.client.0.vm00.stdout:6/159: fsync d2/d14/f2b 0 2026-03-10T12:37:38.562 INFO:tasks.workunit.client.0.vm00.stdout:5/70: truncate fe 832881 0 2026-03-10T12:37:38.562 INFO:tasks.workunit.client.0.vm00.stdout:8/85: write d0/f9 [5231088,27555] 0 2026-03-10T12:37:38.563 INFO:tasks.workunit.client.0.vm00.stdout:5/71: read ff [1866479,35547] 0 2026-03-10T12:37:38.566 INFO:tasks.workunit.client.0.vm00.stdout:8/86: dwrite d0/f10 [4194304,4194304] 0 2026-03-10T12:37:38.572 INFO:tasks.workunit.client.0.vm00.stdout:9/59: dwrite d0/dd/fe [0,4194304] 0 2026-03-10T12:37:38.575 INFO:tasks.workunit.client.0.vm00.stdout:4/68: write fa [217553,101581] 0 2026-03-10T12:37:38.576 INFO:tasks.workunit.client.0.vm00.stdout:1/73: truncate f4 1610427 0 2026-03-10T12:37:38.576 INFO:tasks.workunit.client.0.vm00.stdout:4/69: readlink le 0 2026-03-10T12:37:38.576 INFO:tasks.workunit.client.0.vm00.stdout:1/74: write da/f14 [307540,61712] 0 2026-03-10T12:37:38.584 INFO:tasks.workunit.client.0.vm00.stdout:5/72: symlink l15 0 2026-03-10T12:37:38.584 INFO:tasks.workunit.client.0.vm00.stdout:9/60: rename d0/dd to d0/d5/d16 0 2026-03-10T12:37:38.591 INFO:tasks.workunit.client.0.vm00.stdout:7/112: fsync da/d25/f29 0 2026-03-10T12:37:38.593 INFO:tasks.workunit.client.0.vm00.stdout:9/61: creat d0/f17 x:0 0 0 2026-03-10T12:37:38.596 INFO:tasks.workunit.client.0.vm00.stdout:9/62: dwrite d0/f17 [0,4194304] 0 2026-03-10T12:37:38.613 INFO:tasks.workunit.client.0.vm00.stdout:7/113: symlink da/d25/d2c/l31 0 2026-03-10T12:37:38.614 INFO:tasks.workunit.client.0.vm00.stdout:0/171: dwrite d3/fd [0,4194304] 0 2026-03-10T12:37:38.620 INFO:tasks.workunit.client.0.vm00.stdout:6/160: dwrite d2/da/f26 [0,4194304] 0 2026-03-10T12:37:38.622 INFO:tasks.workunit.client.0.vm00.stdout:5/73: link f12 f16 0 2026-03-10T12:37:38.637 INFO:tasks.workunit.client.0.vm00.stdout:1/75: link c2 da/c1e 0 2026-03-10T12:37:38.639 INFO:tasks.workunit.client.1.vm07.stdout:3/321: read dc/dd/f20 [2436292,121422] 0 2026-03-10T12:37:38.644 INFO:tasks.workunit.client.0.vm00.stdout:0/172: mkdir d3/d40 0 2026-03-10T12:37:38.646 INFO:tasks.workunit.client.0.vm00.stdout:8/87: getdents d0 0 2026-03-10T12:37:38.650 INFO:tasks.workunit.client.1.vm07.stdout:3/322: dwrite dc/dd/f29 [4194304,4194304] 0 2026-03-10T12:37:38.663 INFO:tasks.workunit.client.0.vm00.stdout:9/63: symlink d0/d5/dc/l18 0 2026-03-10T12:37:38.665 INFO:tasks.workunit.client.0.vm00.stdout:4/70: getdents df 0 2026-03-10T12:37:38.669 INFO:tasks.workunit.client.1.vm07.stdout:2/209: dread d0/d42/f1b [0,4194304] 0 2026-03-10T12:37:38.672 INFO:tasks.workunit.client.0.vm00.stdout:1/76: rename da/d12/f1b to da/d12/f1f 0 2026-03-10T12:37:38.677 INFO:tasks.workunit.client.0.vm00.stdout:7/114: rename da/d1b/l24 to da/d26/l32 0 2026-03-10T12:37:38.678 INFO:tasks.workunit.client.0.vm00.stdout:7/115: write da/d1b/f22 [653680,98192] 0 2026-03-10T12:37:38.679 INFO:tasks.workunit.client.0.vm00.stdout:4/71: dread fb [0,4194304] 0 2026-03-10T12:37:38.683 INFO:tasks.workunit.client.0.vm00.stdout:4/72: dwrite f3 [0,4194304] 0 2026-03-10T12:37:38.692 INFO:tasks.workunit.client.1.vm07.stdout:2/210: creat d0/f4a x:0 0 0 2026-03-10T12:37:38.692 INFO:tasks.workunit.client.1.vm07.stdout:2/211: chown d0/d42/d1f/d20/c41 85401885 1 2026-03-10T12:37:38.692 INFO:tasks.workunit.client.0.vm00.stdout:6/161: mkdir d2/d39 0 2026-03-10T12:37:38.692 INFO:tasks.workunit.client.0.vm00.stdout:6/162: write d2/da/f11 [3140153,16800] 0 2026-03-10T12:37:38.692 INFO:tasks.workunit.client.0.vm00.stdout:9/64: mkdir d0/d5/d16/d19 0 2026-03-10T12:37:38.692 INFO:tasks.workunit.client.0.vm00.stdout:9/65: truncate d0/d5/d16/fe 4369551 0 2026-03-10T12:37:38.693 INFO:tasks.workunit.client.0.vm00.stdout:9/66: write d0/f4 [27673,73990] 0 2026-03-10T12:37:38.693 INFO:tasks.workunit.client.0.vm00.stdout:9/67: read d0/f17 [2928936,67176] 0 2026-03-10T12:37:38.697 INFO:tasks.workunit.client.0.vm00.stdout:9/68: dwrite d0/f4 [0,4194304] 0 2026-03-10T12:37:38.714 INFO:tasks.workunit.client.0.vm00.stdout:1/77: creat da/d12/f20 x:0 0 0 2026-03-10T12:37:38.715 INFO:tasks.workunit.client.0.vm00.stdout:1/78: truncate da/d12/f1d 266828 0 2026-03-10T12:37:38.717 INFO:tasks.workunit.client.0.vm00.stdout:0/173: rename d3/la to d3/d7/l41 0 2026-03-10T12:37:38.720 INFO:tasks.workunit.client.0.vm00.stdout:0/174: dwrite d3/d7/f10 [0,4194304] 0 2026-03-10T12:37:38.729 INFO:tasks.workunit.client.0.vm00.stdout:4/73: rmdir df 39 2026-03-10T12:37:38.732 INFO:tasks.workunit.client.0.vm00.stdout:5/74: link l15 l17 0 2026-03-10T12:37:38.733 INFO:tasks.workunit.client.0.vm00.stdout:5/75: dread fe [0,4194304] 0 2026-03-10T12:37:38.733 INFO:tasks.workunit.client.0.vm00.stdout:5/76: write f11 [2207580,95767] 0 2026-03-10T12:37:38.733 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:38 vm00.local ceph-mon[50686]: pgmap v154: 65 pgs: 65 active+clean; 724 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 4.7 MiB/s rd, 80 MiB/s wr, 331 op/s 2026-03-10T12:37:38.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:38 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:38.736 INFO:tasks.workunit.client.1.vm07.stdout:9/225: dread d5/d13/f1b [0,4194304] 0 2026-03-10T12:37:38.737 INFO:tasks.workunit.client.1.vm07.stdout:9/226: write d5/d16/f19 [1651652,10153] 0 2026-03-10T12:37:38.748 INFO:tasks.workunit.client.1.vm07.stdout:9/227: dwrite d5/d16/d23/d26/f46 [0,4194304] 0 2026-03-10T12:37:38.820 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:38 vm07.local ceph-mon[58582]: pgmap v154: 65 pgs: 65 active+clean; 724 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 4.7 MiB/s rd, 80 MiB/s wr, 331 op/s 2026-03-10T12:37:38.820 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:38 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:38.845 INFO:tasks.workunit.client.0.vm00.stdout:9/69: creat d0/f1a x:0 0 0 2026-03-10T12:37:38.845 INFO:tasks.workunit.client.0.vm00.stdout:9/70: dread - d0/f1a zero size 2026-03-10T12:37:38.847 INFO:tasks.workunit.client.0.vm00.stdout:1/79: fsync f5 0 2026-03-10T12:37:38.847 INFO:tasks.workunit.client.0.vm00.stdout:8/88: rename d0/l4 to d0/d12/d17/l18 0 2026-03-10T12:37:38.854 INFO:tasks.workunit.client.0.vm00.stdout:0/175: creat d3/d22/f42 x:0 0 0 2026-03-10T12:37:38.860 INFO:tasks.workunit.client.0.vm00.stdout:7/116: mknod da/d1b/d2d/c33 0 2026-03-10T12:37:38.869 INFO:tasks.workunit.client.0.vm00.stdout:0/176: dwrite d3/d7/d3c/f30 [0,4194304] 0 2026-03-10T12:37:38.869 INFO:tasks.workunit.client.0.vm00.stdout:5/77: mknod c18 0 2026-03-10T12:37:38.869 INFO:tasks.workunit.client.0.vm00.stdout:0/177: write d3/d7/d3c/f19 [114466,44259] 0 2026-03-10T12:37:38.869 INFO:tasks.workunit.client.0.vm00.stdout:6/163: rename d2/da/f26 to d2/da/dc/d2f/f3a 0 2026-03-10T12:37:38.869 INFO:tasks.workunit.client.0.vm00.stdout:1/80: write da/d12/f1f [1369518,3675] 0 2026-03-10T12:37:38.869 INFO:tasks.workunit.client.0.vm00.stdout:6/164: write d2/d14/f1b [1328698,4755] 0 2026-03-10T12:37:38.869 INFO:tasks.workunit.client.0.vm00.stdout:7/117: chown da/l1f 62222 1 2026-03-10T12:37:38.875 INFO:tasks.workunit.client.0.vm00.stdout:7/118: dwrite f9 [0,4194304] 0 2026-03-10T12:37:38.882 INFO:tasks.workunit.client.0.vm00.stdout:4/74: creat df/f14 x:0 0 0 2026-03-10T12:37:38.882 INFO:tasks.workunit.client.0.vm00.stdout:4/75: read f3 [1727352,84417] 0 2026-03-10T12:37:38.885 INFO:tasks.workunit.client.0.vm00.stdout:0/178: creat d3/db/d24/d25/f43 x:0 0 0 2026-03-10T12:37:38.886 INFO:tasks.workunit.client.0.vm00.stdout:0/179: dread d3/db/f16 [0,4194304] 0 2026-03-10T12:37:38.892 INFO:tasks.workunit.client.0.vm00.stdout:9/71: rename d0/d5/d16/fe to d0/d5/d16/d19/f1b 0 2026-03-10T12:37:38.894 INFO:tasks.workunit.client.0.vm00.stdout:1/81: mkdir da/d21 0 2026-03-10T12:37:38.896 INFO:tasks.workunit.client.0.vm00.stdout:6/165: creat d2/d14/f3b x:0 0 0 2026-03-10T12:37:38.900 INFO:tasks.workunit.client.0.vm00.stdout:6/166: dwrite d2/f30 [0,4194304] 0 2026-03-10T12:37:38.918 INFO:tasks.workunit.client.0.vm00.stdout:0/180: mkdir d3/d1b/d38/d44 0 2026-03-10T12:37:38.975 INFO:tasks.workunit.client.0.vm00.stdout:5/78: sync 2026-03-10T12:37:38.975 INFO:tasks.workunit.client.0.vm00.stdout:7/119: sync 2026-03-10T12:37:39.017 INFO:tasks.workunit.client.1.vm07.stdout:2/212: dread d0/d42/f1e [0,4194304] 0 2026-03-10T12:37:39.018 INFO:tasks.workunit.client.1.vm07.stdout:5/302: dread d0/d22/f16 [0,4194304] 0 2026-03-10T12:37:39.039 INFO:tasks.workunit.client.1.vm07.stdout:6/250: chown d1/d4/d4a/l52 1 1 2026-03-10T12:37:39.043 INFO:tasks.workunit.client.1.vm07.stdout:6/251: dwrite d1/d4/d6/d46/d4d/f22 [4194304,4194304] 0 2026-03-10T12:37:39.048 INFO:tasks.workunit.client.1.vm07.stdout:3/323: read dc/dd/f20 [1040614,48614] 0 2026-03-10T12:37:39.050 INFO:tasks.workunit.client.1.vm07.stdout:8/317: dread d1/d3/d18/f2e [0,4194304] 0 2026-03-10T12:37:39.059 INFO:tasks.workunit.client.1.vm07.stdout:9/228: chown d5/d1f/l2e 0 1 2026-03-10T12:37:39.062 INFO:tasks.workunit.client.1.vm07.stdout:1/272: write d9/fe [3364468,111733] 0 2026-03-10T12:37:39.105 INFO:tasks.workunit.client.1.vm07.stdout:4/296: dread d0/d19/f25 [0,4194304] 0 2026-03-10T12:37:39.153 INFO:tasks.workunit.client.1.vm07.stdout:9/229: sync 2026-03-10T12:37:39.156 INFO:tasks.workunit.client.0.vm00.stdout:3/96: dread dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:39.320 INFO:tasks.workunit.client.0.vm00.stdout:2/96: truncate d4/dd/f17 111502 0 2026-03-10T12:37:39.333 INFO:tasks.workunit.client.0.vm00.stdout:1/82: creat da/f22 x:0 0 0 2026-03-10T12:37:39.337 INFO:tasks.workunit.client.0.vm00.stdout:7/120: dread - da/f16 zero size 2026-03-10T12:37:39.337 INFO:tasks.workunit.client.0.vm00.stdout:5/79: rename fe to f19 0 2026-03-10T12:37:39.337 INFO:tasks.workunit.client.0.vm00.stdout:3/97: chown c1 6231072 1 2026-03-10T12:37:39.338 INFO:tasks.workunit.client.0.vm00.stdout:3/98: write f7 [8640401,23858] 0 2026-03-10T12:37:39.339 INFO:tasks.workunit.client.0.vm00.stdout:3/99: write f7 [4916926,27366] 0 2026-03-10T12:37:39.350 INFO:tasks.workunit.client.0.vm00.stdout:6/167: symlink d2/da/l3c 0 2026-03-10T12:37:39.351 INFO:tasks.workunit.client.0.vm00.stdout:5/80: write ff [3847442,60704] 0 2026-03-10T12:37:39.354 INFO:tasks.workunit.client.0.vm00.stdout:7/121: mknod da/d26/c34 0 2026-03-10T12:37:39.365 INFO:tasks.workunit.client.0.vm00.stdout:1/83: rename da/c19 to da/c23 0 2026-03-10T12:37:39.367 INFO:tasks.workunit.client.1.vm07.stdout:0/339: dwrite d0/d14/d5f/f54 [0,4194304] 0 2026-03-10T12:37:39.369 INFO:tasks.workunit.client.0.vm00.stdout:1/84: dwrite da/f22 [0,4194304] 0 2026-03-10T12:37:39.375 INFO:tasks.workunit.client.0.vm00.stdout:8/89: dread d0/f9 [0,4194304] 0 2026-03-10T12:37:39.376 INFO:tasks.workunit.client.0.vm00.stdout:2/97: rename d4/d6/c15 to d4/c1f 0 2026-03-10T12:37:39.383 INFO:tasks.workunit.client.0.vm00.stdout:9/72: getdents d0/d5/dc 0 2026-03-10T12:37:39.383 INFO:tasks.workunit.client.0.vm00.stdout:9/73: stat d0/f17 0 2026-03-10T12:37:39.385 INFO:tasks.workunit.client.0.vm00.stdout:1/85: mkdir da/d24 0 2026-03-10T12:37:39.386 INFO:tasks.workunit.client.0.vm00.stdout:1/86: write f5 [1567250,65968] 0 2026-03-10T12:37:39.388 INFO:tasks.workunit.client.0.vm00.stdout:6/168: symlink d2/d16/d29/d31/d34/l3d 0 2026-03-10T12:37:39.389 INFO:tasks.workunit.client.1.vm07.stdout:6/252: creat d1/d4/d4a/f55 x:0 0 0 2026-03-10T12:37:39.389 INFO:tasks.workunit.client.0.vm00.stdout:5/81: unlink cc 0 2026-03-10T12:37:39.389 INFO:tasks.workunit.client.0.vm00.stdout:5/82: stat c7 0 2026-03-10T12:37:39.389 INFO:tasks.workunit.client.0.vm00.stdout:5/83: stat f11 0 2026-03-10T12:37:39.390 INFO:tasks.workunit.client.0.vm00.stdout:8/90: read d0/f10 [1747861,86779] 0 2026-03-10T12:37:39.392 INFO:tasks.workunit.client.1.vm07.stdout:6/253: dwrite d1/f38 [0,4194304] 0 2026-03-10T12:37:39.393 INFO:tasks.workunit.client.0.vm00.stdout:4/76: mknod df/c15 0 2026-03-10T12:37:39.394 INFO:tasks.workunit.client.0.vm00.stdout:4/77: chown lc 348242 1 2026-03-10T12:37:39.394 INFO:tasks.workunit.client.0.vm00.stdout:1/87: chown da/c1e 23116 1 2026-03-10T12:37:39.395 INFO:tasks.workunit.client.0.vm00.stdout:1/88: fsync da/f14 0 2026-03-10T12:37:39.395 INFO:tasks.workunit.client.0.vm00.stdout:6/169: stat d2/c6 0 2026-03-10T12:37:39.398 INFO:tasks.workunit.client.0.vm00.stdout:5/84: mkdir d1a 0 2026-03-10T12:37:39.399 INFO:tasks.workunit.client.1.vm07.stdout:4/297: creat d0/d4/d10/d3c/f68 x:0 0 0 2026-03-10T12:37:39.399 INFO:tasks.workunit.client.0.vm00.stdout:1/89: dwrite da/d12/f1f [0,4194304] 0 2026-03-10T12:37:39.400 INFO:tasks.workunit.client.1.vm07.stdout:4/298: fsync d0/d4/d5/da/f4d 0 2026-03-10T12:37:39.400 INFO:tasks.workunit.client.0.vm00.stdout:1/90: chown da/c10 44754 1 2026-03-10T12:37:39.401 INFO:tasks.workunit.client.1.vm07.stdout:4/299: write d0/d4/d10/d23/d46/f56 [907160,43238] 0 2026-03-10T12:37:39.403 INFO:tasks.workunit.client.0.vm00.stdout:5/85: dread f11 [0,4194304] 0 2026-03-10T12:37:39.403 INFO:tasks.workunit.client.0.vm00.stdout:1/91: dread f5 [0,4194304] 0 2026-03-10T12:37:39.410 INFO:tasks.workunit.client.1.vm07.stdout:9/230: rename d5 to d5/d4b 22 2026-03-10T12:37:39.411 INFO:tasks.workunit.client.0.vm00.stdout:4/78: dread df/f12 [0,4194304] 0 2026-03-10T12:37:39.411 INFO:tasks.workunit.client.1.vm07.stdout:9/231: fsync d5/d16/d23/d26/f40 0 2026-03-10T12:37:39.411 INFO:tasks.workunit.client.0.vm00.stdout:4/79: fdatasync df/f12 0 2026-03-10T12:37:39.412 INFO:tasks.workunit.client.0.vm00.stdout:4/80: write df/f14 [127350,20326] 0 2026-03-10T12:37:39.415 INFO:tasks.workunit.client.0.vm00.stdout:6/170: mknod d2/da/dc/d2f/c3e 0 2026-03-10T12:37:39.420 INFO:tasks.workunit.client.1.vm07.stdout:2/213: mkdir d0/d42/d26/d4b 0 2026-03-10T12:37:39.420 INFO:tasks.workunit.client.0.vm00.stdout:6/171: dread - d2/d14/f2b zero size 2026-03-10T12:37:39.420 INFO:tasks.workunit.client.1.vm07.stdout:6/254: creat d1/d4/d4a/f56 x:0 0 0 2026-03-10T12:37:39.423 INFO:tasks.workunit.client.0.vm00.stdout:7/122: link da/f17 da/f35 0 2026-03-10T12:37:39.430 INFO:tasks.workunit.client.0.vm00.stdout:6/172: rename d2/d16/d29/f37 to d2/d14/f3f 0 2026-03-10T12:37:39.431 INFO:tasks.workunit.client.0.vm00.stdout:8/91: creat d0/f19 x:0 0 0 2026-03-10T12:37:39.431 INFO:tasks.workunit.client.1.vm07.stdout:1/273: getdents d9/df/d55 0 2026-03-10T12:37:39.432 INFO:tasks.workunit.client.1.vm07.stdout:4/300: chown d0/d4/l4a 271698 1 2026-03-10T12:37:39.435 INFO:tasks.workunit.client.1.vm07.stdout:9/232: symlink d5/d13/d2c/l4c 0 2026-03-10T12:37:39.435 INFO:tasks.workunit.client.0.vm00.stdout:6/173: creat d2/da/dc/f40 x:0 0 0 2026-03-10T12:37:39.438 INFO:tasks.workunit.client.1.vm07.stdout:1/274: mkdir d9/df/d29/d2c/d59 0 2026-03-10T12:37:39.439 INFO:tasks.workunit.client.0.vm00.stdout:5/86: rmdir d1a 0 2026-03-10T12:37:39.442 INFO:tasks.workunit.client.0.vm00.stdout:4/81: rename df/f14 to df/f16 0 2026-03-10T12:37:39.442 INFO:tasks.workunit.client.0.vm00.stdout:4/82: chown f9 495 1 2026-03-10T12:37:39.444 INFO:tasks.workunit.client.0.vm00.stdout:5/87: creat f1b x:0 0 0 2026-03-10T12:37:39.444 INFO:tasks.workunit.client.1.vm07.stdout:9/233: symlink d5/d16/d18/l4d 0 2026-03-10T12:37:39.444 INFO:tasks.workunit.client.0.vm00.stdout:5/88: write f1b [1021947,107419] 0 2026-03-10T12:37:39.444 INFO:tasks.workunit.client.1.vm07.stdout:9/234: readlink l3 0 2026-03-10T12:37:39.445 INFO:tasks.workunit.client.1.vm07.stdout:9/235: chown l3 600322178 1 2026-03-10T12:37:39.445 INFO:tasks.workunit.client.1.vm07.stdout:8/318: getdents d1/d3/d5d 0 2026-03-10T12:37:39.446 INFO:tasks.workunit.client.0.vm00.stdout:3/100: sync 2026-03-10T12:37:39.448 INFO:tasks.workunit.client.0.vm00.stdout:5/89: dwrite ff [0,4194304] 0 2026-03-10T12:37:39.448 INFO:tasks.workunit.client.0.vm00.stdout:8/92: rename d0/f16 to d0/d12/f1a 0 2026-03-10T12:37:39.455 INFO:tasks.workunit.client.0.vm00.stdout:4/83: symlink df/l17 0 2026-03-10T12:37:39.459 INFO:tasks.workunit.client.1.vm07.stdout:9/236: dwrite d5/d16/f34 [0,4194304] 0 2026-03-10T12:37:39.461 INFO:tasks.workunit.client.1.vm07.stdout:9/237: creat d5/d1f/f4e x:0 0 0 2026-03-10T12:37:39.462 INFO:tasks.workunit.client.1.vm07.stdout:4/301: getdents d0/d4/d5 0 2026-03-10T12:37:39.462 INFO:tasks.workunit.client.0.vm00.stdout:6/174: dread d2/da/f11 [0,4194304] 0 2026-03-10T12:37:39.462 INFO:tasks.workunit.client.1.vm07.stdout:9/238: read - d5/d13/d2c/f38 zero size 2026-03-10T12:37:39.462 INFO:tasks.workunit.client.1.vm07.stdout:4/302: chown d0/d4/d10/l57 146172054 1 2026-03-10T12:37:39.464 INFO:tasks.workunit.client.0.vm00.stdout:3/101: creat dd/f25 x:0 0 0 2026-03-10T12:37:39.467 INFO:tasks.workunit.client.0.vm00.stdout:8/93: mknod d0/d12/d17/c1b 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:9/239: mkdir d5/d13/d2c/d2f/d4f 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:9/240: rename d5/c10 to d5/d16/d23/c50 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:4/303: link d0/d4/d10/d3c/d2b/d2d/l3d d0/d4/d10/d5f/l69 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:4/304: symlink d0/d4/d10/d18/l6a 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:9/241: dwrite d5/d16/d18/f1e [4194304,4194304] 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:4/305: dread - d0/d4/d10/d3c/d2b/d2d/f65 zero size 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:9/242: mknod d5/d1f/c51 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:6/255: dread d1/d4/d6/d16/d1a/f29 [0,4194304] 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.1.vm07.stdout:4/306: creat d0/d4/d10/f6b x:0 0 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:6/175: creat d2/d16/f41 x:0 0 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:6/176: write d2/d16/f2a [3871019,61650] 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:3/102: write dd/d18/d13/f22 [825541,115545] 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:8/94: readlink d0/l1 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:4/84: symlink df/l18 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:4/85: chown f3 3418945 1 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:3/103: mknod dd/d18/d13/c26 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:8/95: write d0/f11 [899718,9495] 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:3/104: dwrite dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:39.492 INFO:tasks.workunit.client.0.vm00.stdout:6/177: mkdir d2/d42 0 2026-03-10T12:37:39.494 INFO:tasks.workunit.client.1.vm07.stdout:4/307: dwrite d0/d4/d10/d3c/d2b/f60 [0,4194304] 0 2026-03-10T12:37:39.496 INFO:tasks.workunit.client.0.vm00.stdout:3/105: mkdir dd/d27 0 2026-03-10T12:37:39.496 INFO:tasks.workunit.client.0.vm00.stdout:3/106: read - dd/f25 zero size 2026-03-10T12:37:39.496 INFO:tasks.workunit.client.1.vm07.stdout:4/308: write d0/d19/f25 [2020132,72221] 0 2026-03-10T12:37:39.496 INFO:tasks.workunit.client.0.vm00.stdout:3/107: read f7 [5578271,110863] 0 2026-03-10T12:37:39.497 INFO:tasks.workunit.client.0.vm00.stdout:6/178: creat d2/da/dc/f43 x:0 0 0 2026-03-10T12:37:39.503 INFO:tasks.workunit.client.0.vm00.stdout:8/96: mkdir d0/dd/d1c 0 2026-03-10T12:37:39.505 INFO:tasks.workunit.client.0.vm00.stdout:4/86: creat df/f19 x:0 0 0 2026-03-10T12:37:39.508 INFO:tasks.workunit.client.1.vm07.stdout:9/243: mknod d5/d13/d2c/d2f/d4f/c52 0 2026-03-10T12:37:39.510 INFO:tasks.workunit.client.0.vm00.stdout:6/179: write d2/da/dc/d2f/f3a [1658063,98539] 0 2026-03-10T12:37:39.512 INFO:tasks.workunit.client.1.vm07.stdout:6/256: symlink d1/d4/d6/d16/d49/l57 0 2026-03-10T12:37:39.513 INFO:tasks.workunit.client.1.vm07.stdout:6/257: chown d1/d4/d4a/l52 2087434 1 2026-03-10T12:37:39.513 INFO:tasks.workunit.client.0.vm00.stdout:6/180: dwrite d2/da/dc/f43 [0,4194304] 0 2026-03-10T12:37:39.517 INFO:tasks.workunit.client.0.vm00.stdout:6/181: dread d2/d16/f19 [0,4194304] 0 2026-03-10T12:37:39.517 INFO:tasks.workunit.client.1.vm07.stdout:9/244: creat d5/d13/d2c/d2f/d3e/f53 x:0 0 0 2026-03-10T12:37:39.520 INFO:tasks.workunit.client.1.vm07.stdout:6/258: dwrite d1/d4/d6/f41 [0,4194304] 0 2026-03-10T12:37:39.520 INFO:tasks.workunit.client.0.vm00.stdout:3/108: symlink dd/d27/l28 0 2026-03-10T12:37:39.525 INFO:tasks.workunit.client.1.vm07.stdout:9/245: rename d5/c29 to d5/d16/d18/c54 0 2026-03-10T12:37:39.529 INFO:tasks.workunit.client.0.vm00.stdout:3/109: rename l4 to dd/d18/l29 0 2026-03-10T12:37:39.529 INFO:tasks.workunit.client.1.vm07.stdout:9/246: dwrite d5/d16/d23/d26/f42 [0,4194304] 0 2026-03-10T12:37:39.539 INFO:tasks.workunit.client.1.vm07.stdout:6/259: truncate d1/d4/d6/f13 4954300 0 2026-03-10T12:37:39.541 INFO:tasks.workunit.client.1.vm07.stdout:4/309: dread d0/d4/d5/da/f15 [12582912,4194304] 0 2026-03-10T12:37:39.542 INFO:tasks.workunit.client.0.vm00.stdout:3/110: mkdir dd/d2a 0 2026-03-10T12:37:39.543 INFO:tasks.workunit.client.0.vm00.stdout:3/111: dread - dd/f25 zero size 2026-03-10T12:37:39.543 INFO:tasks.workunit.client.0.vm00.stdout:3/112: write dd/d18/d13/f22 [3063699,12279] 0 2026-03-10T12:37:39.544 INFO:tasks.workunit.client.1.vm07.stdout:9/247: symlink d5/d13/l55 0 2026-03-10T12:37:39.545 INFO:tasks.workunit.client.1.vm07.stdout:9/248: readlink d5/d13/d22/l3f 0 2026-03-10T12:37:39.548 INFO:tasks.workunit.client.0.vm00.stdout:3/113: rmdir dd/d18/d14 39 2026-03-10T12:37:39.549 INFO:tasks.workunit.client.1.vm07.stdout:4/310: unlink d0/d4/d5/da/l32 0 2026-03-10T12:37:39.549 INFO:tasks.workunit.client.0.vm00.stdout:4/87: link df/c10 df/c1a 0 2026-03-10T12:37:39.552 INFO:tasks.workunit.client.1.vm07.stdout:1/275: sync 2026-03-10T12:37:39.552 INFO:tasks.workunit.client.1.vm07.stdout:8/319: dread d1/d3/f29 [0,4194304] 0 2026-03-10T12:37:39.554 INFO:tasks.workunit.client.0.vm00.stdout:3/114: dwrite dd/f15 [0,4194304] 0 2026-03-10T12:37:39.558 INFO:tasks.workunit.client.1.vm07.stdout:4/311: creat d0/d4/d10/d3c/f6c x:0 0 0 2026-03-10T12:37:39.569 INFO:tasks.workunit.client.1.vm07.stdout:1/276: chown d9/f1f 2 1 2026-03-10T12:37:39.569 INFO:tasks.workunit.client.1.vm07.stdout:4/312: dwrite d0/d4/d10/d3c/d2b/d2d/f65 [0,4194304] 0 2026-03-10T12:37:39.569 INFO:tasks.workunit.client.1.vm07.stdout:8/320: chown d1/d3/d6/d50/c66 530500899 1 2026-03-10T12:37:39.569 INFO:tasks.workunit.client.1.vm07.stdout:4/313: fdatasync d0/d4/d10/d3c/f6c 0 2026-03-10T12:37:39.571 INFO:tasks.workunit.client.1.vm07.stdout:4/314: chown d0/d4/d10/d23/d46 2134535 1 2026-03-10T12:37:39.572 INFO:tasks.workunit.client.1.vm07.stdout:4/315: write d0/d4/d10/d3c/f6c [201025,45980] 0 2026-03-10T12:37:39.574 INFO:tasks.workunit.client.1.vm07.stdout:1/277: dwrite d9/df/f26 [0,4194304] 0 2026-03-10T12:37:39.580 INFO:tasks.workunit.client.1.vm07.stdout:4/316: dwrite d0/f53 [0,4194304] 0 2026-03-10T12:37:39.591 INFO:tasks.workunit.client.1.vm07.stdout:8/321: unlink d1/d3/d40/f53 0 2026-03-10T12:37:39.596 INFO:tasks.workunit.client.1.vm07.stdout:1/278: dwrite d9/f1a [0,4194304] 0 2026-03-10T12:37:39.603 INFO:tasks.workunit.client.1.vm07.stdout:1/279: dwrite d9/df/d29/d2b/f32 [4194304,4194304] 0 2026-03-10T12:37:39.615 INFO:tasks.workunit.client.0.vm00.stdout:0/181: dwrite d3/db/d24/d25/f3f [0,4194304] 0 2026-03-10T12:37:39.626 INFO:tasks.workunit.client.0.vm00.stdout:0/182: getdents d3/db 0 2026-03-10T12:37:39.627 INFO:tasks.workunit.client.0.vm00.stdout:0/183: chown d3/db/f16 6106 1 2026-03-10T12:37:39.629 INFO:tasks.workunit.client.0.vm00.stdout:0/184: read d3/d7/f31 [71528,48169] 0 2026-03-10T12:37:39.631 INFO:tasks.workunit.client.0.vm00.stdout:0/185: link f2 d3/db/f45 0 2026-03-10T12:37:39.632 INFO:tasks.workunit.client.0.vm00.stdout:0/186: stat d3/d1b/d38 0 2026-03-10T12:37:39.633 INFO:tasks.workunit.client.0.vm00.stdout:0/187: creat d3/d22/f46 x:0 0 0 2026-03-10T12:37:39.633 INFO:tasks.workunit.client.1.vm07.stdout:7/271: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:37:39.633 INFO:tasks.workunit.client.0.vm00.stdout:0/188: write d3/d1b/d38/f39 [738579,69771] 0 2026-03-10T12:37:39.634 INFO:tasks.workunit.client.0.vm00.stdout:0/189: chown d3/db/l26 7631 1 2026-03-10T12:37:39.636 INFO:tasks.workunit.client.1.vm07.stdout:5/303: dwrite d0/d22/f27 [0,4194304] 0 2026-03-10T12:37:39.639 INFO:tasks.workunit.client.0.vm00.stdout:0/190: mknod d3/d40/c47 0 2026-03-10T12:37:39.640 INFO:tasks.workunit.client.0.vm00.stdout:0/191: write d3/d1b/f2a [339907,60064] 0 2026-03-10T12:37:39.641 INFO:tasks.workunit.client.0.vm00.stdout:0/192: fdatasync d3/d1b/f37 0 2026-03-10T12:37:39.644 INFO:tasks.workunit.client.0.vm00.stdout:0/193: getdents d3/d1b/d38/d44 0 2026-03-10T12:37:39.657 INFO:tasks.workunit.client.0.vm00.stdout:0/194: dread d3/db/f16 [0,4194304] 0 2026-03-10T12:37:39.657 INFO:tasks.workunit.client.0.vm00.stdout:0/195: creat d3/d22/d3a/f48 x:0 0 0 2026-03-10T12:37:39.657 INFO:tasks.workunit.client.0.vm00.stdout:0/196: creat d3/d1b/d38/d44/f49 x:0 0 0 2026-03-10T12:37:39.657 INFO:tasks.workunit.client.0.vm00.stdout:0/197: unlink d3/d22/d3a/f48 0 2026-03-10T12:37:39.657 INFO:tasks.workunit.client.0.vm00.stdout:0/198: mknod d3/d33/c4a 0 2026-03-10T12:37:39.657 INFO:tasks.workunit.client.0.vm00.stdout:6/182: getdents d2/d16/d29/d31/d34 0 2026-03-10T12:37:39.657 INFO:tasks.workunit.client.0.vm00.stdout:0/199: unlink d3/d1b/d38/f39 0 2026-03-10T12:37:39.659 INFO:tasks.workunit.client.0.vm00.stdout:9/74: write d0/f17 [4236751,66186] 0 2026-03-10T12:37:39.665 INFO:tasks.workunit.client.0.vm00.stdout:9/75: dwrite d0/f17 [0,4194304] 0 2026-03-10T12:37:39.665 INFO:tasks.workunit.client.0.vm00.stdout:0/200: mkdir d3/d7/d3c/d4b 0 2026-03-10T12:37:39.666 INFO:tasks.workunit.client.0.vm00.stdout:9/76: creat d0/d5/d16/f1c x:0 0 0 2026-03-10T12:37:39.666 INFO:tasks.workunit.client.0.vm00.stdout:9/77: fdatasync d0/d5/d16/f1c 0 2026-03-10T12:37:39.667 INFO:tasks.workunit.client.0.vm00.stdout:6/183: rmdir d2/da/dc/d2f/d38 0 2026-03-10T12:37:39.668 INFO:tasks.workunit.client.0.vm00.stdout:6/184: dread - d2/d16/f17 zero size 2026-03-10T12:37:39.674 INFO:tasks.workunit.client.0.vm00.stdout:0/201: getdents d3/db/d24 0 2026-03-10T12:37:39.679 INFO:tasks.workunit.client.0.vm00.stdout:0/202: chown d3/db/d24/d25/c34 399909 1 2026-03-10T12:37:39.679 INFO:tasks.workunit.client.0.vm00.stdout:9/78: dwrite d0/f4 [0,4194304] 0 2026-03-10T12:37:39.680 INFO:tasks.workunit.client.0.vm00.stdout:6/185: dwrite d2/d14/f3f [0,4194304] 0 2026-03-10T12:37:39.680 INFO:tasks.workunit.client.1.vm07.stdout:1/280: dread d9/df/f24 [0,4194304] 0 2026-03-10T12:37:39.680 INFO:tasks.workunit.client.1.vm07.stdout:5/304: sync 2026-03-10T12:37:39.683 INFO:tasks.workunit.client.1.vm07.stdout:5/305: symlink d0/d22/d18/d19/d2e/d67/l6b 0 2026-03-10T12:37:39.688 INFO:tasks.workunit.client.1.vm07.stdout:5/306: rename d0/d22/d18/d19/c4d to d0/d22/d18/d19/d21/d54/c6c 0 2026-03-10T12:37:39.688 INFO:tasks.workunit.client.1.vm07.stdout:5/307: stat d0/l3 0 2026-03-10T12:37:39.689 INFO:tasks.workunit.client.1.vm07.stdout:5/308: creat d0/d22/d18/d3e/d5d/f6d x:0 0 0 2026-03-10T12:37:39.690 INFO:tasks.workunit.client.1.vm07.stdout:5/309: chown d0/d22/d18/d19/d2e/f59 1022749 1 2026-03-10T12:37:39.691 INFO:tasks.workunit.client.0.vm00.stdout:6/186: dwrite d2/d16/f23 [0,4194304] 0 2026-03-10T12:37:39.698 INFO:tasks.workunit.client.1.vm07.stdout:5/310: link d0/d22/d18/l4e d0/d22/d18/d19/d2e/d3f/d5c/l6e 0 2026-03-10T12:37:39.704 INFO:tasks.workunit.client.1.vm07.stdout:5/311: mknod d0/d22/d18/d19/d36/c6f 0 2026-03-10T12:37:39.704 INFO:tasks.workunit.client.1.vm07.stdout:5/312: creat d0/f70 x:0 0 0 2026-03-10T12:37:39.704 INFO:tasks.workunit.client.1.vm07.stdout:5/313: write d0/ff [8906815,21483] 0 2026-03-10T12:37:39.704 INFO:tasks.workunit.client.0.vm00.stdout:0/203: mkdir d3/d7/d4c 0 2026-03-10T12:37:39.704 INFO:tasks.workunit.client.0.vm00.stdout:0/204: stat d3/d7/d3c/l20 0 2026-03-10T12:37:39.704 INFO:tasks.workunit.client.0.vm00.stdout:6/187: creat d2/da/dc/d2f/f44 x:0 0 0 2026-03-10T12:37:39.704 INFO:tasks.workunit.client.0.vm00.stdout:0/205: creat d3/d33/f4d x:0 0 0 2026-03-10T12:37:39.705 INFO:tasks.workunit.client.1.vm07.stdout:1/281: sync 2026-03-10T12:37:39.707 INFO:tasks.workunit.client.1.vm07.stdout:5/314: dwrite d0/f47 [0,4194304] 0 2026-03-10T12:37:39.714 INFO:tasks.workunit.client.1.vm07.stdout:5/315: write d0/d22/d18/f4c [25082,75880] 0 2026-03-10T12:37:39.714 INFO:tasks.workunit.client.0.vm00.stdout:0/206: unlink d3/db/d24/d25/c2c 0 2026-03-10T12:37:39.714 INFO:tasks.workunit.client.0.vm00.stdout:9/79: getdents d0/d5/d16/d19 0 2026-03-10T12:37:39.716 INFO:tasks.workunit.client.1.vm07.stdout:5/316: mknod d0/d22/d18/c71 0 2026-03-10T12:37:39.730 INFO:tasks.workunit.client.0.vm00.stdout:9/80: mkdir d0/d5/d16/d1d 0 2026-03-10T12:37:39.732 INFO:tasks.workunit.client.0.vm00.stdout:0/207: getdents d3/d1b 0 2026-03-10T12:37:39.733 INFO:tasks.workunit.client.0.vm00.stdout:0/208: fsync d3/d22/f42 0 2026-03-10T12:37:39.737 INFO:tasks.workunit.client.0.vm00.stdout:0/209: dread d3/d1b/f2b [0,4194304] 0 2026-03-10T12:37:39.741 INFO:tasks.workunit.client.0.vm00.stdout:0/210: creat d3/d40/f4e x:0 0 0 2026-03-10T12:37:39.741 INFO:tasks.workunit.client.0.vm00.stdout:0/211: fdatasync d3/d7/d3c/f30 0 2026-03-10T12:37:39.745 INFO:tasks.workunit.client.0.vm00.stdout:9/81: getdents d0/d5 0 2026-03-10T12:37:39.746 INFO:tasks.workunit.client.0.vm00.stdout:9/82: chown d0/d5/d16/f1c 3046547 1 2026-03-10T12:37:39.757 INFO:tasks.workunit.client.0.vm00.stdout:8/97: read d0/f9 [1554300,91646] 0 2026-03-10T12:37:39.762 INFO:tasks.workunit.client.0.vm00.stdout:9/83: rmdir d0/d5/d16/d1d 0 2026-03-10T12:37:39.763 INFO:tasks.workunit.client.0.vm00.stdout:9/84: write d0/f4 [3929688,60657] 0 2026-03-10T12:37:39.767 INFO:tasks.workunit.client.0.vm00.stdout:9/85: mkdir d0/d5/d16/d1e 0 2026-03-10T12:37:39.768 INFO:tasks.workunit.client.0.vm00.stdout:9/86: readlink d0/l8 0 2026-03-10T12:37:39.770 INFO:tasks.workunit.client.0.vm00.stdout:9/87: symlink d0/d5/d16/d19/l1f 0 2026-03-10T12:37:39.773 INFO:tasks.workunit.client.0.vm00.stdout:9/88: creat d0/d5/d16/d19/f20 x:0 0 0 2026-03-10T12:37:39.777 INFO:tasks.workunit.client.0.vm00.stdout:9/89: dwrite d0/d5/d16/d19/f1b [0,4194304] 0 2026-03-10T12:37:39.778 INFO:tasks.workunit.client.0.vm00.stdout:9/90: write d0/f17 [2993175,93710] 0 2026-03-10T12:37:39.790 INFO:tasks.workunit.client.0.vm00.stdout:6/188: dread d2/d16/f1e [0,4194304] 0 2026-03-10T12:37:39.799 INFO:tasks.workunit.client.0.vm00.stdout:6/189: creat d2/da/dc/f45 x:0 0 0 2026-03-10T12:37:39.814 INFO:tasks.workunit.client.0.vm00.stdout:6/190: creat d2/d39/f46 x:0 0 0 2026-03-10T12:37:39.815 INFO:tasks.workunit.client.0.vm00.stdout:6/191: creat d2/d16/f47 x:0 0 0 2026-03-10T12:37:39.815 INFO:tasks.workunit.client.0.vm00.stdout:6/192: dwrite d2/d16/f1e [0,4194304] 0 2026-03-10T12:37:39.815 INFO:tasks.workunit.client.0.vm00.stdout:6/193: mkdir d2/d16/d29/d31/d48 0 2026-03-10T12:37:39.815 INFO:tasks.workunit.client.0.vm00.stdout:6/194: mknod d2/d16/d29/d31/d48/c49 0 2026-03-10T12:37:39.815 INFO:tasks.workunit.client.0.vm00.stdout:6/195: rename d2/d16/f1c to d2/d39/f4a 0 2026-03-10T12:37:39.816 INFO:tasks.workunit.client.0.vm00.stdout:6/196: mknod d2/d16/c4b 0 2026-03-10T12:37:39.817 INFO:tasks.workunit.client.0.vm00.stdout:6/197: truncate d2/da/dc/f45 857007 0 2026-03-10T12:37:39.890 INFO:tasks.workunit.client.0.vm00.stdout:6/198: sync 2026-03-10T12:37:39.891 INFO:tasks.workunit.client.0.vm00.stdout:6/199: chown d2/d16/d29 1616770 1 2026-03-10T12:37:39.891 INFO:tasks.workunit.client.0.vm00.stdout:6/200: write d2/d14/f2e [906872,96738] 0 2026-03-10T12:37:39.893 INFO:tasks.workunit.client.0.vm00.stdout:6/201: creat d2/d16/d29/f4c x:0 0 0 2026-03-10T12:37:39.893 INFO:tasks.workunit.client.0.vm00.stdout:6/202: fsync d2/da/dc/f25 0 2026-03-10T12:37:39.897 INFO:tasks.workunit.client.0.vm00.stdout:6/203: link d2/d16/f19 d2/d16/d29/d31/d48/f4d 0 2026-03-10T12:37:39.916 INFO:tasks.workunit.client.1.vm07.stdout:9/249: read d5/f45 [11572724,8898] 0 2026-03-10T12:37:39.919 INFO:tasks.workunit.client.1.vm07.stdout:9/250: creat d5/d1f/d31/f56 x:0 0 0 2026-03-10T12:37:39.923 INFO:tasks.workunit.client.0.vm00.stdout:7/123: write da/d26/f27 [4931151,125241] 0 2026-03-10T12:37:39.923 INFO:tasks.workunit.client.0.vm00.stdout:7/124: mknod da/d25/c36 0 2026-03-10T12:37:39.924 INFO:tasks.workunit.client.1.vm07.stdout:9/251: write d5/d16/f34 [994732,17771] 0 2026-03-10T12:37:39.924 INFO:tasks.workunit.client.1.vm07.stdout:9/252: write d5/d13/d22/f39 [3128974,76505] 0 2026-03-10T12:37:39.931 INFO:tasks.workunit.client.1.vm07.stdout:9/253: dwrite d5/d1f/d31/f56 [0,4194304] 0 2026-03-10T12:37:39.970 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:39 vm07.local ceph-mon[58582]: pgmap v155: 65 pgs: 65 active+clean; 724 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 4.7 MiB/s rd, 80 MiB/s wr, 275 op/s 2026-03-10T12:37:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:39 vm00.local ceph-mon[50686]: pgmap v155: 65 pgs: 65 active+clean; 724 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 4.7 MiB/s rd, 80 MiB/s wr, 275 op/s 2026-03-10T12:37:39.994 INFO:tasks.workunit.client.0.vm00.stdout:1/92: dread f3 [0,4194304] 0 2026-03-10T12:37:39.995 INFO:tasks.workunit.client.0.vm00.stdout:1/93: mknod da/d12/c25 0 2026-03-10T12:37:39.995 INFO:tasks.workunit.client.1.vm07.stdout:0/340: write d0/d14/f36 [184503,122957] 0 2026-03-10T12:37:39.995 INFO:tasks.workunit.client.0.vm00.stdout:1/94: mkdir da/d12/d26 0 2026-03-10T12:37:39.996 INFO:tasks.workunit.client.0.vm00.stdout:1/95: dread - da/f13 zero size 2026-03-10T12:37:39.997 INFO:tasks.workunit.client.0.vm00.stdout:1/96: mkdir da/d21/d27 0 2026-03-10T12:37:40.003 INFO:tasks.workunit.client.0.vm00.stdout:1/97: mkdir da/d24/d28 0 2026-03-10T12:37:40.003 INFO:tasks.workunit.client.0.vm00.stdout:5/90: write f12 [339488,1618] 0 2026-03-10T12:37:40.003 INFO:tasks.workunit.client.1.vm07.stdout:1/282: dread d9/fd [0,4194304] 0 2026-03-10T12:37:40.004 INFO:tasks.workunit.client.0.vm00.stdout:5/91: symlink l1c 0 2026-03-10T12:37:40.004 INFO:tasks.workunit.client.0.vm00.stdout:1/98: mknod da/d24/d28/c29 0 2026-03-10T12:37:40.007 INFO:tasks.workunit.client.1.vm07.stdout:1/283: mkdir d9/d2d/d4f/d5a 0 2026-03-10T12:37:40.007 INFO:tasks.workunit.client.0.vm00.stdout:1/99: symlink da/d24/d28/l2a 0 2026-03-10T12:37:40.008 INFO:tasks.workunit.client.0.vm00.stdout:5/92: dread f11 [0,4194304] 0 2026-03-10T12:37:40.010 INFO:tasks.workunit.client.0.vm00.stdout:1/100: dread da/d12/f1f [0,4194304] 0 2026-03-10T12:37:40.010 INFO:tasks.workunit.client.1.vm07.stdout:1/284: read d9/f36 [2167959,96891] 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.0.vm00.stdout:1/101: write da/d12/f20 [491087,11158] 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.0.vm00.stdout:1/102: chown da/d24/d28/l2a 15 1 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.0.vm00.stdout:5/93: symlink l1d 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.0.vm00.stdout:5/94: rename ff to f1e 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.0.vm00.stdout:5/95: mkdir d1f 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.1.vm07.stdout:1/285: dwrite d9/df/d29/d2b/f32 [4194304,4194304] 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.1.vm07.stdout:1/286: write d9/df/d29/f49 [368695,18715] 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.1.vm07.stdout:1/287: mknod d9/df/d54/c5b 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.1.vm07.stdout:1/288: write d9/df/d29/f49 [1018071,86557] 0 2026-03-10T12:37:40.024 INFO:tasks.workunit.client.1.vm07.stdout:1/289: chown d9/df/d29/d2b/d3d/c46 0 1 2026-03-10T12:37:40.042 INFO:tasks.workunit.client.0.vm00.stdout:8/98: getdents d0/dd 0 2026-03-10T12:37:40.044 INFO:tasks.workunit.client.1.vm07.stdout:1/290: dread d9/f36 [0,4194304] 0 2026-03-10T12:37:40.044 INFO:tasks.workunit.client.1.vm07.stdout:1/291: stat d9/df/d29/d2b/d3d/f43 0 2026-03-10T12:37:40.047 INFO:tasks.workunit.client.0.vm00.stdout:8/99: rmdir d0/dd/d1c 0 2026-03-10T12:37:40.049 INFO:tasks.workunit.client.1.vm07.stdout:1/292: mknod d9/d2d/c5c 0 2026-03-10T12:37:40.049 INFO:tasks.workunit.client.1.vm07.stdout:3/324: dwrite f1 [0,4194304] 0 2026-03-10T12:37:40.052 INFO:tasks.workunit.client.0.vm00.stdout:8/100: dwrite d0/f8 [0,4194304] 0 2026-03-10T12:37:40.055 INFO:tasks.workunit.client.1.vm07.stdout:1/293: mknod d9/df/d54/c5d 0 2026-03-10T12:37:40.056 INFO:tasks.workunit.client.1.vm07.stdout:3/325: write dc/dd/d1f/d45/f50 [921256,71768] 0 2026-03-10T12:37:40.058 INFO:tasks.workunit.client.1.vm07.stdout:0/341: sync 2026-03-10T12:37:40.058 INFO:tasks.workunit.client.0.vm00.stdout:8/101: dread d0/f13 [0,4194304] 0 2026-03-10T12:37:40.063 INFO:tasks.workunit.client.0.vm00.stdout:8/102: dwrite d0/f8 [0,4194304] 0 2026-03-10T12:37:40.069 INFO:tasks.workunit.client.0.vm00.stdout:8/103: creat d0/d12/d17/f1d x:0 0 0 2026-03-10T12:37:40.070 INFO:tasks.workunit.client.1.vm07.stdout:0/342: sync 2026-03-10T12:37:40.070 INFO:tasks.workunit.client.0.vm00.stdout:8/104: rename d0/l5 to d0/dd/l1e 0 2026-03-10T12:37:40.072 INFO:tasks.workunit.client.1.vm07.stdout:0/343: readlink d0/d14/d1a/d2f/d31/d4f/l53 0 2026-03-10T12:37:40.072 INFO:tasks.workunit.client.0.vm00.stdout:8/105: dread d0/f13 [0,4194304] 0 2026-03-10T12:37:40.077 INFO:tasks.workunit.client.1.vm07.stdout:0/344: getdents d0/d14/d1a/d2f 0 2026-03-10T12:37:40.078 INFO:tasks.workunit.client.1.vm07.stdout:0/345: mkdir d0/d14/d1a/d2f/d31/d6b 0 2026-03-10T12:37:40.079 INFO:tasks.workunit.client.1.vm07.stdout:0/346: dread - d0/d14/d1a/d2f/f5d zero size 2026-03-10T12:37:40.079 INFO:tasks.workunit.client.0.vm00.stdout:8/106: mknod d0/d12/c1f 0 2026-03-10T12:37:40.080 INFO:tasks.workunit.client.0.vm00.stdout:8/107: fdatasync d0/d12/d17/f1d 0 2026-03-10T12:37:40.081 INFO:tasks.workunit.client.0.vm00.stdout:8/108: read d0/f10 [6058996,49112] 0 2026-03-10T12:37:40.081 INFO:tasks.workunit.client.0.vm00.stdout:8/109: dread - d0/f19 zero size 2026-03-10T12:37:40.083 INFO:tasks.workunit.client.0.vm00.stdout:8/110: rename d0/f13 to d0/dd/f20 0 2026-03-10T12:37:40.095 INFO:tasks.workunit.client.0.vm00.stdout:8/111: dread d0/f7 [0,4194304] 0 2026-03-10T12:37:40.095 INFO:tasks.workunit.client.0.vm00.stdout:8/112: symlink d0/d12/d17/l21 0 2026-03-10T12:37:40.095 INFO:tasks.workunit.client.0.vm00.stdout:8/113: write d0/f8 [1071822,119012] 0 2026-03-10T12:37:40.095 INFO:tasks.workunit.client.0.vm00.stdout:8/114: rename d0/f19 to d0/f22 0 2026-03-10T12:37:40.096 INFO:tasks.workunit.client.0.vm00.stdout:8/115: readlink d0/l6 0 2026-03-10T12:37:40.096 INFO:tasks.workunit.client.1.vm07.stdout:2/214: dwrite d0/f12 [0,4194304] 0 2026-03-10T12:37:40.099 INFO:tasks.workunit.client.1.vm07.stdout:2/215: dread - d0/d42/d26/f48 zero size 2026-03-10T12:37:40.110 INFO:tasks.workunit.client.1.vm07.stdout:1/294: dread d9/fe [0,4194304] 0 2026-03-10T12:37:40.113 INFO:tasks.workunit.client.1.vm07.stdout:1/295: creat d9/d2d/d4f/f5e x:0 0 0 2026-03-10T12:37:40.121 INFO:tasks.workunit.client.1.vm07.stdout:1/296: symlink d9/df/d29/d2b/d31/l5f 0 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:9/254: rename d5/d13/d2c/d2f to d5/d13/d57 0 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:9/255: dread - d5/d16/f35 zero size 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:1/297: symlink d9/d2d/d4f/d5a/l60 0 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:1/298: chown d9/df/d29/d2b/d31/f35 0 1 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:0/347: rename d0/d62/d65/f56 to d0/d14/d5f/d3b/f6c 0 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:9/256: creat d5/d13/d57/d4f/f58 x:0 0 0 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:1/299: write d9/f19 [1871795,198] 0 2026-03-10T12:37:40.134 INFO:tasks.workunit.client.1.vm07.stdout:1/300: chown d9/d2d/c5c 30 1 2026-03-10T12:37:40.135 INFO:tasks.workunit.client.1.vm07.stdout:1/301: stat d9/df/c39 0 2026-03-10T12:37:40.142 INFO:tasks.workunit.client.1.vm07.stdout:4/317: dwrite d0/d4/d10/d3c/d2b/d2d/f65 [4194304,4194304] 0 2026-03-10T12:37:40.145 INFO:tasks.workunit.client.1.vm07.stdout:8/322: write d1/f2 [2861435,79250] 0 2026-03-10T12:37:40.157 INFO:tasks.workunit.client.0.vm00.stdout:5/96: fdatasync f16 0 2026-03-10T12:37:40.159 INFO:tasks.workunit.client.0.vm00.stdout:5/97: rename c13 to d1f/c20 0 2026-03-10T12:37:40.159 INFO:tasks.workunit.client.1.vm07.stdout:2/216: dread d0/d42/d1f/d20/f2b [0,4194304] 0 2026-03-10T12:37:40.167 INFO:tasks.workunit.client.1.vm07.stdout:4/318: mkdir d0/d4/d10/d5f/d6d 0 2026-03-10T12:37:40.170 INFO:tasks.workunit.client.1.vm07.stdout:8/323: mkdir d1/d3/d6c 0 2026-03-10T12:37:40.170 INFO:tasks.workunit.client.1.vm07.stdout:2/217: write d0/d42/d26/f27 [1597824,95312] 0 2026-03-10T12:37:40.171 INFO:tasks.workunit.client.1.vm07.stdout:2/218: fdatasync d0/f4a 0 2026-03-10T12:37:40.174 INFO:tasks.workunit.client.1.vm07.stdout:1/302: creat d9/f61 x:0 0 0 2026-03-10T12:37:40.184 INFO:tasks.workunit.client.0.vm00.stdout:3/115: truncate dd/f15 2164431 0 2026-03-10T12:37:40.184 INFO:tasks.workunit.client.0.vm00.stdout:4/88: write fb [1639174,9320] 0 2026-03-10T12:37:40.185 INFO:tasks.workunit.client.0.vm00.stdout:3/116: chown dd/d18/d13/c1c 347 1 2026-03-10T12:37:40.185 INFO:tasks.workunit.client.0.vm00.stdout:3/117: chown dd/f15 125032370 1 2026-03-10T12:37:40.187 INFO:tasks.workunit.client.0.vm00.stdout:4/89: creat df/f1b x:0 0 0 2026-03-10T12:37:40.188 INFO:tasks.workunit.client.0.vm00.stdout:4/90: truncate df/f19 284114 0 2026-03-10T12:37:40.189 INFO:tasks.workunit.client.0.vm00.stdout:3/118: mkdir dd/d18/d14/d2b 0 2026-03-10T12:37:40.189 INFO:tasks.workunit.client.1.vm07.stdout:8/324: unlink d1/d3/l62 0 2026-03-10T12:37:40.192 INFO:tasks.workunit.client.1.vm07.stdout:1/303: fsync d9/fe 0 2026-03-10T12:37:40.194 INFO:tasks.workunit.client.0.vm00.stdout:3/119: unlink dd/d18/f21 0 2026-03-10T12:37:40.195 INFO:tasks.workunit.client.0.vm00.stdout:3/120: dread dd/f16 [0,4194304] 0 2026-03-10T12:37:40.196 INFO:tasks.workunit.client.1.vm07.stdout:0/348: link d0/d14/d1a/d2f/d31/l3c d0/d14/d1a/d2f/d31/d4f/d60/l6d 0 2026-03-10T12:37:40.196 INFO:tasks.workunit.client.0.vm00.stdout:2/98: dread d4/dd/f17 [0,4194304] 0 2026-03-10T12:37:40.197 INFO:tasks.workunit.client.0.vm00.stdout:2/99: chown d4/dd/f17 3594266 1 2026-03-10T12:37:40.197 INFO:tasks.workunit.client.0.vm00.stdout:3/121: dread dd/f16 [0,4194304] 0 2026-03-10T12:37:40.198 INFO:tasks.workunit.client.1.vm07.stdout:2/219: rmdir d0/d42/d1f/d20/d49 0 2026-03-10T12:37:40.199 INFO:tasks.workunit.client.1.vm07.stdout:1/304: dwrite d9/df/d54/f57 [0,4194304] 0 2026-03-10T12:37:40.203 INFO:tasks.workunit.client.1.vm07.stdout:0/349: getdents d0/d14/d1a/d2f 0 2026-03-10T12:37:40.209 INFO:tasks.workunit.client.0.vm00.stdout:4/91: dwrite df/f11 [0,4194304] 0 2026-03-10T12:37:40.214 INFO:tasks.workunit.client.0.vm00.stdout:3/122: mkdir dd/d27/d2c 0 2026-03-10T12:37:40.214 INFO:tasks.workunit.client.1.vm07.stdout:1/305: truncate d9/df/d29/f49 2095938 0 2026-03-10T12:37:40.214 INFO:tasks.workunit.client.1.vm07.stdout:0/350: dread d0/d14/d5f/d3b/f46 [0,4194304] 0 2026-03-10T12:37:40.214 INFO:tasks.workunit.client.1.vm07.stdout:0/351: dwrite d0/d14/d5f/f54 [0,4194304] 0 2026-03-10T12:37:40.214 INFO:tasks.workunit.client.0.vm00.stdout:2/100: truncate d4/dd/f17 905508 0 2026-03-10T12:37:40.214 INFO:tasks.workunit.client.0.vm00.stdout:0/212: write d3/db/f45 [157696,52584] 0 2026-03-10T12:37:40.216 INFO:tasks.workunit.client.0.vm00.stdout:0/213: dwrite d3/d1b/d38/d44/f49 [0,4194304] 0 2026-03-10T12:37:40.224 INFO:tasks.workunit.client.0.vm00.stdout:2/101: unlink d4/f12 0 2026-03-10T12:37:40.226 INFO:tasks.workunit.client.0.vm00.stdout:9/91: rmdir d0/d5/d16 39 2026-03-10T12:37:40.234 INFO:tasks.workunit.client.0.vm00.stdout:6/204: rmdir d2/d39 39 2026-03-10T12:37:40.234 INFO:tasks.workunit.client.0.vm00.stdout:6/205: fsync d2/d16/f2a 0 2026-03-10T12:37:40.235 INFO:tasks.workunit.client.0.vm00.stdout:6/206: fdatasync d2/da/dc/f45 0 2026-03-10T12:37:40.235 INFO:tasks.workunit.client.0.vm00.stdout:6/207: truncate d2/da/dc/f27 47441 0 2026-03-10T12:37:40.240 INFO:tasks.workunit.client.1.vm07.stdout:0/352: creat d0/d62/f6e x:0 0 0 2026-03-10T12:37:40.240 INFO:tasks.workunit.client.1.vm07.stdout:5/317: read d0/d22/d18/f4c [3862311,127780] 0 2026-03-10T12:37:40.243 INFO:tasks.workunit.client.1.vm07.stdout:0/353: dread d0/d14/d5f/d3b/f6c [0,4194304] 0 2026-03-10T12:37:40.245 INFO:tasks.workunit.client.1.vm07.stdout:1/306: link c4 d9/df/d29/d2c/d59/c62 0 2026-03-10T12:37:40.245 INFO:tasks.workunit.client.0.vm00.stdout:9/92: dwrite d0/d5/d16/d19/f20 [0,4194304] 0 2026-03-10T12:37:40.247 INFO:tasks.workunit.client.0.vm00.stdout:9/93: stat d0/d5/d16/d19 0 2026-03-10T12:37:40.247 INFO:tasks.workunit.client.1.vm07.stdout:0/354: write d0/d14/d1a/d2f/d31/f4d [3188693,112159] 0 2026-03-10T12:37:40.247 INFO:tasks.workunit.client.0.vm00.stdout:9/94: read d0/d5/d16/d19/f1b [1916466,21617] 0 2026-03-10T12:37:40.248 INFO:tasks.workunit.client.1.vm07.stdout:0/355: write d0/d14/d5f/d41/f55 [1147280,68316] 0 2026-03-10T12:37:40.249 INFO:tasks.workunit.client.0.vm00.stdout:4/92: creat df/f1c x:0 0 0 2026-03-10T12:37:40.255 INFO:tasks.workunit.client.0.vm00.stdout:7/125: write f8 [129695,106744] 0 2026-03-10T12:37:40.263 INFO:tasks.workunit.client.1.vm07.stdout:8/325: dread d1/d3/d18/f32 [0,4194304] 0 2026-03-10T12:37:40.263 INFO:tasks.workunit.client.1.vm07.stdout:5/318: mkdir d0/d22/d18/d19/d72 0 2026-03-10T12:37:40.263 INFO:tasks.workunit.client.0.vm00.stdout:2/102: mknod d4/dd/c20 0 2026-03-10T12:37:40.263 INFO:tasks.workunit.client.0.vm00.stdout:7/126: write da/d25/f2b [823270,94998] 0 2026-03-10T12:37:40.263 INFO:tasks.workunit.client.0.vm00.stdout:7/127: dwrite da/fe [0,4194304] 0 2026-03-10T12:37:40.264 INFO:tasks.workunit.client.0.vm00.stdout:9/95: creat d0/f21 x:0 0 0 2026-03-10T12:37:40.265 INFO:tasks.workunit.client.0.vm00.stdout:9/96: fsync d0/f1a 0 2026-03-10T12:37:40.265 INFO:tasks.workunit.client.0.vm00.stdout:3/123: link dd/d18/l29 dd/d18/d13/d1d/l2d 0 2026-03-10T12:37:40.265 INFO:tasks.workunit.client.0.vm00.stdout:9/97: dread - d0/f1a zero size 2026-03-10T12:37:40.268 INFO:tasks.workunit.client.0.vm00.stdout:7/128: dwrite f1 [0,4194304] 0 2026-03-10T12:37:40.269 INFO:tasks.workunit.client.1.vm07.stdout:5/319: dwrite d0/f1e [0,4194304] 0 2026-03-10T12:37:40.271 INFO:tasks.workunit.client.0.vm00.stdout:6/208: fdatasync d2/da/dc/fd 0 2026-03-10T12:37:40.272 INFO:tasks.workunit.client.1.vm07.stdout:5/320: write d0/f70 [67716,48592] 0 2026-03-10T12:37:40.276 INFO:tasks.workunit.client.0.vm00.stdout:4/93: mknod df/c1d 0 2026-03-10T12:37:40.280 INFO:tasks.workunit.client.0.vm00.stdout:9/98: mknod d0/d5/d16/d19/c22 0 2026-03-10T12:37:40.283 INFO:tasks.workunit.client.0.vm00.stdout:7/129: mkdir da/d26/d37 0 2026-03-10T12:37:40.283 INFO:tasks.workunit.client.1.vm07.stdout:8/326: chown d1/d3/c23 20602 1 2026-03-10T12:37:40.284 INFO:tasks.workunit.client.0.vm00.stdout:7/130: write da/fb [2197981,69135] 0 2026-03-10T12:37:40.285 INFO:tasks.workunit.client.0.vm00.stdout:6/209: mknod d2/d16/d29/d31/d48/c4e 0 2026-03-10T12:37:40.287 INFO:tasks.workunit.client.1.vm07.stdout:5/321: unlink d0/d22/d18/d19/d36/c6f 0 2026-03-10T12:37:40.290 INFO:tasks.workunit.client.0.vm00.stdout:2/103: mknod d4/dd/c21 0 2026-03-10T12:37:40.290 INFO:tasks.workunit.client.0.vm00.stdout:4/94: write df/f16 [118865,63136] 0 2026-03-10T12:37:40.290 INFO:tasks.workunit.client.0.vm00.stdout:4/95: write fa [5083431,89563] 0 2026-03-10T12:37:40.293 INFO:tasks.workunit.client.0.vm00.stdout:9/99: mknod d0/d5/d16/d19/c23 0 2026-03-10T12:37:40.295 INFO:tasks.workunit.client.0.vm00.stdout:7/131: creat da/d1b/d2d/f38 x:0 0 0 2026-03-10T12:37:40.295 INFO:tasks.workunit.client.0.vm00.stdout:7/132: stat da/f16 0 2026-03-10T12:37:40.299 INFO:tasks.workunit.client.1.vm07.stdout:5/322: creat d0/d22/d18/d19/d2e/d3f/d63/f73 x:0 0 0 2026-03-10T12:37:40.301 INFO:tasks.workunit.client.1.vm07.stdout:0/356: getdents d0/d14 0 2026-03-10T12:37:40.302 INFO:tasks.workunit.client.0.vm00.stdout:4/96: unlink df/c15 0 2026-03-10T12:37:40.302 INFO:tasks.workunit.client.0.vm00.stdout:4/97: chown df/f1c 6777330 1 2026-03-10T12:37:40.307 INFO:tasks.workunit.client.1.vm07.stdout:5/323: unlink d0/d22/d18/d19/d21/d3a/c68 0 2026-03-10T12:37:40.307 INFO:tasks.workunit.client.1.vm07.stdout:0/357: creat d0/d14/d1a/d2f/d31/f6f x:0 0 0 2026-03-10T12:37:40.307 INFO:tasks.workunit.client.0.vm00.stdout:7/133: creat da/d1b/f39 x:0 0 0 2026-03-10T12:37:40.307 INFO:tasks.workunit.client.0.vm00.stdout:9/100: creat d0/d5/d16/f24 x:0 0 0 2026-03-10T12:37:40.308 INFO:tasks.workunit.client.0.vm00.stdout:7/134: fsync da/d1b/f22 0 2026-03-10T12:37:40.310 INFO:tasks.workunit.client.1.vm07.stdout:0/358: read d0/d14/d1a/f27 [3718870,25335] 0 2026-03-10T12:37:40.312 INFO:tasks.workunit.client.0.vm00.stdout:6/210: fdatasync d2/d16/f19 0 2026-03-10T12:37:40.312 INFO:tasks.workunit.client.0.vm00.stdout:2/104: fdatasync d4/dd/ff 0 2026-03-10T12:37:40.313 INFO:tasks.workunit.client.0.vm00.stdout:4/98: dwrite df/f19 [0,4194304] 0 2026-03-10T12:37:40.315 INFO:tasks.workunit.client.0.vm00.stdout:4/99: write df/f1b [243859,65729] 0 2026-03-10T12:37:40.315 INFO:tasks.workunit.client.0.vm00.stdout:2/105: stat d4/d6/f16 0 2026-03-10T12:37:40.321 INFO:tasks.workunit.client.1.vm07.stdout:0/359: creat d0/d14/d1a/d2f/d31/d4f/f70 x:0 0 0 2026-03-10T12:37:40.322 INFO:tasks.workunit.client.0.vm00.stdout:7/135: dread da/d1b/f22 [0,4194304] 0 2026-03-10T12:37:40.323 INFO:tasks.workunit.client.1.vm07.stdout:9/257: dread d5/f1a [0,4194304] 0 2026-03-10T12:37:40.323 INFO:tasks.workunit.client.0.vm00.stdout:6/211: dwrite d2/d16/f17 [0,4194304] 0 2026-03-10T12:37:40.325 INFO:tasks.workunit.client.0.vm00.stdout:7/136: write da/fb [3859063,125561] 0 2026-03-10T12:37:40.327 INFO:tasks.workunit.client.1.vm07.stdout:0/360: mknod d0/d14/d5f/d41/c71 0 2026-03-10T12:37:40.329 INFO:tasks.workunit.client.0.vm00.stdout:4/100: dwrite f3 [4194304,4194304] 0 2026-03-10T12:37:40.334 INFO:tasks.workunit.client.1.vm07.stdout:9/258: rename d5/d16/f34 to d5/d13/f59 0 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.0.vm00.stdout:2/106: rename d4/d6/fb to d4/d6/f22 0 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.1.vm07.stdout:9/259: dread - d5/d1f/f4e zero size 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.1.vm07.stdout:9/260: dread - d5/d1f/f3d zero size 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.1.vm07.stdout:5/324: getdents d0/d22/d18/d3e/d5d 0 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.1.vm07.stdout:9/261: write d5/d1f/f4e [713290,23313] 0 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.1.vm07.stdout:5/325: read - d0/d22/d18/d19/d21/f61 zero size 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.1.vm07.stdout:9/262: dread d5/f1a [0,4194304] 0 2026-03-10T12:37:40.338 INFO:tasks.workunit.client.0.vm00.stdout:2/107: unlink d4/l11 0 2026-03-10T12:37:40.340 INFO:tasks.workunit.client.1.vm07.stdout:0/361: unlink d0/d14/f69 0 2026-03-10T12:37:40.340 INFO:tasks.workunit.client.1.vm07.stdout:9/263: chown d5/d16/d18 2177 1 2026-03-10T12:37:40.341 INFO:tasks.workunit.client.0.vm00.stdout:2/108: mknod d4/c23 0 2026-03-10T12:37:40.343 INFO:tasks.workunit.client.1.vm07.stdout:1/307: dread d9/df/d29/d2b/d30/f38 [0,4194304] 0 2026-03-10T12:37:40.353 INFO:tasks.workunit.client.1.vm07.stdout:5/326: mknod d0/d22/d18/d19/d36/c74 0 2026-03-10T12:37:40.354 INFO:tasks.workunit.client.1.vm07.stdout:5/327: read d0/f1e [3423808,102128] 0 2026-03-10T12:37:40.357 INFO:tasks.workunit.client.1.vm07.stdout:1/308: write d9/fd [1426093,60989] 0 2026-03-10T12:37:40.359 INFO:tasks.workunit.client.1.vm07.stdout:3/326: dwrite dc/dd/d1f/f27 [0,4194304] 0 2026-03-10T12:37:40.365 INFO:tasks.workunit.client.1.vm07.stdout:5/328: dwrite d0/fa [0,4194304] 0 2026-03-10T12:37:40.365 INFO:tasks.workunit.client.1.vm07.stdout:5/329: truncate d0/f1f 1183071 0 2026-03-10T12:37:40.368 INFO:tasks.workunit.client.1.vm07.stdout:3/327: dwrite dc/dd/d1f/d45/f56 [0,4194304] 0 2026-03-10T12:37:40.379 INFO:tasks.workunit.client.1.vm07.stdout:3/328: mkdir dc/dd/d43/d76 0 2026-03-10T12:37:40.380 INFO:tasks.workunit.client.0.vm00.stdout:7/137: dread da/f10 [4194304,4194304] 0 2026-03-10T12:37:40.381 INFO:tasks.workunit.client.0.vm00.stdout:7/138: dread - da/f35 zero size 2026-03-10T12:37:40.382 INFO:tasks.workunit.client.1.vm07.stdout:1/309: rename d9/df/c1d to d9/df/c63 0 2026-03-10T12:37:40.385 INFO:tasks.workunit.client.1.vm07.stdout:1/310: fdatasync d9/df/d29/d2b/f4e 0 2026-03-10T12:37:40.387 INFO:tasks.workunit.client.1.vm07.stdout:1/311: unlink d9/df/d29/d2b/d30/l44 0 2026-03-10T12:37:40.388 INFO:tasks.workunit.client.0.vm00.stdout:5/98: rename f1e to d1f/f21 0 2026-03-10T12:37:40.389 INFO:tasks.workunit.client.0.vm00.stdout:1/103: truncate da/fc 814527 0 2026-03-10T12:37:40.391 INFO:tasks.workunit.client.0.vm00.stdout:1/104: symlink da/d21/l2b 0 2026-03-10T12:37:40.392 INFO:tasks.workunit.client.0.vm00.stdout:0/214: rename d3/l5 to d3/d1b/d38/l4f 0 2026-03-10T12:37:40.392 INFO:tasks.workunit.client.0.vm00.stdout:0/215: fdatasync d3/db/d24/d25/f43 0 2026-03-10T12:37:40.393 INFO:tasks.workunit.client.1.vm07.stdout:3/329: getdents dc/dd/d43/d5c 0 2026-03-10T12:37:40.393 INFO:tasks.workunit.client.0.vm00.stdout:0/216: dread - d3/d33/f4d zero size 2026-03-10T12:37:40.393 INFO:tasks.workunit.client.0.vm00.stdout:0/217: write d3/d22/f2e [2424373,81225] 0 2026-03-10T12:37:40.395 INFO:tasks.workunit.client.0.vm00.stdout:1/105: symlink da/d24/l2c 0 2026-03-10T12:37:40.397 INFO:tasks.workunit.client.1.vm07.stdout:3/330: dwrite dc/dd/d1f/f30 [0,4194304] 0 2026-03-10T12:37:40.397 INFO:tasks.workunit.client.0.vm00.stdout:5/99: link f11 d1f/f22 0 2026-03-10T12:37:40.398 INFO:tasks.workunit.client.1.vm07.stdout:1/312: unlink d9/df/d29/d2b/d31/c41 0 2026-03-10T12:37:40.399 INFO:tasks.workunit.client.0.vm00.stdout:9/101: rename d0/d5/d16/l12 to d0/d5/dc/l25 0 2026-03-10T12:37:40.400 INFO:tasks.workunit.client.0.vm00.stdout:0/218: dwrite d3/db/d24/d25/f43 [0,4194304] 0 2026-03-10T12:37:40.404 INFO:tasks.workunit.client.0.vm00.stdout:5/100: readlink l17 0 2026-03-10T12:37:40.409 INFO:tasks.workunit.client.0.vm00.stdout:6/212: rename d2/da/dc/f28 to d2/da/dc/d2f/f4f 0 2026-03-10T12:37:40.410 INFO:tasks.workunit.client.0.vm00.stdout:6/213: stat d2/d14/f3b 0 2026-03-10T12:37:40.413 INFO:tasks.workunit.client.1.vm07.stdout:1/313: rename d9/d2d/c5c to d9/df/d29/d2c/c64 0 2026-03-10T12:37:40.425 INFO:tasks.workunit.client.0.vm00.stdout:6/214: symlink d2/da/dc/l50 0 2026-03-10T12:37:40.425 INFO:tasks.workunit.client.0.vm00.stdout:0/219: dwrite d3/d1b/f2b [0,4194304] 0 2026-03-10T12:37:40.425 INFO:tasks.workunit.client.0.vm00.stdout:9/102: rename d0/d5/d16/f1c to d0/d5/f26 0 2026-03-10T12:37:40.425 INFO:tasks.workunit.client.0.vm00.stdout:9/103: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:37:40.425 INFO:tasks.workunit.client.0.vm00.stdout:9/104: stat d0/d5/c14 0 2026-03-10T12:37:40.426 INFO:tasks.workunit.client.1.vm07.stdout:1/314: write d9/f52 [481123,15578] 0 2026-03-10T12:37:40.426 INFO:tasks.workunit.client.1.vm07.stdout:3/331: symlink dc/d18/d24/d72/l77 0 2026-03-10T12:37:40.426 INFO:tasks.workunit.client.1.vm07.stdout:1/315: rename d9/df/d29/d2b/d31/f35 to d9/d2d/d4f/d5a/f65 0 2026-03-10T12:37:40.426 INFO:tasks.workunit.client.1.vm07.stdout:1/316: truncate d9/f61 788406 0 2026-03-10T12:37:40.426 INFO:tasks.workunit.client.1.vm07.stdout:1/317: fsync d9/df/d29/f49 0 2026-03-10T12:37:40.426 INFO:tasks.workunit.client.1.vm07.stdout:6/260: write d1/d4/d6/f13 [3366776,130437] 0 2026-03-10T12:37:40.431 INFO:tasks.workunit.client.0.vm00.stdout:5/101: rename c18 to d1f/c23 0 2026-03-10T12:37:40.435 INFO:tasks.workunit.client.0.vm00.stdout:5/102: symlink d1f/l24 0 2026-03-10T12:37:40.436 INFO:tasks.workunit.client.0.vm00.stdout:5/103: stat c6 0 2026-03-10T12:37:40.436 INFO:tasks.workunit.client.0.vm00.stdout:9/105: mkdir d0/d5/d16/d1e/d27 0 2026-03-10T12:37:40.437 INFO:tasks.workunit.client.0.vm00.stdout:5/104: rmdir d1f 39 2026-03-10T12:37:40.439 INFO:tasks.workunit.client.1.vm07.stdout:4/319: dwrite d0/d4/d10/d23/f50 [0,4194304] 0 2026-03-10T12:37:40.441 INFO:tasks.workunit.client.0.vm00.stdout:9/106: dwrite d0/f17 [0,4194304] 0 2026-03-10T12:37:40.445 INFO:tasks.workunit.client.0.vm00.stdout:9/107: chown d0/f21 46 1 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/108: creat d0/d5/d16/d1e/d27/f28 x:0 0 0 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/109: mknod d0/d5/dc/c29 0 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/110: creat d0/d5/dc/f2a x:0 0 0 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/111: mkdir d0/d5/d16/d1e/d2b 0 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/112: symlink d0/l2c 0 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/113: write d0/f4 [1610998,111410] 0 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/114: readlink d0/d5/dc/l18 0 2026-03-10T12:37:40.466 INFO:tasks.workunit.client.0.vm00.stdout:9/115: creat d0/d5/dc/f2d x:0 0 0 2026-03-10T12:37:40.479 INFO:tasks.workunit.client.1.vm07.stdout:3/332: sync 2026-03-10T12:37:40.480 INFO:tasks.workunit.client.1.vm07.stdout:2/220: dwrite d0/d42/d26/f2e [4194304,4194304] 0 2026-03-10T12:37:40.481 INFO:tasks.workunit.client.1.vm07.stdout:2/221: write d0/f44 [587383,39136] 0 2026-03-10T12:37:40.484 INFO:tasks.workunit.client.1.vm07.stdout:3/333: rename dc/dd/d43/d5c/l69 to dc/d18/d2d/l78 0 2026-03-10T12:37:40.488 INFO:tasks.workunit.client.1.vm07.stdout:3/334: chown dc/d18/d2d 51668012 1 2026-03-10T12:37:40.490 INFO:tasks.workunit.client.1.vm07.stdout:3/335: chown dc/dd/d28 83 1 2026-03-10T12:37:40.492 INFO:tasks.workunit.client.1.vm07.stdout:3/336: truncate dc/dd/d28/d3b/f4c 4227905 0 2026-03-10T12:37:40.495 INFO:tasks.workunit.client.1.vm07.stdout:3/337: chown dc/dd/d43/c64 2459 1 2026-03-10T12:37:40.501 INFO:tasks.workunit.client.1.vm07.stdout:3/338: creat dc/d18/f79 x:0 0 0 2026-03-10T12:37:40.566 INFO:tasks.workunit.client.0.vm00.stdout:5/105: creat d1f/f25 x:0 0 0 2026-03-10T12:37:40.571 INFO:tasks.workunit.client.0.vm00.stdout:8/116: write d0/f9 [2008444,127522] 0 2026-03-10T12:37:40.576 INFO:tasks.workunit.client.0.vm00.stdout:8/117: getdents d0/d12 0 2026-03-10T12:37:40.585 INFO:tasks.workunit.client.0.vm00.stdout:8/118: creat d0/d12/f23 x:0 0 0 2026-03-10T12:37:40.595 INFO:tasks.workunit.client.0.vm00.stdout:8/119: rmdir d0/dd 39 2026-03-10T12:37:40.595 INFO:tasks.workunit.client.0.vm00.stdout:8/120: mknod d0/d12/c24 0 2026-03-10T12:37:40.595 INFO:tasks.workunit.client.0.vm00.stdout:8/121: write d0/d12/d17/f1d [9869,20334] 0 2026-03-10T12:37:40.595 INFO:tasks.workunit.client.0.vm00.stdout:8/122: dwrite d0/f22 [0,4194304] 0 2026-03-10T12:37:40.601 INFO:tasks.workunit.client.1.vm07.stdout:8/327: write d1/d3/d6/f24 [1281854,110020] 0 2026-03-10T12:37:40.603 INFO:tasks.workunit.client.1.vm07.stdout:8/328: truncate d1/d3/d11/f47 635059 0 2026-03-10T12:37:40.607 INFO:tasks.workunit.client.1.vm07.stdout:9/264: unlink d5/d13/f59 0 2026-03-10T12:37:40.613 INFO:tasks.workunit.client.1.vm07.stdout:5/330: rmdir d0/d22/d18/d19/d36 39 2026-03-10T12:37:40.615 INFO:tasks.workunit.client.1.vm07.stdout:0/362: chown d0/d14/d5f/d41/f55 0 1 2026-03-10T12:37:40.620 INFO:tasks.workunit.client.1.vm07.stdout:5/331: mkdir d0/d22/d18/d19/d36/d75 0 2026-03-10T12:37:40.622 INFO:tasks.workunit.client.1.vm07.stdout:0/363: link d0/d14/d5f/l1e d0/d14/d1a/d2f/d31/d4f/d60/l72 0 2026-03-10T12:37:40.627 INFO:tasks.workunit.client.0.vm00.stdout:7/139: dread da/fd [0,4194304] 0 2026-03-10T12:37:40.631 INFO:tasks.workunit.client.1.vm07.stdout:5/332: dread d0/d22/d18/d19/f23 [0,4194304] 0 2026-03-10T12:37:40.632 INFO:tasks.workunit.client.1.vm07.stdout:5/333: chown d0/d22/d18/d19/d36/c74 66564202 1 2026-03-10T12:37:40.632 INFO:tasks.workunit.client.1.vm07.stdout:5/334: fdatasync d0/ff 0 2026-03-10T12:37:40.636 INFO:tasks.workunit.client.1.vm07.stdout:0/364: dread d0/d14/d1a/d2f/d31/f4d [0,4194304] 0 2026-03-10T12:37:40.636 INFO:tasks.workunit.client.1.vm07.stdout:0/365: write d0/d62/f6e [521358,84286] 0 2026-03-10T12:37:40.638 INFO:tasks.workunit.client.1.vm07.stdout:0/366: truncate d0/d14/d1a/d2f/d31/d4f/f5c 828150 0 2026-03-10T12:37:40.639 INFO:tasks.workunit.client.1.vm07.stdout:0/367: truncate d0/d14/d1a/d2f/d31/f6f 175763 0 2026-03-10T12:37:40.648 INFO:tasks.workunit.client.1.vm07.stdout:0/368: link d0/d14/d1a/d2f/d31/d4f/d60/l72 d0/d14/d5f/d41/l73 0 2026-03-10T12:37:40.651 INFO:tasks.workunit.client.1.vm07.stdout:0/369: dwrite d0/d14/d5f/d41/f55 [0,4194304] 0 2026-03-10T12:37:40.661 INFO:tasks.workunit.client.1.vm07.stdout:0/370: dread d0/f1d [0,4194304] 0 2026-03-10T12:37:40.665 INFO:tasks.workunit.client.1.vm07.stdout:0/371: dwrite d0/d14/d1a/d2f/f5d [0,4194304] 0 2026-03-10T12:37:40.669 INFO:tasks.workunit.client.1.vm07.stdout:0/372: mkdir d0/d14/d5f/d41/d6a/d74 0 2026-03-10T12:37:40.670 INFO:tasks.workunit.client.1.vm07.stdout:0/373: creat d0/d14/d1a/d2f/d31/d4f/d60/f75 x:0 0 0 2026-03-10T12:37:40.687 INFO:tasks.workunit.client.0.vm00.stdout:9/116: fdatasync d0/f4 0 2026-03-10T12:37:40.687 INFO:tasks.workunit.client.0.vm00.stdout:9/117: dread - d0/d5/dc/f2a zero size 2026-03-10T12:37:40.688 INFO:tasks.workunit.client.0.vm00.stdout:9/118: symlink d0/d5/d16/l2e 0 2026-03-10T12:37:40.759 INFO:tasks.workunit.client.0.vm00.stdout:9/119: sync 2026-03-10T12:37:40.762 INFO:tasks.workunit.client.0.vm00.stdout:9/120: dwrite d0/f4 [4194304,4194304] 0 2026-03-10T12:37:40.796 INFO:tasks.workunit.client.1.vm07.stdout:4/320: fdatasync d0/d4/d10/d23/f50 0 2026-03-10T12:37:40.797 INFO:tasks.workunit.client.1.vm07.stdout:4/321: write d0/d4/d5/da/f4d [1297758,98612] 0 2026-03-10T12:37:40.799 INFO:tasks.workunit.client.1.vm07.stdout:4/322: chown d0/c29 11149414 1 2026-03-10T12:37:40.805 INFO:tasks.workunit.client.1.vm07.stdout:4/323: creat d0/d4/d5/da/f6e x:0 0 0 2026-03-10T12:37:40.809 INFO:tasks.workunit.client.1.vm07.stdout:4/324: mknod d0/d4/d10/d5f/d6d/c6f 0 2026-03-10T12:37:40.813 INFO:tasks.workunit.client.1.vm07.stdout:4/325: dwrite d0/d4/d10/d3c/f6c [0,4194304] 0 2026-03-10T12:37:40.834 INFO:tasks.workunit.client.1.vm07.stdout:4/326: dread d0/d4/d5/da/f4e [0,4194304] 0 2026-03-10T12:37:40.835 INFO:tasks.workunit.client.1.vm07.stdout:4/327: write d0/d4/d5/d34/f37 [2719122,38847] 0 2026-03-10T12:37:40.843 INFO:tasks.workunit.client.1.vm07.stdout:4/328: symlink d0/d4/d10/d5f/l70 0 2026-03-10T12:37:40.846 INFO:tasks.workunit.client.1.vm07.stdout:4/329: dwrite d0/d4/d10/d23/f27 [0,4194304] 0 2026-03-10T12:37:40.848 INFO:tasks.workunit.client.1.vm07.stdout:4/330: chown d0/d4/d10/d18/c28 3 1 2026-03-10T12:37:40.849 INFO:tasks.workunit.client.1.vm07.stdout:4/331: chown d0/d4/d5/da/f4e 0 1 2026-03-10T12:37:40.853 INFO:tasks.workunit.client.1.vm07.stdout:7/272: dread d0/f1e [0,4194304] 0 2026-03-10T12:37:40.860 INFO:tasks.workunit.client.1.vm07.stdout:1/318: dwrite d9/df/f15 [0,4194304] 0 2026-03-10T12:37:40.861 INFO:tasks.workunit.client.1.vm07.stdout:1/319: dread - d9/df/f58 zero size 2026-03-10T12:37:40.862 INFO:tasks.workunit.client.1.vm07.stdout:6/261: truncate d1/f38 1769441 0 2026-03-10T12:37:40.867 INFO:tasks.workunit.client.1.vm07.stdout:4/332: creat d0/d4/d10/d5f/d6d/f71 x:0 0 0 2026-03-10T12:37:40.874 INFO:tasks.workunit.client.1.vm07.stdout:1/320: symlink d9/df/d29/d2c/l66 0 2026-03-10T12:37:40.876 INFO:tasks.workunit.client.0.vm00.stdout:3/124: dwrite f7 [0,4194304] 0 2026-03-10T12:37:40.879 INFO:tasks.workunit.client.1.vm07.stdout:3/339: truncate dc/d18/f34 3471488 0 2026-03-10T12:37:40.879 INFO:tasks.workunit.client.1.vm07.stdout:9/265: rmdir d5/d13 39 2026-03-10T12:37:40.882 INFO:tasks.workunit.client.1.vm07.stdout:2/222: dwrite d0/d42/f2c [0,4194304] 0 2026-03-10T12:37:40.885 INFO:tasks.workunit.client.1.vm07.stdout:8/329: dwrite d1/d3/ff [0,4194304] 0 2026-03-10T12:37:40.886 INFO:tasks.workunit.client.1.vm07.stdout:8/330: chown d1/d3/d6/d50/l69 26964 1 2026-03-10T12:37:40.891 INFO:tasks.workunit.client.1.vm07.stdout:5/335: truncate d0/d22/d18/d19/d2e/d3f/f6a 3339455 0 2026-03-10T12:37:40.897 INFO:tasks.workunit.client.1.vm07.stdout:2/223: dwrite d0/d42/d26/f2e [8388608,4194304] 0 2026-03-10T12:37:40.904 INFO:tasks.workunit.client.1.vm07.stdout:5/336: dwrite d0/d22/d18/d19/d21/f38 [4194304,4194304] 0 2026-03-10T12:37:40.909 INFO:tasks.workunit.client.1.vm07.stdout:4/333: symlink d0/d4/d10/d5f/d6d/l72 0 2026-03-10T12:37:40.909 INFO:tasks.workunit.client.1.vm07.stdout:0/374: rename d0/d14/d1a to d0/d14/d5f/d76 0 2026-03-10T12:37:40.919 INFO:tasks.workunit.client.1.vm07.stdout:8/331: dwrite d1/d3/d40/f4c [0,4194304] 0 2026-03-10T12:37:40.929 INFO:tasks.workunit.client.1.vm07.stdout:4/334: symlink d0/d4/d10/d18/l73 0 2026-03-10T12:37:40.930 INFO:tasks.workunit.client.1.vm07.stdout:7/273: creat d0/f4f x:0 0 0 2026-03-10T12:37:40.930 INFO:tasks.workunit.client.1.vm07.stdout:4/335: fdatasync d0/d4/d10/d3c/f68 0 2026-03-10T12:37:40.931 INFO:tasks.workunit.client.1.vm07.stdout:9/266: creat d5/d13/d57/d3e/f5a x:0 0 0 2026-03-10T12:37:40.931 INFO:tasks.workunit.client.1.vm07.stdout:4/336: dread - d0/d4/d5/da/f6e zero size 2026-03-10T12:37:40.933 INFO:tasks.workunit.client.1.vm07.stdout:4/337: write d0/d4/d10/d18/f3e [2477073,57683] 0 2026-03-10T12:37:40.934 INFO:tasks.workunit.client.1.vm07.stdout:5/337: creat d0/d22/d18/d19/d2e/d3f/d5c/f76 x:0 0 0 2026-03-10T12:37:40.935 INFO:tasks.workunit.client.1.vm07.stdout:5/338: chown d0/c66 11707686 1 2026-03-10T12:37:40.937 INFO:tasks.workunit.client.1.vm07.stdout:1/321: rename d9/df/d29/d2b/d31/l4b to d9/df/d29/d2b/d31/l67 0 2026-03-10T12:37:40.938 INFO:tasks.workunit.client.1.vm07.stdout:4/338: dread d0/f53 [0,4194304] 0 2026-03-10T12:37:40.948 INFO:tasks.workunit.client.1.vm07.stdout:1/322: dread d9/f1b [0,4194304] 0 2026-03-10T12:37:40.948 INFO:tasks.workunit.client.1.vm07.stdout:1/323: write d9/fe [4218778,72061] 0 2026-03-10T12:37:40.948 INFO:tasks.workunit.client.0.vm00.stdout:4/101: getdents df 0 2026-03-10T12:37:40.948 INFO:tasks.workunit.client.0.vm00.stdout:4/102: chown df/f16 1044 1 2026-03-10T12:37:40.948 INFO:tasks.workunit.client.0.vm00.stdout:4/103: creat df/f1e x:0 0 0 2026-03-10T12:37:40.948 INFO:tasks.workunit.client.0.vm00.stdout:4/104: truncate df/f16 200148 0 2026-03-10T12:37:40.948 INFO:tasks.workunit.client.0.vm00.stdout:4/105: readlink lc 0 2026-03-10T12:37:40.949 INFO:tasks.workunit.client.1.vm07.stdout:9/267: symlink d5/d13/l5b 0 2026-03-10T12:37:40.950 INFO:tasks.workunit.client.1.vm07.stdout:9/268: write d5/d1f/d31/f43 [2911926,4536] 0 2026-03-10T12:37:40.953 INFO:tasks.workunit.client.1.vm07.stdout:8/332: link d1/f19 d1/d3/d18/f6d 0 2026-03-10T12:37:40.956 INFO:tasks.workunit.client.1.vm07.stdout:7/274: mkdir d0/d50 0 2026-03-10T12:37:40.960 INFO:tasks.workunit.client.0.vm00.stdout:3/125: dread f9 [0,4194304] 0 2026-03-10T12:37:40.960 INFO:tasks.workunit.client.0.vm00.stdout:3/126: fdatasync fb 0 2026-03-10T12:37:40.964 INFO:tasks.workunit.client.0.vm00.stdout:3/127: symlink dd/d27/l2e 0 2026-03-10T12:37:40.966 INFO:tasks.workunit.client.1.vm07.stdout:9/269: creat d5/d16/d23/d26/f5c x:0 0 0 2026-03-10T12:37:40.970 INFO:tasks.workunit.client.0.vm00.stdout:2/109: truncate d4/d6/f22 1003420 0 2026-03-10T12:37:40.972 INFO:tasks.workunit.client.1.vm07.stdout:8/333: symlink d1/d3/d18/l6e 0 2026-03-10T12:37:40.973 INFO:tasks.workunit.client.0.vm00.stdout:3/128: dread fb [0,4194304] 0 2026-03-10T12:37:40.973 INFO:tasks.workunit.client.1.vm07.stdout:8/334: dread d1/d3/d18/f2e [0,4194304] 0 2026-03-10T12:37:40.973 INFO:tasks.workunit.client.1.vm07.stdout:9/270: dwrite d5/d16/d18/f1e [4194304,4194304] 0 2026-03-10T12:37:40.973 INFO:tasks.workunit.client.0.vm00.stdout:2/110: symlink d4/d6/l24 0 2026-03-10T12:37:40.974 INFO:tasks.workunit.client.0.vm00.stdout:3/129: creat dd/d18/d14/f2f x:0 0 0 2026-03-10T12:37:40.976 INFO:tasks.workunit.client.0.vm00.stdout:2/111: mknod d4/dd/c25 0 2026-03-10T12:37:40.978 INFO:tasks.workunit.client.1.vm07.stdout:1/324: creat d9/df/d29/d2c/d59/f68 x:0 0 0 2026-03-10T12:37:40.979 INFO:tasks.workunit.client.1.vm07.stdout:7/275: read d0/f39 [6427,4891] 0 2026-03-10T12:37:40.980 INFO:tasks.workunit.client.1.vm07.stdout:7/276: write d0/f21 [3182405,36894] 0 2026-03-10T12:37:40.981 INFO:tasks.workunit.client.0.vm00.stdout:2/112: dwrite d4/f1d [0,4194304] 0 2026-03-10T12:37:40.984 INFO:tasks.workunit.client.0.vm00.stdout:3/130: link dd/d18/d13/d1d/l24 dd/d18/l30 0 2026-03-10T12:37:40.987 INFO:tasks.workunit.client.1.vm07.stdout:9/271: symlink d5/d13/d22/l5d 0 2026-03-10T12:37:40.988 INFO:tasks.workunit.client.0.vm00.stdout:3/131: dwrite fb [0,4194304] 0 2026-03-10T12:37:40.989 INFO:tasks.workunit.client.0.vm00.stdout:3/132: truncate dd/d18/f12 1012055 0 2026-03-10T12:37:40.990 INFO:tasks.workunit.client.0.vm00.stdout:2/113: write d4/dd/f10 [6163363,29728] 0 2026-03-10T12:37:40.998 INFO:tasks.workunit.client.1.vm07.stdout:7/277: creat d0/d47/f51 x:0 0 0 2026-03-10T12:37:40.998 INFO:tasks.workunit.client.1.vm07.stdout:1/325: dread - d9/df/f21 zero size 2026-03-10T12:37:40.999 INFO:tasks.workunit.client.1.vm07.stdout:9/272: rmdir d5/d13/d57 39 2026-03-10T12:37:41.000 INFO:tasks.workunit.client.0.vm00.stdout:2/114: dwrite d4/d6/f16 [4194304,4194304] 0 2026-03-10T12:37:41.005 INFO:tasks.workunit.client.0.vm00.stdout:3/133: creat dd/d18/d14/d2b/f31 x:0 0 0 2026-03-10T12:37:41.007 INFO:tasks.workunit.client.1.vm07.stdout:8/335: dwrite d1/d3/f8 [0,4194304] 0 2026-03-10T12:37:41.007 INFO:tasks.workunit.client.1.vm07.stdout:8/336: fdatasync d1/d3/d6/d50/f5e 0 2026-03-10T12:37:41.007 INFO:tasks.workunit.client.0.vm00.stdout:3/134: getdents dd/d2a 0 2026-03-10T12:37:41.007 INFO:tasks.workunit.client.0.vm00.stdout:3/135: chown dd/d18/d14/d2b/f31 1041943 1 2026-03-10T12:37:41.014 INFO:tasks.workunit.client.1.vm07.stdout:8/337: dread d1/d3/d40/f4c [0,4194304] 0 2026-03-10T12:37:41.019 INFO:tasks.workunit.client.0.vm00.stdout:6/215: rmdir d2/da/dc/d2f 39 2026-03-10T12:37:41.024 INFO:tasks.workunit.client.0.vm00.stdout:6/216: truncate d2/d14/f3b 794510 0 2026-03-10T12:37:41.024 INFO:tasks.workunit.client.0.vm00.stdout:0/220: truncate d3/d1b/f2a 3035456 0 2026-03-10T12:37:41.024 INFO:tasks.workunit.client.0.vm00.stdout:0/221: write f2 [2742031,73742] 0 2026-03-10T12:37:41.028 INFO:tasks.workunit.client.0.vm00.stdout:0/222: creat d3/f50 x:0 0 0 2026-03-10T12:37:41.030 INFO:tasks.workunit.client.1.vm07.stdout:7/278: mkdir d0/d52 0 2026-03-10T12:37:41.033 INFO:tasks.workunit.client.0.vm00.stdout:2/115: dread f1 [0,4194304] 0 2026-03-10T12:37:41.035 INFO:tasks.workunit.client.0.vm00.stdout:0/223: symlink d3/d1b/d38/l51 0 2026-03-10T12:37:41.039 INFO:tasks.workunit.client.0.vm00.stdout:0/224: rmdir d3/db 39 2026-03-10T12:37:41.039 INFO:tasks.workunit.client.1.vm07.stdout:7/279: creat d0/d47/d48/f53 x:0 0 0 2026-03-10T12:37:41.040 INFO:tasks.workunit.client.0.vm00.stdout:9/121: getdents d0 0 2026-03-10T12:37:41.041 INFO:tasks.workunit.client.0.vm00.stdout:9/122: chown d0/d5/d16 2825770 1 2026-03-10T12:37:41.046 INFO:tasks.workunit.client.0.vm00.stdout:0/225: unlink d3/fd 0 2026-03-10T12:37:41.053 INFO:tasks.workunit.client.0.vm00.stdout:0/226: dwrite d3/d1b/f2b [0,4194304] 0 2026-03-10T12:37:41.053 INFO:tasks.workunit.client.1.vm07.stdout:8/338: getdents d1/d3/d6 0 2026-03-10T12:37:41.053 INFO:tasks.workunit.client.1.vm07.stdout:8/339: chown d1/d3/d40/c58 18560458 1 2026-03-10T12:37:41.053 INFO:tasks.workunit.client.1.vm07.stdout:8/340: fsync d1/f6b 0 2026-03-10T12:37:41.055 INFO:tasks.workunit.client.0.vm00.stdout:5/106: dwrite d1f/f22 [0,4194304] 0 2026-03-10T12:37:41.057 INFO:tasks.workunit.client.0.vm00.stdout:5/107: write f11 [4947287,71252] 0 2026-03-10T12:37:41.063 INFO:tasks.workunit.client.0.vm00.stdout:9/123: mkdir d0/d5/d16/d19/d2f 0 2026-03-10T12:37:41.071 INFO:tasks.workunit.client.1.vm07.stdout:8/341: mknod d1/d3/d6/c6f 0 2026-03-10T12:37:41.071 INFO:tasks.workunit.client.1.vm07.stdout:8/342: read d1/d3/d18/f32 [1879287,66353] 0 2026-03-10T12:37:41.071 INFO:tasks.workunit.client.1.vm07.stdout:7/280: rmdir d0/d50 0 2026-03-10T12:37:41.071 INFO:tasks.workunit.client.0.vm00.stdout:8/123: truncate d0/f22 2835534 0 2026-03-10T12:37:41.071 INFO:tasks.workunit.client.0.vm00.stdout:8/124: fdatasync d0/f9 0 2026-03-10T12:37:41.071 INFO:tasks.workunit.client.0.vm00.stdout:7/140: write f6 [414289,69134] 0 2026-03-10T12:37:41.076 INFO:tasks.workunit.client.1.vm07.stdout:7/281: unlink d0/f4d 0 2026-03-10T12:37:41.076 INFO:tasks.workunit.client.0.vm00.stdout:5/108: dwrite d1f/f21 [0,4194304] 0 2026-03-10T12:37:41.079 INFO:tasks.workunit.client.1.vm07.stdout:7/282: rename d0/f29 to d0/d47/d48/f54 0 2026-03-10T12:37:41.081 INFO:tasks.workunit.client.1.vm07.stdout:7/283: symlink d0/d52/l55 0 2026-03-10T12:37:41.081 INFO:tasks.workunit.client.1.vm07.stdout:7/284: read d0/f1e [399943,124817] 0 2026-03-10T12:37:41.084 INFO:tasks.workunit.client.1.vm07.stdout:7/285: creat d0/f56 x:0 0 0 2026-03-10T12:37:41.086 INFO:tasks.workunit.client.0.vm00.stdout:7/141: mknod da/d25/d2c/c3a 0 2026-03-10T12:37:41.087 INFO:tasks.workunit.client.0.vm00.stdout:7/142: read - da/d1b/f39 zero size 2026-03-10T12:37:41.088 INFO:tasks.workunit.client.0.vm00.stdout:9/124: link d0/f1a d0/d5/d16/f30 0 2026-03-10T12:37:41.089 INFO:tasks.workunit.client.0.vm00.stdout:9/125: truncate d0/f1a 1017112 0 2026-03-10T12:37:41.090 INFO:tasks.workunit.client.0.vm00.stdout:9/126: truncate d0/d5/d16/f30 1721047 0 2026-03-10T12:37:41.090 INFO:tasks.workunit.client.0.vm00.stdout:9/127: chown d0/d5/d16/f30 9234727 1 2026-03-10T12:37:41.092 INFO:tasks.workunit.client.0.vm00.stdout:0/227: sync 2026-03-10T12:37:41.094 INFO:tasks.workunit.client.0.vm00.stdout:9/128: dwrite d0/d5/dc/f2a [0,4194304] 0 2026-03-10T12:37:41.098 INFO:tasks.workunit.client.0.vm00.stdout:8/125: mknod d0/dd/c25 0 2026-03-10T12:37:41.101 INFO:tasks.workunit.client.0.vm00.stdout:7/143: dwrite da/d25/d2c/f30 [0,4194304] 0 2026-03-10T12:37:41.108 INFO:tasks.workunit.client.0.vm00.stdout:9/129: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:37:41.112 INFO:tasks.workunit.client.1.vm07.stdout:7/286: dread d0/f37 [0,4194304] 0 2026-03-10T12:37:41.131 INFO:tasks.workunit.client.0.vm00.stdout:7/144: symlink da/d1b/d2d/l3b 0 2026-03-10T12:37:41.131 INFO:tasks.workunit.client.0.vm00.stdout:7/145: write f6 [711309,116501] 0 2026-03-10T12:37:41.131 INFO:tasks.workunit.client.0.vm00.stdout:7/146: dread da/fe [0,4194304] 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/339: fsync d0/d4/d10/d18/f3e 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/340: rmdir d0/d4/d5/da 39 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:7/287: mkdir d0/d57 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/341: mknod d0/d4/d5/da/d66/c74 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/342: creat d0/d4/d5/f75 x:0 0 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/343: mkdir d0/d4/d10/d23/d46/d76 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/344: mknod d0/d4/c77 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/345: dwrite d0/d4/d5/d34/f37 [0,4194304] 0 2026-03-10T12:37:41.132 INFO:tasks.workunit.client.1.vm07.stdout:4/346: dread d0/d4/d10/d3c/d2b/d2d/f65 [0,4194304] 0 2026-03-10T12:37:41.139 INFO:tasks.workunit.client.0.vm00.stdout:7/147: symlink da/d1b/l3c 0 2026-03-10T12:37:41.144 INFO:tasks.workunit.client.0.vm00.stdout:9/130: creat d0/d5/d16/d1e/d2b/f31 x:0 0 0 2026-03-10T12:37:41.149 INFO:tasks.workunit.client.1.vm07.stdout:0/375: creat d0/d14/d5f/d41/f77 x:0 0 0 2026-03-10T12:37:41.151 INFO:tasks.workunit.client.1.vm07.stdout:0/376: write d0/d14/d5f/d76/d2f/d31/d4f/f61 [1863122,73116] 0 2026-03-10T12:37:41.152 INFO:tasks.workunit.client.0.vm00.stdout:9/131: creat d0/d5/d16/d19/f32 x:0 0 0 2026-03-10T12:37:41.152 INFO:tasks.workunit.client.1.vm07.stdout:0/377: write d0/d62/f6e [667005,22985] 0 2026-03-10T12:37:41.161 INFO:tasks.workunit.client.1.vm07.stdout:0/378: rename d0/d14/d5f/d76/f24 to d0/d14/d5f/d76/f78 0 2026-03-10T12:37:41.172 INFO:tasks.workunit.client.1.vm07.stdout:0/379: dread d0/d14/d5f/d76/f27 [4194304,4194304] 0 2026-03-10T12:37:41.190 INFO:tasks.workunit.client.0.vm00.stdout:4/106: getdents df 0 2026-03-10T12:37:41.193 INFO:tasks.workunit.client.0.vm00.stdout:4/107: mkdir df/d1f 0 2026-03-10T12:37:41.195 INFO:tasks.workunit.client.0.vm00.stdout:4/108: creat df/f20 x:0 0 0 2026-03-10T12:37:41.251 INFO:tasks.workunit.client.1.vm07.stdout:6/262: truncate d1/d4/d6/f2a 1406432 0 2026-03-10T12:37:41.255 INFO:tasks.workunit.client.1.vm07.stdout:2/224: dwrite d0/d42/d1f/d20/f2b [0,4194304] 0 2026-03-10T12:37:41.260 INFO:tasks.workunit.client.1.vm07.stdout:2/225: unlink d0/d29/f2a 0 2026-03-10T12:37:41.263 INFO:tasks.workunit.client.1.vm07.stdout:2/226: fdatasync d0/d42/f1e 0 2026-03-10T12:37:41.265 INFO:tasks.workunit.client.1.vm07.stdout:2/227: truncate d0/d42/f1b 1081553 0 2026-03-10T12:37:41.267 INFO:tasks.workunit.client.1.vm07.stdout:2/228: write d0/f40 [130111,61608] 0 2026-03-10T12:37:41.270 INFO:tasks.workunit.client.1.vm07.stdout:2/229: truncate d0/d42/d26/f48 96596 0 2026-03-10T12:37:41.272 INFO:tasks.workunit.client.1.vm07.stdout:2/230: dread d0/f12 [0,4194304] 0 2026-03-10T12:37:41.274 INFO:tasks.workunit.client.1.vm07.stdout:2/231: dread d0/f12 [0,4194304] 0 2026-03-10T12:37:41.349 INFO:tasks.workunit.client.1.vm07.stdout:6/263: sync 2026-03-10T12:37:41.352 INFO:tasks.workunit.client.1.vm07.stdout:6/264: rename d1/d4/d6/d16/d1a/c42 to d1/d4/d6/d4e/c58 0 2026-03-10T12:37:41.354 INFO:tasks.workunit.client.1.vm07.stdout:6/265: fsync d1/d4/d6/d16/d1a/f29 0 2026-03-10T12:37:41.360 INFO:tasks.workunit.client.1.vm07.stdout:3/340: truncate dc/f17 3021900 0 2026-03-10T12:37:41.363 INFO:tasks.workunit.client.1.vm07.stdout:6/266: dread d1/d4/f11 [0,4194304] 0 2026-03-10T12:37:41.369 INFO:tasks.workunit.client.1.vm07.stdout:3/341: dwrite dc/dd/d1f/d45/f68 [0,4194304] 0 2026-03-10T12:37:41.370 INFO:tasks.workunit.client.1.vm07.stdout:3/342: dread - dc/dd/f41 zero size 2026-03-10T12:37:41.372 INFO:tasks.workunit.client.1.vm07.stdout:3/343: mkdir dc/dd/d28/d7a 0 2026-03-10T12:37:41.381 INFO:tasks.workunit.client.1.vm07.stdout:3/344: dread dc/d18/d24/f3e [0,4194304] 0 2026-03-10T12:37:41.383 INFO:tasks.workunit.client.1.vm07.stdout:3/345: stat dc/dd/d28/l2b 0 2026-03-10T12:37:41.383 INFO:tasks.workunit.client.1.vm07.stdout:3/346: readlink dc/l38 0 2026-03-10T12:37:41.383 INFO:tasks.workunit.client.1.vm07.stdout:3/347: chown dc/dd/d43/l52 2235 1 2026-03-10T12:37:41.383 INFO:tasks.workunit.client.1.vm07.stdout:3/348: mknod dc/d18/d2d/d3d/c7b 0 2026-03-10T12:37:41.384 INFO:tasks.workunit.client.1.vm07.stdout:3/349: write dc/d18/d24/f3e [3820448,8243] 0 2026-03-10T12:37:41.391 INFO:tasks.workunit.client.0.vm00.stdout:1/106: write da/fc [1007304,24958] 0 2026-03-10T12:37:41.394 INFO:tasks.workunit.client.0.vm00.stdout:1/107: mknod da/d24/d28/c2d 0 2026-03-10T12:37:41.397 INFO:tasks.workunit.client.0.vm00.stdout:1/108: creat da/d12/d26/f2e x:0 0 0 2026-03-10T12:37:41.397 INFO:tasks.workunit.client.0.vm00.stdout:1/109: chown da/fc 461076 1 2026-03-10T12:37:41.399 INFO:tasks.workunit.client.0.vm00.stdout:1/110: symlink da/d24/l2f 0 2026-03-10T12:37:41.400 INFO:tasks.workunit.client.0.vm00.stdout:1/111: dread f4 [0,4194304] 0 2026-03-10T12:37:41.401 INFO:tasks.workunit.client.0.vm00.stdout:1/112: stat da/d12/d26 0 2026-03-10T12:37:41.401 INFO:tasks.workunit.client.0.vm00.stdout:1/113: truncate f3 1153116 0 2026-03-10T12:37:41.402 INFO:tasks.workunit.client.0.vm00.stdout:1/114: write da/d12/f1f [166813,44410] 0 2026-03-10T12:37:41.406 INFO:tasks.workunit.client.0.vm00.stdout:1/115: creat da/d12/f30 x:0 0 0 2026-03-10T12:37:41.406 INFO:tasks.workunit.client.0.vm00.stdout:1/116: chown da 895252189 1 2026-03-10T12:37:41.407 INFO:tasks.workunit.client.0.vm00.stdout:1/117: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:41.411 INFO:tasks.workunit.client.0.vm00.stdout:1/118: dwrite da/d12/f30 [0,4194304] 0 2026-03-10T12:37:41.445 INFO:tasks.workunit.client.1.vm07.stdout:1/326: truncate d9/fc 3138481 0 2026-03-10T12:37:41.446 INFO:tasks.workunit.client.0.vm00.stdout:7/148: rmdir da/d25 39 2026-03-10T12:37:41.446 INFO:tasks.workunit.client.1.vm07.stdout:5/339: dwrite d0/d22/d18/d19/f2c [0,4194304] 0 2026-03-10T12:37:41.447 INFO:tasks.workunit.client.0.vm00.stdout:7/149: readlink da/d25/d2c/l31 0 2026-03-10T12:37:41.447 INFO:tasks.workunit.client.0.vm00.stdout:7/150: stat da/d1b 0 2026-03-10T12:37:41.448 INFO:tasks.workunit.client.0.vm00.stdout:7/151: chown da/d1b/f22 14 1 2026-03-10T12:37:41.449 INFO:tasks.workunit.client.1.vm07.stdout:9/273: truncate d5/f1c 4921603 0 2026-03-10T12:37:41.449 INFO:tasks.workunit.client.1.vm07.stdout:5/340: fdatasync d0/f9 0 2026-03-10T12:37:41.452 INFO:tasks.workunit.client.1.vm07.stdout:4/347: fdatasync d0/d4/d10/f4b 0 2026-03-10T12:37:41.454 INFO:tasks.workunit.client.0.vm00.stdout:5/109: truncate d1f/f21 2466734 0 2026-03-10T12:37:41.458 INFO:tasks.workunit.client.0.vm00.stdout:5/110: mkdir d1f/d26 0 2026-03-10T12:37:41.463 INFO:tasks.workunit.client.0.vm00.stdout:3/136: unlink dd/f16 0 2026-03-10T12:37:41.463 INFO:tasks.workunit.client.0.vm00.stdout:5/111: creat d1f/f27 x:0 0 0 2026-03-10T12:37:41.463 INFO:tasks.workunit.client.1.vm07.stdout:9/274: mkdir d5/d1f/d5e 0 2026-03-10T12:37:41.464 INFO:tasks.workunit.client.0.vm00.stdout:7/152: getdents da/d25 0 2026-03-10T12:37:41.464 INFO:tasks.workunit.client.1.vm07.stdout:9/275: write d5/d1f/d31/f43 [2668486,83077] 0 2026-03-10T12:37:41.466 INFO:tasks.workunit.client.0.vm00.stdout:7/153: mknod da/d26/c3d 0 2026-03-10T12:37:41.468 INFO:tasks.workunit.client.0.vm00.stdout:5/112: creat d1f/d26/f28 x:0 0 0 2026-03-10T12:37:41.469 INFO:tasks.workunit.client.0.vm00.stdout:5/113: stat c10 0 2026-03-10T12:37:41.470 INFO:tasks.workunit.client.0.vm00.stdout:7/154: mknod da/d1b/c3e 0 2026-03-10T12:37:41.473 INFO:tasks.workunit.client.1.vm07.stdout:4/348: dread d0/d4/d10/d3c/f22 [0,4194304] 0 2026-03-10T12:37:41.474 INFO:tasks.workunit.client.0.vm00.stdout:8/126: dwrite d0/dd/f20 [0,4194304] 0 2026-03-10T12:37:41.474 INFO:tasks.workunit.client.0.vm00.stdout:7/155: dread f1 [0,4194304] 0 2026-03-10T12:37:41.475 INFO:tasks.workunit.client.1.vm07.stdout:9/276: dread d5/f8 [4194304,4194304] 0 2026-03-10T12:37:41.475 INFO:tasks.workunit.client.0.vm00.stdout:7/156: chown da/f16 11865583 1 2026-03-10T12:37:41.477 INFO:tasks.workunit.client.1.vm07.stdout:0/380: truncate d0/d14/d5f/d76/f78 3038839 0 2026-03-10T12:37:41.479 INFO:tasks.workunit.client.1.vm07.stdout:5/341: mkdir d0/d22/d18/d19/d36/d75/d77 0 2026-03-10T12:37:41.483 INFO:tasks.workunit.client.0.vm00.stdout:6/217: mkdir d2/d51 0 2026-03-10T12:37:41.487 INFO:tasks.workunit.client.1.vm07.stdout:5/342: dread d0/ff [4194304,4194304] 0 2026-03-10T12:37:41.490 INFO:tasks.workunit.client.0.vm00.stdout:2/116: rename d4/c1f to d4/dd/c26 0 2026-03-10T12:37:41.490 INFO:tasks.workunit.client.1.vm07.stdout:2/232: write d0/d42/f1e [1145859,50209] 0 2026-03-10T12:37:41.494 INFO:tasks.workunit.client.0.vm00.stdout:3/137: getdents dd/d18/d13 0 2026-03-10T12:37:41.498 INFO:tasks.workunit.client.0.vm00.stdout:0/228: rename d3/d33/c4a to d3/d7/c52 0 2026-03-10T12:37:41.498 INFO:tasks.workunit.client.0.vm00.stdout:0/229: dread - d3/f50 zero size 2026-03-10T12:37:41.502 INFO:tasks.workunit.client.1.vm07.stdout:9/277: rename d5/d16/d23/d26/f40 to d5/d13/d22/f5f 0 2026-03-10T12:37:41.504 INFO:tasks.workunit.client.1.vm07.stdout:0/381: mkdir d0/d14/d5f/d76/d2f/d31/d79 0 2026-03-10T12:37:41.509 INFO:tasks.workunit.client.1.vm07.stdout:0/382: dwrite d0/f15 [0,4194304] 0 2026-03-10T12:37:41.511 INFO:tasks.workunit.client.0.vm00.stdout:0/230: creat d3/d33/f53 x:0 0 0 2026-03-10T12:37:41.512 INFO:tasks.workunit.client.0.vm00.stdout:0/231: write d3/d22/f2e [2902219,105504] 0 2026-03-10T12:37:41.513 INFO:tasks.workunit.client.0.vm00.stdout:0/232: dread - d3/d22/f46 zero size 2026-03-10T12:37:41.516 INFO:tasks.workunit.client.0.vm00.stdout:7/157: truncate da/d1b/f22 209540 0 2026-03-10T12:37:41.517 INFO:tasks.workunit.client.1.vm07.stdout:5/343: write d0/d22/d18/d19/d21/f2d [4002797,119175] 0 2026-03-10T12:37:41.518 INFO:tasks.workunit.client.0.vm00.stdout:3/138: mknod dd/d2a/c32 0 2026-03-10T12:37:41.522 INFO:tasks.workunit.client.0.vm00.stdout:9/132: rename d0/d5/d16/l13 to d0/d5/d16/d1e/l33 0 2026-03-10T12:37:41.524 INFO:tasks.workunit.client.0.vm00.stdout:2/117: link d4/c5 d4/d6/c27 0 2026-03-10T12:37:41.526 INFO:tasks.workunit.client.0.vm00.stdout:9/133: dwrite d0/f21 [4194304,4194304] 0 2026-03-10T12:37:41.528 INFO:tasks.workunit.client.0.vm00.stdout:3/139: rmdir dd/d18 39 2026-03-10T12:37:41.530 INFO:tasks.workunit.client.1.vm07.stdout:4/349: mkdir d0/d4/d5/d78 0 2026-03-10T12:37:41.534 INFO:tasks.workunit.client.0.vm00.stdout:3/140: dwrite dd/f25 [0,4194304] 0 2026-03-10T12:37:41.534 INFO:tasks.workunit.client.0.vm00.stdout:6/218: link d2/d14/l2d d2/da/l52 0 2026-03-10T12:37:41.534 INFO:tasks.workunit.client.1.vm07.stdout:4/350: stat d0/d4/d10/d3c/f68 0 2026-03-10T12:37:41.534 INFO:tasks.workunit.client.1.vm07.stdout:4/351: truncate d0/d4/d5/d34/f5d 206493 0 2026-03-10T12:37:41.541 INFO:tasks.workunit.client.0.vm00.stdout:4/109: rename lc to df/d1f/l21 0 2026-03-10T12:37:41.542 INFO:tasks.workunit.client.0.vm00.stdout:2/118: write f1 [2976086,130923] 0 2026-03-10T12:37:41.544 INFO:tasks.workunit.client.0.vm00.stdout:7/158: mkdir da/d3f 0 2026-03-10T12:37:41.548 INFO:tasks.workunit.client.0.vm00.stdout:1/119: rename da/d12/f1f to da/d12/d26/f31 0 2026-03-10T12:37:41.548 INFO:tasks.workunit.client.1.vm07.stdout:9/278: symlink d5/d16/d23/d26/l60 0 2026-03-10T12:37:41.548 INFO:tasks.workunit.client.0.vm00.stdout:1/120: write da/f14 [916773,17791] 0 2026-03-10T12:37:41.549 INFO:tasks.workunit.client.0.vm00.stdout:4/110: mkdir df/d1f/d22 0 2026-03-10T12:37:41.550 INFO:tasks.workunit.client.0.vm00.stdout:7/159: chown da/l19 83801952 1 2026-03-10T12:37:41.554 INFO:tasks.workunit.client.1.vm07.stdout:0/383: stat d0/d14/d5f/d76/d2f/d31/d4f/l50 0 2026-03-10T12:37:41.554 INFO:tasks.workunit.client.0.vm00.stdout:9/134: link d0/f17 d0/d5/d16/f34 0 2026-03-10T12:37:41.554 INFO:tasks.workunit.client.0.vm00.stdout:9/135: read d0/d5/d16/f34 [1311258,51983] 0 2026-03-10T12:37:41.557 INFO:tasks.workunit.client.0.vm00.stdout:6/219: unlink d2/da/dc/d2f/c3e 0 2026-03-10T12:37:41.557 INFO:tasks.workunit.client.0.vm00.stdout:6/220: write d2/d14/f32 [654345,38512] 0 2026-03-10T12:37:41.560 INFO:tasks.workunit.client.1.vm07.stdout:5/344: symlink d0/d22/d18/d19/d2e/l78 0 2026-03-10T12:37:41.561 INFO:tasks.workunit.client.1.vm07.stdout:5/345: readlink d0/d22/d18/d19/d21/d3a/l43 0 2026-03-10T12:37:41.563 INFO:tasks.workunit.client.0.vm00.stdout:1/121: dwrite f4 [0,4194304] 0 2026-03-10T12:37:41.563 INFO:tasks.workunit.client.1.vm07.stdout:1/327: link d9/df/d29/d2b/d30/l50 d9/l69 0 2026-03-10T12:37:41.564 INFO:tasks.workunit.client.0.vm00.stdout:4/111: dwrite df/f11 [0,4194304] 0 2026-03-10T12:37:41.565 INFO:tasks.workunit.client.0.vm00.stdout:4/112: readlink l5 0 2026-03-10T12:37:41.566 INFO:tasks.workunit.client.0.vm00.stdout:4/113: write f3 [5115303,97279] 0 2026-03-10T12:37:41.566 INFO:tasks.workunit.client.0.vm00.stdout:7/160: read da/fe [2898882,26692] 0 2026-03-10T12:37:41.570 INFO:tasks.workunit.client.0.vm00.stdout:2/119: creat d4/f28 x:0 0 0 2026-03-10T12:37:41.570 INFO:tasks.workunit.client.1.vm07.stdout:4/352: symlink d0/d4/d10/d3c/l79 0 2026-03-10T12:37:41.571 INFO:tasks.workunit.client.0.vm00.stdout:1/122: creat da/d24/f32 x:0 0 0 2026-03-10T12:37:41.576 INFO:tasks.workunit.client.1.vm07.stdout:5/346: unlink d0/f1e 0 2026-03-10T12:37:41.576 INFO:tasks.workunit.client.0.vm00.stdout:9/136: mkdir d0/d5/d16/d19/d2f/d35 0 2026-03-10T12:37:41.577 INFO:tasks.workunit.client.0.vm00.stdout:9/137: write d0/d5/dc/f2a [3096468,84119] 0 2026-03-10T12:37:41.578 INFO:tasks.workunit.client.0.vm00.stdout:6/221: symlink d2/d51/l53 0 2026-03-10T12:37:41.580 INFO:tasks.workunit.client.1.vm07.stdout:5/347: dwrite d0/d22/f27 [0,4194304] 0 2026-03-10T12:37:41.580 INFO:tasks.workunit.client.0.vm00.stdout:7/161: fsync f1 0 2026-03-10T12:37:41.583 INFO:tasks.workunit.client.1.vm07.stdout:9/279: mknod d5/d13/c61 0 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.0.vm00.stdout:9/138: creat d0/d5/d16/d1e/d2b/f36 x:0 0 0 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.0.vm00.stdout:6/222: stat d2/d39/f46 0 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.0.vm00.stdout:4/114: symlink df/l23 0 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.0.vm00.stdout:4/115: dwrite f3 [4194304,4194304] 0 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.1.vm07.stdout:0/384: getdents d0/d14/d5f/d76/d2f/d31/d6b 0 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.1.vm07.stdout:0/385: truncate d0/d14/d5f/d76/d2f/d31/d4f/f5c 1410482 0 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.1.vm07.stdout:5/348: chown d0/d22/d18/l4e 136 1 2026-03-10T12:37:41.595 INFO:tasks.workunit.client.1.vm07.stdout:1/328: rename d9/c14 to d9/df/c6a 0 2026-03-10T12:37:41.596 INFO:tasks.workunit.client.1.vm07.stdout:5/349: creat d0/d22/d18/d19/d36/f79 x:0 0 0 2026-03-10T12:37:41.596 INFO:tasks.workunit.client.0.vm00.stdout:7/162: rmdir da/d1b/d2d 39 2026-03-10T12:37:41.598 INFO:tasks.workunit.client.1.vm07.stdout:4/353: rename d0/d4/d10/d23 to d0/d4/d7a 0 2026-03-10T12:37:41.599 INFO:tasks.workunit.client.1.vm07.stdout:4/354: write d0/d4/d5/f43 [419295,68367] 0 2026-03-10T12:37:41.600 INFO:tasks.workunit.client.0.vm00.stdout:7/163: dwrite da/f16 [0,4194304] 0 2026-03-10T12:37:41.602 INFO:tasks.workunit.client.0.vm00.stdout:2/120: symlink d4/dd/l29 0 2026-03-10T12:37:41.608 INFO:tasks.workunit.client.0.vm00.stdout:9/139: mknod d0/d5/d16/d19/d2f/c37 0 2026-03-10T12:37:41.610 INFO:tasks.workunit.client.0.vm00.stdout:9/140: dread d0/d5/dc/f2a [0,4194304] 0 2026-03-10T12:37:41.616 INFO:tasks.workunit.client.0.vm00.stdout:9/141: dread - d0/d5/d16/d1e/d2b/f31 zero size 2026-03-10T12:37:41.617 INFO:tasks.workunit.client.0.vm00.stdout:2/121: rmdir d4/dd 39 2026-03-10T12:37:41.626 INFO:tasks.workunit.client.0.vm00.stdout:9/142: readlink d0/d5/dc/l25 0 2026-03-10T12:37:41.627 INFO:tasks.workunit.client.0.vm00.stdout:9/143: dread - d0/d5/d16/d19/f32 zero size 2026-03-10T12:37:41.629 INFO:tasks.workunit.client.1.vm07.stdout:4/355: symlink d0/d4/d10/d3c/d2b/d54/l7b 0 2026-03-10T12:37:41.629 INFO:tasks.workunit.client.1.vm07.stdout:4/356: chown d0/d5c 40809148 1 2026-03-10T12:37:41.630 INFO:tasks.workunit.client.1.vm07.stdout:1/329: mkdir d9/df/d29/d6b 0 2026-03-10T12:37:41.631 INFO:tasks.workunit.client.1.vm07.stdout:9/280: getdents d5/d1f/d31 0 2026-03-10T12:37:41.631 INFO:tasks.workunit.client.0.vm00.stdout:9/144: dwrite d0/f4 [8388608,4194304] 0 2026-03-10T12:37:41.632 INFO:tasks.workunit.client.0.vm00.stdout:7/164: mkdir da/d1b/d40 0 2026-03-10T12:37:41.634 INFO:tasks.workunit.client.0.vm00.stdout:2/122: chown d4/dd/c25 25 1 2026-03-10T12:37:41.638 INFO:tasks.workunit.client.1.vm07.stdout:5/350: unlink d0/d22/d18/d19/d21/d54/l5a 0 2026-03-10T12:37:41.654 INFO:tasks.workunit.client.0.vm00.stdout:2/123: mknod d4/d6/c2a 0 2026-03-10T12:37:41.654 INFO:tasks.workunit.client.0.vm00.stdout:2/124: dread - d4/dd/ff zero size 2026-03-10T12:37:41.656 INFO:tasks.workunit.client.1.vm07.stdout:3/350: truncate dc/dd/d1f/d45/f56 3218393 0 2026-03-10T12:37:41.656 INFO:tasks.workunit.client.1.vm07.stdout:3/351: stat c9 0 2026-03-10T12:37:41.660 INFO:tasks.workunit.client.0.vm00.stdout:6/223: getdents d2/d16 0 2026-03-10T12:37:41.662 INFO:tasks.workunit.client.0.vm00.stdout:7/165: mkdir da/d41 0 2026-03-10T12:37:41.663 INFO:tasks.workunit.client.0.vm00.stdout:7/166: write f9 [1044968,62850] 0 2026-03-10T12:37:41.664 INFO:tasks.workunit.client.1.vm07.stdout:4/357: mkdir d0/d5c/d7c 0 2026-03-10T12:37:41.665 INFO:tasks.workunit.client.1.vm07.stdout:4/358: write d0/d4/d7a/f2e [520720,28547] 0 2026-03-10T12:37:41.666 INFO:tasks.workunit.client.0.vm00.stdout:7/167: dwrite da/d25/f2b [0,4194304] 0 2026-03-10T12:37:41.668 INFO:tasks.workunit.client.1.vm07.stdout:5/351: link d0/l3 d0/d22/d18/d19/d21/d54/l7a 0 2026-03-10T12:37:41.668 INFO:tasks.workunit.client.1.vm07.stdout:5/352: chown d0/d22/d18/d19/d21/d3a/c57 498116216 1 2026-03-10T12:37:41.670 INFO:tasks.workunit.client.0.vm00.stdout:9/145: rename d0/d5/dc/l25 to d0/d5/dc/l38 0 2026-03-10T12:37:41.671 INFO:tasks.workunit.client.0.vm00.stdout:1/123: fsync da/d12/d26/f2e 0 2026-03-10T12:37:41.673 INFO:tasks.workunit.client.0.vm00.stdout:9/146: dwrite d0/d5/d16/d19/f32 [0,4194304] 0 2026-03-10T12:37:41.675 INFO:tasks.workunit.client.0.vm00.stdout:9/147: write d0/f4 [5272176,18121] 0 2026-03-10T12:37:41.687 INFO:tasks.workunit.client.0.vm00.stdout:9/148: dwrite d0/d5/d16/f24 [0,4194304] 0 2026-03-10T12:37:41.688 INFO:tasks.workunit.client.0.vm00.stdout:2/125: creat d4/d6/f2b x:0 0 0 2026-03-10T12:37:41.688 INFO:tasks.workunit.client.0.vm00.stdout:2/126: write d4/f28 [373062,104425] 0 2026-03-10T12:37:41.688 INFO:tasks.workunit.client.0.vm00.stdout:2/127: dwrite d4/f1d [4194304,4194304] 0 2026-03-10T12:37:41.688 INFO:tasks.workunit.client.0.vm00.stdout:2/128: chown d4/l1a 0 1 2026-03-10T12:37:41.701 INFO:tasks.workunit.client.1.vm07.stdout:9/281: sync 2026-03-10T12:37:41.705 INFO:tasks.workunit.client.1.vm07.stdout:9/282: dwrite d5/d1f/d31/f43 [0,4194304] 0 2026-03-10T12:37:41.706 INFO:tasks.workunit.client.0.vm00.stdout:1/124: mkdir da/d21/d33 0 2026-03-10T12:37:41.708 INFO:tasks.workunit.client.0.vm00.stdout:1/125: dread da/d12/f30 [0,4194304] 0 2026-03-10T12:37:41.710 INFO:tasks.workunit.client.1.vm07.stdout:4/359: fsync d0/d4/d10/f36 0 2026-03-10T12:37:41.713 INFO:tasks.workunit.client.1.vm07.stdout:4/360: link d0/d4/d10/d5f/c67 d0/d19/c7d 0 2026-03-10T12:37:41.714 INFO:tasks.workunit.client.1.vm07.stdout:4/361: symlink d0/d4/d10/d5f/l7e 0 2026-03-10T12:37:41.717 INFO:tasks.workunit.client.0.vm00.stdout:9/149: creat d0/d5/d16/f39 x:0 0 0 2026-03-10T12:37:41.718 INFO:tasks.workunit.client.0.vm00.stdout:9/150: write d0/d5/d16/d1e/d27/f28 [79411,54702] 0 2026-03-10T12:37:41.741 INFO:tasks.workunit.client.1.vm07.stdout:4/362: dread d0/d4/d5/f43 [0,4194304] 0 2026-03-10T12:37:41.741 INFO:tasks.workunit.client.1.vm07.stdout:4/363: dread - d0/d4/d10/d5f/d6d/f71 zero size 2026-03-10T12:37:41.741 INFO:tasks.workunit.client.1.vm07.stdout:4/364: link d0/d4/l1d d0/d19/l7f 0 2026-03-10T12:37:41.741 INFO:tasks.workunit.client.1.vm07.stdout:4/365: dread d0/d4/d5/d34/f37 [0,4194304] 0 2026-03-10T12:37:41.741 INFO:tasks.workunit.client.1.vm07.stdout:4/366: write d0/d4/d5/da/f48 [3487065,42060] 0 2026-03-10T12:37:41.741 INFO:tasks.workunit.client.0.vm00.stdout:9/151: dwrite d0/d5/d16/d1e/d2b/f36 [0,4194304] 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:9/152: dread - d0/d5/d16/d1e/d2b/f31 zero size 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:7/168: symlink da/d1b/d40/l42 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:1/126: truncate da/d12/f1a 806819 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:8/127: write d0/f10 [7171573,58657] 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:5/114: truncate f1b 358038 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:5/115: chown c7 18 1 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:1/127: write da/d12/f30 [3769867,106899] 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:1/128: chown da/ce 92120 1 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:8/128: mknod d0/d12/d17/c26 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:7/169: creat da/d25/d2e/f43 x:0 0 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:9/153: creat d0/d5/f3a x:0 0 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:5/116: dwrite f19 [0,4194304] 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:8/129: creat d0/d12/f27 x:0 0 0 2026-03-10T12:37:41.742 INFO:tasks.workunit.client.0.vm00.stdout:9/154: dread d0/d5/d16/d1e/d2b/f36 [0,4194304] 0 2026-03-10T12:37:41.746 INFO:tasks.workunit.client.0.vm00.stdout:2/129: link d4/d6/cc d4/dd/c2c 0 2026-03-10T12:37:41.747 INFO:tasks.workunit.client.0.vm00.stdout:2/130: write d4/d6/f2b [594491,2785] 0 2026-03-10T12:37:41.749 INFO:tasks.workunit.client.0.vm00.stdout:9/155: dwrite d0/d5/d16/d1e/d2b/f36 [0,4194304] 0 2026-03-10T12:37:41.749 INFO:tasks.workunit.client.1.vm07.stdout:4/367: sync 2026-03-10T12:37:41.754 INFO:tasks.workunit.client.0.vm00.stdout:5/117: mknod d1f/d26/c29 0 2026-03-10T12:37:41.754 INFO:tasks.workunit.client.1.vm07.stdout:4/368: unlink d0/d4/l4a 0 2026-03-10T12:37:41.754 INFO:tasks.workunit.client.1.vm07.stdout:4/369: readlink d0/d4/d5/da/l17 0 2026-03-10T12:37:41.756 INFO:tasks.workunit.client.0.vm00.stdout:7/170: dwrite da/d1b/d2d/f38 [0,4194304] 0 2026-03-10T12:37:41.757 INFO:tasks.workunit.client.1.vm07.stdout:4/370: symlink d0/d5c/d7c/l80 0 2026-03-10T12:37:41.758 INFO:tasks.workunit.client.0.vm00.stdout:2/131: mkdir d4/d6/d2d 0 2026-03-10T12:37:41.761 INFO:tasks.workunit.client.1.vm07.stdout:4/371: dwrite d0/d4/d10/d3c/d2b/d2d/f65 [4194304,4194304] 0 2026-03-10T12:37:41.766 INFO:tasks.workunit.client.1.vm07.stdout:4/372: mknod d0/d4/d5/d78/c81 0 2026-03-10T12:37:41.767 INFO:tasks.workunit.client.1.vm07.stdout:4/373: rmdir d0/d5c/d7c 39 2026-03-10T12:37:41.768 INFO:tasks.workunit.client.0.vm00.stdout:5/118: dwrite d1f/f22 [0,4194304] 0 2026-03-10T12:37:41.770 INFO:tasks.workunit.client.0.vm00.stdout:5/119: dread f19 [0,4194304] 0 2026-03-10T12:37:41.773 INFO:tasks.workunit.client.1.vm07.stdout:4/374: mknod d0/d4/d7a/c82 0 2026-03-10T12:37:41.773 INFO:tasks.workunit.client.1.vm07.stdout:4/375: fdatasync d0/d4/d10/d18/f3e 0 2026-03-10T12:37:41.775 INFO:tasks.workunit.client.1.vm07.stdout:4/376: truncate d0/d4/d7a/f2e 578563 0 2026-03-10T12:37:41.776 INFO:tasks.workunit.client.1.vm07.stdout:4/377: mknod d0/d19/c83 0 2026-03-10T12:37:41.777 INFO:tasks.workunit.client.1.vm07.stdout:4/378: fdatasync d0/d4/d7a/f27 0 2026-03-10T12:37:41.777 INFO:tasks.workunit.client.0.vm00.stdout:1/129: rmdir da/d21/d33 0 2026-03-10T12:37:41.777 INFO:tasks.workunit.client.0.vm00.stdout:7/171: chown da/d25/c28 30844 1 2026-03-10T12:37:41.779 INFO:tasks.workunit.client.0.vm00.stdout:7/172: dread f1 [4194304,4194304] 0 2026-03-10T12:37:41.781 INFO:tasks.workunit.client.1.vm07.stdout:4/379: dwrite d0/d4/d10/d3c/f6c [0,4194304] 0 2026-03-10T12:37:41.782 INFO:tasks.workunit.client.0.vm00.stdout:7/173: stat da/d1b/d40/l42 0 2026-03-10T12:37:41.785 INFO:tasks.workunit.client.1.vm07.stdout:4/380: chown d0/d4/d5/d78/c81 107 1 2026-03-10T12:37:41.787 INFO:tasks.workunit.client.0.vm00.stdout:5/120: mkdir d1f/d26/d2a 0 2026-03-10T12:37:41.791 INFO:tasks.workunit.client.0.vm00.stdout:2/132: creat d4/d6/f2e x:0 0 0 2026-03-10T12:37:41.791 INFO:tasks.workunit.client.1.vm07.stdout:4/381: unlink d0/d19/l7f 0 2026-03-10T12:37:41.791 INFO:tasks.workunit.client.0.vm00.stdout:1/130: mkdir da/d12/d34 0 2026-03-10T12:37:41.791 INFO:tasks.workunit.client.0.vm00.stdout:1/131: stat da/d21 0 2026-03-10T12:37:41.793 INFO:tasks.workunit.client.0.vm00.stdout:9/156: creat d0/d5/f3b x:0 0 0 2026-03-10T12:37:41.794 INFO:tasks.workunit.client.0.vm00.stdout:1/132: symlink da/d21/l35 0 2026-03-10T12:37:41.795 INFO:tasks.workunit.client.0.vm00.stdout:9/157: write d0/d5/d16/d19/f1b [1139863,86520] 0 2026-03-10T12:37:41.796 INFO:tasks.workunit.client.0.vm00.stdout:9/158: truncate d0/d5/dc/f2d 355173 0 2026-03-10T12:37:41.799 INFO:tasks.workunit.client.0.vm00.stdout:9/159: dread d0/d5/d16/f34 [0,4194304] 0 2026-03-10T12:37:41.802 INFO:tasks.workunit.client.0.vm00.stdout:5/121: dread f12 [0,4194304] 0 2026-03-10T12:37:41.803 INFO:tasks.workunit.client.0.vm00.stdout:1/133: unlink f4 0 2026-03-10T12:37:41.804 INFO:tasks.workunit.client.0.vm00.stdout:9/160: mknod d0/d5/d16/d19/d2f/d35/c3c 0 2026-03-10T12:37:41.805 INFO:tasks.workunit.client.0.vm00.stdout:5/122: dread f19 [0,4194304] 0 2026-03-10T12:37:41.806 INFO:tasks.workunit.client.0.vm00.stdout:5/123: mkdir d1f/d26/d2b 0 2026-03-10T12:37:41.806 INFO:tasks.workunit.client.0.vm00.stdout:1/134: getdents da/d24/d28 0 2026-03-10T12:37:41.807 INFO:tasks.workunit.client.0.vm00.stdout:5/124: chown d1f/d26/d2b 0 1 2026-03-10T12:37:41.807 INFO:tasks.workunit.client.0.vm00.stdout:5/125: read - d1f/f27 zero size 2026-03-10T12:37:41.844 INFO:tasks.workunit.client.0.vm00.stdout:0/233: rmdir d3/d7 39 2026-03-10T12:37:41.849 INFO:tasks.workunit.client.1.vm07.stdout:7/288: write d0/d47/d48/f54 [397625,121812] 0 2026-03-10T12:37:41.850 INFO:tasks.workunit.client.1.vm07.stdout:7/289: stat d0/c15 0 2026-03-10T12:37:41.852 INFO:tasks.workunit.client.1.vm07.stdout:7/290: dread d0/f27 [0,4194304] 0 2026-03-10T12:37:41.853 INFO:tasks.workunit.client.1.vm07.stdout:7/291: dread d0/f37 [0,4194304] 0 2026-03-10T12:37:41.853 INFO:tasks.workunit.client.1.vm07.stdout:7/292: readlink d0/l3d 0 2026-03-10T12:37:41.854 INFO:tasks.workunit.client.1.vm07.stdout:7/293: creat d0/d47/f58 x:0 0 0 2026-03-10T12:37:41.855 INFO:tasks.workunit.client.1.vm07.stdout:7/294: creat d0/d47/f59 x:0 0 0 2026-03-10T12:37:41.856 INFO:tasks.workunit.client.1.vm07.stdout:7/295: write d0/d47/d48/f4b [1021252,69957] 0 2026-03-10T12:37:41.857 INFO:tasks.workunit.client.1.vm07.stdout:7/296: mknod d0/d52/c5a 0 2026-03-10T12:37:41.857 INFO:tasks.workunit.client.0.vm00.stdout:3/141: truncate dd/f25 2952542 0 2026-03-10T12:37:41.862 INFO:tasks.workunit.client.0.vm00.stdout:3/142: readlink dd/d18/l30 0 2026-03-10T12:37:41.863 INFO:tasks.workunit.client.0.vm00.stdout:3/143: mknod dd/d27/c33 0 2026-03-10T12:37:41.872 INFO:tasks.workunit.client.0.vm00.stdout:3/144: chown dd 505 1 2026-03-10T12:37:41.872 INFO:tasks.workunit.client.0.vm00.stdout:3/145: mkdir dd/d27/d2c/d34 0 2026-03-10T12:37:41.872 INFO:tasks.workunit.client.0.vm00.stdout:3/146: dwrite fb [0,4194304] 0 2026-03-10T12:37:41.879 INFO:tasks.workunit.client.0.vm00.stdout:3/147: creat dd/d27/f35 x:0 0 0 2026-03-10T12:37:41.881 INFO:tasks.workunit.client.0.vm00.stdout:3/148: symlink dd/d18/d14/l36 0 2026-03-10T12:37:41.884 INFO:tasks.workunit.client.0.vm00.stdout:3/149: link dd/d18/d13/d1d/l23 dd/d18/d13/d1d/l37 0 2026-03-10T12:37:41.894 INFO:tasks.workunit.client.0.vm00.stdout:3/150: chown dd/d18/d13/c26 7 1 2026-03-10T12:37:41.894 INFO:tasks.workunit.client.0.vm00.stdout:3/151: mkdir dd/d27/d2c/d34/d38 0 2026-03-10T12:37:41.895 INFO:tasks.workunit.client.0.vm00.stdout:7/174: sync 2026-03-10T12:37:41.917 INFO:tasks.workunit.client.0.vm00.stdout:4/116: fdatasync df/f11 0 2026-03-10T12:37:41.918 INFO:tasks.workunit.client.0.vm00.stdout:2/133: fdatasync d4/f1d 0 2026-03-10T12:37:41.922 INFO:tasks.workunit.client.0.vm00.stdout:4/117: mkdir df/d24 0 2026-03-10T12:37:41.922 INFO:tasks.workunit.client.0.vm00.stdout:4/118: write df/f20 [276124,2139] 0 2026-03-10T12:37:41.924 INFO:tasks.workunit.client.0.vm00.stdout:4/119: mkdir df/d1f/d25 0 2026-03-10T12:37:41.925 INFO:tasks.workunit.client.0.vm00.stdout:4/120: mkdir df/d1f/d22/d26 0 2026-03-10T12:37:41.928 INFO:tasks.workunit.client.0.vm00.stdout:2/134: read d4/d6/f2b [358257,87344] 0 2026-03-10T12:37:41.932 INFO:tasks.workunit.client.0.vm00.stdout:2/135: symlink d4/d6/d2d/l2f 0 2026-03-10T12:37:41.933 INFO:tasks.workunit.client.0.vm00.stdout:2/136: creat d4/d6/f30 x:0 0 0 2026-03-10T12:37:41.933 INFO:tasks.workunit.client.1.vm07.stdout:2/233: write d0/d42/d26/d38/f3a [43940,73603] 0 2026-03-10T12:37:41.934 INFO:tasks.workunit.client.1.vm07.stdout:2/234: stat d0/d42/d1f/d20/c41 0 2026-03-10T12:37:41.937 INFO:tasks.workunit.client.0.vm00.stdout:2/137: dread d4/dd/f17 [0,4194304] 0 2026-03-10T12:37:41.942 INFO:tasks.workunit.client.1.vm07.stdout:2/235: mknod d0/d45/c4c 0 2026-03-10T12:37:41.950 INFO:tasks.workunit.client.0.vm00.stdout:5/126: rename f1b to d1f/f2c 0 2026-03-10T12:37:41.951 INFO:tasks.workunit.client.0.vm00.stdout:8/130: rmdir d0/d12/d17 39 2026-03-10T12:37:41.965 INFO:tasks.workunit.client.1.vm07.stdout:5/353: dwrite d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:37:41.966 INFO:tasks.workunit.client.0.vm00.stdout:4/121: rename l5 to df/d24/l27 0 2026-03-10T12:37:41.978 INFO:tasks.workunit.client.1.vm07.stdout:4/382: truncate d0/d4/d10/d3c/d2b/d2d/f65 4443818 0 2026-03-10T12:37:41.979 INFO:tasks.workunit.client.0.vm00.stdout:7/175: dwrite da/d1b/d2d/f38 [4194304,4194304] 0 2026-03-10T12:37:41.979 INFO:tasks.workunit.client.0.vm00.stdout:5/127: symlink d1f/d26/l2d 0 2026-03-10T12:37:41.979 INFO:tasks.workunit.client.1.vm07.stdout:4/383: write d0/d4/d5/da/f44 [961338,129040] 0 2026-03-10T12:37:41.981 INFO:tasks.workunit.client.0.vm00.stdout:9/161: getdents d0/d5 0 2026-03-10T12:37:41.981 INFO:tasks.workunit.client.0.vm00.stdout:7/176: dwrite f6 [0,4194304] 0 2026-03-10T12:37:41.982 INFO:tasks.workunit.client.0.vm00.stdout:9/162: read - d0/d5/f3a zero size 2026-03-10T12:37:41.982 INFO:tasks.workunit.client.0.vm00.stdout:4/122: dwrite df/f1c [0,4194304] 0 2026-03-10T12:37:41.983 INFO:tasks.workunit.client.0.vm00.stdout:8/131: dwrite d0/f7 [0,4194304] 0 2026-03-10T12:37:41.986 INFO:tasks.workunit.client.1.vm07.stdout:5/354: dread d0/d22/d18/d30/f35 [0,4194304] 0 2026-03-10T12:37:41.989 INFO:tasks.workunit.client.1.vm07.stdout:4/384: unlink d0/d4/d5/da/f48 0 2026-03-10T12:37:41.998 INFO:tasks.workunit.client.0.vm00.stdout:5/128: rmdir d1f/d26/d2a 0 2026-03-10T12:37:42.000 INFO:tasks.workunit.client.0.vm00.stdout:7/177: creat da/d1b/d40/f44 x:0 0 0 2026-03-10T12:37:42.001 INFO:tasks.workunit.client.0.vm00.stdout:7/178: readlink da/l1f 0 2026-03-10T12:37:42.003 INFO:tasks.workunit.client.1.vm07.stdout:7/297: truncate d0/f37 281313 0 2026-03-10T12:37:42.004 INFO:tasks.workunit.client.0.vm00.stdout:3/152: truncate dd/d18/d13/f22 2312883 0 2026-03-10T12:37:42.004 INFO:tasks.workunit.client.1.vm07.stdout:7/298: write d0/f40 [991625,110579] 0 2026-03-10T12:37:42.005 INFO:tasks.workunit.client.0.vm00.stdout:3/153: chown dd/d18/d13/c26 32353454 1 2026-03-10T12:37:42.008 INFO:tasks.workunit.client.1.vm07.stdout:7/299: dwrite d0/f3b [0,4194304] 0 2026-03-10T12:37:42.014 INFO:tasks.workunit.client.0.vm00.stdout:5/129: mkdir d1f/d26/d2e 0 2026-03-10T12:37:42.014 INFO:tasks.workunit.client.0.vm00.stdout:5/130: mknod d1f/d26/c2f 0 2026-03-10T12:37:42.014 INFO:tasks.workunit.client.0.vm00.stdout:5/131: truncate d1f/f27 162388 0 2026-03-10T12:37:42.016 INFO:tasks.workunit.client.0.vm00.stdout:8/132: creat d0/f28 x:0 0 0 2026-03-10T12:37:42.016 INFO:tasks.workunit.client.1.vm07.stdout:7/300: mknod d0/d47/d48/c5b 0 2026-03-10T12:37:42.017 INFO:tasks.workunit.client.0.vm00.stdout:7/179: truncate da/f15 1252315 0 2026-03-10T12:37:42.019 INFO:tasks.workunit.client.0.vm00.stdout:5/132: creat d1f/f30 x:0 0 0 2026-03-10T12:37:42.019 INFO:tasks.workunit.client.0.vm00.stdout:5/133: write d1f/f27 [410448,16344] 0 2026-03-10T12:37:42.022 INFO:tasks.workunit.client.1.vm07.stdout:7/301: dwrite d0/f3a [0,4194304] 0 2026-03-10T12:37:42.027 INFO:tasks.workunit.client.1.vm07.stdout:7/302: truncate d0/f2b 5001596 0 2026-03-10T12:37:42.033 INFO:tasks.workunit.client.0.vm00.stdout:8/133: symlink d0/l29 0 2026-03-10T12:37:42.040 INFO:tasks.workunit.client.1.vm07.stdout:5/355: dread d0/d22/d18/d19/d21/f42 [4194304,4194304] 0 2026-03-10T12:37:42.040 INFO:tasks.workunit.client.1.vm07.stdout:5/356: dwrite d0/f1f [0,4194304] 0 2026-03-10T12:37:42.040 INFO:tasks.workunit.client.0.vm00.stdout:4/123: sync 2026-03-10T12:37:42.040 INFO:tasks.workunit.client.0.vm00.stdout:7/180: symlink da/d1b/l45 0 2026-03-10T12:37:42.042 INFO:tasks.workunit.client.0.vm00.stdout:7/181: mkdir da/d25/d2c/d46 0 2026-03-10T12:37:42.042 INFO:tasks.workunit.client.0.vm00.stdout:7/182: read f1 [2814230,118711] 0 2026-03-10T12:37:42.043 INFO:tasks.workunit.client.0.vm00.stdout:7/183: fsync da/d25/f29 0 2026-03-10T12:37:42.044 INFO:tasks.workunit.client.0.vm00.stdout:4/124: creat df/d1f/d22/d26/f28 x:0 0 0 2026-03-10T12:37:42.050 INFO:tasks.workunit.client.0.vm00.stdout:4/125: creat df/f29 x:0 0 0 2026-03-10T12:37:42.058 INFO:tasks.workunit.client.0.vm00.stdout:4/126: rename c4 to df/d1f/d25/c2a 0 2026-03-10T12:37:42.058 INFO:tasks.workunit.client.0.vm00.stdout:4/127: read - df/d1f/d22/d26/f28 zero size 2026-03-10T12:37:42.058 INFO:tasks.workunit.client.0.vm00.stdout:4/128: fsync fb 0 2026-03-10T12:37:42.058 INFO:tasks.workunit.client.0.vm00.stdout:4/129: mknod df/d24/c2b 0 2026-03-10T12:37:42.149 INFO:tasks.workunit.client.0.vm00.stdout:1/135: write da/d12/f1d [789380,2886] 0 2026-03-10T12:37:42.149 INFO:tasks.workunit.client.1.vm07.stdout:6/267: dread d1/d4/f19 [0,4194304] 0 2026-03-10T12:37:42.150 INFO:tasks.workunit.client.0.vm00.stdout:2/138: dwrite d4/dd/ff [0,4194304] 0 2026-03-10T12:37:42.151 INFO:tasks.workunit.client.1.vm07.stdout:6/268: fsync d1/d4/f3f 0 2026-03-10T12:37:42.154 INFO:tasks.workunit.client.0.vm00.stdout:2/139: dwrite d4/d6/f30 [0,4194304] 0 2026-03-10T12:37:42.154 INFO:tasks.workunit.client.1.vm07.stdout:8/343: dread d1/d3/d18/f38 [0,4194304] 0 2026-03-10T12:37:42.156 INFO:tasks.workunit.client.0.vm00.stdout:1/136: dread da/d12/f1a [0,4194304] 0 2026-03-10T12:37:42.157 INFO:tasks.workunit.client.1.vm07.stdout:8/344: mkdir d1/d3/d6/d50/d70 0 2026-03-10T12:37:42.159 INFO:tasks.workunit.client.1.vm07.stdout:8/345: creat d1/d3/f71 x:0 0 0 2026-03-10T12:37:42.160 INFO:tasks.workunit.client.1.vm07.stdout:8/346: chown d1/d3/d6/f4f 222066398 1 2026-03-10T12:37:42.161 INFO:tasks.workunit.client.0.vm00.stdout:1/137: dwrite da/d24/f32 [0,4194304] 0 2026-03-10T12:37:42.161 INFO:tasks.workunit.client.1.vm07.stdout:8/347: chown d1/d3/d6/d50/l69 3 1 2026-03-10T12:37:42.165 INFO:tasks.workunit.client.0.vm00.stdout:9/163: dwrite d0/f1a [0,4194304] 0 2026-03-10T12:37:42.166 INFO:tasks.workunit.client.0.vm00.stdout:9/164: chown d0/d5/f26 1588 1 2026-03-10T12:37:42.167 INFO:tasks.workunit.client.0.vm00.stdout:9/165: write d0/d5/d16/f24 [1020821,12300] 0 2026-03-10T12:37:42.167 INFO:tasks.workunit.client.0.vm00.stdout:9/166: write d0/f21 [6062955,20329] 0 2026-03-10T12:37:42.175 INFO:tasks.workunit.client.0.vm00.stdout:8/134: rmdir d0 39 2026-03-10T12:37:42.175 INFO:tasks.workunit.client.1.vm07.stdout:9/283: symlink d5/d16/l62 0 2026-03-10T12:37:42.176 INFO:tasks.workunit.client.1.vm07.stdout:9/284: dread - d5/d16/f35 zero size 2026-03-10T12:37:42.177 INFO:tasks.workunit.client.0.vm00.stdout:3/154: dread dd/d18/f12 [0,4194304] 0 2026-03-10T12:37:42.177 INFO:tasks.workunit.client.0.vm00.stdout:3/155: stat dd/d18/d13/l1e 0 2026-03-10T12:37:42.178 INFO:tasks.workunit.client.1.vm07.stdout:9/285: creat d5/d13/d57/d4f/f63 x:0 0 0 2026-03-10T12:37:42.179 INFO:tasks.workunit.client.1.vm07.stdout:9/286: mkdir d5/d1f/d31/d64 0 2026-03-10T12:37:42.180 INFO:tasks.workunit.client.1.vm07.stdout:3/352: symlink dc/dd/d1f/l7c 0 2026-03-10T12:37:42.181 INFO:tasks.workunit.client.1.vm07.stdout:3/353: chown dc/dd/f29 6513812 1 2026-03-10T12:37:42.181 INFO:tasks.workunit.client.1.vm07.stdout:3/354: readlink dc/d18/d24/d72/l77 0 2026-03-10T12:37:42.188 INFO:tasks.workunit.client.0.vm00.stdout:7/184: truncate f6 3044024 0 2026-03-10T12:37:42.188 INFO:tasks.workunit.client.0.vm00.stdout:7/185: chown da/d25/c28 18 1 2026-03-10T12:37:42.189 INFO:tasks.workunit.client.0.vm00.stdout:7/186: dread - da/d1b/f1e zero size 2026-03-10T12:37:42.191 INFO:tasks.workunit.client.0.vm00.stdout:1/138: symlink da/d24/l36 0 2026-03-10T12:37:42.198 INFO:tasks.workunit.client.1.vm07.stdout:3/355: dread dc/dd/f21 [4194304,4194304] 0 2026-03-10T12:37:42.198 INFO:tasks.workunit.client.1.vm07.stdout:3/356: stat dc/dd/c6e 0 2026-03-10T12:37:42.198 INFO:tasks.workunit.client.0.vm00.stdout:9/167: rmdir d0/d5/dc 39 2026-03-10T12:37:42.198 INFO:tasks.workunit.client.0.vm00.stdout:3/156: mkdir dd/d2a/d39 0 2026-03-10T12:37:42.198 INFO:tasks.workunit.client.0.vm00.stdout:8/135: dwrite d0/f11 [0,4194304] 0 2026-03-10T12:37:42.203 INFO:tasks.workunit.client.0.vm00.stdout:9/168: mkdir d0/d3d 0 2026-03-10T12:37:42.208 INFO:tasks.workunit.client.0.vm00.stdout:3/157: mknod dd/d27/d2c/c3a 0 2026-03-10T12:37:42.208 INFO:tasks.workunit.client.0.vm00.stdout:3/158: symlink dd/d27/l3b 0 2026-03-10T12:37:42.208 INFO:tasks.workunit.client.0.vm00.stdout:3/159: dread f9 [0,4194304] 0 2026-03-10T12:37:42.209 INFO:tasks.workunit.client.1.vm07.stdout:3/357: symlink dc/l7d 0 2026-03-10T12:37:42.210 INFO:tasks.workunit.client.1.vm07.stdout:3/358: dread - dc/dd/f41 zero size 2026-03-10T12:37:42.211 INFO:tasks.workunit.client.0.vm00.stdout:3/160: dwrite dd/d27/f35 [0,4194304] 0 2026-03-10T12:37:42.214 INFO:tasks.workunit.client.1.vm07.stdout:3/359: dwrite dc/dd/d1f/f27 [0,4194304] 0 2026-03-10T12:37:42.219 INFO:tasks.workunit.client.0.vm00.stdout:9/169: dwrite d0/d5/d16/f39 [0,4194304] 0 2026-03-10T12:37:42.222 INFO:tasks.workunit.client.0.vm00.stdout:7/187: getdents da/d26 0 2026-03-10T12:37:42.226 INFO:tasks.workunit.client.0.vm00.stdout:7/188: dwrite da/d25/d2e/f43 [0,4194304] 0 2026-03-10T12:37:42.232 INFO:tasks.workunit.client.0.vm00.stdout:1/139: getdents da/d21 0 2026-03-10T12:37:42.232 INFO:tasks.workunit.client.0.vm00.stdout:1/140: truncate da/d12/f1d 913588 0 2026-03-10T12:37:42.246 INFO:tasks.workunit.client.0.vm00.stdout:8/136: link d0/d12/d17/f1d d0/d12/f2a 0 2026-03-10T12:37:42.256 INFO:tasks.workunit.client.0.vm00.stdout:1/141: dwrite da/d12/d26/f31 [0,4194304] 0 2026-03-10T12:37:42.260 INFO:tasks.workunit.client.0.vm00.stdout:9/170: symlink d0/d5/l3e 0 2026-03-10T12:37:42.263 INFO:tasks.workunit.client.1.vm07.stdout:0/386: rename d0/d14/d5f/d76/d2f/d31/d4f/c64 to d0/d14/d5f/c7a 0 2026-03-10T12:37:42.263 INFO:tasks.workunit.client.0.vm00.stdout:7/189: mkdir da/d47 0 2026-03-10T12:37:42.264 INFO:tasks.workunit.client.1.vm07.stdout:2/236: rename d0/l35 to d0/d42/d1f/d20/l4d 0 2026-03-10T12:37:42.265 INFO:tasks.workunit.client.1.vm07.stdout:0/387: creat d0/d14/d5f/d76/d2f/d31/d79/f7b x:0 0 0 2026-03-10T12:37:42.266 INFO:tasks.workunit.client.1.vm07.stdout:5/357: rename d0/d22/d18/l56 to d0/d22/d18/d19/d2e/d67/l7b 0 2026-03-10T12:37:42.267 INFO:tasks.workunit.client.1.vm07.stdout:5/358: chown d0/f9 19227 1 2026-03-10T12:37:42.267 INFO:tasks.workunit.client.0.vm00.stdout:1/142: dwrite da/d12/f1d [0,4194304] 0 2026-03-10T12:37:42.269 INFO:tasks.workunit.client.0.vm00.stdout:1/143: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:42.269 INFO:tasks.workunit.client.0.vm00.stdout:1/144: read da/d12/f30 [3372036,99126] 0 2026-03-10T12:37:42.271 INFO:tasks.workunit.client.1.vm07.stdout:2/237: mkdir d0/d42/d4e 0 2026-03-10T12:37:42.276 INFO:tasks.workunit.client.0.vm00.stdout:9/171: creat d0/d5/d16/d19/d2f/f3f x:0 0 0 2026-03-10T12:37:42.276 INFO:tasks.workunit.client.1.vm07.stdout:6/269: rename d1/d4/d6/d16/f54 to d1/d4/d6/d16/d1a/d2c/f59 0 2026-03-10T12:37:42.276 INFO:tasks.workunit.client.1.vm07.stdout:0/388: unlink d0/d14/d5f/d76/d2f/d31/d4f/d60/l72 0 2026-03-10T12:37:42.276 INFO:tasks.workunit.client.1.vm07.stdout:2/238: mkdir d0/d42/d26/d38/d4f 0 2026-03-10T12:37:42.276 INFO:tasks.workunit.client.1.vm07.stdout:2/239: stat d0/d42/f1e 0 2026-03-10T12:37:42.276 INFO:tasks.workunit.client.1.vm07.stdout:2/240: chown d0/f12 0 1 2026-03-10T12:37:42.277 INFO:tasks.workunit.client.0.vm00.stdout:9/172: dwrite d0/d5/d16/d19/d2f/f3f [0,4194304] 0 2026-03-10T12:37:42.292 INFO:tasks.workunit.client.0.vm00.stdout:1/145: read f5 [2934603,40103] 0 2026-03-10T12:37:42.294 INFO:tasks.workunit.client.1.vm07.stdout:9/287: rename d5/d1f/f4e to d5/f65 0 2026-03-10T12:37:42.294 INFO:tasks.workunit.client.0.vm00.stdout:7/190: mkdir da/d41/d48 0 2026-03-10T12:37:42.294 INFO:tasks.workunit.client.1.vm07.stdout:9/288: truncate d5/d13/d2c/f44 701721 0 2026-03-10T12:37:42.295 INFO:tasks.workunit.client.1.vm07.stdout:6/270: creat d1/d4/f5a x:0 0 0 2026-03-10T12:37:42.295 INFO:tasks.workunit.client.1.vm07.stdout:6/271: chown d1/d4/d6/d4e/f51 3462 1 2026-03-10T12:37:42.296 INFO:tasks.workunit.client.0.vm00.stdout:8/137: getdents d0/dd 0 2026-03-10T12:37:42.297 INFO:tasks.workunit.client.1.vm07.stdout:2/241: rename d0/d42/d26/f27 to d0/d42/d26/f50 0 2026-03-10T12:37:42.298 INFO:tasks.workunit.client.0.vm00.stdout:9/173: mknod d0/d5/d16/d19/d2f/d35/c40 0 2026-03-10T12:37:42.299 INFO:tasks.workunit.client.1.vm07.stdout:6/272: mkdir d1/d4/d6/d16/d1a/d2c/d5b 0 2026-03-10T12:37:42.300 INFO:tasks.workunit.client.1.vm07.stdout:0/389: truncate d0/d14/f19 555053 0 2026-03-10T12:37:42.303 INFO:tasks.workunit.client.0.vm00.stdout:8/138: creat d0/dd/f2b x:0 0 0 2026-03-10T12:37:42.303 INFO:tasks.workunit.client.1.vm07.stdout:0/390: dwrite d0/d14/d5f/d76/d2f/d31/f6f [0,4194304] 0 2026-03-10T12:37:42.306 INFO:tasks.workunit.client.0.vm00.stdout:8/139: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:42.309 INFO:tasks.workunit.client.0.vm00.stdout:1/146: write f3 [1318140,100038] 0 2026-03-10T12:37:42.310 INFO:tasks.workunit.client.1.vm07.stdout:9/289: creat d5/d1f/d31/d64/f66 x:0 0 0 2026-03-10T12:37:42.310 INFO:tasks.workunit.client.0.vm00.stdout:1/147: truncate da/f13 259189 0 2026-03-10T12:37:42.318 INFO:tasks.workunit.client.0.vm00.stdout:9/174: creat d0/d5/dc/f41 x:0 0 0 2026-03-10T12:37:42.320 INFO:tasks.workunit.client.1.vm07.stdout:6/273: symlink d1/d4/d6/d16/d49/l5c 0 2026-03-10T12:37:42.321 INFO:tasks.workunit.client.0.vm00.stdout:9/175: dwrite d0/d5/d16/f24 [0,4194304] 0 2026-03-10T12:37:42.321 INFO:tasks.workunit.client.1.vm07.stdout:0/391: truncate d0/d14/d5f/d76/f2c 1450813 0 2026-03-10T12:37:42.322 INFO:tasks.workunit.client.0.vm00.stdout:8/140: symlink d0/d12/l2c 0 2026-03-10T12:37:42.322 INFO:tasks.workunit.client.0.vm00.stdout:9/176: chown d0/d5/d16/f34 16370609 1 2026-03-10T12:37:42.325 INFO:tasks.workunit.client.1.vm07.stdout:6/274: fdatasync d1/d4/d6/d16/d1a/d2c/f59 0 2026-03-10T12:37:42.326 INFO:tasks.workunit.client.0.vm00.stdout:8/141: chown d0/l14 363 1 2026-03-10T12:37:42.336 INFO:tasks.workunit.client.0.vm00.stdout:9/177: dwrite d0/d5/d16/d19/f20 [4194304,4194304] 0 2026-03-10T12:37:42.336 INFO:tasks.workunit.client.0.vm00.stdout:9/178: readlink d0/d5/d16/l2e 0 2026-03-10T12:37:42.336 INFO:tasks.workunit.client.0.vm00.stdout:9/179: write d0/d5/dc/f41 [197423,128017] 0 2026-03-10T12:37:42.336 INFO:tasks.workunit.client.0.vm00.stdout:1/148: rmdir da/d12/d34 0 2026-03-10T12:37:42.336 INFO:tasks.workunit.client.0.vm00.stdout:1/149: chown da/d24/d28 1 1 2026-03-10T12:37:42.336 INFO:tasks.workunit.client.0.vm00.stdout:9/180: read - d0/d5/f26 zero size 2026-03-10T12:37:42.337 INFO:tasks.workunit.client.0.vm00.stdout:1/150: truncate f5 3099999 0 2026-03-10T12:37:42.339 INFO:tasks.workunit.client.0.vm00.stdout:9/181: dread d0/d5/d16/f39 [0,4194304] 0 2026-03-10T12:37:42.339 INFO:tasks.workunit.client.0.vm00.stdout:9/182: read d0/d5/d16/d19/f1b [3294295,125178] 0 2026-03-10T12:37:42.344 INFO:tasks.workunit.client.0.vm00.stdout:1/151: creat da/d24/d28/f37 x:0 0 0 2026-03-10T12:37:42.347 INFO:tasks.workunit.client.0.vm00.stdout:1/152: write da/f14 [1302730,33560] 0 2026-03-10T12:37:42.351 INFO:tasks.workunit.client.0.vm00.stdout:1/153: write da/d12/f1d [1979944,94630] 0 2026-03-10T12:37:42.352 INFO:tasks.workunit.client.0.vm00.stdout:1/154: write da/fc [3698617,67395] 0 2026-03-10T12:37:42.354 INFO:tasks.workunit.client.0.vm00.stdout:1/155: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:42.360 INFO:tasks.workunit.client.0.vm00.stdout:1/156: fdatasync f5 0 2026-03-10T12:37:42.361 INFO:tasks.workunit.client.0.vm00.stdout:1/157: mknod da/d24/d28/c38 0 2026-03-10T12:37:42.363 INFO:tasks.workunit.client.0.vm00.stdout:1/158: dread da/fc [0,4194304] 0 2026-03-10T12:37:42.364 INFO:tasks.workunit.client.0.vm00.stdout:1/159: mkdir da/d21/d39 0 2026-03-10T12:37:42.365 INFO:tasks.workunit.client.0.vm00.stdout:1/160: rename da/d24/d28/c38 to da/d12/c3a 0 2026-03-10T12:37:42.366 INFO:tasks.workunit.client.0.vm00.stdout:1/161: readlink da/d24/l2c 0 2026-03-10T12:37:42.436 INFO:tasks.workunit.client.0.vm00.stdout:8/142: sync 2026-03-10T12:37:42.440 INFO:tasks.workunit.client.0.vm00.stdout:8/143: mkdir d0/d12/d2d 0 2026-03-10T12:37:42.440 INFO:tasks.workunit.client.1.vm07.stdout:7/303: truncate d0/f40 343787 0 2026-03-10T12:37:42.441 INFO:tasks.workunit.client.1.vm07.stdout:7/304: symlink d0/d47/l5c 0 2026-03-10T12:37:42.442 INFO:tasks.workunit.client.1.vm07.stdout:7/305: readlink d0/l24 0 2026-03-10T12:37:42.442 INFO:tasks.workunit.client.0.vm00.stdout:8/144: dwrite d0/f28 [0,4194304] 0 2026-03-10T12:37:42.447 INFO:tasks.workunit.client.0.vm00.stdout:5/134: dwrite d1f/f2c [0,4194304] 0 2026-03-10T12:37:42.450 INFO:tasks.workunit.client.0.vm00.stdout:8/145: dread d0/f28 [0,4194304] 0 2026-03-10T12:37:42.450 INFO:tasks.workunit.client.0.vm00.stdout:8/146: stat d0/f9 0 2026-03-10T12:37:42.451 INFO:tasks.workunit.client.0.vm00.stdout:4/130: getdents df 0 2026-03-10T12:37:42.451 INFO:tasks.workunit.client.1.vm07.stdout:0/392: sync 2026-03-10T12:37:42.451 INFO:tasks.workunit.client.1.vm07.stdout:6/275: sync 2026-03-10T12:37:42.454 INFO:tasks.workunit.client.0.vm00.stdout:7/191: write f6 [1030476,91084] 0 2026-03-10T12:37:42.455 INFO:tasks.workunit.client.0.vm00.stdout:7/192: chown da/d47 1 1 2026-03-10T12:37:42.455 INFO:tasks.workunit.client.0.vm00.stdout:7/193: stat da/d3f 0 2026-03-10T12:37:42.456 INFO:tasks.workunit.client.0.vm00.stdout:5/135: symlink d1f/l31 0 2026-03-10T12:37:42.456 INFO:tasks.workunit.client.1.vm07.stdout:0/393: getdents d0 0 2026-03-10T12:37:42.457 INFO:tasks.workunit.client.0.vm00.stdout:8/147: write d0/f22 [2198487,31906] 0 2026-03-10T12:37:42.458 INFO:tasks.workunit.client.1.vm07.stdout:0/394: dread d0/d14/d5f/d3b/f4b [0,4194304] 0 2026-03-10T12:37:42.460 INFO:tasks.workunit.client.0.vm00.stdout:5/136: creat d1f/f32 x:0 0 0 2026-03-10T12:37:42.460 INFO:tasks.workunit.client.0.vm00.stdout:8/148: creat d0/d12/d17/f2e x:0 0 0 2026-03-10T12:37:42.460 INFO:tasks.workunit.client.0.vm00.stdout:7/194: creat da/d47/f49 x:0 0 0 2026-03-10T12:37:42.460 INFO:tasks.workunit.client.0.vm00.stdout:5/137: readlink l17 0 2026-03-10T12:37:42.461 INFO:tasks.workunit.client.0.vm00.stdout:8/149: dread - d0/d12/d17/f2e zero size 2026-03-10T12:37:42.461 INFO:tasks.workunit.client.0.vm00.stdout:8/150: chown d0 221 1 2026-03-10T12:37:42.461 INFO:tasks.workunit.client.0.vm00.stdout:7/195: chown da/f13 1 1 2026-03-10T12:37:42.462 INFO:tasks.workunit.client.1.vm07.stdout:0/395: rmdir d0/d14/d5f/d76/d2f/d31/d4f 39 2026-03-10T12:37:42.462 INFO:tasks.workunit.client.0.vm00.stdout:5/138: dread d1f/f21 [0,4194304] 0 2026-03-10T12:37:42.464 INFO:tasks.workunit.client.0.vm00.stdout:8/151: unlink d0/d12/d17/l21 0 2026-03-10T12:37:42.468 INFO:tasks.workunit.client.0.vm00.stdout:7/196: creat da/d26/d37/f4a x:0 0 0 2026-03-10T12:37:42.481 INFO:tasks.workunit.client.0.vm00.stdout:5/139: symlink d1f/d26/d2e/l33 0 2026-03-10T12:37:42.481 INFO:tasks.workunit.client.0.vm00.stdout:5/140: dwrite d1f/f25 [0,4194304] 0 2026-03-10T12:37:42.481 INFO:tasks.workunit.client.0.vm00.stdout:8/152: mknod d0/c2f 0 2026-03-10T12:37:42.481 INFO:tasks.workunit.client.0.vm00.stdout:7/197: rename f6 to da/d41/f4b 0 2026-03-10T12:37:42.481 INFO:tasks.workunit.client.0.vm00.stdout:7/198: write da/d26/d37/f4a [387247,67538] 0 2026-03-10T12:37:42.486 INFO:tasks.workunit.client.0.vm00.stdout:7/199: mkdir da/d25/d2e/d4c 0 2026-03-10T12:37:42.486 INFO:tasks.workunit.client.0.vm00.stdout:7/200: fdatasync da/f10 0 2026-03-10T12:37:42.488 INFO:tasks.workunit.client.0.vm00.stdout:7/201: dread da/fe [0,4194304] 0 2026-03-10T12:37:42.493 INFO:tasks.workunit.client.1.vm07.stdout:4/385: unlink d0/d4/d10/d3c/f22 0 2026-03-10T12:37:42.493 INFO:tasks.workunit.client.0.vm00.stdout:7/202: getdents da/d1b/d2d 0 2026-03-10T12:37:42.494 INFO:tasks.workunit.client.1.vm07.stdout:4/386: mknod d0/d4/d10/d18/c84 0 2026-03-10T12:37:42.494 INFO:tasks.workunit.client.0.vm00.stdout:7/203: mknod da/c4d 0 2026-03-10T12:37:42.495 INFO:tasks.workunit.client.1.vm07.stdout:4/387: chown d0/d4/d5/da/l26 11608 1 2026-03-10T12:37:42.495 INFO:tasks.workunit.client.1.vm07.stdout:4/388: chown d0/d4/d7a/d46/d76 13 1 2026-03-10T12:37:42.496 INFO:tasks.workunit.client.1.vm07.stdout:4/389: creat d0/d4/d7a/d46/f85 x:0 0 0 2026-03-10T12:37:42.497 INFO:tasks.workunit.client.0.vm00.stdout:7/204: getdents da/d1b/d40 0 2026-03-10T12:37:42.497 INFO:tasks.workunit.client.0.vm00.stdout:7/205: stat da/d25/d2c/c3a 0 2026-03-10T12:37:42.498 INFO:tasks.workunit.client.0.vm00.stdout:7/206: creat da/d25/f4e x:0 0 0 2026-03-10T12:37:42.499 INFO:tasks.workunit.client.1.vm07.stdout:4/390: dread d0/d4/d7a/f2e [0,4194304] 0 2026-03-10T12:37:42.500 INFO:tasks.workunit.client.1.vm07.stdout:4/391: chown d0/d4/d5/l20 209 1 2026-03-10T12:37:42.500 INFO:tasks.workunit.client.0.vm00.stdout:7/207: dread da/d25/d2e/f43 [0,4194304] 0 2026-03-10T12:37:42.501 INFO:tasks.workunit.client.0.vm00.stdout:7/208: creat da/d25/d2c/f4f x:0 0 0 2026-03-10T12:37:42.502 INFO:tasks.workunit.client.0.vm00.stdout:7/209: mkdir da/d26/d50 0 2026-03-10T12:37:42.503 INFO:tasks.workunit.client.1.vm07.stdout:4/392: dwrite d0/d4/d5/f43 [0,4194304] 0 2026-03-10T12:37:42.505 INFO:tasks.workunit.client.0.vm00.stdout:7/210: creat da/d41/d48/f51 x:0 0 0 2026-03-10T12:37:42.507 INFO:tasks.workunit.client.1.vm07.stdout:4/393: link d0/d4/d5/d78/c81 d0/d4/d7a/d46/d76/c86 0 2026-03-10T12:37:42.511 INFO:tasks.workunit.client.1.vm07.stdout:4/394: dwrite d0/d4/d10/d5f/d6d/f71 [0,4194304] 0 2026-03-10T12:37:42.512 INFO:tasks.workunit.client.1.vm07.stdout:4/395: fdatasync d0/d4/d5/da/f44 0 2026-03-10T12:37:42.515 INFO:tasks.workunit.client.1.vm07.stdout:4/396: unlink d0/d4/d10/f6b 0 2026-03-10T12:37:42.516 INFO:tasks.workunit.client.0.vm00.stdout:7/211: dwrite da/f10 [0,4194304] 0 2026-03-10T12:37:42.517 INFO:tasks.workunit.client.1.vm07.stdout:4/397: rename d0/d4/d5/da/f4e to d0/d4/d7a/f87 0 2026-03-10T12:37:42.519 INFO:tasks.workunit.client.1.vm07.stdout:6/276: sync 2026-03-10T12:37:42.525 INFO:tasks.workunit.client.1.vm07.stdout:4/398: symlink d0/d4/l88 0 2026-03-10T12:37:42.526 INFO:tasks.workunit.client.1.vm07.stdout:4/399: write d0/d4/d10/d5f/d6d/f71 [2865029,31242] 0 2026-03-10T12:37:42.531 INFO:tasks.workunit.client.0.vm00.stdout:7/212: truncate da/f15 729263 0 2026-03-10T12:37:42.532 INFO:tasks.workunit.client.0.vm00.stdout:7/213: truncate da/d1b/f39 389160 0 2026-03-10T12:37:42.532 INFO:tasks.workunit.client.0.vm00.stdout:7/214: write da/d1b/d40/f44 [823313,14811] 0 2026-03-10T12:37:42.535 INFO:tasks.workunit.client.0.vm00.stdout:7/215: fdatasync da/d1b/d2d/f38 0 2026-03-10T12:37:42.536 INFO:tasks.workunit.client.0.vm00.stdout:7/216: symlink da/d26/l52 0 2026-03-10T12:37:42.538 INFO:tasks.workunit.client.1.vm07.stdout:4/400: chown d0/d4/d10/d5f/c67 821105 1 2026-03-10T12:37:42.539 INFO:tasks.workunit.client.0.vm00.stdout:7/217: dwrite da/d25/f4e [0,4194304] 0 2026-03-10T12:37:42.547 INFO:tasks.workunit.client.0.vm00.stdout:7/218: symlink da/d25/l53 0 2026-03-10T12:37:42.550 INFO:tasks.workunit.client.1.vm07.stdout:4/401: dread d0/d4/d5/da/f4d [0,4194304] 0 2026-03-10T12:37:42.556 INFO:tasks.workunit.client.1.vm07.stdout:4/402: read d0/d4/d5/d34/f5d [8921,52433] 0 2026-03-10T12:37:42.560 INFO:tasks.workunit.client.1.vm07.stdout:4/403: truncate d0/d4/d5/d34/f37 3548652 0 2026-03-10T12:37:42.561 INFO:tasks.workunit.client.1.vm07.stdout:4/404: unlink d0/f33 0 2026-03-10T12:37:42.562 INFO:tasks.workunit.client.1.vm07.stdout:4/405: write d0/d4/d5/f43 [1180421,81423] 0 2026-03-10T12:37:42.564 INFO:tasks.workunit.client.1.vm07.stdout:4/406: read d0/d4/d10/d18/f3e [2635792,86570] 0 2026-03-10T12:37:42.565 INFO:tasks.workunit.client.1.vm07.stdout:4/407: dread d0/d4/d5/d34/f5d [0,4194304] 0 2026-03-10T12:37:42.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:42 vm07.local ceph-mon[58582]: pgmap v156: 65 pgs: 65 active+clean; 1.1 GiB data, 4.3 GiB used, 116 GiB / 120 GiB avail; 16 MiB/s rd, 135 MiB/s wr, 355 op/s 2026-03-10T12:37:42.566 INFO:tasks.workunit.client.1.vm07.stdout:4/408: write d0/d4/d5/da/f6e [857935,16234] 0 2026-03-10T12:37:42.568 INFO:tasks.workunit.client.1.vm07.stdout:4/409: write d0/d4/d10/f36 [4133790,33561] 0 2026-03-10T12:37:42.570 INFO:tasks.workunit.client.1.vm07.stdout:4/410: getdents d0/d4/d10/d3c/d2b 0 2026-03-10T12:37:42.578 INFO:tasks.workunit.client.1.vm07.stdout:4/411: dwrite d0/d4/d7a/f4f [0,4194304] 0 2026-03-10T12:37:42.582 INFO:tasks.workunit.client.1.vm07.stdout:4/412: mknod d0/d4/d10/c89 0 2026-03-10T12:37:42.583 INFO:tasks.workunit.client.1.vm07.stdout:4/413: rename d0/d4/d7a/ld to d0/l8a 0 2026-03-10T12:37:42.584 INFO:tasks.workunit.client.1.vm07.stdout:4/414: symlink d0/d4/d10/d5f/l8b 0 2026-03-10T12:37:42.587 INFO:tasks.workunit.client.1.vm07.stdout:4/415: creat d0/d4/d5/da/d66/f8c x:0 0 0 2026-03-10T12:37:42.617 INFO:tasks.workunit.client.1.vm07.stdout:9/290: creat d5/d13/f67 x:0 0 0 2026-03-10T12:37:42.617 INFO:tasks.workunit.client.1.vm07.stdout:8/348: write d1/f19 [664142,51425] 0 2026-03-10T12:37:42.621 INFO:tasks.workunit.client.1.vm07.stdout:9/291: mkdir d5/d16/d23/d26/d68 0 2026-03-10T12:37:42.621 INFO:tasks.workunit.client.1.vm07.stdout:9/292: fsync d5/d1f/f3d 0 2026-03-10T12:37:42.623 INFO:tasks.workunit.client.1.vm07.stdout:8/349: creat d1/d3/d6/d54/f72 x:0 0 0 2026-03-10T12:37:42.639 INFO:tasks.workunit.client.1.vm07.stdout:1/330: creat d9/f6c x:0 0 0 2026-03-10T12:37:42.639 INFO:tasks.workunit.client.1.vm07.stdout:9/293: dwrite d5/d13/d57/d4f/f63 [0,4194304] 0 2026-03-10T12:37:42.646 INFO:tasks.workunit.client.1.vm07.stdout:8/350: creat d1/d3/f73 x:0 0 0 2026-03-10T12:37:42.655 INFO:tasks.workunit.client.1.vm07.stdout:9/294: mkdir d5/d69 0 2026-03-10T12:37:42.660 INFO:tasks.workunit.client.1.vm07.stdout:8/351: truncate d1/d3/f1d 573262 0 2026-03-10T12:37:42.667 INFO:tasks.workunit.client.1.vm07.stdout:9/295: mkdir d5/d13/d57/d4f/d6a 0 2026-03-10T12:37:42.667 INFO:tasks.workunit.client.1.vm07.stdout:8/352: fdatasync d1/d3/d40/f4c 0 2026-03-10T12:37:42.677 INFO:tasks.workunit.client.1.vm07.stdout:9/296: dwrite d5/f65 [0,4194304] 0 2026-03-10T12:37:42.690 INFO:tasks.workunit.client.1.vm07.stdout:8/353: fsync d1/d3/f2d 0 2026-03-10T12:37:42.690 INFO:tasks.workunit.client.1.vm07.stdout:8/354: stat d1/d3/d6/d54 0 2026-03-10T12:37:42.694 INFO:tasks.workunit.client.1.vm07.stdout:8/355: rmdir d1/d3/d5d/d65 39 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.0.vm00.stdout:6/224: write d2/d16/f17 [4594489,14146] 0 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.0.vm00.stdout:6/225: unlink d2/da/dc/c33 0 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.0.vm00.stdout:6/226: creat d2/d16/d29/f54 x:0 0 0 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.1.vm07.stdout:8/356: truncate d1/d3/f16 2716351 0 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.1.vm07.stdout:8/357: stat d1/d3/d18/l31 0 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.1.vm07.stdout:8/358: chown d1/d3/d6/d54/c63 5 1 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.1.vm07.stdout:8/359: creat d1/d3/d6c/f74 x:0 0 0 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.1.vm07.stdout:8/360: creat d1/d3/d18/f75 x:0 0 0 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.1.vm07.stdout:8/361: chown d1/d3/d11/l1a 163604875 1 2026-03-10T12:37:42.704 INFO:tasks.workunit.client.1.vm07.stdout:8/362: chown d1/d3/d40/l61 1 1 2026-03-10T12:37:42.705 INFO:tasks.workunit.client.0.vm00.stdout:6/227: mknod d2/d16/d29/d31/d48/c55 0 2026-03-10T12:37:42.706 INFO:tasks.workunit.client.0.vm00.stdout:6/228: chown d2/da/dc/f13 17560687 1 2026-03-10T12:37:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:42 vm00.local ceph-mon[50686]: pgmap v156: 65 pgs: 65 active+clean; 1.1 GiB data, 4.3 GiB used, 116 GiB / 120 GiB avail; 16 MiB/s rd, 135 MiB/s wr, 355 op/s 2026-03-10T12:37:42.774 INFO:tasks.workunit.client.1.vm07.stdout:9/297: sync 2026-03-10T12:37:42.774 INFO:tasks.workunit.client.1.vm07.stdout:6/277: read d1/d4/d6/d16/d1a/d33/f37 [2464836,14164] 0 2026-03-10T12:37:42.785 INFO:tasks.workunit.client.1.vm07.stdout:3/360: dwrite dc/dd/d1f/d45/f56 [0,4194304] 0 2026-03-10T12:37:42.789 INFO:tasks.workunit.client.1.vm07.stdout:3/361: read dc/dd/f21 [7980383,36088] 0 2026-03-10T12:37:42.789 INFO:tasks.workunit.client.1.vm07.stdout:5/359: truncate d0/d22/d18/d19/d21/f2d 2787079 0 2026-03-10T12:37:42.795 INFO:tasks.workunit.client.1.vm07.stdout:6/278: read d1/d4/d6/d16/d1a/d33/f37 [171072,86303] 0 2026-03-10T12:37:42.796 INFO:tasks.workunit.client.1.vm07.stdout:6/279: fdatasync d1/d4/f3f 0 2026-03-10T12:37:42.799 INFO:tasks.workunit.client.1.vm07.stdout:3/362: mknod dc/d18/d2d/d3d/c7e 0 2026-03-10T12:37:42.815 INFO:tasks.workunit.client.1.vm07.stdout:7/306: write d0/f40 [815431,90254] 0 2026-03-10T12:37:42.820 INFO:tasks.workunit.client.1.vm07.stdout:4/416: getdents d0/d4/d10/d18 0 2026-03-10T12:37:42.840 INFO:tasks.workunit.client.1.vm07.stdout:3/363: creat dc/dd/d28/d7a/f7f x:0 0 0 2026-03-10T12:37:42.842 INFO:tasks.workunit.client.1.vm07.stdout:5/360: rename d0/d22/d18/d19/d2e/d3f/l55 to d0/d22/d18/d19/l7c 0 2026-03-10T12:37:42.852 INFO:tasks.workunit.client.0.vm00.stdout:6/229: dread d2/d16/f20 [0,4194304] 0 2026-03-10T12:37:42.855 INFO:tasks.workunit.client.0.vm00.stdout:9/183: getdents d0/d5/dc 0 2026-03-10T12:37:42.856 INFO:tasks.workunit.client.0.vm00.stdout:9/184: creat d0/d5/d16/d1e/d2b/f42 x:0 0 0 2026-03-10T12:37:42.858 INFO:tasks.workunit.client.0.vm00.stdout:9/185: mkdir d0/d3d/d43 0 2026-03-10T12:37:42.860 INFO:tasks.workunit.client.0.vm00.stdout:9/186: rename d0/d5/l3e to d0/d5/l44 0 2026-03-10T12:37:42.861 INFO:tasks.workunit.client.0.vm00.stdout:9/187: write d0/d5/d16/d19/d2f/f3f [2402044,48248] 0 2026-03-10T12:37:42.863 INFO:tasks.workunit.client.0.vm00.stdout:9/188: dread d0/d5/d16/f39 [0,4194304] 0 2026-03-10T12:37:42.869 INFO:tasks.workunit.client.0.vm00.stdout:1/162: getdents da/d12 0 2026-03-10T12:37:42.869 INFO:tasks.workunit.client.0.vm00.stdout:1/163: chown da/d24 2991791 1 2026-03-10T12:37:42.869 INFO:tasks.workunit.client.0.vm00.stdout:1/164: dread - da/d24/d28/f37 zero size 2026-03-10T12:37:42.870 INFO:tasks.workunit.client.0.vm00.stdout:1/165: chown da/cf 3 1 2026-03-10T12:37:42.871 INFO:tasks.workunit.client.0.vm00.stdout:6/230: dread d2/d14/f2e [0,4194304] 0 2026-03-10T12:37:42.872 INFO:tasks.workunit.client.1.vm07.stdout:4/417: mkdir d0/d4/d10/d8d 0 2026-03-10T12:37:42.875 INFO:tasks.workunit.client.1.vm07.stdout:7/307: link d0/f10 d0/d52/f5d 0 2026-03-10T12:37:42.876 INFO:tasks.workunit.client.1.vm07.stdout:7/308: write d0/f21 [2146990,68516] 0 2026-03-10T12:37:42.877 INFO:tasks.workunit.client.0.vm00.stdout:9/189: rename d0/d5/d16/d1e/d2b/f31 to d0/d5/d16/d19/d2f/d35/f45 0 2026-03-10T12:37:42.878 INFO:tasks.workunit.client.1.vm07.stdout:1/331: dwrite d9/f36 [0,4194304] 0 2026-03-10T12:37:42.878 INFO:tasks.workunit.client.0.vm00.stdout:9/190: truncate d0/d5/dc/f2a 4817535 0 2026-03-10T12:37:42.878 INFO:tasks.workunit.client.0.vm00.stdout:2/140: fsync d4/d6/f30 0 2026-03-10T12:37:42.879 INFO:tasks.workunit.client.0.vm00.stdout:9/191: write d0/d5/d16/f24 [1657599,47432] 0 2026-03-10T12:37:42.880 INFO:tasks.workunit.client.0.vm00.stdout:1/166: symlink da/d21/d39/l3b 0 2026-03-10T12:37:42.882 INFO:tasks.workunit.client.0.vm00.stdout:8/153: truncate d0/f28 3205553 0 2026-03-10T12:37:42.886 INFO:tasks.workunit.client.0.vm00.stdout:2/141: mkdir d4/d6/d2d/d31 0 2026-03-10T12:37:42.890 INFO:tasks.workunit.client.0.vm00.stdout:9/192: truncate d0/d5/f3a 120695 0 2026-03-10T12:37:42.894 INFO:tasks.workunit.client.0.vm00.stdout:6/231: creat d2/da/dc/d2f/f56 x:0 0 0 2026-03-10T12:37:42.894 INFO:tasks.workunit.client.1.vm07.stdout:5/361: rename d0/d22/d18/d19/d2e/d3f/d63/f73 to d0/d22/d18/d19/d21/d54/f7d 0 2026-03-10T12:37:42.894 INFO:tasks.workunit.client.1.vm07.stdout:4/418: mkdir d0/d8e 0 2026-03-10T12:37:42.895 INFO:tasks.workunit.client.0.vm00.stdout:9/193: dwrite d0/d5/f3a [0,4194304] 0 2026-03-10T12:37:42.897 INFO:tasks.workunit.client.1.vm07.stdout:4/419: dwrite d0/d4/d5/da/d66/f8c [0,4194304] 0 2026-03-10T12:37:42.899 INFO:tasks.workunit.client.0.vm00.stdout:6/232: dwrite d2/d14/f3b [0,4194304] 0 2026-03-10T12:37:42.899 INFO:tasks.workunit.client.1.vm07.stdout:1/332: write d9/df/d29/d2b/d31/f3c [1981086,23344] 0 2026-03-10T12:37:42.899 INFO:tasks.workunit.client.1.vm07.stdout:4/420: stat d0/d4/d5/da/l26 0 2026-03-10T12:37:42.905 INFO:tasks.workunit.client.0.vm00.stdout:8/154: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:42.906 INFO:tasks.workunit.client.0.vm00.stdout:8/155: stat d0/d12/d17/l18 0 2026-03-10T12:37:42.907 INFO:tasks.workunit.client.0.vm00.stdout:1/167: creat da/d24/d28/f3c x:0 0 0 2026-03-10T12:37:42.909 INFO:tasks.workunit.client.1.vm07.stdout:1/333: dwrite d9/df/f4a [0,4194304] 0 2026-03-10T12:37:42.909 INFO:tasks.workunit.client.0.vm00.stdout:1/168: dread da/fc [0,4194304] 0 2026-03-10T12:37:42.911 INFO:tasks.workunit.client.0.vm00.stdout:1/169: dread da/fc [0,4194304] 0 2026-03-10T12:37:42.914 INFO:tasks.workunit.client.0.vm00.stdout:9/194: rename d0/d5/d16/d19/d2f/d35/c40 to d0/d5/d16/d1e/c46 0 2026-03-10T12:37:42.916 INFO:tasks.workunit.client.0.vm00.stdout:2/142: mkdir d4/d6/d2d/d31/d32 0 2026-03-10T12:37:42.918 INFO:tasks.workunit.client.0.vm00.stdout:1/170: symlink da/d24/d28/l3d 0 2026-03-10T12:37:42.926 INFO:tasks.workunit.client.0.vm00.stdout:4/131: truncate df/f20 97927 0 2026-03-10T12:37:42.927 INFO:tasks.workunit.client.0.vm00.stdout:9/195: dread - d0/d5/f3b zero size 2026-03-10T12:37:42.929 INFO:tasks.workunit.client.0.vm00.stdout:9/196: dread d0/d5/f3a [0,4194304] 0 2026-03-10T12:37:42.930 INFO:tasks.workunit.client.0.vm00.stdout:9/197: write d0/f4 [427461,56807] 0 2026-03-10T12:37:42.932 INFO:tasks.workunit.client.1.vm07.stdout:7/309: rename d0/l3d to d0/d47/d48/l5e 0 2026-03-10T12:37:42.932 INFO:tasks.workunit.client.0.vm00.stdout:1/171: truncate da/d12/f30 2890127 0 2026-03-10T12:37:42.935 INFO:tasks.workunit.client.0.vm00.stdout:1/172: dwrite da/f13 [0,4194304] 0 2026-03-10T12:37:42.940 INFO:tasks.workunit.client.0.vm00.stdout:1/173: dread da/f13 [0,4194304] 0 2026-03-10T12:37:42.944 INFO:tasks.workunit.client.0.vm00.stdout:1/174: dread da/d12/d26/f31 [0,4194304] 0 2026-03-10T12:37:42.945 INFO:tasks.workunit.client.0.vm00.stdout:1/175: read da/d24/f32 [2395988,52896] 0 2026-03-10T12:37:42.947 INFO:tasks.workunit.client.0.vm00.stdout:6/233: mknod d2/d16/d29/d31/c57 0 2026-03-10T12:37:42.948 INFO:tasks.workunit.client.0.vm00.stdout:4/132: mknod df/d1f/d22/d26/c2c 0 2026-03-10T12:37:42.951 INFO:tasks.workunit.client.1.vm07.stdout:8/363: truncate d1/f7 3376335 0 2026-03-10T12:37:42.951 INFO:tasks.workunit.client.1.vm07.stdout:9/298: write d5/d13/d22/f36 [771427,40661] 0 2026-03-10T12:37:42.952 INFO:tasks.workunit.client.1.vm07.stdout:9/299: stat d5/d13/l15 0 2026-03-10T12:37:42.954 INFO:tasks.workunit.client.0.vm00.stdout:2/143: mknod d4/d6/d2d/d31/d32/c33 0 2026-03-10T12:37:42.957 INFO:tasks.workunit.client.0.vm00.stdout:2/144: write d4/d6/f2e [1043254,97945] 0 2026-03-10T12:37:42.958 INFO:tasks.workunit.client.0.vm00.stdout:9/198: chown d0/d5/dc/l38 0 1 2026-03-10T12:37:42.958 INFO:tasks.workunit.client.0.vm00.stdout:5/141: truncate f11 3982737 0 2026-03-10T12:37:42.958 INFO:tasks.workunit.client.0.vm00.stdout:5/142: read d1f/f2c [437582,29303] 0 2026-03-10T12:37:42.959 INFO:tasks.workunit.client.0.vm00.stdout:5/143: write f16 [403935,95625] 0 2026-03-10T12:37:42.960 INFO:tasks.workunit.client.1.vm07.stdout:0/396: dwrite d0/d14/f19 [0,4194304] 0 2026-03-10T12:37:42.960 INFO:tasks.workunit.client.0.vm00.stdout:5/144: write d1f/f32 [697822,39922] 0 2026-03-10T12:37:42.960 INFO:tasks.workunit.client.0.vm00.stdout:5/145: truncate f12 1527905 0 2026-03-10T12:37:42.961 INFO:tasks.workunit.client.1.vm07.stdout:6/280: dwrite d1/d4/f3b [0,4194304] 0 2026-03-10T12:37:42.963 INFO:tasks.workunit.client.1.vm07.stdout:4/421: mkdir d0/d4/d5/d8f 0 2026-03-10T12:37:42.967 INFO:tasks.workunit.client.1.vm07.stdout:4/422: readlink d0/d4/d5/l59 0 2026-03-10T12:37:42.968 INFO:tasks.workunit.client.0.vm00.stdout:4/133: rename df/d1f/d22/d26/f28 to df/d1f/d22/f2d 0 2026-03-10T12:37:42.974 INFO:tasks.workunit.client.0.vm00.stdout:2/145: creat d4/d6/f34 x:0 0 0 2026-03-10T12:37:42.988 INFO:tasks.workunit.client.0.vm00.stdout:2/146: write d4/dd/f10 [9056658,114342] 0 2026-03-10T12:37:42.988 INFO:tasks.workunit.client.0.vm00.stdout:7/219: truncate da/d25/f4e 1369938 0 2026-03-10T12:37:42.988 INFO:tasks.workunit.client.0.vm00.stdout:5/146: fdatasync f19 0 2026-03-10T12:37:42.988 INFO:tasks.workunit.client.0.vm00.stdout:4/134: rename df/d1f/d25 to df/d1f/d22/d26/d2e 0 2026-03-10T12:37:42.988 INFO:tasks.workunit.client.0.vm00.stdout:4/135: dwrite df/f11 [0,4194304] 0 2026-03-10T12:37:42.990 INFO:tasks.workunit.client.0.vm00.stdout:2/147: chown d4/d6/cc 62 1 2026-03-10T12:37:42.990 INFO:tasks.workunit.client.0.vm00.stdout:4/136: dwrite fa [0,4194304] 0 2026-03-10T12:37:42.991 INFO:tasks.workunit.client.1.vm07.stdout:8/364: dwrite d1/d3/d11/f47 [0,4194304] 0 2026-03-10T12:37:42.991 INFO:tasks.workunit.client.0.vm00.stdout:2/148: write d4/dd/f10 [6306496,15834] 0 2026-03-10T12:37:42.992 INFO:tasks.workunit.client.0.vm00.stdout:2/149: readlink d4/dd/l29 0 2026-03-10T12:37:42.995 INFO:tasks.workunit.client.1.vm07.stdout:8/365: truncate d1/d3/d18/f2e 793590 0 2026-03-10T12:37:42.999 INFO:tasks.workunit.client.0.vm00.stdout:5/147: mknod d1f/d26/c34 0 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:2/150: symlink d4/d6/l35 0 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:2/151: stat d4/d6/c2a 0 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:9/199: link d0/d5/d16/d19/f1b d0/d5/d16/d1e/d2b/f47 0 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:2/152: unlink d4/d6/c13 0 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:2/153: rename d4/d6/d2d/d31 to d4/d6/d2d/d31/d36 22 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:5/148: mkdir d1f/d26/d2b/d35 0 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:9/200: dwrite d0/d5/d16/d1e/d2b/f42 [0,4194304] 0 2026-03-10T12:37:43.007 INFO:tasks.workunit.client.0.vm00.stdout:1/176: sync 2026-03-10T12:37:43.009 INFO:tasks.workunit.client.0.vm00.stdout:9/201: truncate d0/d5/d16/f30 5070767 0 2026-03-10T12:37:43.011 INFO:tasks.workunit.client.1.vm07.stdout:9/300: mkdir d5/d1f/d5e/d6b 0 2026-03-10T12:37:43.011 INFO:tasks.workunit.client.1.vm07.stdout:7/310: creat d0/f5f x:0 0 0 2026-03-10T12:37:43.013 INFO:tasks.workunit.client.0.vm00.stdout:6/234: creat d2/d51/f58 x:0 0 0 2026-03-10T12:37:43.014 INFO:tasks.workunit.client.1.vm07.stdout:0/397: mkdir d0/d14/d7c 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.1.vm07.stdout:9/301: write d5/d13/d22/f5f [1711738,47350] 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.1.vm07.stdout:9/302: fdatasync d5/d16/f35 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.1.vm07.stdout:9/303: write d5/d16/d23/d26/f46 [2139795,95539] 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.1.vm07.stdout:0/398: symlink d0/d62/l7d 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.1.vm07.stdout:8/366: write d1/d3/d5d/d65/f67 [473742,29804] 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:1/177: write da/d12/f1d [1998715,126053] 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:2/154: rename d4/d6/cc to d4/c37 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:9/202: creat d0/d5/f48 x:0 0 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:9/203: chown d0/f1a 478229 1 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:5/149: link d1f/d26/c34 d1f/d26/d2b/d35/c36 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:9/204: rename d0/d5/f48 to d0/d5/d16/f49 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:2/155: mkdir d4/dd/d38 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:6/235: creat d2/d16/d29/d31/d48/f59 x:0 0 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:9/205: creat d0/d5/d16/d19/d2f/d35/f4a x:0 0 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:6/236: creat d2/d16/d29/d31/d48/f5a x:0 0 0 2026-03-10T12:37:43.028 INFO:tasks.workunit.client.0.vm00.stdout:9/206: dwrite d0/d5/f3a [0,4194304] 0 2026-03-10T12:37:43.032 INFO:tasks.workunit.client.0.vm00.stdout:2/156: dread d4/d6/f22 [0,4194304] 0 2026-03-10T12:37:43.036 INFO:tasks.workunit.client.0.vm00.stdout:3/161: dwrite dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:43.040 INFO:tasks.workunit.client.0.vm00.stdout:6/237: mknod d2/da/dc/c5b 0 2026-03-10T12:37:43.045 INFO:tasks.workunit.client.1.vm07.stdout:0/399: chown d0/d14/d5f/d3b/f5b 87 1 2026-03-10T12:37:43.048 INFO:tasks.workunit.client.0.vm00.stdout:3/162: creat dd/d18/d14/f3c x:0 0 0 2026-03-10T12:37:43.048 INFO:tasks.workunit.client.0.vm00.stdout:3/163: chown dd/d2a/d39 42 1 2026-03-10T12:37:43.049 INFO:tasks.workunit.client.0.vm00.stdout:3/164: write dd/d18/d14/f2f [165232,93617] 0 2026-03-10T12:37:43.050 INFO:tasks.workunit.client.0.vm00.stdout:2/157: creat d4/f39 x:0 0 0 2026-03-10T12:37:43.051 INFO:tasks.workunit.client.0.vm00.stdout:3/165: mkdir dd/d3d 0 2026-03-10T12:37:43.052 INFO:tasks.workunit.client.1.vm07.stdout:3/364: dwrite dc/dd/d1f/d45/f5e [0,4194304] 0 2026-03-10T12:37:43.052 INFO:tasks.workunit.client.0.vm00.stdout:2/158: mkdir d4/d6/d2d/d3a 0 2026-03-10T12:37:43.055 INFO:tasks.workunit.client.0.vm00.stdout:2/159: dwrite d4/d6/f30 [0,4194304] 0 2026-03-10T12:37:43.057 INFO:tasks.workunit.client.0.vm00.stdout:2/160: write d4/dd/f10 [5702599,55346] 0 2026-03-10T12:37:43.060 INFO:tasks.workunit.client.0.vm00.stdout:2/161: mkdir d4/d6/d2d/d31/d32/d3b 0 2026-03-10T12:37:43.063 INFO:tasks.workunit.client.0.vm00.stdout:2/162: dwrite d4/d6/f16 [0,4194304] 0 2026-03-10T12:37:43.065 INFO:tasks.workunit.client.0.vm00.stdout:2/163: fsync d4/f1d 0 2026-03-10T12:37:43.065 INFO:tasks.workunit.client.0.vm00.stdout:2/164: write d4/d6/f34 [267694,66747] 0 2026-03-10T12:37:43.078 INFO:tasks.workunit.client.1.vm07.stdout:9/304: mkdir d5/d13/d6c 0 2026-03-10T12:37:43.092 INFO:tasks.workunit.client.0.vm00.stdout:0/234: creat d3/d22/f54 x:0 0 0 2026-03-10T12:37:43.096 INFO:tasks.workunit.client.1.vm07.stdout:3/365: creat dc/d18/d2d/f80 x:0 0 0 2026-03-10T12:37:43.113 INFO:tasks.workunit.client.1.vm07.stdout:8/367: symlink d1/d3/d5d/l76 0 2026-03-10T12:37:43.113 INFO:tasks.workunit.client.0.vm00.stdout:0/235: unlink d3/d7/l3b 0 2026-03-10T12:37:43.113 INFO:tasks.workunit.client.0.vm00.stdout:0/236: creat d3/d22/f55 x:0 0 0 2026-03-10T12:37:43.113 INFO:tasks.workunit.client.0.vm00.stdout:0/237: write d3/d1b/f2b [1767242,65695] 0 2026-03-10T12:37:43.113 INFO:tasks.workunit.client.0.vm00.stdout:0/238: creat d3/d1b/f56 x:0 0 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:0/400: mknod d0/d14/d5f/d76/d2f/d31/d6b/c7e 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:3/366: mkdir dc/dd/d43/d5c/d81 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:9/305: symlink d5/d1f/d5e/d6b/l6d 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:8/368: rename d1/d3/f29 to d1/d3/d11/f77 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:4/423: dwrite d0/d4/d10/d18/f1a [0,4194304] 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:8/369: readlink d1/d3/d6/d50/l69 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:0/401: unlink d0/d14/d5f/d76/f2c 0 2026-03-10T12:37:43.114 INFO:tasks.workunit.client.1.vm07.stdout:9/306: symlink d5/d16/d18/l6e 0 2026-03-10T12:37:43.115 INFO:tasks.workunit.client.1.vm07.stdout:3/367: symlink dc/d18/l82 0 2026-03-10T12:37:43.116 INFO:tasks.workunit.client.0.vm00.stdout:0/239: creat d3/d1b/f57 x:0 0 0 2026-03-10T12:37:43.117 INFO:tasks.workunit.client.1.vm07.stdout:4/424: write d0/d4/d10/d3c/d2b/d2d/f65 [3408383,11825] 0 2026-03-10T12:37:43.119 INFO:tasks.workunit.client.1.vm07.stdout:4/425: read d0/d4/d7a/f4f [643435,112693] 0 2026-03-10T12:37:43.121 INFO:tasks.workunit.client.1.vm07.stdout:4/426: fdatasync d0/d4/d10/d3c/f68 0 2026-03-10T12:37:43.124 INFO:tasks.workunit.client.0.vm00.stdout:3/166: sync 2026-03-10T12:37:43.124 INFO:tasks.workunit.client.0.vm00.stdout:3/167: fdatasync dd/d18/d14/d2b/f31 0 2026-03-10T12:37:43.125 INFO:tasks.workunit.client.1.vm07.stdout:0/402: symlink d0/d14/d5f/d76/d2f/d31/d6b/l7f 0 2026-03-10T12:37:43.125 INFO:tasks.workunit.client.0.vm00.stdout:3/168: write fb [490380,34181] 0 2026-03-10T12:37:43.125 INFO:tasks.workunit.client.1.vm07.stdout:9/307: mknod d5/d16/d23/c6f 0 2026-03-10T12:37:43.126 INFO:tasks.workunit.client.0.vm00.stdout:0/240: mkdir d3/d7/d58 0 2026-03-10T12:37:43.129 INFO:tasks.workunit.client.0.vm00.stdout:0/241: dwrite d3/d22/f42 [0,4194304] 0 2026-03-10T12:37:43.131 INFO:tasks.workunit.client.0.vm00.stdout:2/165: sync 2026-03-10T12:37:43.131 INFO:tasks.workunit.client.0.vm00.stdout:0/242: write d3/d7/f15 [519291,75685] 0 2026-03-10T12:37:43.132 INFO:tasks.workunit.client.1.vm07.stdout:4/427: symlink d0/d4/d5/da/d66/l90 0 2026-03-10T12:37:43.133 INFO:tasks.workunit.client.1.vm07.stdout:0/403: write d0/d14/d5f/d76/f30 [4877390,771] 0 2026-03-10T12:37:43.135 INFO:tasks.workunit.client.0.vm00.stdout:0/243: creat d3/d40/f59 x:0 0 0 2026-03-10T12:37:43.135 INFO:tasks.workunit.client.0.vm00.stdout:3/169: dwrite dd/f25 [0,4194304] 0 2026-03-10T12:37:43.135 INFO:tasks.workunit.client.1.vm07.stdout:3/368: symlink dc/dd/d1f/d6f/l83 0 2026-03-10T12:37:43.137 INFO:tasks.workunit.client.1.vm07.stdout:3/369: write dc/dd/d1f/d45/f5e [685466,48766] 0 2026-03-10T12:37:43.137 INFO:tasks.workunit.client.0.vm00.stdout:3/170: dread - dd/d18/d14/d2b/f31 zero size 2026-03-10T12:37:43.139 INFO:tasks.workunit.client.0.vm00.stdout:2/166: creat d4/dd/f3c x:0 0 0 2026-03-10T12:37:43.139 INFO:tasks.workunit.client.0.vm00.stdout:2/167: chown d4/dd/c21 20024 1 2026-03-10T12:37:43.140 INFO:tasks.workunit.client.1.vm07.stdout:0/404: rename d0/lc to d0/d14/d5f/d3b/l80 0 2026-03-10T12:37:43.141 INFO:tasks.workunit.client.0.vm00.stdout:3/171: write dd/d18/f12 [1879940,8785] 0 2026-03-10T12:37:43.141 INFO:tasks.workunit.client.1.vm07.stdout:8/370: link d1/d3/d5d/l76 d1/d3/d6/l78 0 2026-03-10T12:37:43.142 INFO:tasks.workunit.client.0.vm00.stdout:2/168: dwrite d4/dd/f3c [0,4194304] 0 2026-03-10T12:37:43.142 INFO:tasks.workunit.client.1.vm07.stdout:3/370: rename dc/d18/d24/c6c to dc/dd/d1f/d6f/c84 0 2026-03-10T12:37:43.143 INFO:tasks.workunit.client.0.vm00.stdout:0/244: dwrite d3/d1b/d38/d44/f49 [0,4194304] 0 2026-03-10T12:37:43.143 INFO:tasks.workunit.client.1.vm07.stdout:4/428: creat d0/d19/f91 x:0 0 0 2026-03-10T12:37:43.150 INFO:tasks.workunit.client.0.vm00.stdout:0/245: fsync d3/db/d24/d25/f43 0 2026-03-10T12:37:43.151 INFO:tasks.workunit.client.0.vm00.stdout:0/246: dread - d3/d22/f54 zero size 2026-03-10T12:37:43.157 INFO:tasks.workunit.client.0.vm00.stdout:0/247: dwrite d3/db/f16 [0,4194304] 0 2026-03-10T12:37:43.170 INFO:tasks.workunit.client.1.vm07.stdout:4/429: dread d0/d4/d5/f43 [0,4194304] 0 2026-03-10T12:37:43.170 INFO:tasks.workunit.client.1.vm07.stdout:4/430: chown d0/d4/d5/l20 7359384 1 2026-03-10T12:37:43.170 INFO:tasks.workunit.client.0.vm00.stdout:3/172: unlink dd/d27/l28 0 2026-03-10T12:37:43.171 INFO:tasks.workunit.client.0.vm00.stdout:3/173: readlink dd/d27/l2e 0 2026-03-10T12:37:43.171 INFO:tasks.workunit.client.0.vm00.stdout:0/248: stat d3/d7/d3c/f30 0 2026-03-10T12:37:43.171 INFO:tasks.workunit.client.0.vm00.stdout:3/174: truncate dd/d18/d14/d2b/f31 90657 0 2026-03-10T12:37:43.171 INFO:tasks.workunit.client.0.vm00.stdout:8/156: dwrite d0/f28 [0,4194304] 0 2026-03-10T12:37:43.173 INFO:tasks.workunit.client.0.vm00.stdout:1/178: truncate da/d12/f30 2945067 0 2026-03-10T12:37:43.174 INFO:tasks.workunit.client.1.vm07.stdout:4/431: dread d0/d4/d5/da/f6e [0,4194304] 0 2026-03-10T12:37:43.175 INFO:tasks.workunit.client.0.vm00.stdout:1/179: dread da/f13 [0,4194304] 0 2026-03-10T12:37:43.179 INFO:tasks.workunit.client.0.vm00.stdout:1/180: mknod da/d12/d26/c3e 0 2026-03-10T12:37:43.181 INFO:tasks.workunit.client.0.vm00.stdout:3/175: creat dd/d3d/f3e x:0 0 0 2026-03-10T12:37:43.181 INFO:tasks.workunit.client.0.vm00.stdout:3/176: fsync f7 0 2026-03-10T12:37:43.181 INFO:tasks.workunit.client.0.vm00.stdout:3/177: stat dd/d27/d2c/d34 0 2026-03-10T12:37:43.184 INFO:tasks.workunit.client.0.vm00.stdout:3/178: dread dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:43.185 INFO:tasks.workunit.client.0.vm00.stdout:0/249: mkdir d3/d1b/d38/d44/d5a 0 2026-03-10T12:37:43.187 INFO:tasks.workunit.client.0.vm00.stdout:3/179: dread dd/d27/f35 [0,4194304] 0 2026-03-10T12:37:43.187 INFO:tasks.workunit.client.1.vm07.stdout:3/371: creat dc/dd/f85 x:0 0 0 2026-03-10T12:37:43.188 INFO:tasks.workunit.client.0.vm00.stdout:3/180: write dd/d18/d13/f22 [2572647,58679] 0 2026-03-10T12:37:43.191 INFO:tasks.workunit.client.0.vm00.stdout:0/250: dwrite d3/d7/f1c [0,4194304] 0 2026-03-10T12:37:43.192 INFO:tasks.workunit.client.0.vm00.stdout:0/251: dread - d3/d22/f54 zero size 2026-03-10T12:37:43.194 INFO:tasks.workunit.client.0.vm00.stdout:1/181: symlink da/d24/l3f 0 2026-03-10T12:37:43.194 INFO:tasks.workunit.client.0.vm00.stdout:1/182: chown da/d12/c1c 15601983 1 2026-03-10T12:37:43.197 INFO:tasks.workunit.client.0.vm00.stdout:1/183: write da/f13 [1409720,23481] 0 2026-03-10T12:37:43.198 INFO:tasks.workunit.client.0.vm00.stdout:1/184: dread f5 [0,4194304] 0 2026-03-10T12:37:43.200 INFO:tasks.workunit.client.1.vm07.stdout:3/372: rename l7 to dc/d18/d24/d72/l86 0 2026-03-10T12:37:43.201 INFO:tasks.workunit.client.1.vm07.stdout:3/373: chown dc/dd/l48 512304 1 2026-03-10T12:37:43.202 INFO:tasks.workunit.client.1.vm07.stdout:4/432: mknod d0/d5c/d7c/c92 0 2026-03-10T12:37:43.202 INFO:tasks.workunit.client.0.vm00.stdout:1/185: dwrite da/d24/d28/f3c [0,4194304] 0 2026-03-10T12:37:43.204 INFO:tasks.workunit.client.0.vm00.stdout:3/181: symlink dd/d2a/d39/l3f 0 2026-03-10T12:37:43.210 INFO:tasks.workunit.client.1.vm07.stdout:4/433: creat d0/d4/d7a/f93 x:0 0 0 2026-03-10T12:37:43.210 INFO:tasks.workunit.client.0.vm00.stdout:3/182: read f9 [2481282,2699] 0 2026-03-10T12:37:43.210 INFO:tasks.workunit.client.0.vm00.stdout:3/183: write dd/d3d/f3e [91291,97887] 0 2026-03-10T12:37:43.210 INFO:tasks.workunit.client.0.vm00.stdout:1/186: dread da/d24/f32 [0,4194304] 0 2026-03-10T12:37:43.210 INFO:tasks.workunit.client.0.vm00.stdout:3/184: rmdir dd/d18/d13 39 2026-03-10T12:37:43.218 INFO:tasks.workunit.client.1.vm07.stdout:5/362: dwrite d0/d22/d18/d19/f23 [0,4194304] 0 2026-03-10T12:37:43.219 INFO:tasks.workunit.client.0.vm00.stdout:1/187: creat da/d12/d26/f40 x:0 0 0 2026-03-10T12:37:43.221 INFO:tasks.workunit.client.1.vm07.stdout:4/434: getdents d0/d5c 0 2026-03-10T12:37:43.228 INFO:tasks.workunit.client.1.vm07.stdout:4/435: chown d0/c41 3593 1 2026-03-10T12:37:43.231 INFO:tasks.workunit.client.1.vm07.stdout:4/436: creat d0/d4/d5/d34/f94 x:0 0 0 2026-03-10T12:37:43.246 INFO:tasks.workunit.client.1.vm07.stdout:4/437: mkdir d0/d4/d5/da/d95 0 2026-03-10T12:37:43.254 INFO:tasks.workunit.client.1.vm07.stdout:4/438: symlink d0/d4/d10/d5f/l96 0 2026-03-10T12:37:43.360 INFO:tasks.workunit.client.0.vm00.stdout:4/137: read fb [711944,70854] 0 2026-03-10T12:37:43.361 INFO:tasks.workunit.client.0.vm00.stdout:4/138: chown df/l23 481 1 2026-03-10T12:37:43.361 INFO:tasks.workunit.client.0.vm00.stdout:4/139: write df/f1b [523617,46632] 0 2026-03-10T12:37:43.365 INFO:tasks.workunit.client.0.vm00.stdout:4/140: dwrite df/f1b [0,4194304] 0 2026-03-10T12:37:43.375 INFO:tasks.workunit.client.1.vm07.stdout:2/242: dread d0/d42/d26/f2e [0,4194304] 0 2026-03-10T12:37:43.375 INFO:tasks.workunit.client.1.vm07.stdout:2/243: chown d0/lb 203692728 1 2026-03-10T12:37:43.375 INFO:tasks.workunit.client.0.vm00.stdout:4/141: dwrite df/d1f/d22/f2d [0,4194304] 0 2026-03-10T12:37:43.375 INFO:tasks.workunit.client.0.vm00.stdout:4/142: chown df/f1c 3 1 2026-03-10T12:37:43.375 INFO:tasks.workunit.client.0.vm00.stdout:4/143: truncate fb 2422334 0 2026-03-10T12:37:43.379 INFO:tasks.workunit.client.0.vm00.stdout:4/144: getdents df/d1f/d22 0 2026-03-10T12:37:43.379 INFO:tasks.workunit.client.1.vm07.stdout:2/244: truncate d0/f1d 273336 0 2026-03-10T12:37:43.380 INFO:tasks.workunit.client.1.vm07.stdout:2/245: chown d0/d42 59467 1 2026-03-10T12:37:43.383 INFO:tasks.workunit.client.0.vm00.stdout:4/145: dread f9 [4194304,4194304] 0 2026-03-10T12:37:43.383 INFO:tasks.workunit.client.0.vm00.stdout:4/146: readlink df/l23 0 2026-03-10T12:37:43.400 INFO:tasks.workunit.client.1.vm07.stdout:2/246: creat d0/d42/d26/d4b/f51 x:0 0 0 2026-03-10T12:37:43.410 INFO:tasks.workunit.client.1.vm07.stdout:2/247: rename d0/d42/f1e to d0/d42/d26/f52 0 2026-03-10T12:37:43.412 INFO:tasks.workunit.client.1.vm07.stdout:2/248: dwrite d0/d42/d26/d38/f3a [0,4194304] 0 2026-03-10T12:37:43.416 INFO:tasks.workunit.client.1.vm07.stdout:2/249: dread d0/f40 [0,4194304] 0 2026-03-10T12:37:43.452 INFO:tasks.workunit.client.0.vm00.stdout:4/147: read fb [230258,86265] 0 2026-03-10T12:37:43.452 INFO:tasks.workunit.client.0.vm00.stdout:4/148: readlink df/l23 0 2026-03-10T12:37:43.453 INFO:tasks.workunit.client.0.vm00.stdout:4/149: creat df/d24/f2f x:0 0 0 2026-03-10T12:37:43.455 INFO:tasks.workunit.client.0.vm00.stdout:4/150: rmdir df/d1f/d22 39 2026-03-10T12:37:43.456 INFO:tasks.workunit.client.0.vm00.stdout:4/151: fsync df/f16 0 2026-03-10T12:37:43.461 INFO:tasks.workunit.client.0.vm00.stdout:4/152: creat df/d1f/d22/f30 x:0 0 0 2026-03-10T12:37:43.462 INFO:tasks.workunit.client.0.vm00.stdout:4/153: read fa [84072,71374] 0 2026-03-10T12:37:43.463 INFO:tasks.workunit.client.0.vm00.stdout:4/154: creat df/d1f/d22/d26/f31 x:0 0 0 2026-03-10T12:37:43.464 INFO:tasks.workunit.client.0.vm00.stdout:5/150: getdents d1f/d26 0 2026-03-10T12:37:43.465 INFO:tasks.workunit.client.0.vm00.stdout:4/155: rmdir df/d1f/d22/d26/d2e 39 2026-03-10T12:37:43.466 INFO:tasks.workunit.client.0.vm00.stdout:5/151: mkdir d1f/d26/d2b/d37 0 2026-03-10T12:37:43.466 INFO:tasks.workunit.client.0.vm00.stdout:5/152: dread - d1f/d26/f28 zero size 2026-03-10T12:37:43.471 INFO:tasks.workunit.client.0.vm00.stdout:5/153: link d1f/f21 d1f/d26/d2b/d37/f38 0 2026-03-10T12:37:43.476 INFO:tasks.workunit.client.0.vm00.stdout:5/154: unlink lb 0 2026-03-10T12:37:43.476 INFO:tasks.workunit.client.0.vm00.stdout:5/155: dwrite d1f/f27 [0,4194304] 0 2026-03-10T12:37:43.478 INFO:tasks.workunit.client.1.vm07.stdout:7/311: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:43.480 INFO:tasks.workunit.client.0.vm00.stdout:5/156: dwrite d1f/f25 [4194304,4194304] 0 2026-03-10T12:37:43.480 INFO:tasks.workunit.client.1.vm07.stdout:7/312: mknod d0/d52/c60 0 2026-03-10T12:37:43.489 INFO:tasks.workunit.client.1.vm07.stdout:7/313: read d0/f3 [1221315,14054] 0 2026-03-10T12:37:43.489 INFO:tasks.workunit.client.1.vm07.stdout:6/281: dread d1/d4/d6/f30 [0,4194304] 0 2026-03-10T12:37:43.490 INFO:tasks.workunit.client.1.vm07.stdout:6/282: chown d1/d4/d6/l1b 54 1 2026-03-10T12:37:43.493 INFO:tasks.workunit.client.1.vm07.stdout:6/283: chown d1/l14 101200727 1 2026-03-10T12:37:43.495 INFO:tasks.workunit.client.0.vm00.stdout:5/157: unlink d1f/d26/c29 0 2026-03-10T12:37:43.496 INFO:tasks.workunit.client.0.vm00.stdout:5/158: chown d1f 125630 1 2026-03-10T12:37:43.496 INFO:tasks.workunit.client.1.vm07.stdout:6/284: unlink d1/d4/d6/d16/l24 0 2026-03-10T12:37:43.497 INFO:tasks.workunit.client.0.vm00.stdout:5/159: mkdir d1f/d39 0 2026-03-10T12:37:43.500 INFO:tasks.workunit.client.0.vm00.stdout:9/207: truncate d0/d5/d16/f30 3854615 0 2026-03-10T12:37:43.503 INFO:tasks.workunit.client.0.vm00.stdout:9/208: dread d0/d5/d16/d19/d2f/f3f [0,4194304] 0 2026-03-10T12:37:43.503 INFO:tasks.workunit.client.1.vm07.stdout:6/285: dwrite d1/d4/f3b [0,4194304] 0 2026-03-10T12:37:43.503 INFO:tasks.workunit.client.1.vm07.stdout:7/314: mkdir d0/d61 0 2026-03-10T12:37:43.503 INFO:tasks.workunit.client.1.vm07.stdout:6/286: dread - d1/d4/d4a/f56 zero size 2026-03-10T12:37:43.503 INFO:tasks.workunit.client.1.vm07.stdout:7/315: write d0/d47/d48/f54 [790042,34257] 0 2026-03-10T12:37:43.506 INFO:tasks.workunit.client.0.vm00.stdout:9/209: dwrite d0/d5/f3b [0,4194304] 0 2026-03-10T12:37:43.506 INFO:tasks.workunit.client.1.vm07.stdout:6/287: fdatasync d1/d4/f5a 0 2026-03-10T12:37:43.510 INFO:tasks.workunit.client.0.vm00.stdout:9/210: mknod d0/d5/d16/d1e/d27/c4b 0 2026-03-10T12:37:43.511 INFO:tasks.workunit.client.0.vm00.stdout:5/160: dread f12 [0,4194304] 0 2026-03-10T12:37:43.511 INFO:tasks.workunit.client.0.vm00.stdout:5/161: readlink d1f/d26/l2d 0 2026-03-10T12:37:43.511 INFO:tasks.workunit.client.0.vm00.stdout:9/211: write d0/d5/d16/d19/f32 [1712231,6703] 0 2026-03-10T12:37:43.512 INFO:tasks.workunit.client.0.vm00.stdout:5/162: truncate f12 2103053 0 2026-03-10T12:37:43.512 INFO:tasks.workunit.client.0.vm00.stdout:5/163: write d1f/f32 [672154,92579] 0 2026-03-10T12:37:43.519 INFO:tasks.workunit.client.0.vm00.stdout:5/164: write d1f/f21 [489034,89169] 0 2026-03-10T12:37:43.520 INFO:tasks.workunit.client.1.vm07.stdout:6/288: chown d1/d4/c1c 11190 1 2026-03-10T12:37:43.522 INFO:tasks.workunit.client.0.vm00.stdout:8/157: dwrite d0/d12/f2a [0,4194304] 0 2026-03-10T12:37:43.522 INFO:tasks.workunit.client.0.vm00.stdout:2/169: truncate d4/dd/f3c 353595 0 2026-03-10T12:37:43.526 INFO:tasks.workunit.client.0.vm00.stdout:5/165: creat d1f/d26/d2e/f3a x:0 0 0 2026-03-10T12:37:43.536 INFO:tasks.workunit.client.1.vm07.stdout:7/316: mkdir d0/d57/d62 0 2026-03-10T12:37:43.537 INFO:tasks.workunit.client.0.vm00.stdout:5/166: dread d1f/f21 [0,4194304] 0 2026-03-10T12:37:43.537 INFO:tasks.workunit.client.0.vm00.stdout:5/167: chown c7 36499 1 2026-03-10T12:37:43.537 INFO:tasks.workunit.client.0.vm00.stdout:5/168: dread - d1f/d26/f28 zero size 2026-03-10T12:37:43.537 INFO:tasks.workunit.client.0.vm00.stdout:8/158: unlink d0/d12/f1a 0 2026-03-10T12:37:43.537 INFO:tasks.workunit.client.0.vm00.stdout:8/159: dwrite d0/d12/d17/f1d [0,4194304] 0 2026-03-10T12:37:43.539 INFO:tasks.workunit.client.1.vm07.stdout:6/289: mknod d1/d4/d6/d46/c5d 0 2026-03-10T12:37:43.540 INFO:tasks.workunit.client.1.vm07.stdout:6/290: write d1/d4/d6/f13 [1016393,64299] 0 2026-03-10T12:37:43.547 INFO:tasks.workunit.client.0.vm00.stdout:8/160: mknod d0/c30 0 2026-03-10T12:37:43.550 INFO:tasks.workunit.client.0.vm00.stdout:8/161: link d0/dd/le d0/d12/d17/l31 0 2026-03-10T12:37:43.550 INFO:tasks.workunit.client.1.vm07.stdout:7/317: fdatasync d0/f2f 0 2026-03-10T12:37:43.551 INFO:tasks.workunit.client.0.vm00.stdout:8/162: mkdir d0/d12/d17/d32 0 2026-03-10T12:37:43.551 INFO:tasks.workunit.client.1.vm07.stdout:7/318: read - d0/f4e zero size 2026-03-10T12:37:43.552 INFO:tasks.workunit.client.0.vm00.stdout:8/163: creat d0/d12/d2d/f33 x:0 0 0 2026-03-10T12:37:43.554 INFO:tasks.workunit.client.0.vm00.stdout:8/164: dread d0/d12/d17/f1d [0,4194304] 0 2026-03-10T12:37:43.579 INFO:tasks.workunit.client.1.vm07.stdout:7/319: rename d0/l2c to d0/d61/l63 0 2026-03-10T12:37:43.583 INFO:tasks.workunit.client.1.vm07.stdout:6/291: truncate d1/d4/f19 1877580 0 2026-03-10T12:37:43.587 INFO:tasks.workunit.client.1.vm07.stdout:6/292: dread d1/d4/d6/d46/d4d/fb [0,4194304] 0 2026-03-10T12:37:43.588 INFO:tasks.workunit.client.1.vm07.stdout:7/320: dread - d0/f42 zero size 2026-03-10T12:37:43.597 INFO:tasks.workunit.client.1.vm07.stdout:7/321: creat d0/d61/f64 x:0 0 0 2026-03-10T12:37:43.607 INFO:tasks.workunit.client.1.vm07.stdout:6/293: creat d1/d4/d6/d53/f5e x:0 0 0 2026-03-10T12:37:43.607 INFO:tasks.workunit.client.1.vm07.stdout:6/294: read - d1/d4/d6/d4e/f51 zero size 2026-03-10T12:37:43.607 INFO:tasks.workunit.client.1.vm07.stdout:6/295: dread d1/d4/f11 [0,4194304] 0 2026-03-10T12:37:43.607 INFO:tasks.workunit.client.1.vm07.stdout:6/296: fsync d1/f26 0 2026-03-10T12:37:43.607 INFO:tasks.workunit.client.1.vm07.stdout:7/322: rmdir d0/d52 39 2026-03-10T12:37:43.615 INFO:tasks.workunit.client.0.vm00.stdout:8/165: sync 2026-03-10T12:37:43.617 INFO:tasks.workunit.client.0.vm00.stdout:8/166: readlink d0/l2 0 2026-03-10T12:37:43.617 INFO:tasks.workunit.client.0.vm00.stdout:8/167: chown d0/f10 28 1 2026-03-10T12:37:43.617 INFO:tasks.workunit.client.0.vm00.stdout:8/168: stat d0/l29 0 2026-03-10T12:37:43.627 INFO:tasks.workunit.client.0.vm00.stdout:8/169: dread d0/f22 [0,4194304] 0 2026-03-10T12:37:43.627 INFO:tasks.workunit.client.0.vm00.stdout:8/170: write d0/f10 [3698770,128519] 0 2026-03-10T12:37:43.631 INFO:tasks.workunit.client.0.vm00.stdout:8/171: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:43.634 INFO:tasks.workunit.client.0.vm00.stdout:8/172: dwrite d0/dd/f2b [0,4194304] 0 2026-03-10T12:37:43.635 INFO:tasks.workunit.client.0.vm00.stdout:4/156: truncate df/f11 1433995 0 2026-03-10T12:37:43.635 INFO:tasks.workunit.client.0.vm00.stdout:8/173: chown d0/c2f 127373 1 2026-03-10T12:37:43.642 INFO:tasks.workunit.client.0.vm00.stdout:8/174: dwrite d0/d12/d17/f2e [0,4194304] 0 2026-03-10T12:37:43.650 INFO:tasks.workunit.client.0.vm00.stdout:8/175: link d0/dd/f20 d0/d12/f34 0 2026-03-10T12:37:43.651 INFO:tasks.workunit.client.0.vm00.stdout:8/176: mknod d0/d12/c35 0 2026-03-10T12:37:43.719 INFO:tasks.workunit.client.0.vm00.stdout:2/170: truncate d4/f28 257847 0 2026-03-10T12:37:43.720 INFO:tasks.workunit.client.0.vm00.stdout:5/169: fsync f12 0 2026-03-10T12:37:43.720 INFO:tasks.workunit.client.0.vm00.stdout:5/170: fdatasync d1f/d26/d2e/f3a 0 2026-03-10T12:37:43.721 INFO:tasks.workunit.client.0.vm00.stdout:2/171: creat d4/d6/d2d/f3d x:0 0 0 2026-03-10T12:37:43.722 INFO:tasks.workunit.client.0.vm00.stdout:5/171: rmdir d1f/d26/d2b 39 2026-03-10T12:37:43.722 INFO:tasks.workunit.client.0.vm00.stdout:2/172: chown d4/c5 180895 1 2026-03-10T12:37:43.726 INFO:tasks.workunit.client.0.vm00.stdout:5/172: dwrite d1f/d26/d2b/d37/f38 [0,4194304] 0 2026-03-10T12:37:43.731 INFO:tasks.workunit.client.0.vm00.stdout:5/173: symlink d1f/d26/d2e/l3b 0 2026-03-10T12:37:43.750 INFO:tasks.workunit.client.0.vm00.stdout:4/157: dread fa [0,4194304] 0 2026-03-10T12:37:43.751 INFO:tasks.workunit.client.0.vm00.stdout:4/158: mkdir df/d32 0 2026-03-10T12:37:43.752 INFO:tasks.workunit.client.0.vm00.stdout:4/159: dread - df/d1f/d22/d26/f31 zero size 2026-03-10T12:37:43.752 INFO:tasks.workunit.client.0.vm00.stdout:4/160: fsync df/d24/f2f 0 2026-03-10T12:37:43.757 INFO:tasks.workunit.client.1.vm07.stdout:8/371: getdents d1/d3/d6 0 2026-03-10T12:37:43.758 INFO:tasks.workunit.client.0.vm00.stdout:4/161: link df/f19 df/d24/f33 0 2026-03-10T12:37:43.758 INFO:tasks.workunit.client.1.vm07.stdout:9/308: truncate d5/d13/d22/f5f 441179 0 2026-03-10T12:37:43.760 INFO:tasks.workunit.client.0.vm00.stdout:4/162: unlink df/d1f/d22/f2d 0 2026-03-10T12:37:43.761 INFO:tasks.workunit.client.0.vm00.stdout:4/163: unlink c6 0 2026-03-10T12:37:43.761 INFO:tasks.workunit.client.0.vm00.stdout:4/164: dread - df/d24/f2f zero size 2026-03-10T12:37:43.762 INFO:tasks.workunit.client.0.vm00.stdout:4/165: chown df/d1f/d22/d26/f31 60 1 2026-03-10T12:37:43.762 INFO:tasks.workunit.client.1.vm07.stdout:9/309: creat d5/d1f/d31/d64/f70 x:0 0 0 2026-03-10T12:37:43.767 INFO:tasks.workunit.client.0.vm00.stdout:4/166: symlink df/d24/l34 0 2026-03-10T12:37:43.767 INFO:tasks.workunit.client.1.vm07.stdout:3/374: getdents dc/dd/d1f/d6f 0 2026-03-10T12:37:43.767 INFO:tasks.workunit.client.1.vm07.stdout:3/375: chown dc/dd/f1d 0 1 2026-03-10T12:37:43.768 INFO:tasks.workunit.client.0.vm00.stdout:4/167: read df/d24/f33 [2820113,12459] 0 2026-03-10T12:37:43.769 INFO:tasks.workunit.client.0.vm00.stdout:4/168: fdatasync fa 0 2026-03-10T12:37:43.770 INFO:tasks.workunit.client.1.vm07.stdout:8/372: creat d1/f79 x:0 0 0 2026-03-10T12:37:43.770 INFO:tasks.workunit.client.1.vm07.stdout:0/405: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/f75 [0,4194304] 0 2026-03-10T12:37:43.772 INFO:tasks.workunit.client.0.vm00.stdout:4/169: dread df/f1b [0,4194304] 0 2026-03-10T12:37:43.776 INFO:tasks.workunit.client.0.vm00.stdout:4/170: dwrite df/d24/f33 [0,4194304] 0 2026-03-10T12:37:43.785 INFO:tasks.workunit.client.0.vm00.stdout:4/171: truncate df/d1f/d22/f30 693140 0 2026-03-10T12:37:43.785 INFO:tasks.workunit.client.0.vm00.stdout:4/172: symlink df/d1f/l35 0 2026-03-10T12:37:43.821 INFO:tasks.workunit.client.0.vm00.stdout:4/173: sync 2026-03-10T12:37:43.822 INFO:tasks.workunit.client.0.vm00.stdout:0/252: rename d3/d1b to d3/d7/d4c/d5b 0 2026-03-10T12:37:43.826 INFO:tasks.workunit.client.0.vm00.stdout:3/185: rename dd/d27/l3b to dd/d18/d14/l40 0 2026-03-10T12:37:43.827 INFO:tasks.workunit.client.1.vm07.stdout:5/363: truncate d0/fa 937181 0 2026-03-10T12:37:43.828 INFO:tasks.workunit.client.1.vm07.stdout:5/364: fdatasync d0/d22/f50 0 2026-03-10T12:37:43.828 INFO:tasks.workunit.client.0.vm00.stdout:4/174: mkdir df/d1f/d36 0 2026-03-10T12:37:43.828 INFO:tasks.workunit.client.1.vm07.stdout:8/373: dread d1/f19 [0,4194304] 0 2026-03-10T12:37:43.831 INFO:tasks.workunit.client.0.vm00.stdout:1/188: rename da/d12/l15 to da/d21/d27/l41 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.1.vm07.stdout:4/439: write d0/d4/d5/d34/f5d [527119,10664] 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.0.vm00.stdout:3/186: readlink dd/d18/d13/d1d/l37 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.0.vm00.stdout:1/189: mkdir da/d12/d26/d42 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.0.vm00.stdout:4/175: mknod df/d1f/d22/d26/d2e/c37 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.0.vm00.stdout:9/212: rename d0/d5/d16/d1e/c46 to d0/d5/d16/d19/d2f/d35/c4c 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.0.vm00.stdout:4/176: fdatasync df/f1b 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.0.vm00.stdout:4/177: dread df/d1f/d22/f30 [0,4194304] 0 2026-03-10T12:37:43.843 INFO:tasks.workunit.client.0.vm00.stdout:1/190: rename da/ce to da/d12/d26/c43 0 2026-03-10T12:37:43.845 INFO:tasks.workunit.client.0.vm00.stdout:9/213: mknod d0/d3d/d43/c4d 0 2026-03-10T12:37:43.847 INFO:tasks.workunit.client.0.vm00.stdout:3/187: sync 2026-03-10T12:37:43.848 INFO:tasks.workunit.client.0.vm00.stdout:3/188: fsync fb 0 2026-03-10T12:37:43.852 INFO:tasks.workunit.client.0.vm00.stdout:9/214: mkdir d0/d5/d16/d19/d2f/d35/d4e 0 2026-03-10T12:37:43.853 INFO:tasks.workunit.client.0.vm00.stdout:4/178: creat df/d1f/d22/d26/f38 x:0 0 0 2026-03-10T12:37:43.854 INFO:tasks.workunit.client.0.vm00.stdout:3/189: mknod dd/d18/d13/d1d/c41 0 2026-03-10T12:37:43.855 INFO:tasks.workunit.client.0.vm00.stdout:1/191: truncate f5 2890076 0 2026-03-10T12:37:43.855 INFO:tasks.workunit.client.0.vm00.stdout:1/192: write da/d12/f1d [4217513,54910] 0 2026-03-10T12:37:43.856 INFO:tasks.workunit.client.0.vm00.stdout:1/193: readlink da/d24/l36 0 2026-03-10T12:37:43.857 INFO:tasks.workunit.client.1.vm07.stdout:0/406: mknod d0/d14/d5f/d41/d6a/d74/c81 0 2026-03-10T12:37:43.857 INFO:tasks.workunit.client.0.vm00.stdout:9/215: sync 2026-03-10T12:37:43.859 INFO:tasks.workunit.client.0.vm00.stdout:3/190: creat dd/d18/d13/d1d/f42 x:0 0 0 2026-03-10T12:37:43.862 INFO:tasks.workunit.client.0.vm00.stdout:1/194: mkdir da/d24/d28/d44 0 2026-03-10T12:37:43.862 INFO:tasks.workunit.client.0.vm00.stdout:4/179: creat df/d1f/d22/d26/f39 x:0 0 0 2026-03-10T12:37:43.862 INFO:tasks.workunit.client.0.vm00.stdout:1/195: chown da/d21 3 1 2026-03-10T12:37:43.863 INFO:tasks.workunit.client.1.vm07.stdout:5/365: mknod d0/d22/d18/d19/d2e/d3f/c7e 0 2026-03-10T12:37:43.864 INFO:tasks.workunit.client.0.vm00.stdout:1/196: creat da/d24/f45 x:0 0 0 2026-03-10T12:37:43.864 INFO:tasks.workunit.client.1.vm07.stdout:5/366: stat d0/d22/d18/d19/d2e/f59 0 2026-03-10T12:37:43.865 INFO:tasks.workunit.client.0.vm00.stdout:9/216: link d0/d5/d16/f34 d0/d5/d16/d19/d2f/f4f 0 2026-03-10T12:37:43.865 INFO:tasks.workunit.client.1.vm07.stdout:8/374: read d1/d3/d11/f43 [426943,76658] 0 2026-03-10T12:37:43.865 INFO:tasks.workunit.client.1.vm07.stdout:5/367: stat d0/d22/d18/d3e/d5d/f6d 0 2026-03-10T12:37:43.865 INFO:tasks.workunit.client.0.vm00.stdout:9/217: chown d0/l2c 11761774 1 2026-03-10T12:37:43.865 INFO:tasks.workunit.client.0.vm00.stdout:4/180: mkdir df/d1f/d36/d3a 0 2026-03-10T12:37:43.866 INFO:tasks.workunit.client.0.vm00.stdout:9/218: fsync d0/d5/d16/d19/d2f/d35/f45 0 2026-03-10T12:37:43.867 INFO:tasks.workunit.client.1.vm07.stdout:9/310: link d5/d13/d22/l3f d5/d16/d18/l71 0 2026-03-10T12:37:43.867 INFO:tasks.workunit.client.0.vm00.stdout:9/219: mkdir d0/d5/d16/d19/d50 0 2026-03-10T12:37:43.873 INFO:tasks.workunit.client.1.vm07.stdout:8/375: write d1/f48 [4588651,58071] 0 2026-03-10T12:37:43.874 INFO:tasks.workunit.client.1.vm07.stdout:4/440: creat d0/d4/d10/d8d/f97 x:0 0 0 2026-03-10T12:37:43.875 INFO:tasks.workunit.client.1.vm07.stdout:3/376: getdents dc/dd/d43 0 2026-03-10T12:37:43.882 INFO:tasks.workunit.client.0.vm00.stdout:1/197: sync 2026-03-10T12:37:43.882 INFO:tasks.workunit.client.0.vm00.stdout:9/220: sync 2026-03-10T12:37:43.883 INFO:tasks.workunit.client.0.vm00.stdout:9/221: read d0/d5/d16/f39 [2220306,109868] 0 2026-03-10T12:37:43.883 INFO:tasks.workunit.client.0.vm00.stdout:9/222: dread - d0/d5/d16/d19/d2f/d35/f45 zero size 2026-03-10T12:37:43.888 INFO:tasks.workunit.client.0.vm00.stdout:9/223: read d0/d5/d16/d1e/d2b/f47 [3705853,49688] 0 2026-03-10T12:37:43.890 INFO:tasks.workunit.client.0.vm00.stdout:9/224: link d0/d5/d16/d19/c22 d0/d5/d16/c51 0 2026-03-10T12:37:43.891 INFO:tasks.workunit.client.0.vm00.stdout:9/225: creat d0/d5/d16/d1e/d27/f52 x:0 0 0 2026-03-10T12:37:43.899 INFO:tasks.workunit.client.1.vm07.stdout:0/407: mknod d0/d14/d7c/c82 0 2026-03-10T12:37:43.904 INFO:tasks.workunit.client.1.vm07.stdout:0/408: dwrite d0/d14/d5f/d76/d2f/d31/d79/f7b [0,4194304] 0 2026-03-10T12:37:43.904 INFO:tasks.workunit.client.0.vm00.stdout:8/177: truncate d0/f10 4311400 0 2026-03-10T12:37:43.905 INFO:tasks.workunit.client.0.vm00.stdout:8/178: chown d0/d12/d17/f2e 57524078 1 2026-03-10T12:37:43.905 INFO:tasks.workunit.client.1.vm07.stdout:0/409: read d0/d14/d5f/d76/d2f/d31/f6f [173877,68102] 0 2026-03-10T12:37:43.905 INFO:tasks.workunit.client.0.vm00.stdout:8/179: write d0/f7 [3563640,87826] 0 2026-03-10T12:37:43.906 INFO:tasks.workunit.client.0.vm00.stdout:8/180: chown d0/d12/d2d/f33 79463492 1 2026-03-10T12:37:43.909 INFO:tasks.workunit.client.0.vm00.stdout:8/181: rmdir d0/dd 39 2026-03-10T12:37:43.910 INFO:tasks.workunit.client.0.vm00.stdout:8/182: write d0/f7 [2488602,44805] 0 2026-03-10T12:37:43.910 INFO:tasks.workunit.client.0.vm00.stdout:8/183: read d0/f9 [5051221,82963] 0 2026-03-10T12:37:43.911 INFO:tasks.workunit.client.1.vm07.stdout:0/410: dwrite d0/d14/d5f/d76/d2f/d31/d4f/f70 [0,4194304] 0 2026-03-10T12:37:43.911 INFO:tasks.workunit.client.0.vm00.stdout:8/184: read d0/d12/d17/f2e [785951,8215] 0 2026-03-10T12:37:43.911 INFO:tasks.workunit.client.0.vm00.stdout:8/185: stat d0/d12 0 2026-03-10T12:37:43.916 INFO:tasks.workunit.client.1.vm07.stdout:3/377: mknod dc/dd/d1f/d6f/c87 0 2026-03-10T12:37:43.918 INFO:tasks.workunit.client.0.vm00.stdout:8/186: rename d0/d12/d17/d32 to d0/d12/d36 0 2026-03-10T12:37:43.920 INFO:tasks.workunit.client.0.vm00.stdout:6/238: creat d2/d51/f5c x:0 0 0 2026-03-10T12:37:43.920 INFO:tasks.workunit.client.0.vm00.stdout:6/239: readlink d2/l7 0 2026-03-10T12:37:43.921 INFO:tasks.workunit.client.0.vm00.stdout:8/187: mknod d0/dd/c37 0 2026-03-10T12:37:43.923 INFO:tasks.workunit.client.0.vm00.stdout:6/240: dwrite d2/f30 [0,4194304] 0 2026-03-10T12:37:43.925 INFO:tasks.workunit.client.0.vm00.stdout:6/241: fdatasync d2/d16/f1e 0 2026-03-10T12:37:43.932 INFO:tasks.workunit.client.0.vm00.stdout:2/173: truncate d4/dd/f3c 94672 0 2026-03-10T12:37:43.933 INFO:tasks.workunit.client.0.vm00.stdout:2/174: write d4/d6/f16 [8064298,48243] 0 2026-03-10T12:37:43.934 INFO:tasks.workunit.client.0.vm00.stdout:2/175: write d4/d6/f34 [699436,45263] 0 2026-03-10T12:37:43.935 INFO:tasks.workunit.client.1.vm07.stdout:7/323: write d0/fc [458827,106228] 0 2026-03-10T12:37:43.936 INFO:tasks.workunit.client.0.vm00.stdout:5/174: write d1f/f22 [1947334,44726] 0 2026-03-10T12:37:43.941 INFO:tasks.workunit.client.0.vm00.stdout:2/176: write d4/d6/f2b [1036390,80733] 0 2026-03-10T12:37:43.942 INFO:tasks.workunit.client.0.vm00.stdout:5/175: creat d1f/d26/d2e/f3c x:0 0 0 2026-03-10T12:37:43.943 INFO:tasks.workunit.client.0.vm00.stdout:5/176: read - d1f/d26/d2e/f3a zero size 2026-03-10T12:37:43.944 INFO:tasks.workunit.client.0.vm00.stdout:5/177: symlink d1f/d26/l3d 0 2026-03-10T12:37:43.946 INFO:tasks.workunit.client.1.vm07.stdout:2/250: write d0/f1d [717164,5344] 0 2026-03-10T12:37:43.946 INFO:tasks.workunit.client.0.vm00.stdout:5/178: dwrite f19 [0,4194304] 0 2026-03-10T12:37:43.950 INFO:tasks.workunit.client.0.vm00.stdout:5/179: dwrite d1f/d26/f28 [0,4194304] 0 2026-03-10T12:37:43.954 INFO:tasks.workunit.client.0.vm00.stdout:5/180: fsync d1f/f32 0 2026-03-10T12:37:43.961 INFO:tasks.workunit.client.0.vm00.stdout:5/181: mknod d1f/c3e 0 2026-03-10T12:37:43.967 INFO:tasks.workunit.client.0.vm00.stdout:5/182: dread f12 [0,4194304] 0 2026-03-10T12:37:43.972 INFO:tasks.workunit.client.0.vm00.stdout:5/183: symlink d1f/d26/d2b/l3f 0 2026-03-10T12:37:43.972 INFO:tasks.workunit.client.0.vm00.stdout:5/184: symlink d1f/d39/l40 0 2026-03-10T12:37:43.973 INFO:tasks.workunit.client.0.vm00.stdout:5/185: dread d1f/d26/d2b/d37/f38 [0,4194304] 0 2026-03-10T12:37:43.973 INFO:tasks.workunit.client.0.vm00.stdout:5/186: stat d1f/d26/d2e 0 2026-03-10T12:37:43.976 INFO:tasks.workunit.client.0.vm00.stdout:2/177: dread d4/d6/f34 [0,4194304] 0 2026-03-10T12:37:43.976 INFO:tasks.workunit.client.0.vm00.stdout:2/178: dread - d4/f39 zero size 2026-03-10T12:37:43.979 INFO:tasks.workunit.client.0.vm00.stdout:0/253: write d3/d7/d4c/d5b/f56 [616320,79692] 0 2026-03-10T12:37:43.980 INFO:tasks.workunit.client.0.vm00.stdout:0/254: readlink d3/d7/d4c/d5b/l1e 0 2026-03-10T12:37:43.981 INFO:tasks.workunit.client.0.vm00.stdout:2/179: getdents d4/d6/d2d/d31/d32/d3b 0 2026-03-10T12:37:43.982 INFO:tasks.workunit.client.0.vm00.stdout:2/180: read d4/d6/f30 [3812777,27836] 0 2026-03-10T12:37:43.985 INFO:tasks.workunit.client.0.vm00.stdout:2/181: dread d4/d6/f30 [0,4194304] 0 2026-03-10T12:37:43.988 INFO:tasks.workunit.client.0.vm00.stdout:0/255: read d3/db/d24/d25/f3f [1284808,113066] 0 2026-03-10T12:37:43.989 INFO:tasks.workunit.client.1.vm07.stdout:5/368: dwrite d0/d22/d18/d19/d36/f3d [0,4194304] 0 2026-03-10T12:37:43.996 INFO:tasks.workunit.client.0.vm00.stdout:2/182: creat d4/dd/f3e x:0 0 0 2026-03-10T12:37:43.997 INFO:tasks.workunit.client.0.vm00.stdout:2/183: fsync d4/d6/f2e 0 2026-03-10T12:37:43.997 INFO:tasks.workunit.client.0.vm00.stdout:2/184: write d4/f1d [8111887,59099] 0 2026-03-10T12:37:43.998 INFO:tasks.workunit.client.0.vm00.stdout:2/185: write d4/d6/f16 [1689197,1719] 0 2026-03-10T12:37:43.999 INFO:tasks.workunit.client.0.vm00.stdout:2/186: chown d4/l9 1339 1 2026-03-10T12:37:44.005 INFO:tasks.workunit.client.0.vm00.stdout:2/187: rmdir d4/d6/d2d/d31/d32/d3b 0 2026-03-10T12:37:44.006 INFO:tasks.workunit.client.0.vm00.stdout:2/188: truncate d4/d6/f34 1241374 0 2026-03-10T12:37:44.007 INFO:tasks.workunit.client.0.vm00.stdout:2/189: creat d4/dd/d38/f3f x:0 0 0 2026-03-10T12:37:44.008 INFO:tasks.workunit.client.0.vm00.stdout:2/190: fdatasync d4/d6/f22 0 2026-03-10T12:37:44.022 INFO:tasks.workunit.client.1.vm07.stdout:6/297: dwrite d1/d4/d6/d16/d1a/d2c/f59 [0,4194304] 0 2026-03-10T12:37:44.024 INFO:tasks.workunit.client.0.vm00.stdout:0/256: read - d3/f50 zero size 2026-03-10T12:37:44.026 INFO:tasks.workunit.client.1.vm07.stdout:6/298: dwrite d1/d4/d44/f45 [0,4194304] 0 2026-03-10T12:37:44.028 INFO:tasks.workunit.client.1.vm07.stdout:6/299: write d1/d4/d6/d4e/f51 [342606,95486] 0 2026-03-10T12:37:44.041 INFO:tasks.workunit.client.1.vm07.stdout:4/441: rename d0/l8a to d0/d4/d5/d34/l98 0 2026-03-10T12:37:44.054 INFO:tasks.workunit.client.0.vm00.stdout:3/191: rmdir dd/d18/d14 39 2026-03-10T12:37:44.054 INFO:tasks.workunit.client.0.vm00.stdout:3/192: readlink dd/d18/d13/d1d/l24 0 2026-03-10T12:37:44.055 INFO:tasks.workunit.client.0.vm00.stdout:3/193: fdatasync f7 0 2026-03-10T12:37:44.055 INFO:tasks.workunit.client.0.vm00.stdout:3/194: chown dd/d18/d13/f22 8 1 2026-03-10T12:37:44.058 INFO:tasks.workunit.client.0.vm00.stdout:0/257: fsync d3/d7/f11 0 2026-03-10T12:37:44.059 INFO:tasks.workunit.client.0.vm00.stdout:3/195: dwrite dd/d18/d13/d1d/f42 [0,4194304] 0 2026-03-10T12:37:44.064 INFO:tasks.workunit.client.0.vm00.stdout:3/196: mkdir dd/d18/d13/d1d/d43 0 2026-03-10T12:37:44.065 INFO:tasks.workunit.client.0.vm00.stdout:3/197: creat dd/d27/f44 x:0 0 0 2026-03-10T12:37:44.066 INFO:tasks.workunit.client.0.vm00.stdout:3/198: mkdir dd/d27/d2c/d34/d45 0 2026-03-10T12:37:44.076 INFO:tasks.workunit.client.0.vm00.stdout:6/242: write d2/f9 [4862061,120568] 0 2026-03-10T12:37:44.077 INFO:tasks.workunit.client.0.vm00.stdout:6/243: write d2/da/dc/d2f/f56 [881528,105610] 0 2026-03-10T12:37:44.077 INFO:tasks.workunit.client.0.vm00.stdout:6/244: write d2/da/dc/f27 [940778,111016] 0 2026-03-10T12:37:44.080 INFO:tasks.workunit.client.1.vm07.stdout:1/334: dread d9/df/d54/f57 [0,4194304] 0 2026-03-10T12:37:44.096 INFO:tasks.workunit.client.0.vm00.stdout:4/181: rmdir df/d1f/d22/d26 39 2026-03-10T12:37:44.099 INFO:tasks.workunit.client.0.vm00.stdout:1/198: write da/d24/f32 [1912731,41714] 0 2026-03-10T12:37:44.110 INFO:tasks.workunit.client.1.vm07.stdout:5/369: mknod d0/d22/d18/d19/d21/d3a/c7f 0 2026-03-10T12:37:44.110 INFO:tasks.workunit.client.0.vm00.stdout:0/258: symlink d3/d7/d4c/d5b/d38/d44/d5a/l5c 0 2026-03-10T12:37:44.111 INFO:tasks.workunit.client.0.vm00.stdout:1/199: dwrite da/d24/f45 [0,4194304] 0 2026-03-10T12:37:44.111 INFO:tasks.workunit.client.0.vm00.stdout:1/200: mknod da/d12/d26/d42/c46 0 2026-03-10T12:37:44.111 INFO:tasks.workunit.client.0.vm00.stdout:1/201: creat da/d24/f47 x:0 0 0 2026-03-10T12:37:44.111 INFO:tasks.workunit.client.0.vm00.stdout:1/202: chown da/f22 75 1 2026-03-10T12:37:44.111 INFO:tasks.workunit.client.0.vm00.stdout:3/199: sync 2026-03-10T12:37:44.112 INFO:tasks.workunit.client.0.vm00.stdout:4/182: dread df/f16 [0,4194304] 0 2026-03-10T12:37:44.113 INFO:tasks.workunit.client.0.vm00.stdout:2/191: truncate d4/dd/f3c 869442 0 2026-03-10T12:37:44.114 INFO:tasks.workunit.client.1.vm07.stdout:0/411: truncate d0/d14/d5f/d76/f78 1713248 0 2026-03-10T12:37:44.116 INFO:tasks.workunit.client.0.vm00.stdout:9/226: write d0/d5/d16/f30 [706074,128123] 0 2026-03-10T12:37:44.118 INFO:tasks.workunit.client.0.vm00.stdout:6/245: dread d2/da/dc/d2f/f4f [0,4194304] 0 2026-03-10T12:37:44.123 INFO:tasks.workunit.client.0.vm00.stdout:1/203: symlink da/d21/d27/l48 0 2026-03-10T12:37:44.124 INFO:tasks.workunit.client.0.vm00.stdout:1/204: dread - da/d24/f47 zero size 2026-03-10T12:37:44.126 INFO:tasks.workunit.client.0.vm00.stdout:1/205: dread da/f13 [0,4194304] 0 2026-03-10T12:37:44.126 INFO:tasks.workunit.client.0.vm00.stdout:1/206: write da/d24/f32 [1167825,24458] 0 2026-03-10T12:37:44.129 INFO:tasks.workunit.client.0.vm00.stdout:2/192: mkdir d4/d6/d2d/d31/d32/d40 0 2026-03-10T12:37:44.130 INFO:tasks.workunit.client.0.vm00.stdout:8/188: mkdir d0/dd/d38 0 2026-03-10T12:37:44.131 INFO:tasks.workunit.client.0.vm00.stdout:9/227: mkdir d0/d3d/d43/d53 0 2026-03-10T12:37:44.132 INFO:tasks.workunit.client.0.vm00.stdout:3/200: mknod dd/d18/d14/c46 0 2026-03-10T12:37:44.135 INFO:tasks.workunit.client.0.vm00.stdout:1/207: mknod da/d12/d26/d42/c49 0 2026-03-10T12:37:44.139 INFO:tasks.workunit.client.0.vm00.stdout:2/193: mkdir d4/d6/d41 0 2026-03-10T12:37:44.142 INFO:tasks.workunit.client.0.vm00.stdout:9/228: creat d0/d3d/d43/f54 x:0 0 0 2026-03-10T12:37:44.145 INFO:tasks.workunit.client.0.vm00.stdout:9/229: dwrite d0/d5/d16/d19/f32 [0,4194304] 0 2026-03-10T12:37:44.146 INFO:tasks.workunit.client.0.vm00.stdout:9/230: chown d0/d5/d16/d19/d2f/f3f 201163612 1 2026-03-10T12:37:44.147 INFO:tasks.workunit.client.0.vm00.stdout:9/231: write d0/d5/d16/d1e/d2b/f42 [2819858,9803] 0 2026-03-10T12:37:44.148 INFO:tasks.workunit.client.1.vm07.stdout:6/300: creat d1/d4/d6/d16/f5f x:0 0 0 2026-03-10T12:37:44.148 INFO:tasks.workunit.client.1.vm07.stdout:6/301: stat d1/d4/d6/d16/d1a/d2c/f59 0 2026-03-10T12:37:44.152 INFO:tasks.workunit.client.0.vm00.stdout:1/208: mkdir da/d24/d4a 0 2026-03-10T12:37:44.155 INFO:tasks.workunit.client.0.vm00.stdout:1/209: dwrite da/f14 [0,4194304] 0 2026-03-10T12:37:44.155 INFO:tasks.workunit.client.0.vm00.stdout:1/210: stat da/d24/d28/c29 0 2026-03-10T12:37:44.156 INFO:tasks.workunit.client.0.vm00.stdout:1/211: write da/d24/d28/f37 [648535,76399] 0 2026-03-10T12:37:44.160 INFO:tasks.workunit.client.0.vm00.stdout:2/194: write d4/d6/f34 [2222852,54487] 0 2026-03-10T12:37:44.165 INFO:tasks.workunit.client.0.vm00.stdout:5/187: truncate d1f/f27 3262003 0 2026-03-10T12:37:44.165 INFO:tasks.workunit.client.0.vm00.stdout:5/188: chown c10 1384198 1 2026-03-10T12:37:44.166 INFO:tasks.workunit.client.1.vm07.stdout:9/311: getdents d5/d16/d18 0 2026-03-10T12:37:44.166 INFO:tasks.workunit.client.0.vm00.stdout:9/232: symlink d0/d5/d16/d19/d2f/l55 0 2026-03-10T12:37:44.170 INFO:tasks.workunit.client.0.vm00.stdout:1/212: dread f5 [0,4194304] 0 2026-03-10T12:37:44.171 INFO:tasks.workunit.client.0.vm00.stdout:2/195: mknod d4/d6/d2d/c42 0 2026-03-10T12:37:44.176 INFO:tasks.workunit.client.0.vm00.stdout:1/213: symlink da/d12/l4b 0 2026-03-10T12:37:44.177 INFO:tasks.workunit.client.1.vm07.stdout:7/324: truncate d0/f13 1103822 0 2026-03-10T12:37:44.178 INFO:tasks.workunit.client.0.vm00.stdout:5/189: creat d1f/d26/d2b/d35/f41 x:0 0 0 2026-03-10T12:37:44.178 INFO:tasks.workunit.client.1.vm07.stdout:7/325: chown d0/f2f 428681 1 2026-03-10T12:37:44.179 INFO:tasks.workunit.client.0.vm00.stdout:1/214: symlink da/l4c 0 2026-03-10T12:37:44.180 INFO:tasks.workunit.client.0.vm00.stdout:5/190: creat d1f/d26/d2b/d35/f42 x:0 0 0 2026-03-10T12:37:44.181 INFO:tasks.workunit.client.0.vm00.stdout:5/191: write d1f/f22 [4430161,52457] 0 2026-03-10T12:37:44.183 INFO:tasks.workunit.client.0.vm00.stdout:1/215: mkdir da/d4d 0 2026-03-10T12:37:44.184 INFO:tasks.workunit.client.1.vm07.stdout:8/376: truncate d1/d3/d18/f32 2298963 0 2026-03-10T12:37:44.188 INFO:tasks.workunit.client.0.vm00.stdout:5/192: truncate f16 651402 0 2026-03-10T12:37:44.198 INFO:tasks.workunit.client.0.vm00.stdout:5/193: chown d1f/d26/d2b/d37/f38 5947 1 2026-03-10T12:37:44.198 INFO:tasks.workunit.client.0.vm00.stdout:5/194: mkdir d1f/d26/d2e/d43 0 2026-03-10T12:37:44.204 INFO:tasks.workunit.client.1.vm07.stdout:5/370: rmdir d0/d22/d18/d19/d21 39 2026-03-10T12:37:44.210 INFO:tasks.workunit.client.0.vm00.stdout:3/201: dread f7 [4194304,4194304] 0 2026-03-10T12:37:44.211 INFO:tasks.workunit.client.1.vm07.stdout:3/378: link dc/d18/f36 dc/dd/d28/d7a/f88 0 2026-03-10T12:37:44.212 INFO:tasks.workunit.client.0.vm00.stdout:3/202: creat dd/d27/d2c/d34/d45/f47 x:0 0 0 2026-03-10T12:37:44.213 INFO:tasks.workunit.client.0.vm00.stdout:3/203: rmdir dd/d3d 39 2026-03-10T12:37:44.215 INFO:tasks.workunit.client.1.vm07.stdout:0/412: dread d0/f15 [0,4194304] 0 2026-03-10T12:37:44.221 INFO:tasks.workunit.client.0.vm00.stdout:6/246: dread d2/d16/f2a [0,4194304] 0 2026-03-10T12:37:44.227 INFO:tasks.workunit.client.1.vm07.stdout:9/312: dread d5/d16/d23/d26/f42 [0,4194304] 0 2026-03-10T12:37:44.227 INFO:tasks.workunit.client.0.vm00.stdout:3/204: dread dd/d18/f12 [0,4194304] 0 2026-03-10T12:37:44.227 INFO:tasks.workunit.client.0.vm00.stdout:3/205: creat dd/d27/d2c/d34/d38/f48 x:0 0 0 2026-03-10T12:37:44.228 INFO:tasks.workunit.client.0.vm00.stdout:3/206: dread dd/d3d/f3e [0,4194304] 0 2026-03-10T12:37:44.232 INFO:tasks.workunit.client.0.vm00.stdout:3/207: link l6 dd/d27/d2c/d34/l49 0 2026-03-10T12:37:44.234 INFO:tasks.workunit.client.0.vm00.stdout:3/208: rename dd/d18/d13/d1d/l37 to dd/d2a/l4a 0 2026-03-10T12:37:44.234 INFO:tasks.workunit.client.0.vm00.stdout:3/209: dread - dd/d18/d14/f3c zero size 2026-03-10T12:37:44.236 INFO:tasks.workunit.client.0.vm00.stdout:3/210: mkdir dd/d27/d2c/d34/d45/d4b 0 2026-03-10T12:37:44.236 INFO:tasks.workunit.client.0.vm00.stdout:3/211: stat c1 0 2026-03-10T12:37:44.237 INFO:tasks.workunit.client.0.vm00.stdout:3/212: mknod dd/d27/d2c/d34/c4c 0 2026-03-10T12:37:44.238 INFO:tasks.workunit.client.0.vm00.stdout:3/213: write dd/d18/f12 [537804,109077] 0 2026-03-10T12:37:44.240 INFO:tasks.workunit.client.0.vm00.stdout:3/214: symlink dd/d18/d13/d1d/d43/l4d 0 2026-03-10T12:37:44.244 INFO:tasks.workunit.client.0.vm00.stdout:3/215: mkdir dd/d4e 0 2026-03-10T12:37:44.246 INFO:tasks.workunit.client.1.vm07.stdout:7/326: truncate d0/f42 313604 0 2026-03-10T12:37:44.247 INFO:tasks.workunit.client.1.vm07.stdout:7/327: chown d0/f2b 18 1 2026-03-10T12:37:44.247 INFO:tasks.workunit.client.0.vm00.stdout:3/216: dwrite dd/d18/d14/d2b/f31 [0,4194304] 0 2026-03-10T12:37:44.254 INFO:tasks.workunit.client.0.vm00.stdout:3/217: mknod dd/d18/d13/d1d/c4f 0 2026-03-10T12:37:44.255 INFO:tasks.workunit.client.0.vm00.stdout:3/218: readlink dd/d18/d13/d1d/l2d 0 2026-03-10T12:37:44.256 INFO:tasks.workunit.client.0.vm00.stdout:3/219: fsync dd/d18/d14/f3c 0 2026-03-10T12:37:44.256 INFO:tasks.workunit.client.0.vm00.stdout:3/220: fdatasync dd/f25 0 2026-03-10T12:37:44.258 INFO:tasks.workunit.client.0.vm00.stdout:3/221: dread f7 [4194304,4194304] 0 2026-03-10T12:37:44.259 INFO:tasks.workunit.client.0.vm00.stdout:3/222: write f7 [288640,64404] 0 2026-03-10T12:37:44.260 INFO:tasks.workunit.client.0.vm00.stdout:3/223: write dd/d18/d14/d2b/f31 [4018538,54547] 0 2026-03-10T12:37:44.260 INFO:tasks.workunit.client.0.vm00.stdout:3/224: dread - dd/d27/f44 zero size 2026-03-10T12:37:44.262 INFO:tasks.workunit.client.1.vm07.stdout:5/371: mkdir d0/d22/d18/d80 0 2026-03-10T12:37:44.273 INFO:tasks.workunit.client.0.vm00.stdout:3/225: creat dd/d3d/f50 x:0 0 0 2026-03-10T12:37:44.273 INFO:tasks.workunit.client.0.vm00.stdout:3/226: dwrite dd/d27/d2c/d34/d38/f48 [0,4194304] 0 2026-03-10T12:37:44.273 INFO:tasks.workunit.client.1.vm07.stdout:5/372: read d0/f1f [1925375,26934] 0 2026-03-10T12:37:44.273 INFO:tasks.workunit.client.1.vm07.stdout:3/379: rmdir dc/dd/d28 39 2026-03-10T12:37:44.273 INFO:tasks.workunit.client.1.vm07.stdout:0/413: fdatasync d0/d14/d5f/d3b/f4b 0 2026-03-10T12:37:44.273 INFO:tasks.workunit.client.1.vm07.stdout:9/313: mknod d5/d13/d57/d3e/c72 0 2026-03-10T12:37:44.275 INFO:tasks.workunit.client.1.vm07.stdout:9/314: truncate d5/d13/d2c/f44 1534668 0 2026-03-10T12:37:44.278 INFO:tasks.workunit.client.1.vm07.stdout:7/328: mknod d0/d47/c65 0 2026-03-10T12:37:44.278 INFO:tasks.workunit.client.1.vm07.stdout:1/335: creat d9/f6d x:0 0 0 2026-03-10T12:37:44.279 INFO:tasks.workunit.client.1.vm07.stdout:1/336: write d9/f6d [91368,57975] 0 2026-03-10T12:37:44.283 INFO:tasks.workunit.client.0.vm00.stdout:1/216: fsync da/f14 0 2026-03-10T12:37:44.286 INFO:tasks.workunit.client.0.vm00.stdout:1/217: symlink da/d12/d26/d42/l4e 0 2026-03-10T12:37:44.289 INFO:tasks.workunit.client.0.vm00.stdout:1/218: creat da/d21/d39/f4f x:0 0 0 2026-03-10T12:37:44.290 INFO:tasks.workunit.client.0.vm00.stdout:9/233: dread d0/d5/d16/d1e/d2b/f36 [0,4194304] 0 2026-03-10T12:37:44.292 INFO:tasks.workunit.client.0.vm00.stdout:1/219: symlink da/d24/l50 0 2026-03-10T12:37:44.293 INFO:tasks.workunit.client.0.vm00.stdout:1/220: write da/d24/d28/f37 [591527,92208] 0 2026-03-10T12:37:44.294 INFO:tasks.workunit.client.0.vm00.stdout:1/221: write da/d24/d28/f37 [1504327,126850] 0 2026-03-10T12:37:44.298 INFO:tasks.workunit.client.0.vm00.stdout:1/222: mknod da/c51 0 2026-03-10T12:37:44.301 INFO:tasks.workunit.client.0.vm00.stdout:1/223: dwrite da/d12/f1d [0,4194304] 0 2026-03-10T12:37:44.303 INFO:tasks.workunit.client.0.vm00.stdout:1/224: truncate da/d21/d39/f4f 504412 0 2026-03-10T12:37:44.305 INFO:tasks.workunit.client.0.vm00.stdout:9/234: symlink d0/d3d/d43/d53/l56 0 2026-03-10T12:37:44.315 INFO:tasks.workunit.client.1.vm07.stdout:9/315: creat d5/d13/d57/f73 x:0 0 0 2026-03-10T12:37:44.315 INFO:tasks.workunit.client.1.vm07.stdout:1/337: rmdir d9/df/d29/d2b/d3d 39 2026-03-10T12:37:44.343 INFO:tasks.workunit.client.0.vm00.stdout:0/259: dwrite d3/db/d24/f2f [4194304,4194304] 0 2026-03-10T12:37:44.349 INFO:tasks.workunit.client.0.vm00.stdout:4/183: write f8 [3047816,116072] 0 2026-03-10T12:37:44.352 INFO:tasks.workunit.client.1.vm07.stdout:4/442: dwrite d0/d4/d5/da/f6e [0,4194304] 0 2026-03-10T12:37:44.355 INFO:tasks.workunit.client.0.vm00.stdout:8/189: write d0/d12/f34 [5112755,57403] 0 2026-03-10T12:37:44.360 INFO:tasks.workunit.client.0.vm00.stdout:9/235: rename d0/d5/d16/d19/d2f to d0/d3d/d43/d53/d57 0 2026-03-10T12:37:44.362 INFO:tasks.workunit.client.0.vm00.stdout:0/260: symlink d3/db/d24/l5d 0 2026-03-10T12:37:44.364 INFO:tasks.workunit.client.1.vm07.stdout:2/251: truncate d0/d42/d1f/f2f 248917 0 2026-03-10T12:37:44.364 INFO:tasks.workunit.client.1.vm07.stdout:0/414: mkdir d0/d83 0 2026-03-10T12:37:44.365 INFO:tasks.workunit.client.1.vm07.stdout:6/302: getdents d1/d4/d6/d16 0 2026-03-10T12:37:44.365 INFO:tasks.workunit.client.1.vm07.stdout:3/380: write dc/dd/f1d [1173379,110594] 0 2026-03-10T12:37:44.366 INFO:tasks.workunit.client.1.vm07.stdout:8/377: write d1/d3/d18/f38 [3697397,78772] 0 2026-03-10T12:37:44.378 INFO:tasks.workunit.client.1.vm07.stdout:9/316: mkdir d5/d1f/d31/d74 0 2026-03-10T12:37:44.378 INFO:tasks.workunit.client.0.vm00.stdout:2/196: getdents d4/d6/d2d 0 2026-03-10T12:37:44.378 INFO:tasks.workunit.client.0.vm00.stdout:0/261: truncate d3/d40/f59 477379 0 2026-03-10T12:37:44.378 INFO:tasks.workunit.client.0.vm00.stdout:5/195: rmdir d1f 39 2026-03-10T12:37:44.378 INFO:tasks.workunit.client.0.vm00.stdout:4/184: write df/d1f/d22/d26/f39 [794972,80052] 0 2026-03-10T12:37:44.378 INFO:tasks.workunit.client.0.vm00.stdout:8/190: creat d0/d12/d36/f39 x:0 0 0 2026-03-10T12:37:44.378 INFO:tasks.workunit.client.1.vm07.stdout:8/378: dwrite d1/d3/d6/d50/f56 [4194304,4194304] 0 2026-03-10T12:37:44.382 INFO:tasks.workunit.client.0.vm00.stdout:8/191: write d0/f22 [3724812,47400] 0 2026-03-10T12:37:44.384 INFO:tasks.workunit.client.0.vm00.stdout:2/197: mkdir d4/d6/d2d/d3a/d43 0 2026-03-10T12:37:44.387 INFO:tasks.workunit.client.0.vm00.stdout:5/196: creat d1f/d26/d2b/f44 x:0 0 0 2026-03-10T12:37:44.390 INFO:tasks.workunit.client.0.vm00.stdout:0/262: unlink d3/d7/d4c/d5b/l1e 0 2026-03-10T12:37:44.396 INFO:tasks.workunit.client.0.vm00.stdout:1/225: fdatasync da/d12/f1d 0 2026-03-10T12:37:44.399 INFO:tasks.workunit.client.0.vm00.stdout:4/185: getdents df/d24 0 2026-03-10T12:37:44.400 INFO:tasks.workunit.client.0.vm00.stdout:1/226: link f3 da/d12/d26/d42/f52 0 2026-03-10T12:37:44.401 INFO:tasks.workunit.client.0.vm00.stdout:1/227: read da/d24/d28/f3c [1982604,55054] 0 2026-03-10T12:37:44.405 INFO:tasks.workunit.client.0.vm00.stdout:3/227: dwrite dd/d27/f35 [0,4194304] 0 2026-03-10T12:37:44.407 INFO:tasks.workunit.client.0.vm00.stdout:1/228: dread da/d12/d26/f31 [0,4194304] 0 2026-03-10T12:37:44.407 INFO:tasks.workunit.client.0.vm00.stdout:4/186: symlink df/d1f/d36/d3a/l3b 0 2026-03-10T12:37:44.409 INFO:tasks.workunit.client.0.vm00.stdout:1/229: write da/d24/f45 [4605820,52526] 0 2026-03-10T12:37:44.409 INFO:tasks.workunit.client.0.vm00.stdout:1/230: truncate da/d24/f45 5358805 0 2026-03-10T12:37:44.412 INFO:tasks.workunit.client.0.vm00.stdout:1/231: dwrite da/d24/f45 [0,4194304] 0 2026-03-10T12:37:44.420 INFO:tasks.workunit.client.0.vm00.stdout:1/232: creat da/d24/f53 x:0 0 0 2026-03-10T12:37:44.421 INFO:tasks.workunit.client.0.vm00.stdout:3/228: getdents dd/d18/d14 0 2026-03-10T12:37:44.424 INFO:tasks.workunit.client.0.vm00.stdout:4/187: creat df/d1f/d22/f3c x:0 0 0 2026-03-10T12:37:44.430 INFO:tasks.workunit.client.0.vm00.stdout:3/229: dwrite dd/d27/f35 [0,4194304] 0 2026-03-10T12:37:44.430 INFO:tasks.workunit.client.0.vm00.stdout:1/233: dread da/d24/f32 [0,4194304] 0 2026-03-10T12:37:44.431 INFO:tasks.workunit.client.0.vm00.stdout:1/234: write da/d12/d26/f2e [322225,40078] 0 2026-03-10T12:37:44.438 INFO:tasks.workunit.client.0.vm00.stdout:3/230: symlink dd/d18/d14/l51 0 2026-03-10T12:37:44.438 INFO:tasks.workunit.client.0.vm00.stdout:2/198: sync 2026-03-10T12:37:44.438 INFO:tasks.workunit.client.0.vm00.stdout:5/197: sync 2026-03-10T12:37:44.441 INFO:tasks.workunit.client.0.vm00.stdout:5/198: dread d1f/d26/f28 [0,4194304] 0 2026-03-10T12:37:44.442 INFO:tasks.workunit.client.0.vm00.stdout:2/199: creat d4/d6/d2d/d3a/f44 x:0 0 0 2026-03-10T12:37:44.442 INFO:tasks.workunit.client.0.vm00.stdout:2/200: dread - d4/f39 zero size 2026-03-10T12:37:44.443 INFO:tasks.workunit.client.0.vm00.stdout:1/235: read da/d12/f30 [1115452,127370] 0 2026-03-10T12:37:44.446 INFO:tasks.workunit.client.0.vm00.stdout:2/201: dwrite d4/d6/f16 [4194304,4194304] 0 2026-03-10T12:37:44.452 INFO:tasks.workunit.client.0.vm00.stdout:1/236: creat da/d21/d27/f54 x:0 0 0 2026-03-10T12:37:44.461 INFO:tasks.workunit.client.1.vm07.stdout:6/303: creat d1/d4/d6/f60 x:0 0 0 2026-03-10T12:37:44.464 INFO:tasks.workunit.client.0.vm00.stdout:4/188: dread f8 [0,4194304] 0 2026-03-10T12:37:44.467 INFO:tasks.workunit.client.1.vm07.stdout:7/329: dwrite d0/f37 [0,4194304] 0 2026-03-10T12:37:44.471 INFO:tasks.workunit.client.0.vm00.stdout:0/263: mknod d3/d7/d4c/d5b/d38/d44/d5a/c5e 0 2026-03-10T12:37:44.482 INFO:tasks.workunit.client.0.vm00.stdout:2/202: creat d4/dd/f45 x:0 0 0 2026-03-10T12:37:44.482 INFO:tasks.workunit.client.0.vm00.stdout:2/203: dread d4/d6/f22 [0,4194304] 0 2026-03-10T12:37:44.482 INFO:tasks.workunit.client.0.vm00.stdout:4/189: creat df/f3d x:0 0 0 2026-03-10T12:37:44.482 INFO:tasks.workunit.client.0.vm00.stdout:2/204: creat d4/d6/d2d/d31/f46 x:0 0 0 2026-03-10T12:37:44.482 INFO:tasks.workunit.client.0.vm00.stdout:4/190: rmdir df/d1f 39 2026-03-10T12:37:44.482 INFO:tasks.workunit.client.0.vm00.stdout:4/191: dwrite fa [0,4194304] 0 2026-03-10T12:37:44.488 INFO:tasks.workunit.client.0.vm00.stdout:4/192: dread - df/f29 zero size 2026-03-10T12:37:44.489 INFO:tasks.workunit.client.0.vm00.stdout:4/193: truncate fa 4865292 0 2026-03-10T12:37:44.489 INFO:tasks.workunit.client.0.vm00.stdout:4/194: write df/f19 [1459783,37492] 0 2026-03-10T12:37:44.494 INFO:tasks.workunit.client.0.vm00.stdout:4/195: dread df/d24/f33 [0,4194304] 0 2026-03-10T12:37:44.502 INFO:tasks.workunit.client.1.vm07.stdout:9/317: mkdir d5/d1f/d75 0 2026-03-10T12:37:44.502 INFO:tasks.workunit.client.1.vm07.stdout:9/318: chown d5/d13/d57/d3e 23185892 1 2026-03-10T12:37:44.502 INFO:tasks.workunit.client.0.vm00.stdout:4/196: mknod df/d1f/d36/c3e 0 2026-03-10T12:37:44.503 INFO:tasks.workunit.client.0.vm00.stdout:4/197: symlink df/d1f/d22/l3f 0 2026-03-10T12:37:44.503 INFO:tasks.workunit.client.0.vm00.stdout:4/198: chown df/d1f/d22/d26/d2e/c2a 19273 1 2026-03-10T12:37:44.503 INFO:tasks.workunit.client.0.vm00.stdout:4/199: mkdir df/d1f/d36/d40 0 2026-03-10T12:37:44.503 INFO:tasks.workunit.client.0.vm00.stdout:4/200: getdents df/d32 0 2026-03-10T12:37:44.505 INFO:tasks.workunit.client.1.vm07.stdout:5/373: link d0/cb d0/d22/c81 0 2026-03-10T12:37:44.506 INFO:tasks.workunit.client.1.vm07.stdout:5/374: chown d0/d22/f50 16354020 1 2026-03-10T12:37:44.507 INFO:tasks.workunit.client.1.vm07.stdout:7/330: creat d0/d61/f66 x:0 0 0 2026-03-10T12:37:44.518 INFO:tasks.workunit.client.1.vm07.stdout:0/415: rename d0/d14/d5f/d76/d2f/d31/c67 to d0/d14/d5f/d76/d2f/d31/d79/c84 0 2026-03-10T12:37:44.519 INFO:tasks.workunit.client.1.vm07.stdout:0/416: chown d0/d14/d5f/d76/d2f/l4a 7620210 1 2026-03-10T12:37:44.521 INFO:tasks.workunit.client.1.vm07.stdout:9/319: mkdir d5/d1f/d31/d76 0 2026-03-10T12:37:44.521 INFO:tasks.workunit.client.0.vm00.stdout:5/199: dread f11 [0,4194304] 0 2026-03-10T12:37:44.523 INFO:tasks.workunit.client.0.vm00.stdout:5/200: dread d1f/f25 [4194304,4194304] 0 2026-03-10T12:37:44.527 INFO:tasks.workunit.client.0.vm00.stdout:5/201: dwrite d1f/f21 [4194304,4194304] 0 2026-03-10T12:37:44.530 INFO:tasks.workunit.client.1.vm07.stdout:8/379: mknod d1/d3/c7a 0 2026-03-10T12:37:44.530 INFO:tasks.workunit.client.0.vm00.stdout:5/202: mknod d1f/d26/d2b/c45 0 2026-03-10T12:37:44.531 INFO:tasks.workunit.client.1.vm07.stdout:0/417: mkdir d0/d14/d5f/d76/d2f/d31/d79/d85 0 2026-03-10T12:37:44.531 INFO:tasks.workunit.client.1.vm07.stdout:9/320: truncate d5/f8 8061483 0 2026-03-10T12:37:44.532 INFO:tasks.workunit.client.1.vm07.stdout:5/375: symlink d0/d22/d18/d80/l82 0 2026-03-10T12:37:44.532 INFO:tasks.workunit.client.1.vm07.stdout:7/331: mkdir d0/d67 0 2026-03-10T12:37:44.534 INFO:tasks.workunit.client.1.vm07.stdout:0/418: readlink d0/d14/d5f/l68 0 2026-03-10T12:37:44.538 INFO:tasks.workunit.client.1.vm07.stdout:6/304: rename d1/d4/f31 to d1/d4/d6/d16/d1a/d33/f61 0 2026-03-10T12:37:44.538 INFO:tasks.workunit.client.1.vm07.stdout:8/380: mkdir d1/d3/d6/d7b 0 2026-03-10T12:37:44.538 INFO:tasks.workunit.client.1.vm07.stdout:9/321: symlink d5/d13/d57/d3e/l77 0 2026-03-10T12:37:44.541 INFO:tasks.workunit.client.1.vm07.stdout:9/322: chown d5/d16/f19 203153 1 2026-03-10T12:37:44.563 INFO:tasks.workunit.client.0.vm00.stdout:9/236: rmdir d0/d3d/d43/d53/d57 39 2026-03-10T12:37:44.565 INFO:tasks.workunit.client.0.vm00.stdout:9/237: truncate d0/d5/d16/d19/f1b 2280308 0 2026-03-10T12:37:44.566 INFO:tasks.workunit.client.0.vm00.stdout:5/203: read d1f/f27 [279890,77841] 0 2026-03-10T12:37:44.575 INFO:tasks.workunit.client.1.vm07.stdout:1/338: write d9/df/f10 [924196,32829] 0 2026-03-10T12:37:44.575 INFO:tasks.workunit.client.1.vm07.stdout:1/339: dwrite d9/f1a [4194304,4194304] 0 2026-03-10T12:37:44.575 INFO:tasks.workunit.client.0.vm00.stdout:5/204: write d1f/d26/d2e/f3a [475613,8664] 0 2026-03-10T12:37:44.575 INFO:tasks.workunit.client.0.vm00.stdout:5/205: chown f19 1842 1 2026-03-10T12:37:44.575 INFO:tasks.workunit.client.0.vm00.stdout:9/238: symlink d0/d5/d16/d1e/d27/l58 0 2026-03-10T12:37:44.575 INFO:tasks.workunit.client.0.vm00.stdout:5/206: dread f16 [0,4194304] 0 2026-03-10T12:37:44.575 INFO:tasks.workunit.client.0.vm00.stdout:5/207: dread d1f/f21 [0,4194304] 0 2026-03-10T12:37:44.576 INFO:tasks.workunit.client.1.vm07.stdout:4/443: write d0/d4/d7a/f4f [1560370,54310] 0 2026-03-10T12:37:44.578 INFO:tasks.workunit.client.1.vm07.stdout:2/252: write d0/d42/d1f/d20/f3f [849558,75229] 0 2026-03-10T12:37:44.579 INFO:tasks.workunit.client.1.vm07.stdout:7/332: rename d0/d47/l5c to d0/d61/l68 0 2026-03-10T12:37:44.590 INFO:tasks.workunit.client.1.vm07.stdout:1/340: creat d9/d2d/d4f/d5a/f6e x:0 0 0 2026-03-10T12:37:44.593 INFO:tasks.workunit.client.0.vm00.stdout:7/220: chown da/d25/d2c 130385 1 2026-03-10T12:37:44.599 INFO:tasks.workunit.client.1.vm07.stdout:9/323: mknod d5/d13/d6c/c78 0 2026-03-10T12:37:44.600 INFO:tasks.workunit.client.1.vm07.stdout:7/333: creat d0/d61/f69 x:0 0 0 2026-03-10T12:37:44.600 INFO:tasks.workunit.client.1.vm07.stdout:1/341: fsync d9/df/d29/d2b/d30/f38 0 2026-03-10T12:37:44.601 INFO:tasks.workunit.client.1.vm07.stdout:9/324: truncate d5/d13/d2c/f41 5562750 0 2026-03-10T12:37:44.616 INFO:tasks.workunit.client.0.vm00.stdout:7/221: unlink f8 0 2026-03-10T12:37:44.617 INFO:tasks.workunit.client.1.vm07.stdout:5/376: truncate d0/d22/d18/d19/d2e/f59 1080083 0 2026-03-10T12:37:44.619 INFO:tasks.workunit.client.1.vm07.stdout:4/444: link d0/d4/d10/d18/f1a d0/d4/d10/d3c/d2b/d2d/f99 0 2026-03-10T12:37:44.621 INFO:tasks.workunit.client.1.vm07.stdout:4/445: dread - d0/d4/d10/d3c/f68 zero size 2026-03-10T12:37:44.624 INFO:tasks.workunit.client.1.vm07.stdout:7/334: symlink d0/d47/l6a 0 2026-03-10T12:37:44.628 INFO:tasks.workunit.client.1.vm07.stdout:9/325: truncate d5/d13/d22/f32 261257 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:2/253: creat d0/d42/f53 x:0 0 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:4/446: read d0/d4/d5/d34/f37 [2502308,12520] 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:9/326: write d5/d16/d23/d26/f5c [1027844,129778] 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:4/447: chown d0/d4/d5/d34/f94 409 1 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:6/305: rename d1/f34 to d1/d4/f62 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:7/335: symlink d0/d61/l6b 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:9/327: unlink d5/cc 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:9/328: dwrite d5/d13/f67 [0,4194304] 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:7/336: link d0/d52/f5d d0/d57/d62/f6c 0 2026-03-10T12:37:44.656 INFO:tasks.workunit.client.1.vm07.stdout:4/448: getdents d0/d4/d5/da/d66 0 2026-03-10T12:37:44.664 INFO:tasks.workunit.client.1.vm07.stdout:9/329: dread d5/d13/d22/f36 [0,4194304] 0 2026-03-10T12:37:44.664 INFO:tasks.workunit.client.1.vm07.stdout:4/449: dread d0/d4/d10/d3c/d2b/d2d/f99 [0,4194304] 0 2026-03-10T12:37:44.665 INFO:tasks.workunit.client.1.vm07.stdout:9/330: write d5/d1f/d31/f56 [1630806,65] 0 2026-03-10T12:37:44.665 INFO:tasks.workunit.client.1.vm07.stdout:7/337: dwrite d0/d57/d62/f6c [0,4194304] 0 2026-03-10T12:37:44.680 INFO:tasks.workunit.client.1.vm07.stdout:9/331: fsync d5/f1c 0 2026-03-10T12:37:44.682 INFO:tasks.workunit.client.1.vm07.stdout:9/332: symlink d5/d13/d57/l79 0 2026-03-10T12:37:44.686 INFO:tasks.workunit.client.1.vm07.stdout:9/333: dwrite d5/d13/d2c/f38 [0,4194304] 0 2026-03-10T12:37:44.689 INFO:tasks.workunit.client.1.vm07.stdout:9/334: mkdir d5/d13/d6c/d7a 0 2026-03-10T12:37:44.689 INFO:tasks.workunit.client.1.vm07.stdout:9/335: readlink d5/d13/d57/d3e/l49 0 2026-03-10T12:37:44.691 INFO:tasks.workunit.client.1.vm07.stdout:9/336: mknod d5/d1f/d5e/d6b/c7b 0 2026-03-10T12:37:44.718 INFO:tasks.workunit.client.1.vm07.stdout:9/337: write d5/d1f/f3d [69395,27193] 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:9/338: readlink d5/d1f/l2e 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:9/339: symlink d5/d16/d23/l7c 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/254: dread d0/f1d [0,4194304] 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:9/340: stat d5/d13/f14 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/255: readlink d0/d42/l21 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/256: fsync d0/d42/d1f/d20/f3f 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/257: chown d0/d42/d1f/d20/f2b 191 1 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/258: rmdir d0/d45 39 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/259: mkdir d0/d45/d54 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/260: rename d0/d42/d1f/c36 to d0/d42/c55 0 2026-03-10T12:37:44.719 INFO:tasks.workunit.client.1.vm07.stdout:2/261: mkdir d0/d42/d4e/d56 0 2026-03-10T12:37:44.729 INFO:tasks.workunit.client.0.vm00.stdout:7/222: sync 2026-03-10T12:37:44.733 INFO:tasks.workunit.client.0.vm00.stdout:7/223: dwrite da/d25/d2c/f4f [0,4194304] 0 2026-03-10T12:37:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:44 vm00.local ceph-mon[50686]: pgmap v157: 65 pgs: 65 active+clean; 1.1 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 17 MiB/s rd, 118 MiB/s wr, 244 op/s 2026-03-10T12:37:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:44 vm07.local ceph-mon[58582]: pgmap v157: 65 pgs: 65 active+clean; 1.1 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 17 MiB/s rd, 118 MiB/s wr, 244 op/s 2026-03-10T12:37:44.826 INFO:tasks.workunit.client.0.vm00.stdout:1/237: fsync da/d12/d26/d42/f52 0 2026-03-10T12:37:44.836 INFO:tasks.workunit.client.0.vm00.stdout:0/264: dwrite d3/f4 [0,4194304] 0 2026-03-10T12:37:44.836 INFO:tasks.workunit.client.0.vm00.stdout:1/238: write f3 [1937175,81596] 0 2026-03-10T12:37:44.836 INFO:tasks.workunit.client.0.vm00.stdout:1/239: rmdir da/d24 39 2026-03-10T12:37:44.838 INFO:tasks.workunit.client.0.vm00.stdout:1/240: dwrite da/d12/f1d [0,4194304] 0 2026-03-10T12:37:44.843 INFO:tasks.workunit.client.0.vm00.stdout:0/265: creat d3/d7/d4c/d5b/f5f x:0 0 0 2026-03-10T12:37:44.844 INFO:tasks.workunit.client.0.vm00.stdout:1/241: dread da/d12/f20 [0,4194304] 0 2026-03-10T12:37:44.845 INFO:tasks.workunit.client.0.vm00.stdout:1/242: truncate da/d12/d26/f2e 1167550 0 2026-03-10T12:37:44.849 INFO:tasks.workunit.client.0.vm00.stdout:0/266: getdents d3 0 2026-03-10T12:37:44.849 INFO:tasks.workunit.client.0.vm00.stdout:1/243: dwrite da/d21/d27/f54 [0,4194304] 0 2026-03-10T12:37:44.850 INFO:tasks.workunit.client.0.vm00.stdout:0/267: stat d3/d7/d3c/f19 0 2026-03-10T12:37:44.852 INFO:tasks.workunit.client.0.vm00.stdout:1/244: creat da/d21/d39/f55 x:0 0 0 2026-03-10T12:37:44.924 INFO:tasks.workunit.client.0.vm00.stdout:0/268: read d3/db/f45 [3225800,18674] 0 2026-03-10T12:37:44.928 INFO:tasks.workunit.client.0.vm00.stdout:0/269: dwrite d3/d7/d4c/d5b/f57 [0,4194304] 0 2026-03-10T12:37:44.931 INFO:tasks.workunit.client.0.vm00.stdout:0/270: mknod d3/db/d24/c60 0 2026-03-10T12:37:44.934 INFO:tasks.workunit.client.0.vm00.stdout:0/271: dwrite d3/d22/f42 [0,4194304] 0 2026-03-10T12:37:44.939 INFO:tasks.workunit.client.0.vm00.stdout:0/272: symlink d3/d33/l61 0 2026-03-10T12:37:44.985 INFO:tasks.workunit.client.0.vm00.stdout:4/201: dwrite fb [0,4194304] 0 2026-03-10T12:37:44.985 INFO:tasks.workunit.client.1.vm07.stdout:3/381: truncate dc/dd/d1f/d45/f56 2157890 0 2026-03-10T12:37:44.994 INFO:tasks.workunit.client.1.vm07.stdout:3/382: fsync f1 0 2026-03-10T12:37:45.028 INFO:tasks.workunit.client.1.vm07.stdout:8/381: dwrite d1/f3e [0,4194304] 0 2026-03-10T12:37:45.028 INFO:tasks.workunit.client.1.vm07.stdout:4/450: rename d0/d4/d10/d18 to d0/d4/d10/d9a 0 2026-03-10T12:37:45.028 INFO:tasks.workunit.client.1.vm07.stdout:4/451: rename d0/d5c/d7c/c92 to d0/d4/d10/d8d/c9b 0 2026-03-10T12:37:45.028 INFO:tasks.workunit.client.1.vm07.stdout:5/377: dwrite d0/fa [0,4194304] 0 2026-03-10T12:37:45.029 INFO:tasks.workunit.client.1.vm07.stdout:5/378: dread d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:37:45.029 INFO:tasks.workunit.client.1.vm07.stdout:4/452: mkdir d0/d4/d10/d3c/d2b/d2d/d9c 0 2026-03-10T12:37:45.029 INFO:tasks.workunit.client.1.vm07.stdout:4/453: chown d0/d4/d10/c89 524 1 2026-03-10T12:37:45.029 INFO:tasks.workunit.client.1.vm07.stdout:3/383: read dc/d18/d24/f2c [1299252,3209] 0 2026-03-10T12:37:45.032 INFO:tasks.workunit.client.1.vm07.stdout:5/379: symlink d0/d22/d18/d3e/d5d/l83 0 2026-03-10T12:37:45.032 INFO:tasks.workunit.client.1.vm07.stdout:4/454: fsync d0/d4/d10/d3c/d2b/d2d/f99 0 2026-03-10T12:37:45.033 INFO:tasks.workunit.client.1.vm07.stdout:4/455: readlink d0/d4/d5/da/l1b 0 2026-03-10T12:37:45.037 INFO:tasks.workunit.client.1.vm07.stdout:3/384: unlink dc/dd/c31 0 2026-03-10T12:37:45.038 INFO:tasks.workunit.client.1.vm07.stdout:4/456: symlink d0/d4/d5/da/d95/l9d 0 2026-03-10T12:37:45.043 INFO:tasks.workunit.client.1.vm07.stdout:5/380: creat d0/d22/d18/d3e/d53/f84 x:0 0 0 2026-03-10T12:37:45.073 INFO:tasks.workunit.client.1.vm07.stdout:3/385: mknod dc/dd/d28/c89 0 2026-03-10T12:37:45.074 INFO:tasks.workunit.client.1.vm07.stdout:5/381: stat d0/d22/d18/d19/d21/d54/c6c 0 2026-03-10T12:37:45.074 INFO:tasks.workunit.client.1.vm07.stdout:3/386: stat dc/dd/d1f/l23 0 2026-03-10T12:37:45.082 INFO:tasks.workunit.client.1.vm07.stdout:4/457: dread d0/d4/d10/d3c/d2b/f60 [0,4194304] 0 2026-03-10T12:37:45.090 INFO:tasks.workunit.client.1.vm07.stdout:4/458: symlink d0/d4/d5/da/d66/l9e 0 2026-03-10T12:37:45.094 INFO:tasks.workunit.client.1.vm07.stdout:4/459: dwrite d0/d4/d10/f36 [0,4194304] 0 2026-03-10T12:37:45.111 INFO:tasks.workunit.client.1.vm07.stdout:5/382: rmdir d0/d22/d18/d19/d2e/d3f/d63 0 2026-03-10T12:37:45.123 INFO:tasks.workunit.client.1.vm07.stdout:5/383: creat d0/d22/d18/d19/d21/d3a/f85 x:0 0 0 2026-03-10T12:37:45.130 INFO:tasks.workunit.client.1.vm07.stdout:4/460: rename d0/d4/d5/da/l1b to d0/d4/d10/l9f 0 2026-03-10T12:37:45.141 INFO:tasks.workunit.client.1.vm07.stdout:5/384: truncate d0/fd 477564 0 2026-03-10T12:37:45.143 INFO:tasks.workunit.client.1.vm07.stdout:5/385: unlink d0/d22/d18/d19/d21/f2d 0 2026-03-10T12:37:45.146 INFO:tasks.workunit.client.1.vm07.stdout:5/386: creat d0/d22/d18/f86 x:0 0 0 2026-03-10T12:37:45.147 INFO:tasks.workunit.client.1.vm07.stdout:5/387: dread - d0/d22/d18/d19/d21/d54/f7d zero size 2026-03-10T12:37:45.147 INFO:tasks.workunit.client.1.vm07.stdout:5/388: chown d0/d22/d18/d3e/l40 992021 1 2026-03-10T12:37:45.157 INFO:tasks.workunit.client.1.vm07.stdout:5/389: link d0/d22/d18/d19/d21/d3a/f4f d0/d22/d18/d19/d2e/d3f/f87 0 2026-03-10T12:37:45.163 INFO:tasks.workunit.client.1.vm07.stdout:4/461: dread d0/d4/d5/da/f15 [8388608,4194304] 0 2026-03-10T12:37:45.163 INFO:tasks.workunit.client.1.vm07.stdout:4/462: fsync d0/d4/d5/d34/f94 0 2026-03-10T12:37:45.163 INFO:tasks.workunit.client.1.vm07.stdout:5/390: rename d0/fa to d0/d22/d18/d19/d2e/f88 0 2026-03-10T12:37:45.163 INFO:tasks.workunit.client.1.vm07.stdout:4/463: dread d0/d4/d10/d9a/f1a [4194304,4194304] 0 2026-03-10T12:37:45.165 INFO:tasks.workunit.client.1.vm07.stdout:2/262: write d0/f46 [987397,81176] 0 2026-03-10T12:37:45.169 INFO:tasks.workunit.client.0.vm00.stdout:0/273: truncate d3/db/f16 374234 0 2026-03-10T12:37:45.171 INFO:tasks.workunit.client.1.vm07.stdout:2/263: chown d0/d42/d1f/d20/f2b 7536 1 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:5/391: creat d0/d22/f89 x:0 0 0 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:4/464: creat d0/d4/d7a/d46/d76/fa0 x:0 0 0 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:5/392: rename d0/d22/d18/d19/f23 to d0/d22/d18/d19/d21/d54/f8a 0 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:5/393: creat d0/d22/d18/d80/f8b x:0 0 0 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:5/394: chown d0/d22/d18/d19/d2e/d3f/d5c/f76 346865512 1 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:2/264: mkdir d0/d42/d4e/d56/d57 0 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:5/395: rmdir d0/d22/d18/d3e/d5d 39 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:2/265: creat d0/d42/d26/d4b/f58 x:0 0 0 2026-03-10T12:37:45.209 INFO:tasks.workunit.client.1.vm07.stdout:2/266: mknod d0/d42/d26/d4b/c59 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/274: link d3/d7/c1f d3/d7/d4c/d5b/d38/d44/c62 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/275: readlink d3/d7/d3c/l20 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/276: creat d3/d7/d58/f63 x:0 0 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/277: stat d3/d7/d4c/d5b/d38/d44 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/278: creat d3/d33/f64 x:0 0 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/279: mkdir d3/d40/d65 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/280: dwrite d3/d22/f54 [0,4194304] 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/281: mknod d3/db/d24/d25/c66 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/282: chown d3/d40/d65 780340 1 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/283: chown d3/d7/d4c/d5b/d38 3070 1 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/284: symlink d3/db/d24/d25/l67 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/285: write d3/db/d24/f2f [448420,75809] 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/286: write d3/d7/d3c/f30 [2462808,90191] 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/287: creat d3/d7/d58/f68 x:0 0 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/288: getdents d3/d22 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/289: symlink d3/d7/d4c/d5b/d38/d44/l69 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/290: read d3/d7/d4c/d5b/f2a [1739451,25317] 0 2026-03-10T12:37:45.210 INFO:tasks.workunit.client.0.vm00.stdout:0/291: dread - d3/d7/d4c/d5b/f5f zero size 2026-03-10T12:37:45.213 INFO:tasks.workunit.client.0.vm00.stdout:0/292: rmdir d3/d22 39 2026-03-10T12:37:45.216 INFO:tasks.workunit.client.0.vm00.stdout:0/293: write d3/d7/d4c/d5b/f37 [4802003,12427] 0 2026-03-10T12:37:45.221 INFO:tasks.workunit.client.0.vm00.stdout:0/294: mknod d3/db/d24/c6a 0 2026-03-10T12:37:45.222 INFO:tasks.workunit.client.0.vm00.stdout:0/295: mknod d3/d7/d3c/c6b 0 2026-03-10T12:37:45.225 INFO:tasks.workunit.client.0.vm00.stdout:0/296: mknod d3/c6c 0 2026-03-10T12:37:45.228 INFO:tasks.workunit.client.0.vm00.stdout:0/297: mknod d3/d33/c6d 0 2026-03-10T12:37:45.271 INFO:tasks.workunit.client.1.vm07.stdout:3/387: dread dc/dd/d28/d3b/f4d [0,4194304] 0 2026-03-10T12:37:45.274 INFO:tasks.workunit.client.1.vm07.stdout:3/388: dwrite f1 [4194304,4194304] 0 2026-03-10T12:37:45.283 INFO:tasks.workunit.client.1.vm07.stdout:3/389: fdatasync dc/dd/f41 0 2026-03-10T12:37:45.288 INFO:tasks.workunit.client.1.vm07.stdout:3/390: fdatasync dc/dd/f41 0 2026-03-10T12:37:45.294 INFO:tasks.workunit.client.1.vm07.stdout:3/391: link dc/dd/d1f/l23 dc/dd/d43/l8a 0 2026-03-10T12:37:45.301 INFO:tasks.workunit.client.1.vm07.stdout:3/392: dread dc/dd/d28/f46 [0,4194304] 0 2026-03-10T12:37:45.432 INFO:tasks.workunit.client.1.vm07.stdout:9/341: dread d5/d16/d18/f1e [4194304,4194304] 0 2026-03-10T12:37:45.434 INFO:tasks.workunit.client.1.vm07.stdout:9/342: mkdir d5/d1f/d7d 0 2026-03-10T12:37:45.435 INFO:tasks.workunit.client.1.vm07.stdout:9/343: readlink d5/l7 0 2026-03-10T12:37:45.440 INFO:tasks.workunit.client.1.vm07.stdout:9/344: symlink d5/d16/d23/l7e 0 2026-03-10T12:37:45.443 INFO:tasks.workunit.client.1.vm07.stdout:9/345: rename d5/d13/d57/d3e/f5a to d5/d1f/d7d/f7f 0 2026-03-10T12:37:45.451 INFO:tasks.workunit.client.1.vm07.stdout:9/346: dread d5/fb [0,4194304] 0 2026-03-10T12:37:45.464 INFO:tasks.workunit.client.1.vm07.stdout:9/347: fdatasync d5/d13/d2c/f44 0 2026-03-10T12:37:45.469 INFO:tasks.workunit.client.1.vm07.stdout:9/348: link d5/d13/l55 d5/l80 0 2026-03-10T12:37:45.470 INFO:tasks.workunit.client.1.vm07.stdout:9/349: truncate d5/d16/d18/f20 318641 0 2026-03-10T12:37:45.471 INFO:tasks.workunit.client.1.vm07.stdout:9/350: mknod d5/d1f/d31/d74/c81 0 2026-03-10T12:37:45.477 INFO:tasks.workunit.client.1.vm07.stdout:9/351: dread d5/d1f/d31/f56 [0,4194304] 0 2026-03-10T12:37:45.482 INFO:tasks.workunit.client.1.vm07.stdout:9/352: write d5/d13/d57/d4f/f58 [394902,536] 0 2026-03-10T12:37:45.484 INFO:tasks.workunit.client.1.vm07.stdout:9/353: chown d5/c11 0 1 2026-03-10T12:37:45.484 INFO:tasks.workunit.client.1.vm07.stdout:9/354: chown d5/d16/d23/c6f 6619858 1 2026-03-10T12:37:45.490 INFO:tasks.workunit.client.0.vm00.stdout:1/245: truncate da/d12/d26/f31 2041183 0 2026-03-10T12:37:45.493 INFO:tasks.workunit.client.0.vm00.stdout:1/246: mkdir da/d24/d28/d56 0 2026-03-10T12:37:45.494 INFO:tasks.workunit.client.0.vm00.stdout:1/247: creat da/d12/d26/f57 x:0 0 0 2026-03-10T12:37:45.496 INFO:tasks.workunit.client.0.vm00.stdout:1/248: creat da/d24/d28/d56/f58 x:0 0 0 2026-03-10T12:37:45.496 INFO:tasks.workunit.client.1.vm07.stdout:9/355: dread d5/d13/f67 [0,4194304] 0 2026-03-10T12:37:45.497 INFO:tasks.workunit.client.0.vm00.stdout:1/249: unlink da/d12/c17 0 2026-03-10T12:37:45.498 INFO:tasks.workunit.client.1.vm07.stdout:9/356: dread d5/d13/d22/f36 [0,4194304] 0 2026-03-10T12:37:45.505 INFO:tasks.workunit.client.1.vm07.stdout:9/357: rmdir d5/d16/d23/d26 39 2026-03-10T12:37:45.507 INFO:tasks.workunit.client.0.vm00.stdout:0/298: creat d3/db/f6e x:0 0 0 2026-03-10T12:37:45.509 INFO:tasks.workunit.client.0.vm00.stdout:0/299: fdatasync d3/d7/d4c/d5b/f5f 0 2026-03-10T12:37:45.512 INFO:tasks.workunit.client.1.vm07.stdout:5/396: getdents d0/d22/d18/d80 0 2026-03-10T12:37:45.512 INFO:tasks.workunit.client.0.vm00.stdout:0/300: unlink d3/l18 0 2026-03-10T12:37:45.514 INFO:tasks.workunit.client.1.vm07.stdout:9/358: link d5/d13/f14 d5/d1f/d31/f82 0 2026-03-10T12:37:45.515 INFO:tasks.workunit.client.0.vm00.stdout:0/301: dwrite d3/d22/f42 [0,4194304] 0 2026-03-10T12:37:45.530 INFO:tasks.workunit.client.1.vm07.stdout:2/267: dwrite d0/d42/f3c [0,4194304] 0 2026-03-10T12:37:45.530 INFO:tasks.workunit.client.1.vm07.stdout:5/397: dwrite d0/d22/f89 [0,4194304] 0 2026-03-10T12:37:45.530 INFO:tasks.workunit.client.0.vm00.stdout:0/302: mkdir d3/d22/d3a/d6f 0 2026-03-10T12:37:45.530 INFO:tasks.workunit.client.0.vm00.stdout:0/303: dread - d3/d7/d58/f68 zero size 2026-03-10T12:37:45.549 INFO:tasks.workunit.client.1.vm07.stdout:3/393: dwrite dc/dd/d28/d3b/f4d [0,4194304] 0 2026-03-10T12:37:45.550 INFO:tasks.workunit.client.0.vm00.stdout:0/304: link d3/db/d24/f2f d3/d7/f70 0 2026-03-10T12:37:45.551 INFO:tasks.workunit.client.1.vm07.stdout:2/268: dread d0/d42/d1f/d20/f2b [0,4194304] 0 2026-03-10T12:37:45.561 INFO:tasks.workunit.client.0.vm00.stdout:0/305: fdatasync d3/d7/d3c/f19 0 2026-03-10T12:37:45.562 INFO:tasks.workunit.client.0.vm00.stdout:0/306: fsync d3/d7/d3c/f30 0 2026-03-10T12:37:45.568 INFO:tasks.workunit.client.0.vm00.stdout:1/250: write da/f13 [1601946,108195] 0 2026-03-10T12:37:45.568 INFO:tasks.workunit.client.1.vm07.stdout:3/394: mknod dc/c8b 0 2026-03-10T12:37:45.570 INFO:tasks.workunit.client.0.vm00.stdout:0/307: creat d3/d22/f71 x:0 0 0 2026-03-10T12:37:45.571 INFO:tasks.workunit.client.0.vm00.stdout:0/308: fsync d3/d7/d4c/d5b/d38/d44/f49 0 2026-03-10T12:37:45.575 INFO:tasks.workunit.client.0.vm00.stdout:1/251: dread da/d12/f30 [0,4194304] 0 2026-03-10T12:37:45.575 INFO:tasks.workunit.client.0.vm00.stdout:1/252: chown da/d24/d28/d56 2888460 1 2026-03-10T12:37:45.575 INFO:tasks.workunit.client.0.vm00.stdout:1/253: chown da/d24/d28/l2a 6 1 2026-03-10T12:37:45.576 INFO:tasks.workunit.client.1.vm07.stdout:3/395: creat dc/dd/d1f/d6f/f8c x:0 0 0 2026-03-10T12:37:45.576 INFO:tasks.workunit.client.0.vm00.stdout:1/254: fsync da/d24/f45 0 2026-03-10T12:37:45.578 INFO:tasks.workunit.client.0.vm00.stdout:1/255: getdents da/d12/d26 0 2026-03-10T12:37:45.580 INFO:tasks.workunit.client.1.vm07.stdout:2/269: creat d0/d42/d26/f5a x:0 0 0 2026-03-10T12:37:45.585 INFO:tasks.workunit.client.1.vm07.stdout:3/396: rename dc/d18/l40 to dc/dd/d28/l8d 0 2026-03-10T12:37:45.585 INFO:tasks.workunit.client.1.vm07.stdout:3/397: read dc/dd/d28/f46 [3771324,128265] 0 2026-03-10T12:37:45.590 INFO:tasks.workunit.client.1.vm07.stdout:3/398: dwrite dc/d18/f79 [0,4194304] 0 2026-03-10T12:37:45.592 INFO:tasks.workunit.client.1.vm07.stdout:2/270: dwrite d0/d42/d26/d4b/f51 [0,4194304] 0 2026-03-10T12:37:45.594 INFO:tasks.workunit.client.0.vm00.stdout:7/224: link da/d26/c34 da/d26/d50/c54 0 2026-03-10T12:37:45.596 INFO:tasks.workunit.client.1.vm07.stdout:3/399: stat dc/d18/l82 0 2026-03-10T12:37:45.609 INFO:tasks.workunit.client.0.vm00.stdout:0/309: link d3/d40/f59 d3/d7/d3c/f72 0 2026-03-10T12:37:45.610 INFO:tasks.workunit.client.0.vm00.stdout:0/310: chown d3/d7/d3c/c6b 604 1 2026-03-10T12:37:45.614 INFO:tasks.workunit.client.1.vm07.stdout:2/271: mkdir d0/d5b 0 2026-03-10T12:37:45.618 INFO:tasks.workunit.client.0.vm00.stdout:7/225: dread f1 [4194304,4194304] 0 2026-03-10T12:37:45.629 INFO:tasks.workunit.client.1.vm07.stdout:2/272: creat d0/d42/d26/d38/d4f/f5c x:0 0 0 2026-03-10T12:37:45.629 INFO:tasks.workunit.client.1.vm07.stdout:2/273: mkdir d0/d42/d26/d38/d4f/d5d 0 2026-03-10T12:37:45.629 INFO:tasks.workunit.client.0.vm00.stdout:7/226: fsync da/d1b/f39 0 2026-03-10T12:37:45.629 INFO:tasks.workunit.client.0.vm00.stdout:7/227: dwrite da/d25/d2e/f43 [0,4194304] 0 2026-03-10T12:37:45.629 INFO:tasks.workunit.client.0.vm00.stdout:7/228: chown da/c4d 15623 1 2026-03-10T12:37:45.629 INFO:tasks.workunit.client.0.vm00.stdout:7/229: write da/d25/d2c/f4f [2379818,24829] 0 2026-03-10T12:37:45.631 INFO:tasks.workunit.client.1.vm07.stdout:2/274: rename d0/d42/d26/d4b/c59 to d0/d45/d54/c5e 0 2026-03-10T12:37:45.632 INFO:tasks.workunit.client.1.vm07.stdout:2/275: write d0/d42/d26/f5a [324246,87591] 0 2026-03-10T12:37:45.638 INFO:tasks.workunit.client.1.vm07.stdout:2/276: creat d0/d42/f5f x:0 0 0 2026-03-10T12:37:45.656 INFO:tasks.workunit.client.0.vm00.stdout:6/247: rename d2/d51/f58 to d2/d14/f5d 0 2026-03-10T12:37:45.657 INFO:tasks.workunit.client.0.vm00.stdout:6/248: stat d2/d16/d29/d31 0 2026-03-10T12:37:45.658 INFO:tasks.workunit.client.1.vm07.stdout:9/359: write d5/d16/f19 [7678030,116078] 0 2026-03-10T12:37:45.658 INFO:tasks.workunit.client.1.vm07.stdout:9/360: chown d5/f45 476825476 1 2026-03-10T12:37:45.659 INFO:tasks.workunit.client.0.vm00.stdout:8/192: rename d0/l2 to d0/d12/l3a 0 2026-03-10T12:37:45.662 INFO:tasks.workunit.client.1.vm07.stdout:9/361: creat d5/d13/d22/f83 x:0 0 0 2026-03-10T12:37:45.663 INFO:tasks.workunit.client.0.vm00.stdout:3/231: rename dd/d27/l2e to dd/d27/d2c/d34/d45/l52 0 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:3/232: write dd/d27/f35 [3900581,9603] 0 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:3/233: truncate dd/d27/d2c/d34/d38/f48 4457285 0 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:8/193: symlink d0/d12/d17/l3b 0 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:6/249: creat d2/f5e x:0 0 0 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:6/250: chown d2/da/c1f 29920520 1 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:6/251: chown d2/da/dc/c5b 40036 1 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:2/205: rename d4/d6/c27 to d4/d6/d2d/d31/d32/c47 0 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:6/252: dread - d2/d39/f46 zero size 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:3/234: unlink dd/d18/c1a 0 2026-03-10T12:37:45.674 INFO:tasks.workunit.client.0.vm00.stdout:6/253: symlink d2/d16/d29/l5f 0 2026-03-10T12:37:45.675 INFO:tasks.workunit.client.1.vm07.stdout:0/419: sync 2026-03-10T12:37:45.676 INFO:tasks.workunit.client.0.vm00.stdout:9/239: rename d0/d3d/d43/d53/d57/d35 to d0/d3d/d59 0 2026-03-10T12:37:45.679 INFO:tasks.workunit.client.0.vm00.stdout:9/240: dwrite d0/f21 [4194304,4194304] 0 2026-03-10T12:37:45.682 INFO:tasks.workunit.client.1.vm07.stdout:1/342: sync 2026-03-10T12:37:45.682 INFO:tasks.workunit.client.1.vm07.stdout:7/338: sync 2026-03-10T12:37:45.682 INFO:tasks.workunit.client.1.vm07.stdout:4/465: sync 2026-03-10T12:37:45.682 INFO:tasks.workunit.client.1.vm07.stdout:8/382: sync 2026-03-10T12:37:45.682 INFO:tasks.workunit.client.1.vm07.stdout:6/306: sync 2026-03-10T12:37:45.684 INFO:tasks.workunit.client.0.vm00.stdout:3/235: creat dd/d3d/f53 x:0 0 0 2026-03-10T12:37:45.684 INFO:tasks.workunit.client.0.vm00.stdout:9/241: dread d0/d5/dc/f41 [0,4194304] 0 2026-03-10T12:37:45.685 INFO:tasks.workunit.client.1.vm07.stdout:1/343: chown d9/df/f15 1206884 1 2026-03-10T12:37:45.685 INFO:tasks.workunit.client.1.vm07.stdout:5/398: dwrite d0/f47 [0,4194304] 0 2026-03-10T12:37:45.686 INFO:tasks.workunit.client.1.vm07.stdout:4/466: chown d0/d4/l88 121 1 2026-03-10T12:37:45.687 INFO:tasks.workunit.client.1.vm07.stdout:0/420: mkdir d0/d14/d5f/d41/d86 0 2026-03-10T12:37:45.689 INFO:tasks.workunit.client.1.vm07.stdout:4/467: readlink d0/d4/d5/d34/l5e 0 2026-03-10T12:37:45.690 INFO:tasks.workunit.client.0.vm00.stdout:5/208: rename f16 to d1f/f46 0 2026-03-10T12:37:45.697 INFO:tasks.workunit.client.0.vm00.stdout:4/202: rename df/d24 to df/d1f/d36/d3a/d41 0 2026-03-10T12:37:45.699 INFO:tasks.workunit.client.1.vm07.stdout:5/399: dwrite d0/d22/d18/d19/d36/f3d [0,4194304] 0 2026-03-10T12:37:45.699 INFO:tasks.workunit.client.0.vm00.stdout:6/254: mknod d2/da/dc/c60 0 2026-03-10T12:37:45.706 INFO:tasks.workunit.client.0.vm00.stdout:2/206: dread d4/f28 [0,4194304] 0 2026-03-10T12:37:45.708 INFO:tasks.workunit.client.0.vm00.stdout:2/207: truncate d4/d6/f2e 2060388 0 2026-03-10T12:37:45.709 INFO:tasks.workunit.client.0.vm00.stdout:2/208: dwrite d4/d6/f2b [0,4194304] 0 2026-03-10T12:37:45.712 INFO:tasks.workunit.client.0.vm00.stdout:9/242: creat d0/d3d/d43/d53/f5a x:0 0 0 2026-03-10T12:37:45.713 INFO:tasks.workunit.client.0.vm00.stdout:9/243: fsync d0/d5/f3b 0 2026-03-10T12:37:45.714 INFO:tasks.workunit.client.0.vm00.stdout:5/209: link d1f/d26/d2e/l33 d1f/d39/l47 0 2026-03-10T12:37:45.715 INFO:tasks.workunit.client.0.vm00.stdout:9/244: dread - d0/d3d/d59/f4a zero size 2026-03-10T12:37:45.718 INFO:tasks.workunit.client.0.vm00.stdout:0/311: dwrite d3/d7/d4c/d5b/f2b [4194304,4194304] 0 2026-03-10T12:37:45.719 INFO:tasks.workunit.client.1.vm07.stdout:3/400: dwrite dc/dd/f41 [0,4194304] 0 2026-03-10T12:37:45.719 INFO:tasks.workunit.client.0.vm00.stdout:1/256: dwrite f5 [0,4194304] 0 2026-03-10T12:37:45.724 INFO:tasks.workunit.client.0.vm00.stdout:7/230: write da/d1b/f22 [651081,123670] 0 2026-03-10T12:37:45.729 INFO:tasks.workunit.client.0.vm00.stdout:7/231: dwrite da/f16 [0,4194304] 0 2026-03-10T12:37:45.730 INFO:tasks.workunit.client.0.vm00.stdout:7/232: stat da/d1b/f39 0 2026-03-10T12:37:45.733 INFO:tasks.workunit.client.0.vm00.stdout:2/209: symlink d4/d6/d2d/d31/d32/l48 0 2026-03-10T12:37:45.737 INFO:tasks.workunit.client.1.vm07.stdout:2/277: write d0/d42/d1f/d20/f39 [913691,27367] 0 2026-03-10T12:37:45.738 INFO:tasks.workunit.client.1.vm07.stdout:2/278: readlink d0/l11 0 2026-03-10T12:37:45.740 INFO:tasks.workunit.client.0.vm00.stdout:4/203: rename df/d1f/d22/d26/f38 to df/f42 0 2026-03-10T12:37:45.741 INFO:tasks.workunit.client.0.vm00.stdout:5/210: write d1f/f22 [2043168,120006] 0 2026-03-10T12:37:45.749 INFO:tasks.workunit.client.0.vm00.stdout:1/257: mkdir da/d24/d28/d44/d59 0 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.1.vm07.stdout:9/362: truncate d5/d16/f19 2565202 0 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.1.vm07.stdout:1/344: chown d9/l25 3 1 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.1.vm07.stdout:4/468: creat d0/fa1 x:0 0 0 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.0.vm00.stdout:1/258: chown da/d21/d27/f54 0 1 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.0.vm00.stdout:2/210: creat d4/d6/d41/f49 x:0 0 0 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.0.vm00.stdout:4/204: mknod df/d32/c43 0 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.0.vm00.stdout:9/245: link d0/d5/d16/d19/c23 d0/d3d/d43/c5b 0 2026-03-10T12:37:45.765 INFO:tasks.workunit.client.0.vm00.stdout:9/246: dread d0/d5/f3b [0,4194304] 0 2026-03-10T12:37:45.767 INFO:tasks.workunit.client.0.vm00.stdout:0/312: dread d3/f4 [0,4194304] 0 2026-03-10T12:37:45.768 INFO:tasks.workunit.client.1.vm07.stdout:5/400: creat d0/d22/d18/d19/d2e/d3f/d5c/f8c x:0 0 0 2026-03-10T12:37:45.768 INFO:tasks.workunit.client.1.vm07.stdout:6/307: chown d1/d4/d4a 517578 1 2026-03-10T12:37:45.769 INFO:tasks.workunit.client.0.vm00.stdout:2/211: mknod d4/d6/d2d/c4a 0 2026-03-10T12:37:45.771 INFO:tasks.workunit.client.0.vm00.stdout:5/211: creat d1f/d26/f48 x:0 0 0 2026-03-10T12:37:45.773 INFO:tasks.workunit.client.0.vm00.stdout:2/212: symlink d4/d6/l4b 0 2026-03-10T12:37:45.773 INFO:tasks.workunit.client.0.vm00.stdout:2/213: write d4/dd/f45 [398703,79664] 0 2026-03-10T12:37:45.773 INFO:tasks.workunit.client.0.vm00.stdout:0/313: creat d3/d7/d4c/f73 x:0 0 0 2026-03-10T12:37:45.774 INFO:tasks.workunit.client.0.vm00.stdout:2/214: chown d4/d6/d41/f49 432302 1 2026-03-10T12:37:45.776 INFO:tasks.workunit.client.0.vm00.stdout:5/212: mknod d1f/d26/d2b/c49 0 2026-03-10T12:37:45.778 INFO:tasks.workunit.client.0.vm00.stdout:2/215: creat d4/d6/d41/f4c x:0 0 0 2026-03-10T12:37:45.779 INFO:tasks.workunit.client.0.vm00.stdout:5/213: creat d1f/f4a x:0 0 0 2026-03-10T12:37:45.780 INFO:tasks.workunit.client.0.vm00.stdout:5/214: creat d1f/d26/d2b/f4b x:0 0 0 2026-03-10T12:37:45.781 INFO:tasks.workunit.client.1.vm07.stdout:2/279: rename d0/d42/f3c to d0/d42/d4e/d56/f60 0 2026-03-10T12:37:45.781 INFO:tasks.workunit.client.0.vm00.stdout:2/216: symlink d4/d6/d2d/d31/d32/d40/l4d 0 2026-03-10T12:37:45.781 INFO:tasks.workunit.client.0.vm00.stdout:2/217: fsync d4/d6/d2d/d31/f46 0 2026-03-10T12:37:45.788 INFO:tasks.workunit.client.0.vm00.stdout:2/218: creat d4/d6/f4e x:0 0 0 2026-03-10T12:37:45.788 INFO:tasks.workunit.client.0.vm00.stdout:2/219: stat d4/d6 0 2026-03-10T12:37:45.788 INFO:tasks.workunit.client.0.vm00.stdout:2/220: fsync f1 0 2026-03-10T12:37:45.793 INFO:tasks.workunit.client.0.vm00.stdout:2/221: fsync d4/d6/f30 0 2026-03-10T12:37:45.794 INFO:tasks.workunit.client.0.vm00.stdout:5/215: dread d1f/f2c [0,4194304] 0 2026-03-10T12:37:45.795 INFO:tasks.workunit.client.0.vm00.stdout:2/222: rename d4/d6/d2d/d31/d32/d40/l4d to d4/d6/d2d/d31/l4f 0 2026-03-10T12:37:45.796 INFO:tasks.workunit.client.0.vm00.stdout:2/223: fsync d4/d6/d2d/d31/f46 0 2026-03-10T12:37:45.798 INFO:tasks.workunit.client.1.vm07.stdout:7/339: mkdir d0/d57/d62/d6d 0 2026-03-10T12:37:45.798 INFO:tasks.workunit.client.0.vm00.stdout:5/216: creat d1f/d26/d2b/d37/f4c x:0 0 0 2026-03-10T12:37:45.798 INFO:tasks.workunit.client.0.vm00.stdout:5/217: read - d1f/f4a zero size 2026-03-10T12:37:45.798 INFO:tasks.workunit.client.1.vm07.stdout:4/469: rmdir d0/d4/d10 39 2026-03-10T12:37:45.801 INFO:tasks.workunit.client.0.vm00.stdout:0/314: fdatasync d3/d40/f4e 0 2026-03-10T12:37:45.801 INFO:tasks.workunit.client.0.vm00.stdout:5/218: rename d1f/d26/d2b/c49 to d1f/d26/c4d 0 2026-03-10T12:37:45.801 INFO:tasks.workunit.client.0.vm00.stdout:5/219: chown d1f/d39 186969553 1 2026-03-10T12:37:45.805 INFO:tasks.workunit.client.0.vm00.stdout:5/220: dwrite d1f/f30 [0,4194304] 0 2026-03-10T12:37:45.810 INFO:tasks.workunit.client.0.vm00.stdout:5/221: mknod d1f/d26/d2e/c4e 0 2026-03-10T12:37:45.810 INFO:tasks.workunit.client.0.vm00.stdout:5/222: read - d1f/d26/d2b/f44 zero size 2026-03-10T12:37:45.814 INFO:tasks.workunit.client.0.vm00.stdout:5/223: mknod d1f/d26/c4f 0 2026-03-10T12:37:45.823 INFO:tasks.workunit.client.0.vm00.stdout:5/224: creat d1f/d26/d2b/d35/f50 x:0 0 0 2026-03-10T12:37:45.826 INFO:tasks.workunit.client.1.vm07.stdout:1/345: dread d9/f1f [0,4194304] 0 2026-03-10T12:37:45.826 INFO:tasks.workunit.client.0.vm00.stdout:0/315: getdents d3/d7/d3c 0 2026-03-10T12:37:45.830 INFO:tasks.workunit.client.0.vm00.stdout:5/225: unlink d1f/c20 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.1.vm07.stdout:7/340: unlink d0/d47/d48/f4c 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.1.vm07.stdout:9/363: creat d5/d1f/d31/d76/f84 x:0 0 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.1.vm07.stdout:7/341: truncate d0/d61/f69 1023983 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.1.vm07.stdout:6/308: getdents d1/d4/d6/d16/d1a/d2c/d5b 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.0.vm00.stdout:0/316: rmdir d3/d22/d3a/d6f 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.0.vm00.stdout:0/317: dwrite d3/db/f6e [0,4194304] 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.0.vm00.stdout:0/318: mkdir d3/d7/d3c/d74 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.0.vm00.stdout:0/319: fdatasync d3/db/d24/f2f 0 2026-03-10T12:37:45.853 INFO:tasks.workunit.client.1.vm07.stdout:4/470: link d0/d4/d7a/f27 d0/d4/d7a/d46/d76/fa2 0 2026-03-10T12:37:45.854 INFO:tasks.workunit.client.1.vm07.stdout:2/280: rmdir d0/d42/d4e/d56/d57 0 2026-03-10T12:37:45.854 INFO:tasks.workunit.client.1.vm07.stdout:6/309: unlink d1/d4/d6/l27 0 2026-03-10T12:37:45.856 INFO:tasks.workunit.client.1.vm07.stdout:9/364: read d5/d13/d22/f32 [109402,124263] 0 2026-03-10T12:37:45.856 INFO:tasks.workunit.client.1.vm07.stdout:2/281: truncate d0/d42/d26/d4b/f58 880865 0 2026-03-10T12:37:45.857 INFO:tasks.workunit.client.1.vm07.stdout:2/282: read d0/d42/d26/d38/f3a [289718,127905] 0 2026-03-10T12:37:45.858 INFO:tasks.workunit.client.1.vm07.stdout:2/283: dread - d0/d42/f5f zero size 2026-03-10T12:37:45.861 INFO:tasks.workunit.client.1.vm07.stdout:5/401: rename d0/d22/d18/d19/l7c to d0/d22/d18/d19/l8d 0 2026-03-10T12:37:45.861 INFO:tasks.workunit.client.0.vm00.stdout:9/247: dread d0/d5/d16/d19/f20 [4194304,4194304] 0 2026-03-10T12:37:45.862 INFO:tasks.workunit.client.1.vm07.stdout:4/471: creat d0/d4/d5/d34/fa3 x:0 0 0 2026-03-10T12:37:45.864 INFO:tasks.workunit.client.1.vm07.stdout:6/310: rename d1/d4/d6/d4e/f51 to d1/d4/d6/d16/f63 0 2026-03-10T12:37:45.866 INFO:tasks.workunit.client.1.vm07.stdout:9/365: mkdir d5/d13/d57/d3e/d85 0 2026-03-10T12:37:45.867 INFO:tasks.workunit.client.1.vm07.stdout:9/366: readlink d5/d13/d22/l5d 0 2026-03-10T12:37:45.868 INFO:tasks.workunit.client.1.vm07.stdout:2/284: creat d0/d42/d4e/f61 x:0 0 0 2026-03-10T12:37:45.870 INFO:tasks.workunit.client.1.vm07.stdout:4/472: chown d0/d4/d10/d3c/d2b/d2d/f65 476 1 2026-03-10T12:37:45.871 INFO:tasks.workunit.client.1.vm07.stdout:6/311: rmdir d1/d4/d4a 39 2026-03-10T12:37:45.873 INFO:tasks.workunit.client.1.vm07.stdout:4/473: symlink d0/d4/d7a/d46/d76/la4 0 2026-03-10T12:37:45.874 INFO:tasks.workunit.client.1.vm07.stdout:9/367: creat d5/d16/d23/d26/f86 x:0 0 0 2026-03-10T12:37:45.876 INFO:tasks.workunit.client.1.vm07.stdout:4/474: symlink d0/d4/la5 0 2026-03-10T12:37:45.876 INFO:tasks.workunit.client.1.vm07.stdout:9/368: stat d5/d13/d57/d3e/f53 0 2026-03-10T12:37:45.878 INFO:tasks.workunit.client.1.vm07.stdout:4/475: chown d0/d4/d10/d9a/l6a 1548 1 2026-03-10T12:37:45.878 INFO:tasks.workunit.client.1.vm07.stdout:9/369: write d5/d16/d23/d26/f46 [3773083,68613] 0 2026-03-10T12:37:45.878 INFO:tasks.workunit.client.1.vm07.stdout:6/312: dwrite d1/d4/f5a [0,4194304] 0 2026-03-10T12:37:45.882 INFO:tasks.workunit.client.1.vm07.stdout:9/370: symlink d5/d13/d6c/l87 0 2026-03-10T12:37:45.882 INFO:tasks.workunit.client.1.vm07.stdout:9/371: chown d5/d13/d57 666 1 2026-03-10T12:37:45.892 INFO:tasks.workunit.client.1.vm07.stdout:6/313: rmdir d1/d4/d6/d16/d1a/d2c/d5b 0 2026-03-10T12:37:45.902 INFO:tasks.workunit.client.1.vm07.stdout:6/314: mkdir d1/d4/d6/d4e/d64 0 2026-03-10T12:37:45.902 INFO:tasks.workunit.client.1.vm07.stdout:2/285: dread d0/f18 [0,4194304] 0 2026-03-10T12:37:45.902 INFO:tasks.workunit.client.1.vm07.stdout:2/286: chown d0/c16 570054 1 2026-03-10T12:37:45.902 INFO:tasks.workunit.client.1.vm07.stdout:2/287: mkdir d0/d42/d26/d38/d4f/d62 0 2026-03-10T12:37:45.902 INFO:tasks.workunit.client.1.vm07.stdout:6/315: dwrite d1/d4/d6/d16/d1a/d33/f37 [0,4194304] 0 2026-03-10T12:37:45.915 INFO:tasks.workunit.client.1.vm07.stdout:9/372: dread d5/f1c [0,4194304] 0 2026-03-10T12:37:45.925 INFO:tasks.workunit.client.0.vm00.stdout:0/320: dread d3/d7/f31 [0,4194304] 0 2026-03-10T12:37:45.929 INFO:tasks.workunit.client.0.vm00.stdout:0/321: dwrite d3/d7/d4c/d5b/f37 [4194304,4194304] 0 2026-03-10T12:37:45.934 INFO:tasks.workunit.client.0.vm00.stdout:0/322: mknod d3/d7/d4c/d5b/c75 0 2026-03-10T12:37:45.941 INFO:tasks.workunit.client.0.vm00.stdout:0/323: creat d3/d7/d4c/f76 x:0 0 0 2026-03-10T12:37:45.941 INFO:tasks.workunit.client.0.vm00.stdout:0/324: read d3/d7/f1c [494695,8855] 0 2026-03-10T12:37:45.952 INFO:tasks.workunit.client.1.vm07.stdout:5/402: dread d0/d22/d18/f4c [0,4194304] 0 2026-03-10T12:37:45.957 INFO:tasks.workunit.client.0.vm00.stdout:2/224: sync 2026-03-10T12:37:45.965 INFO:tasks.workunit.client.0.vm00.stdout:5/226: sync 2026-03-10T12:37:45.969 INFO:tasks.workunit.client.0.vm00.stdout:5/227: unlink d1f/d26/d2b/f4b 0 2026-03-10T12:37:46.000 INFO:tasks.workunit.client.1.vm07.stdout:0/421: dread d0/d14/d5f/d76/f78 [0,4194304] 0 2026-03-10T12:37:46.018 INFO:tasks.workunit.client.0.vm00.stdout:8/194: getdents d0 0 2026-03-10T12:37:46.019 INFO:tasks.workunit.client.0.vm00.stdout:3/236: getdents dd/d18 0 2026-03-10T12:37:46.019 INFO:tasks.workunit.client.0.vm00.stdout:3/237: write dd/f25 [2582097,101958] 0 2026-03-10T12:37:46.024 INFO:tasks.workunit.client.0.vm00.stdout:3/238: creat dd/d3d/f54 x:0 0 0 2026-03-10T12:37:46.027 INFO:tasks.workunit.client.0.vm00.stdout:3/239: dwrite dd/d3d/f54 [0,4194304] 0 2026-03-10T12:37:46.068 INFO:tasks.workunit.client.0.vm00.stdout:6/255: dwrite d2/da/dc/f25 [0,4194304] 0 2026-03-10T12:37:46.071 INFO:tasks.workunit.client.0.vm00.stdout:6/256: truncate d2/d16/d29/d31/d48/f59 815772 0 2026-03-10T12:37:46.072 INFO:tasks.workunit.client.0.vm00.stdout:6/257: mknod d2/d39/c61 0 2026-03-10T12:37:46.077 INFO:tasks.workunit.client.0.vm00.stdout:7/233: dwrite da/d25/f2b [4194304,4194304] 0 2026-03-10T12:37:46.078 INFO:tasks.workunit.client.1.vm07.stdout:8/383: write d1/d3/f1d [1428911,113050] 0 2026-03-10T12:37:46.079 INFO:tasks.workunit.client.1.vm07.stdout:3/401: write dc/d18/f36 [4007239,20750] 0 2026-03-10T12:37:46.090 INFO:tasks.workunit.client.0.vm00.stdout:1/259: write da/d12/f30 [2191550,120533] 0 2026-03-10T12:37:46.092 INFO:tasks.workunit.client.1.vm07.stdout:8/384: creat d1/d3/d6/d7b/f7c x:0 0 0 2026-03-10T12:37:46.095 INFO:tasks.workunit.client.0.vm00.stdout:4/205: write df/f12 [4073180,11111] 0 2026-03-10T12:37:46.096 INFO:tasks.workunit.client.0.vm00.stdout:4/206: fdatasync df/d1f/d36/d3a/d41/f33 0 2026-03-10T12:37:46.097 INFO:tasks.workunit.client.0.vm00.stdout:4/207: read - df/f3d zero size 2026-03-10T12:37:46.100 INFO:tasks.workunit.client.0.vm00.stdout:4/208: dwrite fa [0,4194304] 0 2026-03-10T12:37:46.101 INFO:tasks.workunit.client.0.vm00.stdout:4/209: write df/d1f/d22/f3c [911671,78003] 0 2026-03-10T12:37:46.106 INFO:tasks.workunit.client.1.vm07.stdout:8/385: dwrite d1/d3/d40/f4c [0,4194304] 0 2026-03-10T12:37:46.107 INFO:tasks.workunit.client.0.vm00.stdout:4/210: creat df/d1f/d36/d3a/f44 x:0 0 0 2026-03-10T12:37:46.108 INFO:tasks.workunit.client.1.vm07.stdout:3/402: truncate dc/f17 1372367 0 2026-03-10T12:37:46.112 INFO:tasks.workunit.client.0.vm00.stdout:4/211: stat df/d1f/l21 0 2026-03-10T12:37:46.114 INFO:tasks.workunit.client.0.vm00.stdout:4/212: fsync df/f1c 0 2026-03-10T12:37:46.117 INFO:tasks.workunit.client.0.vm00.stdout:4/213: rmdir df/d1f/d22/d26 39 2026-03-10T12:37:46.120 INFO:tasks.workunit.client.0.vm00.stdout:4/214: stat df/d1f/d22/d26/c2c 0 2026-03-10T12:37:46.121 INFO:tasks.workunit.client.1.vm07.stdout:4/476: truncate d0/d4/d7a/f50 594824 0 2026-03-10T12:37:46.123 INFO:tasks.workunit.client.0.vm00.stdout:4/215: dwrite df/d1f/d22/d26/f39 [0,4194304] 0 2026-03-10T12:37:46.124 INFO:tasks.workunit.client.0.vm00.stdout:4/216: chown df/d1f/l21 1 1 2026-03-10T12:37:46.125 INFO:tasks.workunit.client.0.vm00.stdout:4/217: chown df/d1f/d22/f3c 11573 1 2026-03-10T12:37:46.125 INFO:tasks.workunit.client.0.vm00.stdout:4/218: chown df/d32 598144 1 2026-03-10T12:37:46.128 INFO:tasks.workunit.client.1.vm07.stdout:1/346: write d9/f1f [1576365,70301] 0 2026-03-10T12:37:46.132 INFO:tasks.workunit.client.0.vm00.stdout:7/234: unlink da/d41/d48/f51 0 2026-03-10T12:37:46.139 INFO:tasks.workunit.client.1.vm07.stdout:1/347: rmdir d9/df/d54 39 2026-03-10T12:37:46.144 INFO:tasks.workunit.client.0.vm00.stdout:4/219: dread df/d1f/d22/f30 [0,4194304] 0 2026-03-10T12:37:46.144 INFO:tasks.workunit.client.1.vm07.stdout:8/386: rename d1/d3/f25 to d1/d3/d6/d54/f7d 0 2026-03-10T12:37:46.144 INFO:tasks.workunit.client.1.vm07.stdout:7/342: dwrite d0/f39 [0,4194304] 0 2026-03-10T12:37:46.146 INFO:tasks.workunit.client.0.vm00.stdout:4/220: mknod df/d1f/d36/d40/c45 0 2026-03-10T12:37:46.146 INFO:tasks.workunit.client.0.vm00.stdout:4/221: stat df/f1c 0 2026-03-10T12:37:46.148 INFO:tasks.workunit.client.0.vm00.stdout:5/228: truncate d1f/d26/f28 157143 0 2026-03-10T12:37:46.149 INFO:tasks.workunit.client.0.vm00.stdout:5/229: chown d1f/d26/d2b/f44 62326 1 2026-03-10T12:37:46.149 INFO:tasks.workunit.client.0.vm00.stdout:5/230: read f11 [1050726,12023] 0 2026-03-10T12:37:46.151 INFO:tasks.workunit.client.0.vm00.stdout:4/222: symlink df/d1f/d36/d3a/d41/l46 0 2026-03-10T12:37:46.165 INFO:tasks.workunit.client.0.vm00.stdout:4/223: link df/d1f/d22/f30 df/d1f/d36/d3a/d41/f47 0 2026-03-10T12:37:46.165 INFO:tasks.workunit.client.0.vm00.stdout:7/235: symlink da/d25/d2e/d4c/l55 0 2026-03-10T12:37:46.172 INFO:tasks.workunit.client.0.vm00.stdout:3/240: truncate dd/d27/f35 3549355 0 2026-03-10T12:37:46.173 INFO:tasks.workunit.client.0.vm00.stdout:4/224: mknod df/c48 0 2026-03-10T12:37:46.174 INFO:tasks.workunit.client.0.vm00.stdout:4/225: write fb [2038773,37956] 0 2026-03-10T12:37:46.175 INFO:tasks.workunit.client.1.vm07.stdout:4/477: getdents d0/d4/d5/da/d66 0 2026-03-10T12:37:46.179 INFO:tasks.workunit.client.0.vm00.stdout:4/226: write df/f42 [128118,77366] 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:4/478: dwrite d0/d4/d5/da/d66/f8c [4194304,4194304] 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:4/479: chown d0 840 1 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:2/288: write d0/d42/d26/f3e [902804,34079] 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:9/373: dwrite d5/f65 [4194304,4194304] 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:6/316: dwrite d1/d4/d4a/f55 [0,4194304] 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:6/317: dread - d1/d4/d6/d53/f5e zero size 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:4/480: fsync d0/d4/d5/da/f4d 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:2/289: rename d0/d45/c4c to d0/d42/d26/d38/d4f/d62/c63 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:9/374: creat d5/d13/d57/d4f/f88 x:0 0 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.1.vm07.stdout:9/375: readlink d5/d16/l62 0 2026-03-10T12:37:46.217 INFO:tasks.workunit.client.0.vm00.stdout:4/227: chown df/d1f/d36/d3a/d41/l46 1 1 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:4/228: write df/d1f/d36/d3a/d41/f33 [1318786,98531] 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:3/241: getdents dd/d18/d13/d1d 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:3/242: chown dd/d18/d14/c20 247046 1 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:3/243: stat dd/l1f 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/260: dwrite da/f22 [0,4194304] 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:0/325: dwrite d3/d40/f4e [0,4194304] 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:0/326: write d3/f50 [1036187,60305] 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:0/327: write d3/d7/d4c/d5b/d38/d44/f49 [3233768,57011] 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:4/229: creat df/d1f/d36/d40/f49 x:0 0 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/261: mkdir da/d24/d5a 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/262: creat da/d12/d26/d42/f5b x:0 0 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/263: unlink da/d24/d28/d56/f58 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/264: stat da/f13 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:0/328: mkdir d3/db/d77 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/265: dread - da/d24/f47 zero size 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:0/329: dread - d3/d7/d58/f63 zero size 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/266: mknod da/d21/c5c 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/267: mkdir da/d24/d28/d44/d5d 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/268: symlink da/d24/d28/d44/d59/l5e 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/269: mknod da/d12/d26/d42/c5f 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:0/330: dread d3/d7/d4c/d5b/f2a [0,4194304] 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:1/270: write da/d24/f53 [857947,27916] 0 2026-03-10T12:37:46.218 INFO:tasks.workunit.client.0.vm00.stdout:0/331: dwrite d3/d7/d4c/d5b/f56 [0,4194304] 0 2026-03-10T12:37:46.219 INFO:tasks.workunit.client.1.vm07.stdout:0/422: dwrite d0/d14/d5f/d76/f3d [4194304,4194304] 0 2026-03-10T12:37:46.220 INFO:tasks.workunit.client.1.vm07.stdout:5/403: dwrite d0/f1f [4194304,4194304] 0 2026-03-10T12:37:46.221 INFO:tasks.workunit.client.0.vm00.stdout:0/332: write d3/d40/f4e [3272298,76604] 0 2026-03-10T12:37:46.222 INFO:tasks.workunit.client.1.vm07.stdout:5/404: read - d0/d22/d18/d19/d21/d54/f7d zero size 2026-03-10T12:37:46.226 INFO:tasks.workunit.client.1.vm07.stdout:2/290: mkdir d0/d29/d64 0 2026-03-10T12:37:46.228 INFO:tasks.workunit.client.1.vm07.stdout:6/318: mkdir d1/d4/d6/d43/d65 0 2026-03-10T12:37:46.238 INFO:tasks.workunit.client.1.vm07.stdout:9/376: truncate d5/d13/d22/f39 972441 0 2026-03-10T12:37:46.254 INFO:tasks.workunit.client.1.vm07.stdout:0/423: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:1/348: getdents d9/df/d29/d2c/d59 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:2/291: creat d0/d42/d26/d38/d4f/f65 x:0 0 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:6/319: mkdir d1/d4/d6/d53/d66 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:9/377: chown d5/l80 0 1 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:6/320: truncate d1/d4/d4a/f56 735885 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:5/405: symlink d0/d22/d18/d3e/l8e 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:5/406: chown d0/d22/d18/d19/d72 0 1 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:0/424: symlink d0/d14/d5f/d76/d2f/d31/d79/d85/l88 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:1/349: creat d9/df/d55/f6f x:0 0 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:1/350: write d9/df/f15 [192931,32423] 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:6/321: truncate d1/f1e 2307493 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:6/322: readlink d1/l14 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:5/407: getdents d0/d22 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:0/425: getdents d0/d62 0 2026-03-10T12:37:46.277 INFO:tasks.workunit.client.1.vm07.stdout:0/426: truncate d0/d14/d5f/d76/d2f/d31/d79/f7b 4257360 0 2026-03-10T12:37:46.278 INFO:tasks.workunit.client.0.vm00.stdout:4/230: fdatasync df/d1f/d22/f3c 0 2026-03-10T12:37:46.279 INFO:tasks.workunit.client.0.vm00.stdout:4/231: fdatasync df/f1e 0 2026-03-10T12:37:46.279 INFO:tasks.workunit.client.0.vm00.stdout:4/232: stat le 0 2026-03-10T12:37:46.280 INFO:tasks.workunit.client.0.vm00.stdout:4/233: truncate df/f3d 12667 0 2026-03-10T12:37:46.282 INFO:tasks.workunit.client.0.vm00.stdout:4/234: dread df/f11 [0,4194304] 0 2026-03-10T12:37:46.283 INFO:tasks.workunit.client.0.vm00.stdout:4/235: symlink df/d32/l4a 0 2026-03-10T12:37:46.286 INFO:tasks.workunit.client.0.vm00.stdout:4/236: dwrite df/f1c [4194304,4194304] 0 2026-03-10T12:37:46.290 INFO:tasks.workunit.client.0.vm00.stdout:4/237: symlink df/d1f/d22/l4b 0 2026-03-10T12:37:46.297 INFO:tasks.workunit.client.0.vm00.stdout:4/238: creat df/d1f/d22/f4c x:0 0 0 2026-03-10T12:37:46.297 INFO:tasks.workunit.client.0.vm00.stdout:4/239: stat df/d32/c43 0 2026-03-10T12:37:46.297 INFO:tasks.workunit.client.0.vm00.stdout:4/240: write df/f16 [597942,109971] 0 2026-03-10T12:37:46.298 INFO:tasks.workunit.client.0.vm00.stdout:4/241: creat df/d1f/f4d x:0 0 0 2026-03-10T12:37:46.299 INFO:tasks.workunit.client.0.vm00.stdout:4/242: write df/d1f/d36/d40/f49 [258562,115609] 0 2026-03-10T12:37:46.300 INFO:tasks.workunit.client.0.vm00.stdout:4/243: stat df/d1f/d22/d26/d2e/c37 0 2026-03-10T12:37:46.304 INFO:tasks.workunit.client.0.vm00.stdout:7/236: sync 2026-03-10T12:37:46.383 INFO:tasks.workunit.client.0.vm00.stdout:0/333: sync 2026-03-10T12:37:46.383 INFO:tasks.workunit.client.0.vm00.stdout:7/237: sync 2026-03-10T12:37:46.386 INFO:tasks.workunit.client.0.vm00.stdout:7/238: chown da/d26/d50/c54 0 1 2026-03-10T12:37:46.388 INFO:tasks.workunit.client.0.vm00.stdout:7/239: write da/d25/d2c/f30 [3035788,8013] 0 2026-03-10T12:37:46.388 INFO:tasks.workunit.client.0.vm00.stdout:0/334: dwrite d3/d7/d4c/d5b/f2b [0,4194304] 0 2026-03-10T12:37:46.389 INFO:tasks.workunit.client.0.vm00.stdout:0/335: truncate d3/d33/f64 873722 0 2026-03-10T12:37:46.389 INFO:tasks.workunit.client.0.vm00.stdout:7/240: mkdir da/d26/d37/d56 0 2026-03-10T12:37:46.389 INFO:tasks.workunit.client.0.vm00.stdout:0/336: chown d3/d33/f53 1 1 2026-03-10T12:37:46.391 INFO:tasks.workunit.client.0.vm00.stdout:7/241: symlink da/d3f/l57 0 2026-03-10T12:37:46.391 INFO:tasks.workunit.client.0.vm00.stdout:7/242: write da/d25/d2e/f43 [4320776,79409] 0 2026-03-10T12:37:46.391 INFO:tasks.workunit.client.0.vm00.stdout:0/337: write d3/d22/f46 [338765,51205] 0 2026-03-10T12:37:46.398 INFO:tasks.workunit.client.0.vm00.stdout:7/243: dwrite f9 [0,4194304] 0 2026-03-10T12:37:46.402 INFO:tasks.workunit.client.0.vm00.stdout:7/244: dwrite f9 [0,4194304] 0 2026-03-10T12:37:46.408 INFO:tasks.workunit.client.0.vm00.stdout:0/338: creat d3/d7/d3c/d74/f78 x:0 0 0 2026-03-10T12:37:46.422 INFO:tasks.workunit.client.0.vm00.stdout:7/245: mkdir da/d25/d2c/d58 0 2026-03-10T12:37:46.423 INFO:tasks.workunit.client.0.vm00.stdout:0/339: creat d3/d7/d3c/d4b/f79 x:0 0 0 2026-03-10T12:37:46.426 INFO:tasks.workunit.client.0.vm00.stdout:0/340: creat d3/d40/f7a x:0 0 0 2026-03-10T12:37:46.427 INFO:tasks.workunit.client.1.vm07.stdout:7/343: sync 2026-03-10T12:37:46.427 INFO:tasks.workunit.client.0.vm00.stdout:7/246: write da/d25/f4e [621323,51452] 0 2026-03-10T12:37:46.428 INFO:tasks.workunit.client.1.vm07.stdout:0/427: dread d0/f1d [0,4194304] 0 2026-03-10T12:37:46.428 INFO:tasks.workunit.client.1.vm07.stdout:7/344: fdatasync d0/f56 0 2026-03-10T12:37:46.429 INFO:tasks.workunit.client.0.vm00.stdout:0/341: rmdir d3/d7/d3c 39 2026-03-10T12:37:46.434 INFO:tasks.workunit.client.0.vm00.stdout:0/342: symlink d3/d7/d3c/l7b 0 2026-03-10T12:37:46.436 INFO:tasks.workunit.client.0.vm00.stdout:0/343: mknod d3/db/d24/c7c 0 2026-03-10T12:37:46.438 INFO:tasks.workunit.client.1.vm07.stdout:0/428: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/f89 x:0 0 0 2026-03-10T12:37:46.438 INFO:tasks.workunit.client.1.vm07.stdout:7/345: dwrite d0/d47/f59 [0,4194304] 0 2026-03-10T12:37:46.453 INFO:tasks.workunit.client.0.vm00.stdout:1/271: dread da/d12/d26/f2e [0,4194304] 0 2026-03-10T12:37:46.454 INFO:tasks.workunit.client.0.vm00.stdout:1/272: symlink da/d21/d27/l60 0 2026-03-10T12:37:46.457 INFO:tasks.workunit.client.1.vm07.stdout:7/346: dwrite d0/d61/f66 [0,4194304] 0 2026-03-10T12:37:46.465 INFO:tasks.workunit.client.1.vm07.stdout:0/429: getdents d0/d14/d5f/d76/d2f/d31 0 2026-03-10T12:37:46.472 INFO:tasks.workunit.client.1.vm07.stdout:0/430: readlink d0/d14/l17 0 2026-03-10T12:37:46.479 INFO:tasks.workunit.client.0.vm00.stdout:8/195: dread d0/f10 [0,4194304] 0 2026-03-10T12:37:46.479 INFO:tasks.workunit.client.0.vm00.stdout:8/196: chown d0/l14 71797 1 2026-03-10T12:37:46.490 INFO:tasks.workunit.client.0.vm00.stdout:3/244: dread f7 [0,4194304] 0 2026-03-10T12:37:46.502 INFO:tasks.workunit.client.1.vm07.stdout:0/431: dread d0/d14/d5f/d76/d2f/d31/f4d [0,4194304] 0 2026-03-10T12:37:46.502 INFO:tasks.workunit.client.0.vm00.stdout:3/245: readlink dd/d27/d2c/d34/l49 0 2026-03-10T12:37:46.502 INFO:tasks.workunit.client.0.vm00.stdout:3/246: dread dd/d3d/f54 [0,4194304] 0 2026-03-10T12:37:46.508 INFO:tasks.workunit.client.0.vm00.stdout:4/244: fdatasync df/f16 0 2026-03-10T12:37:46.510 INFO:tasks.workunit.client.0.vm00.stdout:4/245: creat df/f4e x:0 0 0 2026-03-10T12:37:46.574 INFO:tasks.workunit.client.0.vm00.stdout:3/247: write dd/d27/f35 [2516497,64895] 0 2026-03-10T12:37:46.576 INFO:tasks.workunit.client.0.vm00.stdout:9/248: rename d0/d5/f3a to d0/d5/d16/d19/f5c 0 2026-03-10T12:37:46.578 INFO:tasks.workunit.client.0.vm00.stdout:2/225: rename d4/d6/d41/f49 to d4/d6/d2d/d31/d32/d40/f50 0 2026-03-10T12:37:46.580 INFO:tasks.workunit.client.0.vm00.stdout:2/226: dread - d4/d6/d2d/d31/f46 zero size 2026-03-10T12:37:46.581 INFO:tasks.workunit.client.0.vm00.stdout:9/249: creat d0/f5d x:0 0 0 2026-03-10T12:37:46.582 INFO:tasks.workunit.client.0.vm00.stdout:5/231: rename l1c to d1f/d26/d2b/d35/l51 0 2026-03-10T12:37:46.583 INFO:tasks.workunit.client.0.vm00.stdout:5/232: fsync d1f/d26/d2e/f3a 0 2026-03-10T12:37:46.585 INFO:tasks.workunit.client.0.vm00.stdout:9/250: unlink d0/d3d/d43/d53/f5a 0 2026-03-10T12:37:46.585 INFO:tasks.workunit.client.0.vm00.stdout:9/251: dread - d0/d3d/d59/f45 zero size 2026-03-10T12:37:46.586 INFO:tasks.workunit.client.0.vm00.stdout:1/273: rename da/d12/d26/d42/l4e to da/d24/d28/d44/d5d/l61 0 2026-03-10T12:37:46.587 INFO:tasks.workunit.client.0.vm00.stdout:3/248: dread dd/d18/d13/d1d/f42 [0,4194304] 0 2026-03-10T12:37:46.587 INFO:tasks.workunit.client.0.vm00.stdout:1/274: dread da/d12/f20 [0,4194304] 0 2026-03-10T12:37:46.588 INFO:tasks.workunit.client.0.vm00.stdout:1/275: dread - da/d12/d26/f57 zero size 2026-03-10T12:37:46.591 INFO:tasks.workunit.client.0.vm00.stdout:3/249: mkdir dd/d18/d13/d1d/d43/d55 0 2026-03-10T12:37:46.592 INFO:tasks.workunit.client.0.vm00.stdout:1/276: stat da/d12/d26/f31 0 2026-03-10T12:37:46.593 INFO:tasks.workunit.client.0.vm00.stdout:3/250: creat dd/d27/f56 x:0 0 0 2026-03-10T12:37:46.594 INFO:tasks.workunit.client.0.vm00.stdout:1/277: creat da/d12/f62 x:0 0 0 2026-03-10T12:37:46.600 INFO:tasks.workunit.client.0.vm00.stdout:3/251: symlink dd/d2a/l57 0 2026-03-10T12:37:46.607 INFO:tasks.workunit.client.1.vm07.stdout:3/403: dwrite dc/d18/d24/f55 [0,4194304] 0 2026-03-10T12:37:46.607 INFO:tasks.workunit.client.1.vm07.stdout:3/404: dread - dc/dd/d43/f61 zero size 2026-03-10T12:37:46.608 INFO:tasks.workunit.client.0.vm00.stdout:1/278: symlink da/d24/l63 0 2026-03-10T12:37:46.608 INFO:tasks.workunit.client.0.vm00.stdout:6/258: getdents d2/d51 0 2026-03-10T12:37:46.608 INFO:tasks.workunit.client.0.vm00.stdout:1/279: getdents da/d24/d4a 0 2026-03-10T12:37:46.608 INFO:tasks.workunit.client.0.vm00.stdout:1/280: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:46.608 INFO:tasks.workunit.client.0.vm00.stdout:1/281: write da/d12/d26/f40 [713985,98172] 0 2026-03-10T12:37:46.617 INFO:tasks.workunit.client.1.vm07.stdout:3/405: write dc/dd/d28/d3b/f5b [3502859,108146] 0 2026-03-10T12:37:46.617 INFO:tasks.workunit.client.0.vm00.stdout:9/252: dread d0/f17 [0,4194304] 0 2026-03-10T12:37:46.619 INFO:tasks.workunit.client.0.vm00.stdout:9/253: getdents d0/d3d 0 2026-03-10T12:37:46.621 INFO:tasks.workunit.client.0.vm00.stdout:9/254: mknod d0/d5/d16/d1e/c5e 0 2026-03-10T12:37:46.624 INFO:tasks.workunit.client.0.vm00.stdout:5/233: sync 2026-03-10T12:37:46.624 INFO:tasks.workunit.client.0.vm00.stdout:5/234: write d1f/d26/f48 [1043108,37025] 0 2026-03-10T12:37:46.625 INFO:tasks.workunit.client.0.vm00.stdout:9/255: dwrite d0/f21 [4194304,4194304] 0 2026-03-10T12:37:46.626 INFO:tasks.workunit.client.0.vm00.stdout:9/256: readlink d0/d5/dc/l18 0 2026-03-10T12:37:46.627 INFO:tasks.workunit.client.0.vm00.stdout:9/257: creat d0/d5/d16/d1e/d2b/f5f x:0 0 0 2026-03-10T12:37:46.635 INFO:tasks.workunit.client.0.vm00.stdout:3/252: dread dd/d18/f12 [0,4194304] 0 2026-03-10T12:37:46.636 INFO:tasks.workunit.client.0.vm00.stdout:3/253: unlink c1 0 2026-03-10T12:37:46.638 INFO:tasks.workunit.client.0.vm00.stdout:3/254: link l6 dd/d18/d14/l58 0 2026-03-10T12:37:46.644 INFO:tasks.workunit.client.0.vm00.stdout:3/255: link dd/d2a/l57 dd/d4e/l59 0 2026-03-10T12:37:46.645 INFO:tasks.workunit.client.1.vm07.stdout:0/432: sync 2026-03-10T12:37:46.648 INFO:tasks.workunit.client.0.vm00.stdout:3/256: dwrite dd/d27/f56 [0,4194304] 0 2026-03-10T12:37:46.651 INFO:tasks.workunit.client.1.vm07.stdout:0/433: chown d0/d14/d5f/d76/d2f/d31/d4f/f61 0 1 2026-03-10T12:37:46.659 INFO:tasks.workunit.client.0.vm00.stdout:3/257: rmdir dd/d27/d2c/d34/d45/d4b 0 2026-03-10T12:37:46.659 INFO:tasks.workunit.client.0.vm00.stdout:3/258: mknod dd/d27/d2c/c5a 0 2026-03-10T12:37:46.663 INFO:tasks.workunit.client.0.vm00.stdout:6/259: sync 2026-03-10T12:37:46.667 INFO:tasks.workunit.client.0.vm00.stdout:6/260: dwrite d2/d14/f2e [0,4194304] 0 2026-03-10T12:37:46.669 INFO:tasks.workunit.client.0.vm00.stdout:6/261: creat d2/d16/d29/d31/d48/f62 x:0 0 0 2026-03-10T12:37:46.670 INFO:tasks.workunit.client.0.vm00.stdout:6/262: chown d2/da/c1f 1034932 1 2026-03-10T12:37:46.671 INFO:tasks.workunit.client.0.vm00.stdout:4/246: rmdir df/d1f/d36/d40 39 2026-03-10T12:37:46.674 INFO:tasks.workunit.client.0.vm00.stdout:4/247: creat df/f4f x:0 0 0 2026-03-10T12:37:46.676 INFO:tasks.workunit.client.0.vm00.stdout:4/248: dread df/d1f/d22/d26/f39 [0,4194304] 0 2026-03-10T12:37:46.679 INFO:tasks.workunit.client.0.vm00.stdout:8/197: truncate d0/f11 3965994 0 2026-03-10T12:37:46.686 INFO:tasks.workunit.client.0.vm00.stdout:6/263: creat d2/d51/f63 x:0 0 0 2026-03-10T12:37:46.686 INFO:tasks.workunit.client.0.vm00.stdout:6/264: creat d2/d16/d29/f64 x:0 0 0 2026-03-10T12:37:46.686 INFO:tasks.workunit.client.0.vm00.stdout:6/265: symlink d2/d16/l65 0 2026-03-10T12:37:46.687 INFO:tasks.workunit.client.0.vm00.stdout:9/258: sync 2026-03-10T12:37:46.687 INFO:tasks.workunit.client.0.vm00.stdout:3/259: sync 2026-03-10T12:37:46.688 INFO:tasks.workunit.client.0.vm00.stdout:1/282: dread da/f13 [0,4194304] 0 2026-03-10T12:37:46.688 INFO:tasks.workunit.client.0.vm00.stdout:1/283: chown da/f13 85053 1 2026-03-10T12:37:46.693 INFO:tasks.workunit.client.0.vm00.stdout:6/266: symlink d2/d42/l66 0 2026-03-10T12:37:46.694 INFO:tasks.workunit.client.0.vm00.stdout:3/260: dwrite dd/d18/d14/f3c [0,4194304] 0 2026-03-10T12:37:46.694 INFO:tasks.workunit.client.0.vm00.stdout:8/198: link d0/dd/c37 d0/d12/d36/c3c 0 2026-03-10T12:37:46.697 INFO:tasks.workunit.client.0.vm00.stdout:6/267: mknod d2/d16/d29/c67 0 2026-03-10T12:37:46.698 INFO:tasks.workunit.client.0.vm00.stdout:9/259: rmdir d0/d3d/d43/d53/d57 39 2026-03-10T12:37:46.703 INFO:tasks.workunit.client.0.vm00.stdout:9/260: dwrite d0/d3d/d59/f4a [0,4194304] 0 2026-03-10T12:37:46.704 INFO:tasks.workunit.client.0.vm00.stdout:8/199: read d0/dd/f2b [918892,124898] 0 2026-03-10T12:37:46.706 INFO:tasks.workunit.client.0.vm00.stdout:8/200: chown d0/d12/d17 76804 1 2026-03-10T12:37:46.709 INFO:tasks.workunit.client.0.vm00.stdout:6/268: rename d2/da/dc/f43 to d2/f68 0 2026-03-10T12:37:46.723 INFO:tasks.workunit.client.0.vm00.stdout:1/284: link da/d12/f20 da/d12/f64 0 2026-03-10T12:37:46.723 INFO:tasks.workunit.client.0.vm00.stdout:3/261: creat dd/d18/d13/d1d/f5b x:0 0 0 2026-03-10T12:37:46.723 INFO:tasks.workunit.client.0.vm00.stdout:3/262: chown f9 4974 1 2026-03-10T12:37:46.723 INFO:tasks.workunit.client.0.vm00.stdout:3/263: dwrite dd/d27/f35 [4194304,4194304] 0 2026-03-10T12:37:46.726 INFO:tasks.workunit.client.0.vm00.stdout:9/261: sync 2026-03-10T12:37:46.731 INFO:tasks.workunit.client.0.vm00.stdout:2/227: dwrite d4/f28 [0,4194304] 0 2026-03-10T12:37:46.731 INFO:tasks.workunit.client.0.vm00.stdout:1/285: write da/d12/f64 [507356,52631] 0 2026-03-10T12:37:46.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:46 vm00.local ceph-mon[50686]: pgmap v158: 65 pgs: 65 active+clean; 1.3 GiB data, 5.3 GiB used, 115 GiB / 120 GiB avail; 26 MiB/s rd, 155 MiB/s wr, 294 op/s 2026-03-10T12:37:46.736 INFO:tasks.workunit.client.0.vm00.stdout:1/286: dwrite f5 [0,4194304] 0 2026-03-10T12:37:46.737 INFO:tasks.workunit.client.0.vm00.stdout:3/264: dwrite dd/d18/d14/f2f [0,4194304] 0 2026-03-10T12:37:46.742 INFO:tasks.workunit.client.0.vm00.stdout:9/262: rmdir d0/d3d/d59 39 2026-03-10T12:37:46.742 INFO:tasks.workunit.client.0.vm00.stdout:1/287: dread - da/d12/d26/d42/f5b zero size 2026-03-10T12:37:46.746 INFO:tasks.workunit.client.0.vm00.stdout:1/288: chown da/d24/d28/f3c 268430 1 2026-03-10T12:37:46.748 INFO:tasks.workunit.client.0.vm00.stdout:2/228: dwrite d4/dd/f10 [4194304,4194304] 0 2026-03-10T12:37:46.748 INFO:tasks.workunit.client.0.vm00.stdout:8/201: creat d0/dd/d38/f3d x:0 0 0 2026-03-10T12:37:46.754 INFO:tasks.workunit.client.0.vm00.stdout:2/229: dwrite d4/dd/ff [0,4194304] 0 2026-03-10T12:37:46.761 INFO:tasks.workunit.client.0.vm00.stdout:9/263: creat d0/d5/d16/d1e/f60 x:0 0 0 2026-03-10T12:37:46.763 INFO:tasks.workunit.client.0.vm00.stdout:1/289: rename da/d21/l35 to da/d24/d28/d44/d5d/l65 0 2026-03-10T12:37:46.763 INFO:tasks.workunit.client.0.vm00.stdout:1/290: read - da/d12/d26/f57 zero size 2026-03-10T12:37:46.764 INFO:tasks.workunit.client.0.vm00.stdout:3/265: symlink dd/d18/d14/d2b/l5c 0 2026-03-10T12:37:46.765 INFO:tasks.workunit.client.0.vm00.stdout:3/266: write dd/d27/d2c/d34/d45/f47 [575743,101529] 0 2026-03-10T12:37:46.765 INFO:tasks.workunit.client.0.vm00.stdout:3/267: read - dd/d3d/f53 zero size 2026-03-10T12:37:46.768 INFO:tasks.workunit.client.0.vm00.stdout:3/268: dwrite dd/d27/d2c/d34/d45/f47 [0,4194304] 0 2026-03-10T12:37:46.772 INFO:tasks.workunit.client.0.vm00.stdout:1/291: creat da/d12/f66 x:0 0 0 2026-03-10T12:37:46.773 INFO:tasks.workunit.client.0.vm00.stdout:2/230: mkdir d4/d6/d2d/d3a/d43/d51 0 2026-03-10T12:37:46.773 INFO:tasks.workunit.client.0.vm00.stdout:3/269: readlink l6 0 2026-03-10T12:37:46.775 INFO:tasks.workunit.client.0.vm00.stdout:9/264: dwrite d0/d5/dc/f2a [0,4194304] 0 2026-03-10T12:37:46.786 INFO:tasks.workunit.client.0.vm00.stdout:1/292: rename da/d12/d26/d42 to da/d24/d28/d67 0 2026-03-10T12:37:46.787 INFO:tasks.workunit.client.0.vm00.stdout:1/293: dread - da/d12/f66 zero size 2026-03-10T12:37:46.788 INFO:tasks.workunit.client.0.vm00.stdout:2/231: symlink d4/d6/d2d/d31/d32/d40/l52 0 2026-03-10T12:37:46.788 INFO:tasks.workunit.client.0.vm00.stdout:2/232: stat d4/c23 0 2026-03-10T12:37:46.797 INFO:tasks.workunit.client.0.vm00.stdout:9/265: dwrite d0/d5/d16/f39 [4194304,4194304] 0 2026-03-10T12:37:46.800 INFO:tasks.workunit.client.0.vm00.stdout:3/270: mkdir dd/d4e/d5d 0 2026-03-10T12:37:46.809 INFO:tasks.workunit.client.1.vm07.stdout:8/387: write d1/f36 [3805632,22044] 0 2026-03-10T12:37:46.809 INFO:tasks.workunit.client.0.vm00.stdout:3/271: truncate dd/d27/d2c/d34/d38/f48 4515396 0 2026-03-10T12:37:46.809 INFO:tasks.workunit.client.0.vm00.stdout:9/266: dwrite d0/d5/d16/f39 [0,4194304] 0 2026-03-10T12:37:46.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:46 vm07.local ceph-mon[58582]: pgmap v158: 65 pgs: 65 active+clean; 1.3 GiB data, 5.3 GiB used, 115 GiB / 120 GiB avail; 26 MiB/s rd, 155 MiB/s wr, 294 op/s 2026-03-10T12:37:46.817 INFO:tasks.workunit.client.0.vm00.stdout:3/272: creat dd/d2a/d39/f5e x:0 0 0 2026-03-10T12:37:46.820 INFO:tasks.workunit.client.0.vm00.stdout:3/273: dwrite dd/d18/d13/d1d/f5b [0,4194304] 0 2026-03-10T12:37:46.831 INFO:tasks.workunit.client.0.vm00.stdout:9/267: unlink d0/d5/c15 0 2026-03-10T12:37:46.831 INFO:tasks.workunit.client.0.vm00.stdout:9/268: symlink d0/d5/d16/d1e/l61 0 2026-03-10T12:37:46.831 INFO:tasks.workunit.client.0.vm00.stdout:9/269: symlink d0/l62 0 2026-03-10T12:37:46.832 INFO:tasks.workunit.client.0.vm00.stdout:9/270: unlink d0/d5/l44 0 2026-03-10T12:37:46.833 INFO:tasks.workunit.client.0.vm00.stdout:9/271: chown d0/d5/d16/f34 452 1 2026-03-10T12:37:46.849 INFO:tasks.workunit.client.1.vm07.stdout:8/388: creat d1/d3/d40/f7e x:0 0 0 2026-03-10T12:37:46.854 INFO:tasks.workunit.client.1.vm07.stdout:8/389: stat d1/d3/l4a 0 2026-03-10T12:37:46.865 INFO:tasks.workunit.client.1.vm07.stdout:8/390: creat d1/d3/d6/d50/d70/f7f x:0 0 0 2026-03-10T12:37:46.872 INFO:tasks.workunit.client.1.vm07.stdout:8/391: unlink d1/d3/d18/f6d 0 2026-03-10T12:37:46.872 INFO:tasks.workunit.client.1.vm07.stdout:8/392: fsync d1/f79 0 2026-03-10T12:37:46.873 INFO:tasks.workunit.client.1.vm07.stdout:8/393: fdatasync d1/fc 0 2026-03-10T12:37:46.880 INFO:tasks.workunit.client.0.vm00.stdout:9/272: sync 2026-03-10T12:37:46.887 INFO:tasks.workunit.client.0.vm00.stdout:9/273: dwrite d0/d3d/d59/f45 [0,4194304] 0 2026-03-10T12:37:46.889 INFO:tasks.workunit.client.0.vm00.stdout:9/274: mknod d0/d3d/c63 0 2026-03-10T12:37:46.891 INFO:tasks.workunit.client.0.vm00.stdout:9/275: symlink d0/d5/d16/d1e/l64 0 2026-03-10T12:37:46.907 INFO:tasks.workunit.client.1.vm07.stdout:8/394: getdents d1/d3/d6/d50 0 2026-03-10T12:37:46.907 INFO:tasks.workunit.client.0.vm00.stdout:9/276: creat d0/d5/d16/d19/f65 x:0 0 0 2026-03-10T12:37:46.907 INFO:tasks.workunit.client.0.vm00.stdout:9/277: fdatasync d0/d5/d16/d1e/d2b/f36 0 2026-03-10T12:37:46.908 INFO:tasks.workunit.client.1.vm07.stdout:8/395: creat d1/d3/d6/d50/f80 x:0 0 0 2026-03-10T12:37:46.909 INFO:tasks.workunit.client.0.vm00.stdout:9/278: read d0/d5/d16/d19/f5c [381163,119559] 0 2026-03-10T12:37:46.911 INFO:tasks.workunit.client.1.vm07.stdout:8/396: readlink d1/d3/d6/d54/l6a 0 2026-03-10T12:37:46.912 INFO:tasks.workunit.client.0.vm00.stdout:9/279: creat d0/d3d/d43/d53/f66 x:0 0 0 2026-03-10T12:37:46.913 INFO:tasks.workunit.client.0.vm00.stdout:9/280: fsync d0/d5/d16/f30 0 2026-03-10T12:37:46.916 INFO:tasks.workunit.client.0.vm00.stdout:9/281: creat d0/d3d/d43/d53/d57/f67 x:0 0 0 2026-03-10T12:37:46.942 INFO:tasks.workunit.client.0.vm00.stdout:3/274: dread dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:46.945 INFO:tasks.workunit.client.0.vm00.stdout:3/275: mknod dd/d4e/d5d/c5f 0 2026-03-10T12:37:46.950 INFO:tasks.workunit.client.0.vm00.stdout:3/276: dwrite dd/d27/f35 [4194304,4194304] 0 2026-03-10T12:37:46.956 INFO:tasks.workunit.client.0.vm00.stdout:3/277: creat dd/d27/d2c/d34/f60 x:0 0 0 2026-03-10T12:37:46.959 INFO:tasks.workunit.client.0.vm00.stdout:3/278: dread dd/d27/f56 [0,4194304] 0 2026-03-10T12:37:46.961 INFO:tasks.workunit.client.0.vm00.stdout:3/279: mknod dd/d27/d2c/d34/d45/c61 0 2026-03-10T12:37:46.977 INFO:tasks.workunit.client.0.vm00.stdout:3/280: dread fb [0,4194304] 0 2026-03-10T12:37:46.978 INFO:tasks.workunit.client.0.vm00.stdout:3/281: symlink dd/d3d/l62 0 2026-03-10T12:37:46.979 INFO:tasks.workunit.client.0.vm00.stdout:3/282: creat dd/d27/d2c/d34/d38/f63 x:0 0 0 2026-03-10T12:37:46.983 INFO:tasks.workunit.client.0.vm00.stdout:3/283: dwrite dd/d2a/d39/f5e [0,4194304] 0 2026-03-10T12:37:46.984 INFO:tasks.workunit.client.0.vm00.stdout:3/284: write dd/f25 [3809404,128219] 0 2026-03-10T12:37:46.985 INFO:tasks.workunit.client.0.vm00.stdout:3/285: stat dd/d3d/f50 0 2026-03-10T12:37:46.989 INFO:tasks.workunit.client.0.vm00.stdout:3/286: dwrite dd/d27/f56 [0,4194304] 0 2026-03-10T12:37:47.071 INFO:tasks.workunit.client.1.vm07.stdout:8/397: sync 2026-03-10T12:37:47.075 INFO:tasks.workunit.client.1.vm07.stdout:8/398: dwrite d1/d3/d5d/d65/f67 [0,4194304] 0 2026-03-10T12:37:47.078 INFO:tasks.workunit.client.1.vm07.stdout:8/399: dread - d1/f79 zero size 2026-03-10T12:37:47.078 INFO:tasks.workunit.client.1.vm07.stdout:8/400: fdatasync d1/f79 0 2026-03-10T12:37:47.129 INFO:tasks.workunit.client.0.vm00.stdout:1/294: dread f3 [0,4194304] 0 2026-03-10T12:37:47.131 INFO:tasks.workunit.client.0.vm00.stdout:1/295: creat da/d24/d5a/f68 x:0 0 0 2026-03-10T12:37:47.134 INFO:tasks.workunit.client.0.vm00.stdout:1/296: dwrite da/d12/f64 [0,4194304] 0 2026-03-10T12:37:47.145 INFO:tasks.workunit.client.0.vm00.stdout:1/297: dwrite da/d21/d39/f55 [0,4194304] 0 2026-03-10T12:37:47.149 INFO:tasks.workunit.client.1.vm07.stdout:4/481: write d0/f53 [4675781,49191] 0 2026-03-10T12:37:47.162 INFO:tasks.workunit.client.0.vm00.stdout:1/298: sync 2026-03-10T12:37:47.163 INFO:tasks.workunit.client.0.vm00.stdout:1/299: creat da/d12/d26/f69 x:0 0 0 2026-03-10T12:37:47.165 INFO:tasks.workunit.client.0.vm00.stdout:1/300: mkdir da/d21/d27/d6a 0 2026-03-10T12:37:47.166 INFO:tasks.workunit.client.0.vm00.stdout:1/301: creat da/d21/d27/d6a/f6b x:0 0 0 2026-03-10T12:37:47.168 INFO:tasks.workunit.client.0.vm00.stdout:1/302: dread da/d21/d39/f4f [0,4194304] 0 2026-03-10T12:37:47.169 INFO:tasks.workunit.client.0.vm00.stdout:1/303: creat da/d24/d28/d67/f6c x:0 0 0 2026-03-10T12:37:47.173 INFO:tasks.workunit.client.0.vm00.stdout:1/304: dwrite da/d24/d5a/f68 [0,4194304] 0 2026-03-10T12:37:47.247 INFO:tasks.workunit.client.1.vm07.stdout:4/482: sync 2026-03-10T12:37:47.248 INFO:tasks.workunit.client.1.vm07.stdout:4/483: mknod d0/d19/ca6 0 2026-03-10T12:37:47.250 INFO:tasks.workunit.client.1.vm07.stdout:4/484: mkdir d0/d4/d10/d3c/d2b/d2d/da7 0 2026-03-10T12:37:47.277 INFO:tasks.workunit.client.0.vm00.stdout:5/235: chown d1f/d26/f28 3 1 2026-03-10T12:37:47.280 INFO:tasks.workunit.client.0.vm00.stdout:5/236: rmdir d1f 39 2026-03-10T12:37:47.283 INFO:tasks.workunit.client.0.vm00.stdout:5/237: creat d1f/d26/d2b/f52 x:0 0 0 2026-03-10T12:37:47.287 INFO:tasks.workunit.client.0.vm00.stdout:5/238: dread f12 [0,4194304] 0 2026-03-10T12:37:47.288 INFO:tasks.workunit.client.0.vm00.stdout:5/239: mkdir d1f/d26/d2b/d35/d53 0 2026-03-10T12:37:47.291 INFO:tasks.workunit.client.0.vm00.stdout:5/240: dwrite d1f/d26/d2e/f3c [0,4194304] 0 2026-03-10T12:37:47.294 INFO:tasks.workunit.client.0.vm00.stdout:5/241: readlink d1f/d39/l47 0 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.1.vm07.stdout:9/378: rmdir d5 39 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/242: mkdir d1f/d39/d54 0 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/243: mknod d1f/d26/d2b/d35/c55 0 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/244: dread d1f/f46 [0,4194304] 0 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/245: getdents d1f/d26/d2e 0 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/246: dread - d1f/d26/d2b/d35/f41 zero size 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/247: write d1f/d26/d2b/d35/f50 [145817,25004] 0 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/248: dwrite d1f/d26/d2b/f52 [0,4194304] 0 2026-03-10T12:37:47.306 INFO:tasks.workunit.client.0.vm00.stdout:5/249: readlink d1f/d26/l2d 0 2026-03-10T12:37:47.310 INFO:tasks.workunit.client.0.vm00.stdout:5/250: dwrite d1f/f30 [0,4194304] 0 2026-03-10T12:37:47.382 INFO:tasks.workunit.client.0.vm00.stdout:5/251: sync 2026-03-10T12:37:47.383 INFO:tasks.workunit.client.0.vm00.stdout:5/252: link c14 d1f/d39/d54/c56 0 2026-03-10T12:37:47.399 INFO:tasks.workunit.client.0.vm00.stdout:4/249: dwrite df/d1f/d22/f30 [0,4194304] 0 2026-03-10T12:37:47.402 INFO:tasks.workunit.client.0.vm00.stdout:8/202: rmdir d0/dd/d38 39 2026-03-10T12:37:47.404 INFO:tasks.workunit.client.0.vm00.stdout:8/203: read d0/d12/d17/f1d [1312531,93395] 0 2026-03-10T12:37:47.408 INFO:tasks.workunit.client.0.vm00.stdout:8/204: write d0/d12/f27 [97601,129656] 0 2026-03-10T12:37:47.409 INFO:tasks.workunit.client.1.vm07.stdout:9/379: sync 2026-03-10T12:37:47.412 INFO:tasks.workunit.client.0.vm00.stdout:4/250: sync 2026-03-10T12:37:47.412 INFO:tasks.workunit.client.1.vm07.stdout:9/380: mkdir d5/d13/d6c/d89 0 2026-03-10T12:37:47.414 INFO:tasks.workunit.client.1.vm07.stdout:9/381: unlink d5/d13/d2c/l4c 0 2026-03-10T12:37:47.415 INFO:tasks.workunit.client.1.vm07.stdout:9/382: chown d5/d13/l55 257 1 2026-03-10T12:37:47.418 INFO:tasks.workunit.client.0.vm00.stdout:2/233: dwrite d4/dd/f3c [0,4194304] 0 2026-03-10T12:37:47.421 INFO:tasks.workunit.client.0.vm00.stdout:2/234: write d4/dd/f3c [1477436,99454] 0 2026-03-10T12:37:47.427 INFO:tasks.workunit.client.1.vm07.stdout:9/383: dread d5/d1f/d31/f56 [0,4194304] 0 2026-03-10T12:37:47.428 INFO:tasks.workunit.client.0.vm00.stdout:3/287: rename dd/d2a/d39 to dd/d64 0 2026-03-10T12:37:47.428 INFO:tasks.workunit.client.0.vm00.stdout:2/235: dwrite d4/d6/d41/f4c [0,4194304] 0 2026-03-10T12:37:47.434 INFO:tasks.workunit.client.0.vm00.stdout:8/205: dread d0/d12/d17/f2e [0,4194304] 0 2026-03-10T12:37:47.437 INFO:tasks.workunit.client.1.vm07.stdout:3/406: dread dc/dd/f16 [0,4194304] 0 2026-03-10T12:37:47.437 INFO:tasks.workunit.client.0.vm00.stdout:4/251: link df/d1f/d36/d3a/d41/f33 df/d1f/d22/d26/d2e/f50 0 2026-03-10T12:37:47.439 INFO:tasks.workunit.client.0.vm00.stdout:3/288: mkdir dd/d3d/d65 0 2026-03-10T12:37:47.440 INFO:tasks.workunit.client.0.vm00.stdout:3/289: write fb [2022895,126579] 0 2026-03-10T12:37:47.441 INFO:tasks.workunit.client.0.vm00.stdout:3/290: write dd/d18/d14/f3c [237519,105176] 0 2026-03-10T12:37:47.444 INFO:tasks.workunit.client.0.vm00.stdout:9/282: rmdir d0/d5 39 2026-03-10T12:37:47.454 INFO:tasks.workunit.client.0.vm00.stdout:2/236: mkdir d4/d53 0 2026-03-10T12:37:47.456 INFO:tasks.workunit.client.0.vm00.stdout:3/291: dwrite dd/f15 [0,4194304] 0 2026-03-10T12:37:47.457 INFO:tasks.workunit.client.1.vm07.stdout:3/407: rename dc/dd/d43/d5c/d81 to dc/dd/d28/d7a/d8e 0 2026-03-10T12:37:47.458 INFO:tasks.workunit.client.1.vm07.stdout:2/292: write d0/d42/d1f/f2f [1233313,126872] 0 2026-03-10T12:37:47.459 INFO:tasks.workunit.client.1.vm07.stdout:3/408: read - dc/dd/f85 zero size 2026-03-10T12:37:47.459 INFO:tasks.workunit.client.0.vm00.stdout:2/237: symlink d4/d6/d41/l54 0 2026-03-10T12:37:47.462 INFO:tasks.workunit.client.0.vm00.stdout:4/252: creat df/d1f/d36/f51 x:0 0 0 2026-03-10T12:37:47.462 INFO:tasks.workunit.client.0.vm00.stdout:4/253: stat le 0 2026-03-10T12:37:47.472 INFO:tasks.workunit.client.1.vm07.stdout:1/351: write d9/fd [2106600,127241] 0 2026-03-10T12:37:47.472 INFO:tasks.workunit.client.1.vm07.stdout:3/409: mknod dc/dd/d28/d3b/c8f 0 2026-03-10T12:37:47.472 INFO:tasks.workunit.client.1.vm07.stdout:5/408: write d0/f9 [1288550,18965] 0 2026-03-10T12:37:47.478 INFO:tasks.workunit.client.0.vm00.stdout:3/292: mkdir dd/d18/d13/d1d/d43/d55/d66 0 2026-03-10T12:37:47.479 INFO:tasks.workunit.client.0.vm00.stdout:5/253: fdatasync d1f/d26/d2b/f52 0 2026-03-10T12:37:47.483 INFO:tasks.workunit.client.0.vm00.stdout:5/254: dread d1f/d26/d2e/f3c [0,4194304] 0 2026-03-10T12:37:47.485 INFO:tasks.workunit.client.0.vm00.stdout:1/305: truncate da/d12/f1a 3193941 0 2026-03-10T12:37:47.486 INFO:tasks.workunit.client.0.vm00.stdout:2/238: symlink d4/d53/l55 0 2026-03-10T12:37:47.487 INFO:tasks.workunit.client.0.vm00.stdout:9/283: getdents d0/d3d/d43/d53 0 2026-03-10T12:37:47.492 INFO:tasks.workunit.client.1.vm07.stdout:5/409: rename d0/d22/d18/d19/d2e/c41 to d0/d22/d18/d19/d2e/c8f 0 2026-03-10T12:37:47.492 INFO:tasks.workunit.client.0.vm00.stdout:9/284: dwrite d0/d3d/d59/f4a [0,4194304] 0 2026-03-10T12:37:47.494 INFO:tasks.workunit.client.0.vm00.stdout:3/293: mknod dd/d2a/c67 0 2026-03-10T12:37:47.494 INFO:tasks.workunit.client.0.vm00.stdout:3/294: chown dd/cf 19327 1 2026-03-10T12:37:47.498 INFO:tasks.workunit.client.0.vm00.stdout:2/239: sync 2026-03-10T12:37:47.501 INFO:tasks.workunit.client.0.vm00.stdout:3/295: symlink dd/d18/d13/l68 0 2026-03-10T12:37:47.503 INFO:tasks.workunit.client.0.vm00.stdout:4/254: creat df/d1f/d22/f52 x:0 0 0 2026-03-10T12:37:47.503 INFO:tasks.workunit.client.0.vm00.stdout:4/255: stat df/f4e 0 2026-03-10T12:37:47.506 INFO:tasks.workunit.client.0.vm00.stdout:2/240: symlink d4/d6/d41/l56 0 2026-03-10T12:37:47.507 INFO:tasks.workunit.client.0.vm00.stdout:2/241: rename d4/d6 to d4/d6/d2d/d57 22 2026-03-10T12:37:47.508 INFO:tasks.workunit.client.0.vm00.stdout:2/242: write d4/d6/d2d/d3a/f44 [38631,86083] 0 2026-03-10T12:37:47.508 INFO:tasks.workunit.client.0.vm00.stdout:2/243: dread - d4/dd/d38/f3f zero size 2026-03-10T12:37:47.513 INFO:tasks.workunit.client.0.vm00.stdout:5/255: creat d1f/d39/d54/f57 x:0 0 0 2026-03-10T12:37:47.515 INFO:tasks.workunit.client.0.vm00.stdout:9/285: dread - d0/d5/f26 zero size 2026-03-10T12:37:47.520 INFO:tasks.workunit.client.0.vm00.stdout:9/286: chown d0/d5/d16/d19 2 1 2026-03-10T12:37:47.520 INFO:tasks.workunit.client.0.vm00.stdout:3/296: creat dd/d18/d13/d1d/f69 x:0 0 0 2026-03-10T12:37:47.521 INFO:tasks.workunit.client.0.vm00.stdout:3/297: stat dd/ce 0 2026-03-10T12:37:47.523 INFO:tasks.workunit.client.0.vm00.stdout:2/244: dread d4/d6/f2b [0,4194304] 0 2026-03-10T12:37:47.523 INFO:tasks.workunit.client.0.vm00.stdout:2/245: dread - d4/dd/d38/f3f zero size 2026-03-10T12:37:47.524 INFO:tasks.workunit.client.0.vm00.stdout:5/256: mkdir d1f/d26/d2e/d58 0 2026-03-10T12:37:47.527 INFO:tasks.workunit.client.0.vm00.stdout:2/246: dwrite d4/dd/f10 [0,4194304] 0 2026-03-10T12:37:47.534 INFO:tasks.workunit.client.1.vm07.stdout:3/410: rename dc/d18/d2d/d3d/l57 to dc/d18/d2d/d3d/l90 0 2026-03-10T12:37:47.537 INFO:tasks.workunit.client.0.vm00.stdout:9/287: creat d0/d3d/d43/f68 x:0 0 0 2026-03-10T12:37:47.538 INFO:tasks.workunit.client.0.vm00.stdout:3/298: mkdir dd/d4e/d6a 0 2026-03-10T12:37:47.540 INFO:tasks.workunit.client.1.vm07.stdout:6/323: dwrite d1/f38 [0,4194304] 0 2026-03-10T12:37:47.547 INFO:tasks.workunit.client.0.vm00.stdout:7/247: dwrite da/f17 [0,4194304] 0 2026-03-10T12:37:47.549 INFO:tasks.workunit.client.0.vm00.stdout:3/299: dwrite dd/d18/d13/d1d/f69 [0,4194304] 0 2026-03-10T12:37:47.554 INFO:tasks.workunit.client.0.vm00.stdout:5/257: creat d1f/f59 x:0 0 0 2026-03-10T12:37:47.555 INFO:tasks.workunit.client.0.vm00.stdout:5/258: stat d1f/d26/c2f 0 2026-03-10T12:37:47.555 INFO:tasks.workunit.client.0.vm00.stdout:5/259: write d1f/d26/d2b/f52 [3629039,5057] 0 2026-03-10T12:37:47.556 INFO:tasks.workunit.client.0.vm00.stdout:5/260: write f19 [351085,94580] 0 2026-03-10T12:37:47.556 INFO:tasks.workunit.client.0.vm00.stdout:5/261: chown d1f/d39/l40 30 1 2026-03-10T12:37:47.557 INFO:tasks.workunit.client.0.vm00.stdout:5/262: write d1f/f59 [291105,79023] 0 2026-03-10T12:37:47.558 INFO:tasks.workunit.client.0.vm00.stdout:3/300: rename f9 to dd/d18/d13/f6b 0 2026-03-10T12:37:47.561 INFO:tasks.workunit.client.0.vm00.stdout:5/263: rename d1f/c3e to d1f/d39/c5a 0 2026-03-10T12:37:47.561 INFO:tasks.workunit.client.0.vm00.stdout:3/301: symlink dd/d4e/d5d/l6c 0 2026-03-10T12:37:47.561 INFO:tasks.workunit.client.0.vm00.stdout:0/344: truncate d3/d7/d4c/d5b/f2b 7714613 0 2026-03-10T12:37:47.563 INFO:tasks.workunit.client.1.vm07.stdout:6/324: creat d1/d4/d6/d16/d49/f67 x:0 0 0 2026-03-10T12:37:47.565 INFO:tasks.workunit.client.1.vm07.stdout:6/325: fsync d1/d4/d6/f60 0 2026-03-10T12:37:47.565 INFO:tasks.workunit.client.0.vm00.stdout:9/288: getdents d0/d5/d16/d1e 0 2026-03-10T12:37:47.565 INFO:tasks.workunit.client.0.vm00.stdout:3/302: mknod dd/d4e/d6a/c6d 0 2026-03-10T12:37:47.566 INFO:tasks.workunit.client.0.vm00.stdout:3/303: fsync dd/d3d/f53 0 2026-03-10T12:37:47.569 INFO:tasks.workunit.client.0.vm00.stdout:3/304: rename dd/d18/d14/f2f to dd/d4e/d5d/f6e 0 2026-03-10T12:37:47.569 INFO:tasks.workunit.client.0.vm00.stdout:5/264: dread d1f/d26/d2e/f3a [0,4194304] 0 2026-03-10T12:37:47.572 INFO:tasks.workunit.client.0.vm00.stdout:9/289: dwrite d0/d5/d16/f39 [0,4194304] 0 2026-03-10T12:37:47.582 INFO:tasks.workunit.client.1.vm07.stdout:3/411: dread dc/dd/f29 [4194304,4194304] 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:3/305: rename dd/c19 to dd/d3d/d65/c6f 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:9/290: symlink d0/d5/l69 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:3/306: write dd/d18/d13/f22 [2570379,79618] 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:3/307: stat dd/d2a 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:9/291: write d0/d5/d16/d19/f20 [923287,114766] 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:3/308: symlink dd/d18/d13/l70 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:0/345: creat d3/db/d24/d25/f7d x:0 0 0 2026-03-10T12:37:47.583 INFO:tasks.workunit.client.0.vm00.stdout:9/292: rename d0/d5/d16/l2e to d0/d5/d16/d19/d50/l6a 0 2026-03-10T12:37:47.584 INFO:tasks.workunit.client.0.vm00.stdout:3/309: creat dd/d4e/d5d/f71 x:0 0 0 2026-03-10T12:37:47.587 INFO:tasks.workunit.client.0.vm00.stdout:3/310: creat dd/d27/f72 x:0 0 0 2026-03-10T12:37:47.588 INFO:tasks.workunit.client.0.vm00.stdout:3/311: rmdir dd/d18/d13 39 2026-03-10T12:37:47.593 INFO:tasks.workunit.client.0.vm00.stdout:3/312: truncate dd/d18/d13/d1d/f42 4248153 0 2026-03-10T12:37:47.594 INFO:tasks.workunit.client.0.vm00.stdout:3/313: mkdir dd/d3d/d73 0 2026-03-10T12:37:47.602 INFO:tasks.workunit.client.1.vm07.stdout:3/412: creat dc/dd/d1f/f91 x:0 0 0 2026-03-10T12:37:47.604 INFO:tasks.workunit.client.0.vm00.stdout:3/314: link ca dd/d4e/c74 0 2026-03-10T12:37:47.604 INFO:tasks.workunit.client.0.vm00.stdout:3/315: creat dd/d27/d2c/d34/d45/f75 x:0 0 0 2026-03-10T12:37:47.604 INFO:tasks.workunit.client.0.vm00.stdout:0/346: unlink d3/d33/f53 0 2026-03-10T12:37:47.607 INFO:tasks.workunit.client.0.vm00.stdout:3/316: creat dd/d27/d2c/d34/d38/f76 x:0 0 0 2026-03-10T12:37:47.610 INFO:tasks.workunit.client.0.vm00.stdout:0/347: creat d3/d7/d4c/d5b/d38/d44/d5a/f7e x:0 0 0 2026-03-10T12:37:47.612 INFO:tasks.workunit.client.0.vm00.stdout:3/317: dwrite dd/d27/f44 [0,4194304] 0 2026-03-10T12:37:47.621 INFO:tasks.workunit.client.0.vm00.stdout:3/318: read dd/d3d/f3e [141514,99891] 0 2026-03-10T12:37:47.623 INFO:tasks.workunit.client.0.vm00.stdout:3/319: write dd/d27/d2c/d34/f60 [824125,104711] 0 2026-03-10T12:37:47.624 INFO:tasks.workunit.client.0.vm00.stdout:0/348: stat d3/d7/d4c/d5b/l32 0 2026-03-10T12:37:47.624 INFO:tasks.workunit.client.1.vm07.stdout:3/413: symlink dc/dd/d43/d76/l92 0 2026-03-10T12:37:47.624 INFO:tasks.workunit.client.0.vm00.stdout:0/349: write d3/d40/f4e [4640160,96474] 0 2026-03-10T12:37:47.625 INFO:tasks.workunit.client.0.vm00.stdout:0/350: write d3/d7/d3c/f30 [3768607,71743] 0 2026-03-10T12:37:47.626 INFO:tasks.workunit.client.1.vm07.stdout:6/326: dread d1/f3d [4194304,4194304] 0 2026-03-10T12:37:47.627 INFO:tasks.workunit.client.0.vm00.stdout:3/320: dread dd/d18/d13/d1d/f42 [0,4194304] 0 2026-03-10T12:37:47.632 INFO:tasks.workunit.client.1.vm07.stdout:3/414: mknod dc/dd/d1f/c93 0 2026-03-10T12:37:47.636 INFO:tasks.workunit.client.0.vm00.stdout:0/351: stat d3/l9 0 2026-03-10T12:37:47.649 INFO:tasks.workunit.client.0.vm00.stdout:0/352: unlink d3/f50 0 2026-03-10T12:37:47.654 INFO:tasks.workunit.client.0.vm00.stdout:0/353: unlink d3/db/f6e 0 2026-03-10T12:37:47.663 INFO:tasks.workunit.client.0.vm00.stdout:3/321: dread dd/d18/d13/f6b [0,4194304] 0 2026-03-10T12:37:47.666 INFO:tasks.workunit.client.0.vm00.stdout:3/322: dwrite dd/f15 [4194304,4194304] 0 2026-03-10T12:37:47.668 INFO:tasks.workunit.client.0.vm00.stdout:3/323: symlink dd/d27/l77 0 2026-03-10T12:37:47.669 INFO:tasks.workunit.client.0.vm00.stdout:3/324: write dd/d18/d13/d1d/f5b [1419362,80520] 0 2026-03-10T12:37:47.683 INFO:tasks.workunit.client.0.vm00.stdout:4/256: dread df/f3d [0,4194304] 0 2026-03-10T12:37:47.685 INFO:tasks.workunit.client.0.vm00.stdout:4/257: unlink df/d1f/d22/d26/d2e/c37 0 2026-03-10T12:37:47.686 INFO:tasks.workunit.client.0.vm00.stdout:4/258: unlink df/d1f/d36/d3a/l3b 0 2026-03-10T12:37:47.690 INFO:tasks.workunit.client.0.vm00.stdout:4/259: dwrite df/f1e [0,4194304] 0 2026-03-10T12:37:47.695 INFO:tasks.workunit.client.0.vm00.stdout:4/260: mknod df/d1f/d36/c53 0 2026-03-10T12:37:47.699 INFO:tasks.workunit.client.0.vm00.stdout:4/261: dwrite df/f16 [0,4194304] 0 2026-03-10T12:37:47.702 INFO:tasks.workunit.client.0.vm00.stdout:4/262: mknod df/d1f/d36/d3a/d41/c54 0 2026-03-10T12:37:47.708 INFO:tasks.workunit.client.1.vm07.stdout:7/347: truncate d0/f39 3503096 0 2026-03-10T12:37:47.709 INFO:tasks.workunit.client.1.vm07.stdout:7/348: write d0/d47/f59 [3743263,5992] 0 2026-03-10T12:37:47.731 INFO:tasks.workunit.client.1.vm07.stdout:7/349: fdatasync d0/f14 0 2026-03-10T12:37:47.737 INFO:tasks.workunit.client.1.vm07.stdout:0/434: write d0/d14/d5f/d76/f78 [2205070,84267] 0 2026-03-10T12:37:47.747 INFO:tasks.workunit.client.1.vm07.stdout:7/350: read d0/f13 [484178,71190] 0 2026-03-10T12:37:47.757 INFO:tasks.workunit.client.0.vm00.stdout:8/206: truncate d0/f7 343324 0 2026-03-10T12:37:47.757 INFO:tasks.workunit.client.1.vm07.stdout:7/351: mknod d0/d67/c6e 0 2026-03-10T12:37:47.758 INFO:tasks.workunit.client.0.vm00.stdout:8/207: mkdir d0/d12/d36/d3e 0 2026-03-10T12:37:47.759 INFO:tasks.workunit.client.0.vm00.stdout:8/208: rmdir d0/d12/d17 39 2026-03-10T12:37:47.759 INFO:tasks.workunit.client.0.vm00.stdout:8/209: stat d0/f22 0 2026-03-10T12:37:47.760 INFO:tasks.workunit.client.0.vm00.stdout:6/269: write d2/da/dc/fd [3815503,94489] 0 2026-03-10T12:37:47.760 INFO:tasks.workunit.client.0.vm00.stdout:8/210: dread - d0/d12/d36/f39 zero size 2026-03-10T12:37:47.762 INFO:tasks.workunit.client.0.vm00.stdout:8/211: readlink d0/l1 0 2026-03-10T12:37:47.767 INFO:tasks.workunit.client.1.vm07.stdout:7/352: dwrite d0/d52/f5d [0,4194304] 0 2026-03-10T12:37:47.767 INFO:tasks.workunit.client.0.vm00.stdout:6/270: dwrite d2/d16/d29/d31/d48/f62 [0,4194304] 0 2026-03-10T12:37:47.767 INFO:tasks.workunit.client.0.vm00.stdout:6/271: chown d2/d16/d29/f4c 85 1 2026-03-10T12:37:47.769 INFO:tasks.workunit.client.0.vm00.stdout:8/212: dwrite d0/d12/d36/f39 [0,4194304] 0 2026-03-10T12:37:47.771 INFO:tasks.workunit.client.0.vm00.stdout:8/213: readlink d0/l6 0 2026-03-10T12:37:47.776 INFO:tasks.workunit.client.0.vm00.stdout:6/272: symlink d2/da/dc/d2f/l69 0 2026-03-10T12:37:47.781 INFO:tasks.workunit.client.0.vm00.stdout:8/214: dread d0/f9 [4194304,4194304] 0 2026-03-10T12:37:47.786 INFO:tasks.workunit.client.1.vm07.stdout:7/353: rename d0/d57/d62/d6d to d0/d67/d6f 0 2026-03-10T12:37:47.786 INFO:tasks.workunit.client.0.vm00.stdout:8/215: link d0/d12/l15 d0/l3f 0 2026-03-10T12:37:47.786 INFO:tasks.workunit.client.0.vm00.stdout:8/216: write d0/d12/d2d/f33 [501046,39911] 0 2026-03-10T12:37:47.786 INFO:tasks.workunit.client.0.vm00.stdout:8/217: chown d0/d12/d17/f2e 38151 1 2026-03-10T12:37:47.786 INFO:tasks.workunit.client.0.vm00.stdout:8/218: write d0/d12/d2d/f33 [129815,72451] 0 2026-03-10T12:37:47.788 INFO:tasks.workunit.client.0.vm00.stdout:8/219: symlink d0/d12/d36/l40 0 2026-03-10T12:37:47.788 INFO:tasks.workunit.client.0.vm00.stdout:8/220: stat d0/f8 0 2026-03-10T12:37:47.789 INFO:tasks.workunit.client.0.vm00.stdout:8/221: creat d0/d12/d36/f41 x:0 0 0 2026-03-10T12:37:47.793 INFO:tasks.workunit.client.0.vm00.stdout:8/222: dwrite d0/f9 [4194304,4194304] 0 2026-03-10T12:37:47.795 INFO:tasks.workunit.client.0.vm00.stdout:8/223: creat d0/d12/d36/d3e/f42 x:0 0 0 2026-03-10T12:37:47.797 INFO:tasks.workunit.client.0.vm00.stdout:8/224: mkdir d0/d12/d43 0 2026-03-10T12:37:47.800 INFO:tasks.workunit.client.0.vm00.stdout:8/225: dwrite d0/d12/d36/d3e/f42 [0,4194304] 0 2026-03-10T12:37:47.805 INFO:tasks.workunit.client.0.vm00.stdout:8/226: getdents d0/d12/d17 0 2026-03-10T12:37:47.806 INFO:tasks.workunit.client.0.vm00.stdout:8/227: link d0/f10 d0/d12/d2d/f44 0 2026-03-10T12:37:47.807 INFO:tasks.workunit.client.0.vm00.stdout:6/273: sync 2026-03-10T12:37:47.809 INFO:tasks.workunit.client.0.vm00.stdout:4/263: dread df/d1f/d22/d26/d2e/f50 [0,4194304] 0 2026-03-10T12:37:47.810 INFO:tasks.workunit.client.0.vm00.stdout:6/274: chown d2/da/dc/l50 69578 1 2026-03-10T12:37:47.810 INFO:tasks.workunit.client.0.vm00.stdout:8/228: creat d0/d12/d43/f45 x:0 0 0 2026-03-10T12:37:47.810 INFO:tasks.workunit.client.0.vm00.stdout:6/275: chown d2/d16/c36 440 1 2026-03-10T12:37:47.811 INFO:tasks.workunit.client.0.vm00.stdout:8/229: write d0/d12/f27 [449293,36627] 0 2026-03-10T12:37:47.812 INFO:tasks.workunit.client.0.vm00.stdout:8/230: stat d0/d12/d36/f39 0 2026-03-10T12:37:47.818 INFO:tasks.workunit.client.0.vm00.stdout:8/231: write d0/dd/f2b [3719571,95351] 0 2026-03-10T12:37:47.819 INFO:tasks.workunit.client.0.vm00.stdout:8/232: read - d0/d12/f23 zero size 2026-03-10T12:37:47.822 INFO:tasks.workunit.client.0.vm00.stdout:8/233: mkdir d0/d46 0 2026-03-10T12:37:47.825 INFO:tasks.workunit.client.0.vm00.stdout:8/234: dwrite d0/d12/f34 [0,4194304] 0 2026-03-10T12:37:47.853 INFO:tasks.workunit.client.0.vm00.stdout:6/276: dread d2/d14/f1b [0,4194304] 0 2026-03-10T12:37:47.856 INFO:tasks.workunit.client.0.vm00.stdout:6/277: rmdir d2/d16/d29/d31/d48 39 2026-03-10T12:37:47.857 INFO:tasks.workunit.client.1.vm07.stdout:4/485: write d0/d19/f25 [5031165,91210] 0 2026-03-10T12:37:47.857 INFO:tasks.workunit.client.1.vm07.stdout:4/486: stat d0/d4/la5 0 2026-03-10T12:37:47.861 INFO:tasks.workunit.client.0.vm00.stdout:8/235: dread d0/d12/f2a [0,4194304] 0 2026-03-10T12:37:47.863 INFO:tasks.workunit.client.1.vm07.stdout:4/487: creat d0/d4/d5/da/d66/fa8 x:0 0 0 2026-03-10T12:37:47.863 INFO:tasks.workunit.client.0.vm00.stdout:8/236: symlink d0/d12/d36/l47 0 2026-03-10T12:37:47.863 INFO:tasks.workunit.client.0.vm00.stdout:8/237: read - d0/d12/f23 zero size 2026-03-10T12:37:47.864 INFO:tasks.workunit.client.0.vm00.stdout:8/238: write d0/d12/f34 [267658,36968] 0 2026-03-10T12:37:47.883 INFO:tasks.workunit.client.0.vm00.stdout:6/278: rename d2/da/dc/d2f/f44 to d2/da/f6a 0 2026-03-10T12:37:47.884 INFO:tasks.workunit.client.1.vm07.stdout:4/488: mknod d0/d19/ca9 0 2026-03-10T12:37:47.889 INFO:tasks.workunit.client.1.vm07.stdout:4/489: symlink d0/d4/d10/d3c/d2b/d54/laa 0 2026-03-10T12:37:47.897 INFO:tasks.workunit.client.1.vm07.stdout:9/384: truncate d5/f65 7567360 0 2026-03-10T12:37:47.913 INFO:tasks.workunit.client.1.vm07.stdout:6/327: read d1/f1e [1463257,18301] 0 2026-03-10T12:37:47.914 INFO:tasks.workunit.client.1.vm07.stdout:6/328: write d1/d4/d4a/f56 [354872,100129] 0 2026-03-10T12:37:47.914 INFO:tasks.workunit.client.1.vm07.stdout:6/329: readlink d1/d4/d6/lf 0 2026-03-10T12:37:47.920 INFO:tasks.workunit.client.1.vm07.stdout:2/293: truncate d0/d42/d26/f50 583911 0 2026-03-10T12:37:47.923 INFO:tasks.workunit.client.1.vm07.stdout:1/352: dwrite d9/df/d29/d2b/f4e [4194304,4194304] 0 2026-03-10T12:37:47.925 INFO:tasks.workunit.client.1.vm07.stdout:6/330: dwrite d1/d4/d6/d46/d4d/f22 [4194304,4194304] 0 2026-03-10T12:37:47.929 INFO:tasks.workunit.client.1.vm07.stdout:2/294: truncate d0/f40 679674 0 2026-03-10T12:37:47.935 INFO:tasks.workunit.client.1.vm07.stdout:1/353: read d9/fc [93923,107187] 0 2026-03-10T12:37:47.935 INFO:tasks.workunit.client.1.vm07.stdout:2/295: write d0/d42/d4e/d56/f60 [32769,19268] 0 2026-03-10T12:37:47.938 INFO:tasks.workunit.client.1.vm07.stdout:1/354: chown d9/f1a 0 1 2026-03-10T12:37:47.938 INFO:tasks.workunit.client.1.vm07.stdout:1/355: chown d9/f1a 30725 1 2026-03-10T12:37:47.942 INFO:tasks.workunit.client.1.vm07.stdout:1/356: dwrite d9/fd [0,4194304] 0 2026-03-10T12:37:47.945 INFO:tasks.workunit.client.1.vm07.stdout:2/296: dread - d0/f2d zero size 2026-03-10T12:37:48.016 INFO:tasks.workunit.client.0.vm00.stdout:0/354: dwrite d3/db/d24/f2f [4194304,4194304] 0 2026-03-10T12:37:48.027 INFO:tasks.workunit.client.0.vm00.stdout:0/355: dread d3/d22/f54 [0,4194304] 0 2026-03-10T12:37:48.040 INFO:tasks.workunit.client.0.vm00.stdout:0/356: link d3/d7/d4c/d5b/d38/l4f d3/d40/l7f 0 2026-03-10T12:37:48.040 INFO:tasks.workunit.client.0.vm00.stdout:0/357: write d3/d40/f7a [663356,54234] 0 2026-03-10T12:37:48.045 INFO:tasks.workunit.client.1.vm07.stdout:5/410: unlink d0/d22/d18/d19/f2c 0 2026-03-10T12:37:48.049 INFO:tasks.workunit.client.0.vm00.stdout:0/358: getdents d3/d40/d65 0 2026-03-10T12:37:48.050 INFO:tasks.workunit.client.1.vm07.stdout:5/411: write d0/d22/d18/f20 [3935038,120442] 0 2026-03-10T12:37:48.057 INFO:tasks.workunit.client.1.vm07.stdout:1/357: creat d9/df/d29/f70 x:0 0 0 2026-03-10T12:37:48.092 INFO:tasks.workunit.client.1.vm07.stdout:5/412: write d0/d22/d18/d19/d21/d54/f7d [1017209,68322] 0 2026-03-10T12:37:48.092 INFO:tasks.workunit.client.1.vm07.stdout:5/413: symlink d0/d22/d18/d3e/d53/l90 0 2026-03-10T12:37:48.092 INFO:tasks.workunit.client.1.vm07.stdout:1/358: mknod d9/df/d29/c71 0 2026-03-10T12:37:48.098 INFO:tasks.workunit.client.1.vm07.stdout:1/359: dread d9/df/f4a [0,4194304] 0 2026-03-10T12:37:48.100 INFO:tasks.workunit.client.1.vm07.stdout:1/360: creat d9/df/d29/d2b/d31/f72 x:0 0 0 2026-03-10T12:37:48.102 INFO:tasks.workunit.client.1.vm07.stdout:1/361: unlink d9/fc 0 2026-03-10T12:37:48.106 INFO:tasks.workunit.client.1.vm07.stdout:1/362: dread d9/f1b [0,4194304] 0 2026-03-10T12:37:48.107 INFO:tasks.workunit.client.1.vm07.stdout:1/363: truncate d9/d2d/d4f/d5a/f65 709253 0 2026-03-10T12:37:48.108 INFO:tasks.workunit.client.1.vm07.stdout:1/364: chown d9/df/d29/d2b/d3d 0 1 2026-03-10T12:37:48.111 INFO:tasks.workunit.client.1.vm07.stdout:1/365: dwrite d9/f1a [0,4194304] 0 2026-03-10T12:37:48.122 INFO:tasks.workunit.client.1.vm07.stdout:1/366: creat d9/df/d29/d2c/d59/f73 x:0 0 0 2026-03-10T12:37:48.125 INFO:tasks.workunit.client.1.vm07.stdout:1/367: rename l2 to d9/df/d29/d2b/d3d/l74 0 2026-03-10T12:37:48.126 INFO:tasks.workunit.client.1.vm07.stdout:1/368: truncate d9/df/d29/d2b/f32 6156985 0 2026-03-10T12:37:48.127 INFO:tasks.workunit.client.1.vm07.stdout:1/369: mkdir d9/d2d/d4f/d75 0 2026-03-10T12:37:48.128 INFO:tasks.workunit.client.1.vm07.stdout:1/370: dread - d9/df/d55/f6f zero size 2026-03-10T12:37:48.129 INFO:tasks.workunit.client.1.vm07.stdout:1/371: dread - d9/df/f58 zero size 2026-03-10T12:37:48.145 INFO:tasks.workunit.client.1.vm07.stdout:2/297: sync 2026-03-10T12:37:48.146 INFO:tasks.workunit.client.1.vm07.stdout:2/298: read d0/f1d [361755,32436] 0 2026-03-10T12:37:48.149 INFO:tasks.workunit.client.1.vm07.stdout:2/299: mknod d0/d5b/c66 0 2026-03-10T12:37:48.151 INFO:tasks.workunit.client.1.vm07.stdout:2/300: creat d0/d29/d64/f67 x:0 0 0 2026-03-10T12:37:48.155 INFO:tasks.workunit.client.1.vm07.stdout:2/301: chown d0/d42/d26/d4b/f58 7 1 2026-03-10T12:37:48.155 INFO:tasks.workunit.client.1.vm07.stdout:2/302: mknod d0/d42/d1f/d20/c68 0 2026-03-10T12:37:48.160 INFO:tasks.workunit.client.1.vm07.stdout:2/303: truncate d0/f18 1420313 0 2026-03-10T12:37:48.161 INFO:tasks.workunit.client.1.vm07.stdout:2/304: fdatasync d0/f2d 0 2026-03-10T12:37:48.166 INFO:tasks.workunit.client.0.vm00.stdout:2/247: truncate d4/d6/f34 495360 0 2026-03-10T12:37:48.168 INFO:tasks.workunit.client.0.vm00.stdout:2/248: creat d4/dd/d38/f58 x:0 0 0 2026-03-10T12:37:48.169 INFO:tasks.workunit.client.0.vm00.stdout:2/249: symlink d4/d53/l59 0 2026-03-10T12:37:48.169 INFO:tasks.workunit.client.0.vm00.stdout:2/250: fsync d4/dd/f45 0 2026-03-10T12:37:48.173 INFO:tasks.workunit.client.0.vm00.stdout:2/251: dwrite d4/dd/f3e [0,4194304] 0 2026-03-10T12:37:48.174 INFO:tasks.workunit.client.0.vm00.stdout:5/265: write d1f/f27 [2461522,48600] 0 2026-03-10T12:37:48.180 INFO:tasks.workunit.client.0.vm00.stdout:9/293: write d0/d5/d16/f34 [1958324,63831] 0 2026-03-10T12:37:48.180 INFO:tasks.workunit.client.0.vm00.stdout:3/325: truncate dd/d3d/f54 3341394 0 2026-03-10T12:37:48.181 INFO:tasks.workunit.client.0.vm00.stdout:9/294: write d0/f4 [12959009,112713] 0 2026-03-10T12:37:48.182 INFO:tasks.workunit.client.0.vm00.stdout:9/295: write d0/d3d/d43/f68 [552063,56039] 0 2026-03-10T12:37:48.185 INFO:tasks.workunit.client.0.vm00.stdout:2/252: creat d4/dd/d38/f5a x:0 0 0 2026-03-10T12:37:48.189 INFO:tasks.workunit.client.0.vm00.stdout:9/296: creat d0/d5/d16/d1e/d2b/f6b x:0 0 0 2026-03-10T12:37:48.190 INFO:tasks.workunit.client.1.vm07.stdout:0/435: creat d0/d14/d5f/d76/f8a x:0 0 0 2026-03-10T12:37:48.192 INFO:tasks.workunit.client.0.vm00.stdout:5/266: mkdir d1f/d26/d2b/d35/d53/d5b 0 2026-03-10T12:37:48.195 INFO:tasks.workunit.client.1.vm07.stdout:0/436: mknod d0/d14/d5f/d41/d6a/d74/c8b 0 2026-03-10T12:37:48.196 INFO:tasks.workunit.client.0.vm00.stdout:4/264: dwrite df/d1f/d36/d3a/d41/f33 [0,4194304] 0 2026-03-10T12:37:48.197 INFO:tasks.workunit.client.0.vm00.stdout:2/253: symlink d4/dd/l5b 0 2026-03-10T12:37:48.197 INFO:tasks.workunit.client.0.vm00.stdout:4/265: readlink df/d1f/d36/d3a/d41/l34 0 2026-03-10T12:37:48.208 INFO:tasks.workunit.client.1.vm07.stdout:0/437: unlink d0/d14/d5f/l68 0 2026-03-10T12:37:48.208 INFO:tasks.workunit.client.0.vm00.stdout:3/326: dread dd/f25 [0,4194304] 0 2026-03-10T12:37:48.209 INFO:tasks.workunit.client.0.vm00.stdout:3/327: write dd/d18/d13/f22 [524139,126255] 0 2026-03-10T12:37:48.211 INFO:tasks.workunit.client.0.vm00.stdout:4/266: mknod df/d1f/d36/d3a/c55 0 2026-03-10T12:37:48.218 INFO:tasks.workunit.client.1.vm07.stdout:8/401: rmdir d1/d3/d6c 39 2026-03-10T12:37:48.218 INFO:tasks.workunit.client.0.vm00.stdout:2/254: symlink d4/dd/l5c 0 2026-03-10T12:37:48.218 INFO:tasks.workunit.client.0.vm00.stdout:4/267: fsync df/f3d 0 2026-03-10T12:37:48.218 INFO:tasks.workunit.client.0.vm00.stdout:4/268: chown df/f1c 27709 1 2026-03-10T12:37:48.219 INFO:tasks.workunit.client.0.vm00.stdout:4/269: fdatasync df/d1f/d22/f3c 0 2026-03-10T12:37:48.219 INFO:tasks.workunit.client.0.vm00.stdout:4/270: dread - df/f29 zero size 2026-03-10T12:37:48.221 INFO:tasks.workunit.client.0.vm00.stdout:5/267: rmdir d1f/d26/d2e/d43 0 2026-03-10T12:37:48.222 INFO:tasks.workunit.client.1.vm07.stdout:0/438: dwrite d0/d14/f19 [0,4194304] 0 2026-03-10T12:37:48.222 INFO:tasks.workunit.client.0.vm00.stdout:5/268: chown d1f/d39/d54 11 1 2026-03-10T12:37:48.223 INFO:tasks.workunit.client.0.vm00.stdout:5/269: write d1f/d26/d2e/f3a [133391,127948] 0 2026-03-10T12:37:48.223 INFO:tasks.workunit.client.0.vm00.stdout:5/270: stat d1f/d39/l47 0 2026-03-10T12:37:48.224 INFO:tasks.workunit.client.0.vm00.stdout:3/328: unlink dd/d27/d2c/d34/d45/l52 0 2026-03-10T12:37:48.225 INFO:tasks.workunit.client.0.vm00.stdout:8/239: truncate d0/d12/d36/f39 4149207 0 2026-03-10T12:37:48.237 INFO:tasks.workunit.client.0.vm00.stdout:1/306: write da/d12/f1d [1779121,27168] 0 2026-03-10T12:37:48.240 INFO:tasks.workunit.client.1.vm07.stdout:6/331: dread d1/d4/d6/f30 [0,4194304] 0 2026-03-10T12:37:48.243 INFO:tasks.workunit.client.0.vm00.stdout:5/271: creat d1f/d26/d2b/f5c x:0 0 0 2026-03-10T12:37:48.243 INFO:tasks.workunit.client.1.vm07.stdout:6/332: dread d1/d4/d4a/f55 [0,4194304] 0 2026-03-10T12:37:48.246 INFO:tasks.workunit.client.0.vm00.stdout:6/279: rmdir d2 39 2026-03-10T12:37:48.246 INFO:tasks.workunit.client.0.vm00.stdout:8/240: mkdir d0/d12/d17/d48 0 2026-03-10T12:37:48.247 INFO:tasks.workunit.client.0.vm00.stdout:8/241: truncate d0/d12/d2d/f33 1072208 0 2026-03-10T12:37:48.247 INFO:tasks.workunit.client.0.vm00.stdout:1/307: unlink da/d12/f1a 0 2026-03-10T12:37:48.248 INFO:tasks.workunit.client.1.vm07.stdout:0/439: unlink d0/d14/d5f/d76/d2f/d31/l59 0 2026-03-10T12:37:48.255 INFO:tasks.workunit.client.0.vm00.stdout:8/242: dread d0/d12/d2d/f44 [0,4194304] 0 2026-03-10T12:37:48.258 INFO:tasks.workunit.client.0.vm00.stdout:1/308: rmdir da/d24/d5a 39 2026-03-10T12:37:48.262 INFO:tasks.workunit.client.0.vm00.stdout:5/272: dread d1f/f21 [0,4194304] 0 2026-03-10T12:37:48.262 INFO:tasks.workunit.client.1.vm07.stdout:4/490: truncate d0/d4/d5/da/f6e 1964654 0 2026-03-10T12:37:48.262 INFO:tasks.workunit.client.1.vm07.stdout:6/333: write d1/f26 [3872801,127172] 0 2026-03-10T12:37:48.262 INFO:tasks.workunit.client.0.vm00.stdout:5/273: rename d1f/d26/d2b/d35 to d1f/d26/d2b/d35/d53/d5d 22 2026-03-10T12:37:48.264 INFO:tasks.workunit.client.1.vm07.stdout:1/372: rename d9/df/c20 to d9/d2d/c76 0 2026-03-10T12:37:48.266 INFO:tasks.workunit.client.0.vm00.stdout:8/243: mkdir d0/d12/d2d/d49 0 2026-03-10T12:37:48.267 INFO:tasks.workunit.client.1.vm07.stdout:9/385: dwrite d5/f65 [0,4194304] 0 2026-03-10T12:37:48.268 INFO:tasks.workunit.client.1.vm07.stdout:4/491: stat d0/d4/d10/d3c/d2b/d54/laa 0 2026-03-10T12:37:48.269 INFO:tasks.workunit.client.1.vm07.stdout:4/492: chown d0/d19 9601 1 2026-03-10T12:37:48.272 INFO:tasks.workunit.client.0.vm00.stdout:1/309: fdatasync da/d12/d26/f31 0 2026-03-10T12:37:48.279 INFO:tasks.workunit.client.0.vm00.stdout:1/310: dwrite da/d24/f45 [4194304,4194304] 0 2026-03-10T12:37:48.279 INFO:tasks.workunit.client.0.vm00.stdout:1/311: write da/d12/f66 [285230,4469] 0 2026-03-10T12:37:48.280 INFO:tasks.workunit.client.0.vm00.stdout:1/312: stat da/d24/d28/f37 0 2026-03-10T12:37:48.281 INFO:tasks.workunit.client.0.vm00.stdout:1/313: write da/d12/f20 [165422,51708] 0 2026-03-10T12:37:48.283 INFO:tasks.workunit.client.0.vm00.stdout:4/271: getdents df/d1f 0 2026-03-10T12:37:48.285 INFO:tasks.workunit.client.0.vm00.stdout:5/274: creat d1f/d26/d2b/f5e x:0 0 0 2026-03-10T12:37:48.286 INFO:tasks.workunit.client.0.vm00.stdout:5/275: truncate d1f/f59 876652 0 2026-03-10T12:37:48.288 INFO:tasks.workunit.client.0.vm00.stdout:3/329: getdents dd/d18/d13/d1d/d43/d55 0 2026-03-10T12:37:48.299 INFO:tasks.workunit.client.0.vm00.stdout:8/244: creat d0/d12/d36/d3e/f4a x:0 0 0 2026-03-10T12:37:48.299 INFO:tasks.workunit.client.0.vm00.stdout:1/314: rmdir da/d24/d28/d67 39 2026-03-10T12:37:48.304 INFO:tasks.workunit.client.1.vm07.stdout:4/493: creat d0/d4/d10/d8d/fab x:0 0 0 2026-03-10T12:37:48.304 INFO:tasks.workunit.client.1.vm07.stdout:1/373: mkdir d9/d2d/d4f/d75/d77 0 2026-03-10T12:37:48.304 INFO:tasks.workunit.client.0.vm00.stdout:3/330: chown c0 1 1 2026-03-10T12:37:48.305 INFO:tasks.workunit.client.1.vm07.stdout:4/494: truncate d0/d4/d10/d3c/f68 779425 0 2026-03-10T12:37:48.310 INFO:tasks.workunit.client.1.vm07.stdout:7/354: rename d0/f3b to d0/f70 0 2026-03-10T12:37:48.312 INFO:tasks.workunit.client.0.vm00.stdout:5/276: dread d1f/f25 [4194304,4194304] 0 2026-03-10T12:37:48.312 INFO:tasks.workunit.client.0.vm00.stdout:1/315: creat da/d21/d27/d6a/f6d x:0 0 0 2026-03-10T12:37:48.313 INFO:tasks.workunit.client.0.vm00.stdout:1/316: chown da/d12/d26/c3e 935296584 1 2026-03-10T12:37:48.314 INFO:tasks.workunit.client.0.vm00.stdout:3/331: creat dd/d2a/f78 x:0 0 0 2026-03-10T12:37:48.315 INFO:tasks.workunit.client.1.vm07.stdout:4/495: dread d0/d19/f25 [4194304,4194304] 0 2026-03-10T12:37:48.316 INFO:tasks.workunit.client.0.vm00.stdout:5/277: creat d1f/d39/f5f x:0 0 0 2026-03-10T12:37:48.318 INFO:tasks.workunit.client.1.vm07.stdout:7/355: readlink d0/l1b 0 2026-03-10T12:37:48.318 INFO:tasks.workunit.client.0.vm00.stdout:9/297: dwrite d0/d3d/d43/d53/d57/f3f [0,4194304] 0 2026-03-10T12:37:48.320 INFO:tasks.workunit.client.0.vm00.stdout:9/298: truncate d0/d3d/d43/d53/d57/f67 705063 0 2026-03-10T12:37:48.320 INFO:tasks.workunit.client.0.vm00.stdout:9/299: fdatasync d0/d3d/d43/f68 0 2026-03-10T12:37:48.322 INFO:tasks.workunit.client.1.vm07.stdout:4/496: symlink d0/d4/d10/d5f/d6d/lac 0 2026-03-10T12:37:48.324 INFO:tasks.workunit.client.0.vm00.stdout:1/317: creat da/d21/d27/f6e x:0 0 0 2026-03-10T12:37:48.324 INFO:tasks.workunit.client.0.vm00.stdout:4/272: creat df/d1f/d22/d26/f56 x:0 0 0 2026-03-10T12:37:48.325 INFO:tasks.workunit.client.0.vm00.stdout:3/332: symlink dd/d27/l79 0 2026-03-10T12:37:48.327 INFO:tasks.workunit.client.0.vm00.stdout:4/273: readlink df/d1f/d36/d3a/d41/l46 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:7/356: unlink d0/c17 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:7/357: fdatasync d0/f2f 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:7/358: chown d0 193298 1 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:5/414: dwrite d0/d22/f16 [0,4194304] 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:7/359: creat d0/d67/f71 x:0 0 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:7/360: stat d0/d57/d62/f6c 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:5/415: symlink d0/d22/d18/d3e/l91 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:8/402: write d1/f19 [1870776,30793] 0 2026-03-10T12:37:48.374 INFO:tasks.workunit.client.1.vm07.stdout:5/416: rmdir d0/d22/d18/d19/d2e/d3f/d5c 39 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:0/359: dwrite d3/d7/f31 [0,4194304] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:6/280: read d2/d14/f32 [199004,30289] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:6/281: truncate d2/d14/f5d 75771 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:1/318: dwrite da/d12/f64 [0,4194304] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:9/300: rename d0/d5/d16/d1e/d2b/f42 to d0/d3d/d43/d53/d57/f6c 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:3/333: symlink dd/l7a 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:9/301: dwrite d0/d3d/d43/f54 [0,4194304] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:3/334: fsync dd/d18/d13/d1d/f5b 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:9/302: truncate d0/d5/f26 797246 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:6/282: symlink d2/d42/l6b 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:6/283: chown d2/d16/c4b 538083708 1 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:6/284: chown d2/da/dc/c5b 40 1 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:3/335: getdents dd/d3d/d73 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:9/303: fsync d0/d5/d16/d19/f1b 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:3/336: chown dd/d2a/l4a 0 1 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:3/337: write dd/d18/d13/f22 [1095718,92373] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:9/304: dread d0/d5/dc/f41 [0,4194304] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:9/305: write d0/d5/dc/f2a [2500487,10263] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.0.vm00.stdout:3/338: dread dd/d27/d2c/d34/f60 [0,4194304] 0 2026-03-10T12:37:48.375 INFO:tasks.workunit.client.1.vm07.stdout:2/305: dwrite d0/f44 [0,4194304] 0 2026-03-10T12:37:48.376 INFO:tasks.workunit.client.0.vm00.stdout:9/306: mknod d0/d5/d16/c6d 0 2026-03-10T12:37:48.376 INFO:tasks.workunit.client.1.vm07.stdout:2/306: chown d0/d42/d26/d38/f3d 17876 1 2026-03-10T12:37:48.377 INFO:tasks.workunit.client.0.vm00.stdout:9/307: truncate d0/d3d/d43/d53/f66 940904 0 2026-03-10T12:37:48.378 INFO:tasks.workunit.client.0.vm00.stdout:3/339: write dd/d18/d13/f6b [1135777,117676] 0 2026-03-10T12:37:48.390 INFO:tasks.workunit.client.0.vm00.stdout:3/340: chown dd/d2a/f78 3 1 2026-03-10T12:37:48.390 INFO:tasks.workunit.client.0.vm00.stdout:9/308: dwrite d0/d5/d16/d1e/d2b/f6b [0,4194304] 0 2026-03-10T12:37:48.390 INFO:tasks.workunit.client.0.vm00.stdout:9/309: dread d0/d3d/d43/f68 [0,4194304] 0 2026-03-10T12:37:48.394 INFO:tasks.workunit.client.0.vm00.stdout:3/341: dwrite dd/f25 [0,4194304] 0 2026-03-10T12:37:48.395 INFO:tasks.workunit.client.0.vm00.stdout:9/310: rmdir d0/d5 39 2026-03-10T12:37:48.398 INFO:tasks.workunit.client.0.vm00.stdout:9/311: dread - d0/d5/d16/f49 zero size 2026-03-10T12:37:48.412 INFO:tasks.workunit.client.0.vm00.stdout:9/312: symlink d0/d5/d16/d1e/l6e 0 2026-03-10T12:37:48.412 INFO:tasks.workunit.client.0.vm00.stdout:9/313: creat d0/d3d/d59/d4e/f6f x:0 0 0 2026-03-10T12:37:48.412 INFO:tasks.workunit.client.0.vm00.stdout:9/314: creat d0/d3d/d59/d4e/f70 x:0 0 0 2026-03-10T12:37:48.418 INFO:tasks.workunit.client.0.vm00.stdout:8/245: sync 2026-03-10T12:37:48.420 INFO:tasks.workunit.client.0.vm00.stdout:4/274: dread df/d1f/d22/d26/f39 [0,4194304] 0 2026-03-10T12:37:48.432 INFO:tasks.workunit.client.0.vm00.stdout:4/275: mkdir df/d57 0 2026-03-10T12:37:48.432 INFO:tasks.workunit.client.0.vm00.stdout:4/276: creat df/d32/f58 x:0 0 0 2026-03-10T12:37:48.432 INFO:tasks.workunit.client.0.vm00.stdout:4/277: dread df/d1f/d22/d26/f39 [0,4194304] 0 2026-03-10T12:37:48.432 INFO:tasks.workunit.client.0.vm00.stdout:4/278: dwrite df/f42 [0,4194304] 0 2026-03-10T12:37:48.434 INFO:tasks.workunit.client.0.vm00.stdout:4/279: fdatasync df/d1f/d36/d3a/d41/f47 0 2026-03-10T12:37:48.436 INFO:tasks.workunit.client.0.vm00.stdout:4/280: symlink df/d1f/l59 0 2026-03-10T12:37:48.437 INFO:tasks.workunit.client.0.vm00.stdout:4/281: chown df/d32/l4a 6982 1 2026-03-10T12:37:48.452 INFO:tasks.workunit.client.1.vm07.stdout:6/334: sync 2026-03-10T12:37:48.457 INFO:tasks.workunit.client.1.vm07.stdout:6/335: dwrite d1/d4/d44/f45 [4194304,4194304] 0 2026-03-10T12:37:48.462 INFO:tasks.workunit.client.1.vm07.stdout:6/336: link d1/d4/d6/d16/d49/f67 d1/d4/d6/d53/d66/f68 0 2026-03-10T12:37:48.463 INFO:tasks.workunit.client.1.vm07.stdout:6/337: mknod d1/d4/d6/c69 0 2026-03-10T12:37:48.489 INFO:tasks.workunit.client.0.vm00.stdout:2/255: write d4/d6/f22 [1984998,24404] 0 2026-03-10T12:37:48.491 INFO:tasks.workunit.client.0.vm00.stdout:2/256: creat d4/d53/f5d x:0 0 0 2026-03-10T12:37:48.499 INFO:tasks.workunit.client.1.vm07.stdout:5/417: dread d0/d22/d18/d19/d21/f42 [4194304,4194304] 0 2026-03-10T12:37:48.502 INFO:tasks.workunit.client.1.vm07.stdout:5/418: dwrite d0/d22/d18/d19/d2e/f59 [0,4194304] 0 2026-03-10T12:37:48.511 INFO:tasks.workunit.client.1.vm07.stdout:5/419: link d0/d22/d18/d19/d2e/l49 d0/d22/l92 0 2026-03-10T12:37:48.512 INFO:tasks.workunit.client.0.vm00.stdout:3/342: sync 2026-03-10T12:37:48.512 INFO:tasks.workunit.client.0.vm00.stdout:1/319: sync 2026-03-10T12:37:48.512 INFO:tasks.workunit.client.0.vm00.stdout:9/315: sync 2026-03-10T12:37:48.512 INFO:tasks.workunit.client.0.vm00.stdout:4/282: sync 2026-03-10T12:37:48.513 INFO:tasks.workunit.client.0.vm00.stdout:9/316: stat d0/d3d/d43/d53/d57/f4f 0 2026-03-10T12:37:48.519 INFO:tasks.workunit.client.0.vm00.stdout:4/283: readlink df/l18 0 2026-03-10T12:37:48.526 INFO:tasks.workunit.client.0.vm00.stdout:1/320: dwrite da/d12/f30 [0,4194304] 0 2026-03-10T12:37:48.533 INFO:tasks.workunit.client.1.vm07.stdout:5/420: getdents d0/d22/d18/d19/d36 0 2026-03-10T12:37:48.541 INFO:tasks.workunit.client.1.vm07.stdout:5/421: creat d0/d22/f93 x:0 0 0 2026-03-10T12:37:48.546 INFO:tasks.workunit.client.1.vm07.stdout:5/422: creat d0/d22/d18/d19/d2e/d67/f94 x:0 0 0 2026-03-10T12:37:48.547 INFO:tasks.workunit.client.0.vm00.stdout:9/317: sync 2026-03-10T12:37:48.554 INFO:tasks.workunit.client.0.vm00.stdout:9/318: symlink d0/d5/d16/l71 0 2026-03-10T12:37:48.554 INFO:tasks.workunit.client.0.vm00.stdout:9/319: stat d0/d5/d16/d1e 0 2026-03-10T12:37:48.559 INFO:tasks.workunit.client.0.vm00.stdout:9/320: link d0/d3d/d43/c4d d0/d5/c72 0 2026-03-10T12:37:48.563 INFO:tasks.workunit.client.0.vm00.stdout:9/321: link d0/d3d/c63 d0/d5/d16/c73 0 2026-03-10T12:37:48.564 INFO:tasks.workunit.client.1.vm07.stdout:5/423: getdents d0/d22/d18/d19/d21/d3a 0 2026-03-10T12:37:48.582 INFO:tasks.workunit.client.1.vm07.stdout:6/338: dread d1/f26 [0,4194304] 0 2026-03-10T12:37:48.592 INFO:tasks.workunit.client.0.vm00.stdout:9/322: sync 2026-03-10T12:37:48.595 INFO:tasks.workunit.client.0.vm00.stdout:9/323: mkdir d0/d3d/d59/d74 0 2026-03-10T12:37:48.596 INFO:tasks.workunit.client.0.vm00.stdout:9/324: write d0/d5/d16/d1e/d2b/f5f [220886,19839] 0 2026-03-10T12:37:48.611 INFO:tasks.workunit.client.0.vm00.stdout:9/325: sync 2026-03-10T12:37:48.615 INFO:tasks.workunit.client.0.vm00.stdout:9/326: dwrite d0/d5/d16/d1e/d27/f52 [0,4194304] 0 2026-03-10T12:37:48.621 INFO:tasks.workunit.client.0.vm00.stdout:9/327: write d0/d5/d16/f24 [5087958,67624] 0 2026-03-10T12:37:48.622 INFO:tasks.workunit.client.0.vm00.stdout:9/328: truncate d0/f17 4780046 0 2026-03-10T12:37:48.622 INFO:tasks.workunit.client.0.vm00.stdout:9/329: write d0/d5/d16/f34 [3946318,106760] 0 2026-03-10T12:37:48.629 INFO:tasks.workunit.client.0.vm00.stdout:9/330: creat d0/d5/d16/d1e/d27/f75 x:0 0 0 2026-03-10T12:37:48.639 INFO:tasks.workunit.client.0.vm00.stdout:5/278: write d1f/f46 [1104331,128168] 0 2026-03-10T12:37:48.641 INFO:tasks.workunit.client.0.vm00.stdout:9/331: dwrite d0/d5/d16/d1e/d2b/f47 [0,4194304] 0 2026-03-10T12:37:48.643 INFO:tasks.workunit.client.0.vm00.stdout:9/332: write d0/d5/d16/d1e/d2b/f5f [1234719,95152] 0 2026-03-10T12:37:48.654 INFO:tasks.workunit.client.0.vm00.stdout:9/333: dread d0/d5/d16/d19/f32 [0,4194304] 0 2026-03-10T12:37:48.658 INFO:tasks.workunit.client.0.vm00.stdout:5/279: mknod d1f/d26/d2b/d35/d53/c60 0 2026-03-10T12:37:48.659 INFO:tasks.workunit.client.0.vm00.stdout:5/280: fdatasync d1f/d26/d2e/f3a 0 2026-03-10T12:37:48.662 INFO:tasks.workunit.client.0.vm00.stdout:5/281: dwrite d1f/f46 [0,4194304] 0 2026-03-10T12:37:48.665 INFO:tasks.workunit.client.0.vm00.stdout:3/343: truncate dd/d4e/d5d/f6e 3083181 0 2026-03-10T12:37:48.668 INFO:tasks.workunit.client.0.vm00.stdout:2/257: dwrite d4/dd/f17 [0,4194304] 0 2026-03-10T12:37:48.668 INFO:tasks.workunit.client.0.vm00.stdout:5/282: dread d1f/f25 [4194304,4194304] 0 2026-03-10T12:37:48.682 INFO:tasks.workunit.client.0.vm00.stdout:3/344: creat dd/d64/f7b x:0 0 0 2026-03-10T12:37:48.691 INFO:tasks.workunit.client.1.vm07.stdout:9/386: write d5/f45 [10128538,56282] 0 2026-03-10T12:37:48.691 INFO:tasks.workunit.client.0.vm00.stdout:3/345: write dd/d3d/f50 [924365,121456] 0 2026-03-10T12:37:48.691 INFO:tasks.workunit.client.0.vm00.stdout:4/284: chown df/d1f/d22/d26/d2e/f50 103393397 1 2026-03-10T12:37:48.691 INFO:tasks.workunit.client.0.vm00.stdout:4/285: fsync df/f4e 0 2026-03-10T12:37:48.691 INFO:tasks.workunit.client.0.vm00.stdout:2/258: creat d4/d6/d2d/d31/d32/d40/f5e x:0 0 0 2026-03-10T12:37:48.691 INFO:tasks.workunit.client.0.vm00.stdout:4/286: write f8 [1302793,55725] 0 2026-03-10T12:37:48.692 INFO:tasks.workunit.client.0.vm00.stdout:2/259: truncate d4/d6/f30 2086864 0 2026-03-10T12:37:48.693 INFO:tasks.workunit.client.0.vm00.stdout:9/334: getdents d0/d5/d16/d19 0 2026-03-10T12:37:48.693 INFO:tasks.workunit.client.0.vm00.stdout:9/335: write d0/d5/d16/d1e/d2b/f36 [934041,19385] 0 2026-03-10T12:37:48.694 INFO:tasks.workunit.client.0.vm00.stdout:7/248: symlink da/d41/d48/l59 0 2026-03-10T12:37:48.697 INFO:tasks.workunit.client.0.vm00.stdout:9/336: dwrite d0/d3d/d59/f45 [0,4194304] 0 2026-03-10T12:37:48.699 INFO:tasks.workunit.client.1.vm07.stdout:1/374: dwrite d9/df/d29/d2b/f32 [4194304,4194304] 0 2026-03-10T12:37:48.700 INFO:tasks.workunit.client.0.vm00.stdout:2/260: symlink d4/d6/d2d/l5f 0 2026-03-10T12:37:48.701 INFO:tasks.workunit.client.0.vm00.stdout:2/261: fdatasync d4/dd/d38/f58 0 2026-03-10T12:37:48.706 INFO:tasks.workunit.client.0.vm00.stdout:2/262: dwrite d4/f1d [0,4194304] 0 2026-03-10T12:37:48.710 INFO:tasks.workunit.client.0.vm00.stdout:2/263: write d4/d6/f4e [759526,71592] 0 2026-03-10T12:37:48.717 INFO:tasks.workunit.client.1.vm07.stdout:4/497: write d0/d4/d7a/d46/f85 [30517,48662] 0 2026-03-10T12:37:48.717 INFO:tasks.workunit.client.0.vm00.stdout:0/360: write d3/d7/d3c/f72 [160937,127752] 0 2026-03-10T12:37:48.718 INFO:tasks.workunit.client.1.vm07.stdout:4/498: chown d0/d4/d10/d8d/fab 618104 1 2026-03-10T12:37:48.722 INFO:tasks.workunit.client.1.vm07.stdout:9/387: fsync d5/d13/d57/d3e/f53 0 2026-03-10T12:37:48.723 INFO:tasks.workunit.client.0.vm00.stdout:7/249: dwrite da/f10 [0,4194304] 0 2026-03-10T12:37:48.725 INFO:tasks.workunit.client.0.vm00.stdout:9/337: symlink d0/d3d/d59/d4e/l76 0 2026-03-10T12:37:48.727 INFO:tasks.workunit.client.0.vm00.stdout:4/287: creat df/d1f/d22/f5a x:0 0 0 2026-03-10T12:37:48.729 INFO:tasks.workunit.client.0.vm00.stdout:9/338: rmdir d0/d3d/d43/d53/d57 39 2026-03-10T12:37:48.729 INFO:tasks.workunit.client.1.vm07.stdout:4/499: truncate d0/d4/d10/d5f/d6d/f71 5068714 0 2026-03-10T12:37:48.729 INFO:tasks.workunit.client.1.vm07.stdout:7/361: write d0/f4f [85444,14580] 0 2026-03-10T12:37:48.731 INFO:tasks.workunit.client.0.vm00.stdout:2/264: creat d4/d6/d2d/d3a/d43/d51/f60 x:0 0 0 2026-03-10T12:37:48.731 INFO:tasks.workunit.client.0.vm00.stdout:2/265: write d4/dd/f45 [769201,115093] 0 2026-03-10T12:37:48.733 INFO:tasks.workunit.client.0.vm00.stdout:2/266: read - d4/f39 zero size 2026-03-10T12:37:48.735 INFO:tasks.workunit.client.0.vm00.stdout:4/288: write df/d1f/d36/d3a/f44 [824583,55893] 0 2026-03-10T12:37:48.736 INFO:tasks.workunit.client.0.vm00.stdout:4/289: fsync df/d1f/d36/f51 0 2026-03-10T12:37:48.736 INFO:tasks.workunit.client.0.vm00.stdout:2/267: dwrite d4/dd/f45 [0,4194304] 0 2026-03-10T12:37:48.739 INFO:tasks.workunit.client.0.vm00.stdout:7/250: getdents da/d1b/d2d 0 2026-03-10T12:37:48.740 INFO:tasks.workunit.client.0.vm00.stdout:7/251: chown f9 920345198 1 2026-03-10T12:37:48.740 INFO:tasks.workunit.client.0.vm00.stdout:9/339: getdents d0/d3d/d59/d74 0 2026-03-10T12:37:48.741 INFO:tasks.workunit.client.1.vm07.stdout:8/403: write d1/d3/d18/f75 [507742,120715] 0 2026-03-10T12:37:48.741 INFO:tasks.workunit.client.0.vm00.stdout:4/290: rmdir df/d32 39 2026-03-10T12:37:48.742 INFO:tasks.workunit.client.1.vm07.stdout:2/307: write d0/d42/f1b [915199,64066] 0 2026-03-10T12:37:48.743 INFO:tasks.workunit.client.0.vm00.stdout:2/268: dread d4/dd/f17 [0,4194304] 0 2026-03-10T12:37:48.748 INFO:tasks.workunit.client.0.vm00.stdout:9/340: symlink d0/l77 0 2026-03-10T12:37:48.749 INFO:tasks.workunit.client.0.vm00.stdout:4/291: mknod df/d1f/d22/c5b 0 2026-03-10T12:37:48.751 INFO:tasks.workunit.client.0.vm00.stdout:9/341: mkdir d0/d3d/d78 0 2026-03-10T12:37:48.757 INFO:tasks.workunit.client.1.vm07.stdout:7/362: dread d0/f42 [0,4194304] 0 2026-03-10T12:37:48.757 INFO:tasks.workunit.client.0.vm00.stdout:9/342: write d0/d5/d16/d1e/d2b/f5f [1179349,76231] 0 2026-03-10T12:37:48.757 INFO:tasks.workunit.client.0.vm00.stdout:4/292: mkdir df/d32/d5c 0 2026-03-10T12:37:48.758 INFO:tasks.workunit.client.0.vm00.stdout:7/252: creat da/d25/f5a x:0 0 0 2026-03-10T12:37:48.758 INFO:tasks.workunit.client.0.vm00.stdout:7/253: dread da/f35 [0,4194304] 0 2026-03-10T12:37:48.758 INFO:tasks.workunit.client.0.vm00.stdout:9/343: truncate d0/d5/dc/f41 178844 0 2026-03-10T12:37:48.758 INFO:tasks.workunit.client.0.vm00.stdout:7/254: readlink da/d1b/d2d/l3b 0 2026-03-10T12:37:48.758 INFO:tasks.workunit.client.0.vm00.stdout:4/293: mknod df/d32/c5d 0 2026-03-10T12:37:48.758 INFO:tasks.workunit.client.0.vm00.stdout:4/294: stat df/d1f/d36/d3a/f44 0 2026-03-10T12:37:48.762 INFO:tasks.workunit.client.0.vm00.stdout:4/295: dwrite df/d1f/d36/f51 [0,4194304] 0 2026-03-10T12:37:48.764 INFO:tasks.workunit.client.1.vm07.stdout:9/388: creat d5/d13/d57/d4f/d6a/f8a x:0 0 0 2026-03-10T12:37:48.765 INFO:tasks.workunit.client.1.vm07.stdout:9/389: write d5/d1f/f3d [403889,36173] 0 2026-03-10T12:37:48.766 INFO:tasks.workunit.client.1.vm07.stdout:9/390: readlink d5/d16/d18/l4d 0 2026-03-10T12:37:48.768 INFO:tasks.workunit.client.0.vm00.stdout:2/269: sync 2026-03-10T12:37:48.769 INFO:tasks.workunit.client.0.vm00.stdout:9/344: creat d0/d3d/d43/d53/f79 x:0 0 0 2026-03-10T12:37:48.770 INFO:tasks.workunit.client.0.vm00.stdout:9/345: chown d0/d5/dc/c29 4599404 1 2026-03-10T12:37:48.770 INFO:tasks.workunit.client.0.vm00.stdout:4/296: dread - df/d1f/f4d zero size 2026-03-10T12:37:48.774 INFO:tasks.workunit.client.0.vm00.stdout:6/285: getdents d2/d51 0 2026-03-10T12:37:48.775 INFO:tasks.workunit.client.0.vm00.stdout:7/255: symlink da/d25/l5b 0 2026-03-10T12:37:48.777 INFO:tasks.workunit.client.0.vm00.stdout:2/270: dread d4/d6/f34 [0,4194304] 0 2026-03-10T12:37:48.780 INFO:tasks.workunit.client.0.vm00.stdout:2/271: creat d4/d53/f61 x:0 0 0 2026-03-10T12:37:48.781 INFO:tasks.workunit.client.0.vm00.stdout:4/297: dread df/f11 [0,4194304] 0 2026-03-10T12:37:48.781 INFO:tasks.workunit.client.0.vm00.stdout:4/298: chown df/d32 19098 1 2026-03-10T12:37:48.783 INFO:tasks.workunit.client.0.vm00.stdout:6/286: creat d2/d39/f6c x:0 0 0 2026-03-10T12:37:48.783 INFO:tasks.workunit.client.0.vm00.stdout:1/321: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:48.785 INFO:tasks.workunit.client.0.vm00.stdout:4/299: creat df/d1f/d36/d3a/d41/f5e x:0 0 0 2026-03-10T12:37:48.787 INFO:tasks.workunit.client.0.vm00.stdout:4/300: rename df/d32/d5c to df/d32/d5c/d5f 22 2026-03-10T12:37:48.787 INFO:tasks.workunit.client.0.vm00.stdout:2/272: creat d4/dd/f62 x:0 0 0 2026-03-10T12:37:48.788 INFO:tasks.workunit.client.0.vm00.stdout:1/322: dwrite f5 [0,4194304] 0 2026-03-10T12:37:48.788 INFO:tasks.workunit.client.0.vm00.stdout:4/301: chown df/d1f/l59 93 1 2026-03-10T12:37:48.790 INFO:tasks.workunit.client.0.vm00.stdout:4/302: rename df/d1f to df/d1f/d60 22 2026-03-10T12:37:48.791 INFO:tasks.workunit.client.0.vm00.stdout:7/256: creat da/d1b/d40/f5c x:0 0 0 2026-03-10T12:37:48.795 INFO:tasks.workunit.client.1.vm07.stdout:7/363: mknod d0/d47/d48/c72 0 2026-03-10T12:37:48.796 INFO:tasks.workunit.client.0.vm00.stdout:2/273: dwrite d4/d6/d2d/d3a/d43/d51/f60 [0,4194304] 0 2026-03-10T12:37:48.800 INFO:tasks.workunit.client.0.vm00.stdout:6/287: creat d2/d16/f6d x:0 0 0 2026-03-10T12:37:48.800 INFO:tasks.workunit.client.0.vm00.stdout:2/274: chown d4/d53/l55 55962825 1 2026-03-10T12:37:48.802 INFO:tasks.workunit.client.1.vm07.stdout:7/364: dwrite d0/fc [0,4194304] 0 2026-03-10T12:37:48.804 INFO:tasks.workunit.client.0.vm00.stdout:1/323: sync 2026-03-10T12:37:48.804 INFO:tasks.workunit.client.1.vm07.stdout:7/365: chown d0/f70 59735 1 2026-03-10T12:37:48.808 INFO:tasks.workunit.client.0.vm00.stdout:2/275: dwrite d4/f1d [0,4194304] 0 2026-03-10T12:37:48.812 INFO:tasks.workunit.client.0.vm00.stdout:2/276: write d4/dd/d38/f3f [275222,15508] 0 2026-03-10T12:37:48.814 INFO:tasks.workunit.client.0.vm00.stdout:2/277: truncate d4/d6/d2d/d31/d32/d40/f5e 40784 0 2026-03-10T12:37:48.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:48 vm07.local ceph-mon[58582]: pgmap v159: 65 pgs: 65 active+clean; 1.5 GiB data, 5.5 GiB used, 114 GiB / 120 GiB avail; 30 MiB/s rd, 145 MiB/s wr, 272 op/s 2026-03-10T12:37:48.816 INFO:tasks.workunit.client.0.vm00.stdout:2/278: write d4/dd/d38/f3f [1308053,108012] 0 2026-03-10T12:37:48.816 INFO:tasks.workunit.client.1.vm07.stdout:5/424: write d0/f9 [4355468,93142] 0 2026-03-10T12:37:48.818 INFO:tasks.workunit.client.1.vm07.stdout:7/366: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:37:48.818 INFO:tasks.workunit.client.1.vm07.stdout:5/425: chown d0/d22/d18/d19/d2e/d3f/c7e 1 1 2026-03-10T12:37:48.820 INFO:tasks.workunit.client.1.vm07.stdout:7/367: write d0/f37 [157305,66785] 0 2026-03-10T12:37:48.821 INFO:tasks.workunit.client.0.vm00.stdout:7/257: symlink da/d41/d48/l5d 0 2026-03-10T12:37:48.824 INFO:tasks.workunit.client.1.vm07.stdout:0/440: rename d0/d62 to d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c 0 2026-03-10T12:37:48.825 INFO:tasks.workunit.client.1.vm07.stdout:2/308: mknod d0/d42/d26/c69 0 2026-03-10T12:37:48.826 INFO:tasks.workunit.client.0.vm00.stdout:1/324: symlink da/d21/d27/l6f 0 2026-03-10T12:37:48.826 INFO:tasks.workunit.client.1.vm07.stdout:0/441: chown d0/d14/d5f/d3b/f6c 1841319 1 2026-03-10T12:37:48.833 INFO:tasks.workunit.client.0.vm00.stdout:2/279: sync 2026-03-10T12:37:48.835 INFO:tasks.workunit.client.1.vm07.stdout:2/309: dwrite d0/d42/d1f/d20/f3f [0,4194304] 0 2026-03-10T12:37:48.838 INFO:tasks.workunit.client.0.vm00.stdout:4/303: rename df/d32/c5d to df/d1f/d22/d26/c61 0 2026-03-10T12:37:48.839 INFO:tasks.workunit.client.0.vm00.stdout:2/280: chown d4/d6/d2d/l5f 235560 1 2026-03-10T12:37:48.839 INFO:tasks.workunit.client.0.vm00.stdout:4/304: chown df/d1f/d36/d3a 145230 1 2026-03-10T12:37:48.843 INFO:tasks.workunit.client.0.vm00.stdout:4/305: mknod df/d32/c62 0 2026-03-10T12:37:48.843 INFO:tasks.workunit.client.0.vm00.stdout:2/281: dwrite d4/d6/f16 [8388608,4194304] 0 2026-03-10T12:37:48.850 INFO:tasks.workunit.client.0.vm00.stdout:2/282: rmdir d4/d6/d41 39 2026-03-10T12:37:48.853 INFO:tasks.workunit.client.0.vm00.stdout:2/283: unlink d4/dd/d38/f58 0 2026-03-10T12:37:48.854 INFO:tasks.workunit.client.0.vm00.stdout:4/306: getdents df/d1f/d36 0 2026-03-10T12:37:48.861 INFO:tasks.workunit.client.0.vm00.stdout:2/284: mkdir d4/dd/d63 0 2026-03-10T12:37:48.861 INFO:tasks.workunit.client.0.vm00.stdout:4/307: dread df/d1f/d36/f51 [0,4194304] 0 2026-03-10T12:37:48.863 INFO:tasks.workunit.client.0.vm00.stdout:4/308: rename df/d32/d5c to df/d63 0 2026-03-10T12:37:48.864 INFO:tasks.workunit.client.0.vm00.stdout:4/309: mkdir df/d32/d64 0 2026-03-10T12:37:48.871 INFO:tasks.workunit.client.1.vm07.stdout:8/404: creat d1/d3/d6/f81 x:0 0 0 2026-03-10T12:37:48.871 INFO:tasks.workunit.client.1.vm07.stdout:5/426: rename d0/f13 to d0/d22/d18/f95 0 2026-03-10T12:37:48.872 INFO:tasks.workunit.client.0.vm00.stdout:4/310: mkdir df/d1f/d22/d26/d65 0 2026-03-10T12:37:48.872 INFO:tasks.workunit.client.1.vm07.stdout:0/442: stat d0/d14/d5f/d76/d2f/d31/d4f/f70 0 2026-03-10T12:37:48.874 INFO:tasks.workunit.client.0.vm00.stdout:4/311: dwrite df/f12 [0,4194304] 0 2026-03-10T12:37:48.878 INFO:tasks.workunit.client.1.vm07.stdout:0/443: rmdir d0/d14 39 2026-03-10T12:37:48.879 INFO:tasks.workunit.client.1.vm07.stdout:5/427: chown d0/d22/d18/d30/l69 153970 1 2026-03-10T12:37:48.882 INFO:tasks.workunit.client.0.vm00.stdout:7/258: dread da/d41/f4b [0,4194304] 0 2026-03-10T12:37:48.883 INFO:tasks.workunit.client.1.vm07.stdout:5/428: mknod d0/d22/d18/d19/d21/d54/c96 0 2026-03-10T12:37:48.884 INFO:tasks.workunit.client.0.vm00.stdout:7/259: creat da/d25/d2e/f5e x:0 0 0 2026-03-10T12:37:48.884 INFO:tasks.workunit.client.1.vm07.stdout:0/444: stat d0/d14/d5f/d76/d2f/d31/d4f/f70 0 2026-03-10T12:37:48.885 INFO:tasks.workunit.client.1.vm07.stdout:0/445: readlink d0/d14/l17 0 2026-03-10T12:37:48.885 INFO:tasks.workunit.client.1.vm07.stdout:5/429: creat d0/d22/d18/f97 x:0 0 0 2026-03-10T12:37:48.889 INFO:tasks.workunit.client.0.vm00.stdout:7/260: rename da/d1b/d40/l42 to da/d25/d2c/d46/l5f 0 2026-03-10T12:37:48.895 INFO:tasks.workunit.client.1.vm07.stdout:0/446: mkdir d0/d83/d8d 0 2026-03-10T12:37:48.903 INFO:tasks.workunit.client.0.vm00.stdout:7/261: rmdir da/d41 39 2026-03-10T12:37:48.908 INFO:tasks.workunit.client.1.vm07.stdout:0/447: write d0/d14/d5f/d76/d2f/d31/f6f [1484983,105434] 0 2026-03-10T12:37:48.912 INFO:tasks.workunit.client.1.vm07.stdout:5/430: dread d0/d22/d18/d30/f33 [0,4194304] 0 2026-03-10T12:37:48.921 INFO:tasks.workunit.client.1.vm07.stdout:5/431: mknod d0/d22/d18/d19/d2e/d3f/c98 0 2026-03-10T12:37:48.921 INFO:tasks.workunit.client.0.vm00.stdout:0/361: fsync d3/d7/d3c/f72 0 2026-03-10T12:37:48.921 INFO:tasks.workunit.client.0.vm00.stdout:0/362: chown d3/d22/f71 73 1 2026-03-10T12:37:48.921 INFO:tasks.workunit.client.0.vm00.stdout:0/363: symlink d3/db/l80 0 2026-03-10T12:37:48.921 INFO:tasks.workunit.client.0.vm00.stdout:0/364: creat d3/d7/d4c/d5b/d38/f81 x:0 0 0 2026-03-10T12:37:48.921 INFO:tasks.workunit.client.0.vm00.stdout:0/365: unlink d3/d33/f64 0 2026-03-10T12:37:48.926 INFO:tasks.workunit.client.0.vm00.stdout:0/366: getdents d3/d7/d3c 0 2026-03-10T12:37:48.927 INFO:tasks.workunit.client.0.vm00.stdout:0/367: mkdir d3/db/d77/d82 0 2026-03-10T12:37:48.929 INFO:tasks.workunit.client.0.vm00.stdout:0/368: creat d3/d22/f83 x:0 0 0 2026-03-10T12:37:48.933 INFO:tasks.workunit.client.0.vm00.stdout:0/369: getdents d3/d7/d4c 0 2026-03-10T12:37:48.936 INFO:tasks.workunit.client.0.vm00.stdout:0/370: link d3/l28 d3/db/d24/d25/l84 0 2026-03-10T12:37:48.936 INFO:tasks.workunit.client.0.vm00.stdout:0/371: readlink d3/l9 0 2026-03-10T12:37:48.938 INFO:tasks.workunit.client.1.vm07.stdout:9/391: dread d5/f45 [0,4194304] 0 2026-03-10T12:37:48.939 INFO:tasks.workunit.client.0.vm00.stdout:0/372: dread d3/d7/f31 [0,4194304] 0 2026-03-10T12:37:48.940 INFO:tasks.workunit.client.0.vm00.stdout:0/373: chown f2 32280 1 2026-03-10T12:37:48.942 INFO:tasks.workunit.client.0.vm00.stdout:0/374: mknod d3/db/d24/d25/c85 0 2026-03-10T12:37:48.943 INFO:tasks.workunit.client.1.vm07.stdout:9/392: fdatasync d5/d13/d22/f39 0 2026-03-10T12:37:48.947 INFO:tasks.workunit.client.0.vm00.stdout:0/375: creat d3/d7/d4c/d5b/d38/d44/d5a/f86 x:0 0 0 2026-03-10T12:37:48.950 INFO:tasks.workunit.client.0.vm00.stdout:0/376: mknod d3/d40/d65/c87 0 2026-03-10T12:37:48.951 INFO:tasks.workunit.client.0.vm00.stdout:0/377: truncate d3/d40/f59 782984 0 2026-03-10T12:37:48.953 INFO:tasks.workunit.client.1.vm07.stdout:9/393: mknod d5/d13/d57/d4f/c8b 0 2026-03-10T12:37:48.953 INFO:tasks.workunit.client.1.vm07.stdout:5/432: dread d0/d22/d18/d19/d21/d54/f8a [0,4194304] 0 2026-03-10T12:37:48.954 INFO:tasks.workunit.client.0.vm00.stdout:0/378: write d3/d7/d4c/d5b/d38/f81 [876467,95468] 0 2026-03-10T12:37:48.954 INFO:tasks.workunit.client.1.vm07.stdout:9/394: chown d5/d13/d57/d3e/f53 473355638 1 2026-03-10T12:37:48.961 INFO:tasks.workunit.client.0.vm00.stdout:0/379: creat d3/d7/d4c/d5b/f88 x:0 0 0 2026-03-10T12:37:48.961 INFO:tasks.workunit.client.0.vm00.stdout:0/380: dwrite d3/d7/d4c/f73 [0,4194304] 0 2026-03-10T12:37:48.961 INFO:tasks.workunit.client.1.vm07.stdout:9/395: dread d5/d13/d2c/f44 [0,4194304] 0 2026-03-10T12:37:48.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:48 vm00.local ceph-mon[50686]: pgmap v159: 65 pgs: 65 active+clean; 1.5 GiB data, 5.5 GiB used, 114 GiB / 120 GiB avail; 30 MiB/s rd, 145 MiB/s wr, 272 op/s 2026-03-10T12:37:48.993 INFO:tasks.workunit.client.0.vm00.stdout:0/381: rename d3/db/d24/d25/f43 to d3/d7/d4c/d5b/d38/f89 0 2026-03-10T12:37:48.994 INFO:tasks.workunit.client.0.vm00.stdout:0/382: chown d3/d7/f11 2 1 2026-03-10T12:37:48.998 INFO:tasks.workunit.client.1.vm07.stdout:9/396: symlink d5/d13/d22/l8c 0 2026-03-10T12:37:49.005 INFO:tasks.workunit.client.1.vm07.stdout:9/397: fsync d5/d1f/d31/f56 0 2026-03-10T12:37:49.008 INFO:tasks.workunit.client.1.vm07.stdout:9/398: mkdir d5/d1f/d5e/d8d 0 2026-03-10T12:37:49.015 INFO:tasks.workunit.client.1.vm07.stdout:9/399: rename d5/d13/d22/f83 to d5/d13/d57/d4f/d6a/f8e 0 2026-03-10T12:37:49.015 INFO:tasks.workunit.client.1.vm07.stdout:9/400: creat d5/d16/f8f x:0 0 0 2026-03-10T12:37:49.015 INFO:tasks.workunit.client.1.vm07.stdout:9/401: symlink d5/d1f/d31/d74/l90 2 2026-03-10T12:37:49.015 INFO:tasks.workunit.client.1.vm07.stdout:9/402: write d5/d16/d23/d26/f5c [1573732,64885] 0 2026-03-10T12:37:49.015 INFO:tasks.workunit.client.1.vm07.stdout:9/403: chown d5/d13/d57/l79 620607 1 2026-03-10T12:37:49.032 INFO:tasks.workunit.client.1.vm07.stdout:9/404: creat d5/f91 x:0 0 0 2026-03-10T12:37:49.039 INFO:tasks.workunit.client.1.vm07.stdout:9/405: mknod d5/d16/d23/d26/c92 0 2026-03-10T12:37:49.040 INFO:tasks.workunit.client.1.vm07.stdout:4/500: dread d0/d4/d10/d3c/f68 [0,4194304] 0 2026-03-10T12:37:49.040 INFO:tasks.workunit.client.1.vm07.stdout:9/406: dread - d5/d13/d57/d4f/f88 zero size 2026-03-10T12:37:49.043 INFO:tasks.workunit.client.0.vm00.stdout:9/346: dread d0/d5/d16/f39 [4194304,4194304] 0 2026-03-10T12:37:49.048 INFO:tasks.workunit.client.1.vm07.stdout:4/501: creat d0/d5c/fad x:0 0 0 2026-03-10T12:37:49.050 INFO:tasks.workunit.client.1.vm07.stdout:9/407: mkdir d5/d69/d93 0 2026-03-10T12:37:49.051 INFO:tasks.workunit.client.1.vm07.stdout:3/415: dread dc/dd/d28/d7a/f88 [0,4194304] 0 2026-03-10T12:37:49.053 INFO:tasks.workunit.client.1.vm07.stdout:4/502: chown d0/d4/d7a/f27 327856326 1 2026-03-10T12:37:49.054 INFO:tasks.workunit.client.0.vm00.stdout:8/246: dread d0/f28 [0,4194304] 0 2026-03-10T12:37:49.054 INFO:tasks.workunit.client.0.vm00.stdout:8/247: stat d0 0 2026-03-10T12:37:49.054 INFO:tasks.workunit.client.1.vm07.stdout:4/503: write d0/d4/d5/d34/f94 [622356,7775] 0 2026-03-10T12:37:49.055 INFO:tasks.workunit.client.0.vm00.stdout:8/248: symlink d0/d12/d36/d3e/l4b 0 2026-03-10T12:37:49.056 INFO:tasks.workunit.client.0.vm00.stdout:8/249: readlink d0/lc 0 2026-03-10T12:37:49.056 INFO:tasks.workunit.client.0.vm00.stdout:8/250: dread - d0/d12/d43/f45 zero size 2026-03-10T12:37:49.057 INFO:tasks.workunit.client.0.vm00.stdout:5/283: fsync d1f/f46 0 2026-03-10T12:37:49.058 INFO:tasks.workunit.client.0.vm00.stdout:5/284: read d1f/f21 [7998797,130557] 0 2026-03-10T12:37:49.059 INFO:tasks.workunit.client.0.vm00.stdout:8/251: creat d0/d12/d17/d48/f4c x:0 0 0 2026-03-10T12:37:49.060 INFO:tasks.workunit.client.0.vm00.stdout:5/285: creat d1f/d26/d2b/d37/f61 x:0 0 0 2026-03-10T12:37:49.063 INFO:tasks.workunit.client.0.vm00.stdout:5/286: symlink d1f/d26/d2b/d37/l62 0 2026-03-10T12:37:49.070 INFO:tasks.workunit.client.1.vm07.stdout:9/408: readlink d5/l12 0 2026-03-10T12:37:49.070 INFO:tasks.workunit.client.1.vm07.stdout:3/416: creat dc/f94 x:0 0 0 2026-03-10T12:37:49.070 INFO:tasks.workunit.client.0.vm00.stdout:9/347: getdents d0/d3d/d59/d4e 0 2026-03-10T12:37:49.071 INFO:tasks.workunit.client.0.vm00.stdout:9/348: mknod d0/d5/d16/d1e/d27/c7a 0 2026-03-10T12:37:49.071 INFO:tasks.workunit.client.0.vm00.stdout:9/349: chown d0/d5/d16/f39 1675 1 2026-03-10T12:37:49.071 INFO:tasks.workunit.client.0.vm00.stdout:5/287: getdents d1f/d26/d2b 0 2026-03-10T12:37:49.071 INFO:tasks.workunit.client.0.vm00.stdout:5/288: chown c6 9 1 2026-03-10T12:37:49.071 INFO:tasks.workunit.client.0.vm00.stdout:9/350: dwrite d0/d5/d16/d1e/d27/f52 [0,4194304] 0 2026-03-10T12:37:49.074 INFO:tasks.workunit.client.1.vm07.stdout:4/504: unlink d0/d4/d5/d34/l98 0 2026-03-10T12:37:49.075 INFO:tasks.workunit.client.1.vm07.stdout:3/417: mkdir dc/dd/d43/d76/d95 0 2026-03-10T12:37:49.076 INFO:tasks.workunit.client.1.vm07.stdout:9/409: creat d5/d13/d6c/d7a/f94 x:0 0 0 2026-03-10T12:37:49.076 INFO:tasks.workunit.client.1.vm07.stdout:9/410: read d5/f1c [563039,9765] 0 2026-03-10T12:37:49.077 INFO:tasks.workunit.client.1.vm07.stdout:4/505: chown d0/d4/d5/da/c52 169 1 2026-03-10T12:37:49.077 INFO:tasks.workunit.client.0.vm00.stdout:9/351: rmdir d0/d3d 39 2026-03-10T12:37:49.078 INFO:tasks.workunit.client.0.vm00.stdout:5/289: creat d1f/d26/d2e/d58/f63 x:0 0 0 2026-03-10T12:37:49.079 INFO:tasks.workunit.client.0.vm00.stdout:5/290: write d1f/d26/d2b/d35/f50 [742440,51148] 0 2026-03-10T12:37:49.081 INFO:tasks.workunit.client.0.vm00.stdout:5/291: mknod d1f/d26/d2b/d35/c64 0 2026-03-10T12:37:49.083 INFO:tasks.workunit.client.0.vm00.stdout:5/292: read d1f/d26/d2e/f3c [4053285,59116] 0 2026-03-10T12:37:49.084 INFO:tasks.workunit.client.0.vm00.stdout:5/293: creat d1f/d39/f65 x:0 0 0 2026-03-10T12:37:49.085 INFO:tasks.workunit.client.0.vm00.stdout:5/294: write d1f/d26/d2b/d37/f4c [709952,49103] 0 2026-03-10T12:37:49.085 INFO:tasks.workunit.client.0.vm00.stdout:5/295: chown d1f/d26/f28 415 1 2026-03-10T12:37:49.087 INFO:tasks.workunit.client.0.vm00.stdout:5/296: symlink d1f/d26/d2e/l66 0 2026-03-10T12:37:49.088 INFO:tasks.workunit.client.0.vm00.stdout:5/297: rmdir d1f/d26/d2b/d35 39 2026-03-10T12:37:49.089 INFO:tasks.workunit.client.0.vm00.stdout:5/298: mknod d1f/d39/c67 0 2026-03-10T12:37:49.092 INFO:tasks.workunit.client.0.vm00.stdout:5/299: dwrite d1f/f4a [0,4194304] 0 2026-03-10T12:37:49.092 INFO:tasks.workunit.client.1.vm07.stdout:4/506: fdatasync d0/d4/d5/da/f44 0 2026-03-10T12:37:49.092 INFO:tasks.workunit.client.1.vm07.stdout:9/411: truncate d5/d13/d2c/f44 31783 0 2026-03-10T12:37:49.093 INFO:tasks.workunit.client.0.vm00.stdout:5/300: dread - d1f/d26/d2b/f44 zero size 2026-03-10T12:37:49.094 INFO:tasks.workunit.client.0.vm00.stdout:5/301: write d1f/d26/d2e/f3a [302745,128075] 0 2026-03-10T12:37:49.096 INFO:tasks.workunit.client.0.vm00.stdout:5/302: rmdir d1f/d26/d2b/d35/d53 39 2026-03-10T12:37:49.096 INFO:tasks.workunit.client.1.vm07.stdout:9/412: fsync d5/d16/f35 0 2026-03-10T12:37:49.098 INFO:tasks.workunit.client.0.vm00.stdout:5/303: dread d1f/f4a [0,4194304] 0 2026-03-10T12:37:49.098 INFO:tasks.workunit.client.1.vm07.stdout:3/418: link dc/d18/d2d/d3d/f73 dc/dd/f96 0 2026-03-10T12:37:49.100 INFO:tasks.workunit.client.0.vm00.stdout:5/304: creat d1f/d26/d2b/d35/f68 x:0 0 0 2026-03-10T12:37:49.102 INFO:tasks.workunit.client.0.vm00.stdout:5/305: mknod d1f/d26/d2b/d35/c69 0 2026-03-10T12:37:49.117 INFO:tasks.workunit.client.1.vm07.stdout:4/507: dread d0/d4/d5/da/f4d [0,4194304] 0 2026-03-10T12:37:49.118 INFO:tasks.workunit.client.1.vm07.stdout:4/508: truncate d0/d4/d5/d34/fa3 3290 0 2026-03-10T12:37:49.130 INFO:tasks.workunit.client.1.vm07.stdout:4/509: read - d0/d19/f91 zero size 2026-03-10T12:37:49.134 INFO:tasks.workunit.client.1.vm07.stdout:4/510: read d0/d4/d7a/d46/f85 [61671,27785] 0 2026-03-10T12:37:49.136 INFO:tasks.workunit.client.1.vm07.stdout:9/413: dwrite d5/d13/f67 [0,4194304] 0 2026-03-10T12:37:49.138 INFO:tasks.workunit.client.1.vm07.stdout:4/511: dread d0/d4/d5/d34/fa3 [0,4194304] 0 2026-03-10T12:37:49.143 INFO:tasks.workunit.client.1.vm07.stdout:9/414: rename d5/d13/d22/f5f to d5/d13/d57/f95 0 2026-03-10T12:37:49.160 INFO:tasks.workunit.client.1.vm07.stdout:9/415: link d5/l12 d5/d1f/d31/l96 0 2026-03-10T12:37:49.160 INFO:tasks.workunit.client.0.vm00.stdout:2/285: getdents d4/d53 0 2026-03-10T12:37:49.161 INFO:tasks.workunit.client.1.vm07.stdout:9/416: dread d5/d13/d22/f32 [0,4194304] 0 2026-03-10T12:37:49.162 INFO:tasks.workunit.client.0.vm00.stdout:1/325: truncate da/d12/f30 1875363 0 2026-03-10T12:37:49.163 INFO:tasks.workunit.client.0.vm00.stdout:2/286: symlink d4/d6/d2d/d3a/d43/d51/l64 0 2026-03-10T12:37:49.169 INFO:tasks.workunit.client.0.vm00.stdout:4/312: dwrite df/d1f/d22/d26/f39 [0,4194304] 0 2026-03-10T12:37:49.172 INFO:tasks.workunit.client.0.vm00.stdout:5/306: dread d1f/d26/d2b/f52 [0,4194304] 0 2026-03-10T12:37:49.173 INFO:tasks.workunit.client.0.vm00.stdout:4/313: rmdir df/d1f/d22 39 2026-03-10T12:37:49.176 INFO:tasks.workunit.client.0.vm00.stdout:5/307: rename d1f/d39/d54 to d1f/d6a 0 2026-03-10T12:37:49.177 INFO:tasks.workunit.client.1.vm07.stdout:9/417: mkdir d5/d69/d93/d97 0 2026-03-10T12:37:49.178 INFO:tasks.workunit.client.0.vm00.stdout:2/287: read d4/d6/d41/f4c [583858,37415] 0 2026-03-10T12:37:49.178 INFO:tasks.workunit.client.1.vm07.stdout:6/339: dread d1/d4/d6/d16/d1a/d2c/f59 [0,4194304] 0 2026-03-10T12:37:49.181 INFO:tasks.workunit.client.0.vm00.stdout:2/288: dwrite d4/dd/f3c [0,4194304] 0 2026-03-10T12:37:49.182 INFO:tasks.workunit.client.1.vm07.stdout:9/418: mknod d5/d13/d6c/d89/c98 0 2026-03-10T12:37:49.183 INFO:tasks.workunit.client.0.vm00.stdout:4/314: fdatasync df/d1f/d22/f4c 0 2026-03-10T12:37:49.183 INFO:tasks.workunit.client.1.vm07.stdout:9/419: fsync d5/d16/d23/d26/f86 0 2026-03-10T12:37:49.187 INFO:tasks.workunit.client.1.vm07.stdout:6/340: creat d1/d4/d6/d16/d1a/f6a x:0 0 0 2026-03-10T12:37:49.187 INFO:tasks.workunit.client.1.vm07.stdout:9/420: mknod d5/d69/c99 0 2026-03-10T12:37:49.190 INFO:tasks.workunit.client.0.vm00.stdout:2/289: creat d4/d6/d2d/d31/d32/d40/f65 x:0 0 0 2026-03-10T12:37:49.191 INFO:tasks.workunit.client.0.vm00.stdout:2/290: truncate d4/f28 4829269 0 2026-03-10T12:37:49.192 INFO:tasks.workunit.client.0.vm00.stdout:2/291: unlink d4/c23 0 2026-03-10T12:37:49.193 INFO:tasks.workunit.client.1.vm07.stdout:6/341: symlink d1/d4/d6/d4e/l6b 0 2026-03-10T12:37:49.194 INFO:tasks.workunit.client.0.vm00.stdout:4/315: mknod df/d57/c66 0 2026-03-10T12:37:49.196 INFO:tasks.workunit.client.0.vm00.stdout:2/292: write d4/d6/f2b [1475650,94211] 0 2026-03-10T12:37:49.196 INFO:tasks.workunit.client.0.vm00.stdout:2/293: write f1 [2030721,43569] 0 2026-03-10T12:37:49.197 INFO:tasks.workunit.client.1.vm07.stdout:6/342: readlink d1/d4/d6/d16/l2f 0 2026-03-10T12:37:49.197 INFO:tasks.workunit.client.0.vm00.stdout:2/294: chown d4/dd/ff 14640167 1 2026-03-10T12:37:49.198 INFO:tasks.workunit.client.1.vm07.stdout:6/343: fdatasync d1/d4/f5a 0 2026-03-10T12:37:49.198 INFO:tasks.workunit.client.1.vm07.stdout:9/421: dwrite d5/d13/d57/d4f/f88 [0,4194304] 0 2026-03-10T12:37:49.201 INFO:tasks.workunit.client.1.vm07.stdout:9/422: stat d5/d16/d18/l6e 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:9/423: symlink d5/d13/l9a 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:6/344: dwrite d1/d4/d6/d46/d4d/f22 [4194304,4194304] 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:9/424: fdatasync d5/f8 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:6/345: fdatasync d1/d4/f3b 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:9/425: mkdir d5/d13/d9b 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:6/346: rmdir d1/d4/d4a 39 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:6/347: dwrite d1/d4/d6/d46/d4d/f22 [4194304,4194304] 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:6/348: chown d1/d4/d6/d16/d1a/d33/f3c 11298 1 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:6/349: symlink d1/d4/d6/d16/d49/l6c 0 2026-03-10T12:37:49.240 INFO:tasks.workunit.client.1.vm07.stdout:6/350: symlink d1/d4/d6/d4e/l6d 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/316: unlink df/d1f/d22/d26/c61 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/317: creat df/d32/d64/f67 x:0 0 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/326: dread da/d12/d26/f31 [0,4194304] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/318: truncate df/d1f/d36/d3a/d41/f2f 620343 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/319: dwrite df/d1f/d36/d40/f49 [0,4194304] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/320: chown f3 15 1 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/321: dread df/d1f/d22/d26/d2e/f50 [0,4194304] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/322: chown df/d32/c43 85810246 1 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:4/323: write df/d32/f58 [70643,39862] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/327: read da/d12/d26/f2e [271735,36235] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/328: symlink da/d24/d28/d56/l70 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/329: dwrite da/d12/d26/f57 [0,4194304] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/330: mkdir da/d24/d5a/d71 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/331: mkdir da/d24/d28/d44/d5d/d72 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/332: mkdir da/d24/d73 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/333: dread da/d21/d39/f4f [0,4194304] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/334: rename da/d24/f32 to da/d21/f74 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/335: read da/d12/d26/f31 [2037178,6412] 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/336: truncate da/d12/d26/f57 4264402 0 2026-03-10T12:37:49.241 INFO:tasks.workunit.client.0.vm00.stdout:1/337: dread - da/d21/d27/d6a/f6b zero size 2026-03-10T12:37:49.360 INFO:tasks.workunit.client.0.vm00.stdout:9/352: dread d0/d5/d16/d1e/d27/f28 [0,4194304] 0 2026-03-10T12:37:49.360 INFO:tasks.workunit.client.0.vm00.stdout:9/353: stat d0/d5/d16/d19/f1b 0 2026-03-10T12:37:49.366 INFO:tasks.workunit.client.0.vm00.stdout:3/346: dread dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:49.367 INFO:tasks.workunit.client.0.vm00.stdout:7/262: dread da/f13 [0,4194304] 0 2026-03-10T12:37:49.373 INFO:tasks.workunit.client.0.vm00.stdout:7/263: read - da/d1b/d40/f5c zero size 2026-03-10T12:37:49.373 INFO:tasks.workunit.client.0.vm00.stdout:7/264: fsync da/f10 0 2026-03-10T12:37:49.373 INFO:tasks.workunit.client.0.vm00.stdout:7/265: mkdir da/d3f/d60 0 2026-03-10T12:37:49.373 INFO:tasks.workunit.client.0.vm00.stdout:3/347: creat dd/d18/f7c x:0 0 0 2026-03-10T12:37:49.373 INFO:tasks.workunit.client.0.vm00.stdout:3/348: creat dd/d27/d2c/f7d x:0 0 0 2026-03-10T12:37:49.374 INFO:tasks.workunit.client.0.vm00.stdout:3/349: symlink dd/d64/l7e 0 2026-03-10T12:37:49.377 INFO:tasks.workunit.client.0.vm00.stdout:3/350: rename dd/d27/d2c/d34/c4c to dd/d3d/c7f 0 2026-03-10T12:37:49.377 INFO:tasks.workunit.client.0.vm00.stdout:3/351: write dd/d27/f56 [5088829,91639] 0 2026-03-10T12:37:49.381 INFO:tasks.workunit.client.0.vm00.stdout:3/352: dwrite dd/d27/f35 [0,4194304] 0 2026-03-10T12:37:49.392 INFO:tasks.workunit.client.0.vm00.stdout:3/353: dwrite dd/d18/f12 [0,4194304] 0 2026-03-10T12:37:49.402 INFO:tasks.workunit.client.1.vm07.stdout:3/419: sync 2026-03-10T12:37:49.405 INFO:tasks.workunit.client.0.vm00.stdout:3/354: mknod dd/d18/d14/c80 0 2026-03-10T12:37:49.406 INFO:tasks.workunit.client.0.vm00.stdout:3/355: write dd/d4e/d5d/f71 [920661,91264] 0 2026-03-10T12:37:49.406 INFO:tasks.workunit.client.0.vm00.stdout:3/356: read - dd/d64/f7b zero size 2026-03-10T12:37:49.411 INFO:tasks.workunit.client.0.vm00.stdout:3/357: chown dd/d27/l79 46606060 1 2026-03-10T12:37:49.411 INFO:tasks.workunit.client.0.vm00.stdout:3/358: stat dd/d18/d14/c80 0 2026-03-10T12:37:49.411 INFO:tasks.workunit.client.1.vm07.stdout:3/420: dwrite dc/dd/d1f/d6f/f8c [0,4194304] 0 2026-03-10T12:37:49.413 INFO:tasks.workunit.client.0.vm00.stdout:8/252: truncate d0/d12/d36/f39 2281998 0 2026-03-10T12:37:49.416 INFO:tasks.workunit.client.1.vm07.stdout:3/421: mkdir dc/d18/d2d/d3d/d97 0 2026-03-10T12:37:49.418 INFO:tasks.workunit.client.1.vm07.stdout:3/422: creat dc/dd/d1f/d6f/f98 x:0 0 0 2026-03-10T12:37:49.429 INFO:tasks.workunit.client.1.vm07.stdout:3/423: mkdir dc/d18/d99 0 2026-03-10T12:37:49.429 INFO:tasks.workunit.client.0.vm00.stdout:8/253: unlink d0/d12/d17/f2e 0 2026-03-10T12:37:49.429 INFO:tasks.workunit.client.0.vm00.stdout:7/266: dread da/d25/d2c/f4f [0,4194304] 0 2026-03-10T12:37:49.430 INFO:tasks.workunit.client.0.vm00.stdout:8/254: dread d0/f7 [0,4194304] 0 2026-03-10T12:37:49.435 INFO:tasks.workunit.client.1.vm07.stdout:1/375: truncate d9/df/d29/d2b/d31/f3c 456671 0 2026-03-10T12:37:49.437 INFO:tasks.workunit.client.0.vm00.stdout:8/255: getdents d0/d12/d43 0 2026-03-10T12:37:49.438 INFO:tasks.workunit.client.0.vm00.stdout:8/256: fsync d0/d12/d36/d3e/f4a 0 2026-03-10T12:37:49.438 INFO:tasks.workunit.client.0.vm00.stdout:8/257: dread - d0/d12/d17/d48/f4c zero size 2026-03-10T12:37:49.449 INFO:tasks.workunit.client.0.vm00.stdout:6/288: write d2/d39/f46 [561528,39237] 0 2026-03-10T12:37:49.465 INFO:tasks.workunit.client.1.vm07.stdout:1/376: readlink d9/l25 0 2026-03-10T12:37:49.469 INFO:tasks.workunit.client.0.vm00.stdout:4/324: dread df/d32/f58 [0,4194304] 0 2026-03-10T12:37:49.471 INFO:tasks.workunit.client.0.vm00.stdout:4/325: truncate df/d1f/d36/d3a/d41/f47 4848317 0 2026-03-10T12:37:49.476 INFO:tasks.workunit.client.0.vm00.stdout:6/289: creat d2/d16/d29/d31/d48/f6e x:0 0 0 2026-03-10T12:37:49.476 INFO:tasks.workunit.client.0.vm00.stdout:4/326: dwrite df/d32/f58 [0,4194304] 0 2026-03-10T12:37:49.478 INFO:tasks.workunit.client.0.vm00.stdout:4/327: creat df/d1f/d36/d3a/f68 x:0 0 0 2026-03-10T12:37:49.478 INFO:tasks.workunit.client.0.vm00.stdout:4/328: chown df/d1f/d36/d3a/f68 892554 1 2026-03-10T12:37:49.479 INFO:tasks.workunit.client.1.vm07.stdout:1/377: truncate d9/f61 378831 0 2026-03-10T12:37:49.480 INFO:tasks.workunit.client.1.vm07.stdout:7/368: write d0/f28 [1052076,1878] 0 2026-03-10T12:37:49.481 INFO:tasks.workunit.client.1.vm07.stdout:7/369: stat d0/d57 0 2026-03-10T12:37:49.481 INFO:tasks.workunit.client.1.vm07.stdout:1/378: truncate d9/f6d 1079946 0 2026-03-10T12:37:49.485 INFO:tasks.workunit.client.1.vm07.stdout:8/405: truncate d1/f3e 291595 0 2026-03-10T12:37:49.492 INFO:tasks.workunit.client.1.vm07.stdout:1/379: chown d9/df/d29/d2b/d31/f72 354775116 1 2026-03-10T12:37:49.504 INFO:tasks.workunit.client.1.vm07.stdout:2/310: dwrite d0/f4a [0,4194304] 0 2026-03-10T12:37:49.510 INFO:tasks.workunit.client.0.vm00.stdout:7/267: rename da/d25/d2c/d46 to da/d26/d37/d61 0 2026-03-10T12:37:49.511 INFO:tasks.workunit.client.0.vm00.stdout:6/290: fdatasync d2/d16/f41 0 2026-03-10T12:37:49.513 INFO:tasks.workunit.client.1.vm07.stdout:8/406: rmdir d1/d3 39 2026-03-10T12:37:49.515 INFO:tasks.workunit.client.0.vm00.stdout:6/291: dwrite d2/d16/f6d [0,4194304] 0 2026-03-10T12:37:49.516 INFO:tasks.workunit.client.0.vm00.stdout:4/329: rename df/f11 to df/d1f/d36/f69 0 2026-03-10T12:37:49.517 INFO:tasks.workunit.client.0.vm00.stdout:4/330: write df/f3d [1002196,54095] 0 2026-03-10T12:37:49.517 INFO:tasks.workunit.client.0.vm00.stdout:4/331: truncate df/d1f/d22/f52 323283 0 2026-03-10T12:37:49.520 INFO:tasks.workunit.client.0.vm00.stdout:4/332: mknod df/d1f/d22/d26/c6a 0 2026-03-10T12:37:49.521 INFO:tasks.workunit.client.0.vm00.stdout:4/333: truncate df/d32/d64/f67 108422 0 2026-03-10T12:37:49.521 INFO:tasks.workunit.client.1.vm07.stdout:0/448: write d0/f1d [1191491,8590] 0 2026-03-10T12:37:49.526 INFO:tasks.workunit.client.0.vm00.stdout:4/334: dwrite df/f1c [4194304,4194304] 0 2026-03-10T12:37:49.527 INFO:tasks.workunit.client.1.vm07.stdout:2/311: mknod d0/d42/d1f/d20/c6a 0 2026-03-10T12:37:49.530 INFO:tasks.workunit.client.0.vm00.stdout:4/335: mkdir df/d63/d6b 0 2026-03-10T12:37:49.535 INFO:tasks.workunit.client.0.vm00.stdout:4/336: mkdir df/d6c 0 2026-03-10T12:37:49.537 INFO:tasks.workunit.client.0.vm00.stdout:4/337: mknod df/d63/c6d 0 2026-03-10T12:37:49.538 INFO:tasks.workunit.client.0.vm00.stdout:4/338: stat df/d63/c6d 0 2026-03-10T12:37:49.538 INFO:tasks.workunit.client.0.vm00.stdout:4/339: dread - df/d1f/d22/f4c zero size 2026-03-10T12:37:49.543 INFO:tasks.workunit.client.0.vm00.stdout:4/340: dwrite f9 [0,4194304] 0 2026-03-10T12:37:49.560 INFO:tasks.workunit.client.1.vm07.stdout:8/407: creat d1/d3/d6/d50/d70/f82 x:0 0 0 2026-03-10T12:37:49.563 INFO:tasks.workunit.client.1.vm07.stdout:2/312: mknod d0/d42/d26/c6b 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.0.vm00.stdout:7/268: rename da/fd to da/d47/f62 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.0.vm00.stdout:0/383: getdents d3/d40/d65 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.1.vm07.stdout:7/370: dread d0/d47/d48/f4b [0,4194304] 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.1.vm07.stdout:5/433: dwrite d0/d22/d18/f4c [4194304,4194304] 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.1.vm07.stdout:7/371: chown d0/d47/l49 74 1 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.1.vm07.stdout:5/434: fsync d0/f1f 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.1.vm07.stdout:8/408: rename d1/d3/d6/d50/l69 to d1/d3/d6/d7b/l83 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.1.vm07.stdout:5/435: write d0/d22/d18/d19/d36/f3d [2552237,15481] 0 2026-03-10T12:37:49.579 INFO:tasks.workunit.client.1.vm07.stdout:5/436: readlink d0/d22/d18/d3e/l40 0 2026-03-10T12:37:49.580 INFO:tasks.workunit.client.1.vm07.stdout:7/372: dread - d0/d47/d48/f53 zero size 2026-03-10T12:37:49.583 INFO:tasks.workunit.client.1.vm07.stdout:5/437: read - d0/d22/d18/d19/d21/d3a/f4f zero size 2026-03-10T12:37:49.598 INFO:tasks.workunit.client.1.vm07.stdout:7/373: link d0/d67/f71 d0/d47/f73 0 2026-03-10T12:37:49.599 INFO:tasks.workunit.client.1.vm07.stdout:7/374: truncate d0/fc 4850844 0 2026-03-10T12:37:49.603 INFO:tasks.workunit.client.1.vm07.stdout:5/438: dwrite d0/d22/d18/d19/d21/d54/f8a [0,4194304] 0 2026-03-10T12:37:49.604 INFO:tasks.workunit.client.1.vm07.stdout:7/375: rename d0/c1a to d0/d57/d62/c74 0 2026-03-10T12:37:49.605 INFO:tasks.workunit.client.1.vm07.stdout:7/376: stat d0/f13 0 2026-03-10T12:37:49.755 INFO:tasks.workunit.client.0.vm00.stdout:0/384: sync 2026-03-10T12:37:49.758 INFO:tasks.workunit.client.0.vm00.stdout:0/385: unlink d3/d40/f59 0 2026-03-10T12:37:49.764 INFO:tasks.workunit.client.1.vm07.stdout:1/380: sync 2026-03-10T12:37:49.764 INFO:tasks.workunit.client.1.vm07.stdout:8/409: read d1/d3/d6/f24 [411875,120621] 0 2026-03-10T12:37:49.765 INFO:tasks.workunit.client.1.vm07.stdout:5/439: sync 2026-03-10T12:37:49.765 INFO:tasks.workunit.client.1.vm07.stdout:2/313: sync 2026-03-10T12:37:49.766 INFO:tasks.workunit.client.1.vm07.stdout:5/440: read - d0/d22/f93 zero size 2026-03-10T12:37:49.769 INFO:tasks.workunit.client.0.vm00.stdout:0/386: creat d3/db/d77/f8a x:0 0 0 2026-03-10T12:37:49.776 INFO:tasks.workunit.client.1.vm07.stdout:5/441: dread - d0/d22/d18/d19/d21/d3a/f85 zero size 2026-03-10T12:37:49.776 INFO:tasks.workunit.client.1.vm07.stdout:2/314: mkdir d0/d29/d64/d6c 0 2026-03-10T12:37:49.776 INFO:tasks.workunit.client.1.vm07.stdout:2/315: write d0/d42/f53 [630563,124201] 0 2026-03-10T12:37:49.776 INFO:tasks.workunit.client.1.vm07.stdout:2/316: dread d0/d42/d26/d4b/f58 [0,4194304] 0 2026-03-10T12:37:49.777 INFO:tasks.workunit.client.1.vm07.stdout:2/317: chown d0/f4 1 1 2026-03-10T12:37:49.795 INFO:tasks.workunit.client.0.vm00.stdout:0/387: rename d3/d7/d58/f63 to d3/d7/d4c/d5b/d38/f8b 0 2026-03-10T12:37:49.801 INFO:tasks.workunit.client.0.vm00.stdout:5/308: write d1f/f2c [5209858,97114] 0 2026-03-10T12:37:49.806 INFO:tasks.workunit.client.0.vm00.stdout:5/309: mkdir d1f/d26/d2e/d58/d6b 0 2026-03-10T12:37:49.807 INFO:tasks.workunit.client.0.vm00.stdout:5/310: truncate d1f/d26/d2b/d35/f41 192219 0 2026-03-10T12:37:49.807 INFO:tasks.workunit.client.0.vm00.stdout:5/311: dread - d1f/d26/d2b/f44 zero size 2026-03-10T12:37:49.809 INFO:tasks.workunit.client.1.vm07.stdout:5/442: mknod d0/d22/d18/d19/d21/d3a/c99 0 2026-03-10T12:37:49.812 INFO:tasks.workunit.client.1.vm07.stdout:5/443: readlink d0/l32 0 2026-03-10T12:37:49.819 INFO:tasks.workunit.client.0.vm00.stdout:5/312: dread d1f/f27 [0,4194304] 0 2026-03-10T12:37:49.824 INFO:tasks.workunit.client.1.vm07.stdout:5/444: chown d0/d22/d18/d19/d21/d54/c6c 9977779 1 2026-03-10T12:37:49.824 INFO:tasks.workunit.client.1.vm07.stdout:9/426: write d5/d1f/d31/f82 [2145736,61385] 0 2026-03-10T12:37:49.826 INFO:tasks.workunit.client.0.vm00.stdout:0/388: rename d3/d7/f1c to d3/d22/d3a/f8c 0 2026-03-10T12:37:49.830 INFO:tasks.workunit.client.0.vm00.stdout:2/295: truncate d4/dd/f3e 1171459 0 2026-03-10T12:37:49.836 INFO:tasks.workunit.client.0.vm00.stdout:1/338: getdents da/d24 0 2026-03-10T12:37:49.842 INFO:tasks.workunit.client.0.vm00.stdout:2/296: getdents d4/d6 0 2026-03-10T12:37:49.842 INFO:tasks.workunit.client.0.vm00.stdout:9/354: truncate d0/f17 3898265 0 2026-03-10T12:37:49.842 INFO:tasks.workunit.client.0.vm00.stdout:1/339: creat da/d24/d5a/f75 x:0 0 0 2026-03-10T12:37:49.842 INFO:tasks.workunit.client.1.vm07.stdout:3/424: write dc/d18/d24/f49 [886890,26887] 0 2026-03-10T12:37:49.842 INFO:tasks.workunit.client.0.vm00.stdout:2/297: symlink d4/d6/d2d/d3a/d43/l66 0 2026-03-10T12:37:49.846 INFO:tasks.workunit.client.0.vm00.stdout:9/355: mknod d0/d5/d16/d19/d50/c7b 0 2026-03-10T12:37:49.846 INFO:tasks.workunit.client.0.vm00.stdout:9/356: chown d0/d5/d16/d1e 133837698 1 2026-03-10T12:37:49.849 INFO:tasks.workunit.client.0.vm00.stdout:1/340: creat da/d24/f76 x:0 0 0 2026-03-10T12:37:49.849 INFO:tasks.workunit.client.0.vm00.stdout:1/341: stat f5 0 2026-03-10T12:37:49.851 INFO:tasks.workunit.client.1.vm07.stdout:4/512: dwrite d0/d4/d10/d5f/d6d/f71 [0,4194304] 0 2026-03-10T12:37:49.852 INFO:tasks.workunit.client.0.vm00.stdout:1/342: mkdir da/d21/d39/d77 0 2026-03-10T12:37:49.853 INFO:tasks.workunit.client.0.vm00.stdout:1/343: dread - da/d12/d26/f69 zero size 2026-03-10T12:37:49.855 INFO:tasks.workunit.client.0.vm00.stdout:9/357: creat d0/d3d/d59/d4e/f7c x:0 0 0 2026-03-10T12:37:49.856 INFO:tasks.workunit.client.0.vm00.stdout:9/358: chown d0/d5/d16/f24 201415414 1 2026-03-10T12:37:49.864 INFO:tasks.workunit.client.0.vm00.stdout:1/344: dwrite da/d12/f20 [0,4194304] 0 2026-03-10T12:37:49.875 INFO:tasks.workunit.client.0.vm00.stdout:3/359: unlink dd/d4e/d5d/f6e 0 2026-03-10T12:37:49.878 INFO:tasks.workunit.client.0.vm00.stdout:8/258: dwrite d0/f11 [0,4194304] 0 2026-03-10T12:37:49.880 INFO:tasks.workunit.client.0.vm00.stdout:3/360: stat dd/d27/d2c/d34/l49 0 2026-03-10T12:37:49.887 INFO:tasks.workunit.client.0.vm00.stdout:3/361: creat dd/d4e/d5d/f81 x:0 0 0 2026-03-10T12:37:49.889 INFO:tasks.workunit.client.0.vm00.stdout:1/345: write da/d24/d28/d67/f52 [1437732,125772] 0 2026-03-10T12:37:49.891 INFO:tasks.workunit.client.0.vm00.stdout:8/259: creat d0/dd/f4d x:0 0 0 2026-03-10T12:37:49.892 INFO:tasks.workunit.client.0.vm00.stdout:3/362: symlink dd/d4e/d6a/l82 0 2026-03-10T12:37:49.894 INFO:tasks.workunit.client.1.vm07.stdout:6/351: dread d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:37:49.894 INFO:tasks.workunit.client.1.vm07.stdout:0/449: dwrite d0/d14/f36 [0,4194304] 0 2026-03-10T12:37:49.902 INFO:tasks.workunit.client.0.vm00.stdout:8/260: write d0/f10 [4491081,6502] 0 2026-03-10T12:37:49.905 INFO:tasks.workunit.client.0.vm00.stdout:1/346: mkdir da/d4d/d78 0 2026-03-10T12:37:49.910 INFO:tasks.workunit.client.0.vm00.stdout:3/363: creat dd/d18/f83 x:0 0 0 2026-03-10T12:37:49.928 INFO:tasks.workunit.client.0.vm00.stdout:3/364: fdatasync dd/d18/f83 0 2026-03-10T12:37:49.936 INFO:tasks.workunit.client.0.vm00.stdout:3/365: truncate dd/d4e/d5d/f81 156429 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:3/425: truncate dc/dd/d28/f46 3444673 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:3/426: readlink dc/dd/d1f/l59 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:3/427: truncate dc/dd/f85 595742 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:5/445: mknod d0/d22/d18/d19/d72/c9a 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:5/446: stat d0/c5e 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:2/318: link d0/d42/d26/d38/f3a d0/d42/d26/d4b/f6d 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:9/427: rename l3 to d5/d13/d6c/d7a/l9c 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:9/428: chown d5/d16/l62 6797442 1 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:0/450: mkdir d0/d14/d5f/d76/d8e 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:6/352: mkdir d1/d4/d6/d16/d1a/d6e 0 2026-03-10T12:37:49.937 INFO:tasks.workunit.client.1.vm07.stdout:6/353: fdatasync d1/d4/f62 0 2026-03-10T12:37:49.942 INFO:tasks.workunit.client.1.vm07.stdout:2/319: symlink d0/d42/d26/d38/l6e 0 2026-03-10T12:37:49.942 INFO:tasks.workunit.client.1.vm07.stdout:5/447: rename d0/d22/d18/d30/f33 to d0/d22/d18/d19/d21/d54/f9b 0 2026-03-10T12:37:49.943 INFO:tasks.workunit.client.1.vm07.stdout:9/429: stat d5/d13/d2c/f44 0 2026-03-10T12:37:49.944 INFO:tasks.workunit.client.0.vm00.stdout:3/366: sync 2026-03-10T12:37:49.946 INFO:tasks.workunit.client.1.vm07.stdout:2/320: write d0/d42/d4e/d56/f60 [17322,63931] 0 2026-03-10T12:37:49.946 INFO:tasks.workunit.client.0.vm00.stdout:3/367: mkdir dd/d3d/d84 0 2026-03-10T12:37:49.948 INFO:tasks.workunit.client.1.vm07.stdout:4/513: dread d0/d4/d7a/f50 [0,4194304] 0 2026-03-10T12:37:49.949 INFO:tasks.workunit.client.0.vm00.stdout:3/368: mknod dd/d18/d13/d1d/d43/d55/c85 0 2026-03-10T12:37:49.950 INFO:tasks.workunit.client.0.vm00.stdout:3/369: write dd/d3d/f53 [900744,2296] 0 2026-03-10T12:37:49.951 INFO:tasks.workunit.client.1.vm07.stdout:5/448: dread d0/d22/d18/d19/d21/d54/f8a [0,4194304] 0 2026-03-10T12:37:49.958 INFO:tasks.workunit.client.1.vm07.stdout:0/451: rename d0/d14/d5f/l52 to d0/d14/d7c/l8f 0 2026-03-10T12:37:49.958 INFO:tasks.workunit.client.0.vm00.stdout:3/370: creat dd/d18/d13/d1d/f86 x:0 0 0 2026-03-10T12:37:49.958 INFO:tasks.workunit.client.0.vm00.stdout:3/371: readlink dd/d18/d13/l68 0 2026-03-10T12:37:49.958 INFO:tasks.workunit.client.0.vm00.stdout:3/372: creat dd/d64/f87 x:0 0 0 2026-03-10T12:37:49.958 INFO:tasks.workunit.client.0.vm00.stdout:3/373: dwrite dd/d18/f83 [0,4194304] 0 2026-03-10T12:37:49.963 INFO:tasks.workunit.client.1.vm07.stdout:7/377: dwrite d0/d47/d48/f4b [0,4194304] 0 2026-03-10T12:37:49.969 INFO:tasks.workunit.client.0.vm00.stdout:3/374: dwrite dd/d18/d13/d1d/f42 [0,4194304] 0 2026-03-10T12:37:49.971 INFO:tasks.workunit.client.0.vm00.stdout:3/375: dread dd/d18/f12 [0,4194304] 0 2026-03-10T12:37:49.983 INFO:tasks.workunit.client.1.vm07.stdout:1/381: write d9/df/d29/d2b/d3d/f4c [181524,5239] 0 2026-03-10T12:37:49.983 INFO:tasks.workunit.client.1.vm07.stdout:1/382: chown d9/df/d29/d2b/d31 1303272423 1 2026-03-10T12:37:49.983 INFO:tasks.workunit.client.0.vm00.stdout:3/376: write dd/d18/f83 [683779,7182] 0 2026-03-10T12:37:49.983 INFO:tasks.workunit.client.0.vm00.stdout:5/313: dwrite f11 [0,4194304] 0 2026-03-10T12:37:49.986 INFO:tasks.workunit.client.0.vm00.stdout:5/314: rename d1f/d26/d2b/c45 to d1f/d26/d2e/c6c 0 2026-03-10T12:37:49.989 INFO:tasks.workunit.client.1.vm07.stdout:8/410: write d1/d3/d6/d54/f7d [1918293,130633] 0 2026-03-10T12:37:49.989 INFO:tasks.workunit.client.0.vm00.stdout:5/315: dwrite d1f/d6a/f57 [0,4194304] 0 2026-03-10T12:37:49.990 INFO:tasks.workunit.client.0.vm00.stdout:5/316: chown d1f/d26/d2e/f3c 205299 1 2026-03-10T12:37:50.036 INFO:tasks.workunit.client.1.vm07.stdout:2/321: creat d0/d42/d4e/d56/f6f x:0 0 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.1.vm07.stdout:4/514: creat d0/d4/d7a/d46/d76/fae x:0 0 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.1.vm07.stdout:3/428: dwrite dc/d18/d24/f3a [0,4194304] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.1.vm07.stdout:2/322: dread d0/f4 [0,4194304] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.1.vm07.stdout:2/323: fsync d0/d42/d1f/d20/f39 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:0/389: dwrite d3/d7/d4c/d5b/d38/f89 [4194304,4194304] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:0/390: stat d3/db/d77 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:5/317: dwrite d1f/d39/f65 [0,4194304] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:7/269: write da/d25/d2c/f30 [3799706,16707] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:7/270: write da/d25/d2c/f30 [230007,42020] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:4/341: link df/d1f/d36/f69 df/d1f/d36/d3a/f6e 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:5/318: mknod d1f/d26/c6d 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:4/342: dwrite df/f12 [0,4194304] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:0/391: unlink d3/db/l26 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:5/319: rename d1f/d26/d2e/f3a to d1f/d26/d2b/d35/d53/d5b/f6e 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:5/320: mkdir d1f/d26/d6f 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:5/321: truncate d1f/d26/d2b/f5c 91375 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:5/322: read f11 [3573576,75915] 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:4/343: link df/f42 df/d1f/d36/f6f 0 2026-03-10T12:37:50.037 INFO:tasks.workunit.client.0.vm00.stdout:4/344: mkdir df/d1f/d22/d26/d70 0 2026-03-10T12:37:50.038 INFO:tasks.workunit.client.0.vm00.stdout:4/345: dread f9 [0,4194304] 0 2026-03-10T12:37:50.038 INFO:tasks.workunit.client.0.vm00.stdout:4/346: creat df/d6c/f71 x:0 0 0 2026-03-10T12:37:50.038 INFO:tasks.workunit.client.0.vm00.stdout:4/347: fdatasync f9 0 2026-03-10T12:37:50.038 INFO:tasks.workunit.client.0.vm00.stdout:4/348: creat df/d1f/d22/f72 x:0 0 0 2026-03-10T12:37:50.038 INFO:tasks.workunit.client.0.vm00.stdout:7/271: mkdir da/d25/d63 0 2026-03-10T12:37:50.038 INFO:tasks.workunit.client.0.vm00.stdout:4/349: rename df/d1f/d36/d40 to df/d63/d6b/d73 0 2026-03-10T12:37:50.038 INFO:tasks.workunit.client.0.vm00.stdout:4/350: mknod df/d57/c74 0 2026-03-10T12:37:50.040 INFO:tasks.workunit.client.1.vm07.stdout:8/411: symlink d1/d3/d11/l84 0 2026-03-10T12:37:50.040 INFO:tasks.workunit.client.0.vm00.stdout:4/351: dread df/d1f/d36/f69 [0,4194304] 0 2026-03-10T12:37:50.041 INFO:tasks.workunit.client.0.vm00.stdout:7/272: readlink da/d26/d37/d61/l5f 0 2026-03-10T12:37:50.042 INFO:tasks.workunit.client.0.vm00.stdout:4/352: unlink df/d57/c74 0 2026-03-10T12:37:50.046 INFO:tasks.workunit.client.1.vm07.stdout:9/430: mkdir d5/d13/d9d 0 2026-03-10T12:37:50.048 INFO:tasks.workunit.client.0.vm00.stdout:7/273: creat da/d25/d2c/d58/f64 x:0 0 0 2026-03-10T12:37:50.051 INFO:tasks.workunit.client.0.vm00.stdout:7/274: dwrite da/d25/d2c/f30 [0,4194304] 0 2026-03-10T12:37:50.059 INFO:tasks.workunit.client.0.vm00.stdout:7/275: mknod da/d26/c65 0 2026-03-10T12:37:50.061 INFO:tasks.workunit.client.1.vm07.stdout:0/452: fdatasync d0/d14/d5f/d76/d2f/d31/d4f/f61 0 2026-03-10T12:37:50.067 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:49 vm00.local ceph-mon[50686]: pgmap v160: 65 pgs: 65 active+clean; 1.5 GiB data, 5.5 GiB used, 114 GiB / 120 GiB avail; 27 MiB/s rd, 131 MiB/s wr, 223 op/s 2026-03-10T12:37:50.073 INFO:tasks.workunit.client.1.vm07.stdout:0/453: read d0/d14/d5f/d76/f27 [3853363,99793] 0 2026-03-10T12:37:50.081 INFO:tasks.workunit.client.1.vm07.stdout:5/449: sync 2026-03-10T12:37:50.081 INFO:tasks.workunit.client.1.vm07.stdout:1/383: symlink d9/df/d54/l78 0 2026-03-10T12:37:50.081 INFO:tasks.workunit.client.0.vm00.stdout:5/323: sync 2026-03-10T12:37:50.081 INFO:tasks.workunit.client.0.vm00.stdout:0/392: sync 2026-03-10T12:37:50.081 INFO:tasks.workunit.client.1.vm07.stdout:6/354: sync 2026-03-10T12:37:50.082 INFO:tasks.workunit.client.1.vm07.stdout:6/355: readlink d1/d4/d6/d16/d49/l6c 0 2026-03-10T12:37:50.083 INFO:tasks.workunit.client.0.vm00.stdout:5/324: creat d1f/d26/d2b/d35/d53/f70 x:0 0 0 2026-03-10T12:37:50.084 INFO:tasks.workunit.client.0.vm00.stdout:5/325: write d1f/f59 [1856960,25397] 0 2026-03-10T12:37:50.085 INFO:tasks.workunit.client.1.vm07.stdout:8/412: symlink d1/d3/d18/l85 0 2026-03-10T12:37:50.088 INFO:tasks.workunit.client.1.vm07.stdout:9/431: creat d5/d13/d22/f9e x:0 0 0 2026-03-10T12:37:50.089 INFO:tasks.workunit.client.0.vm00.stdout:5/326: creat d1f/d26/d2e/f71 x:0 0 0 2026-03-10T12:37:50.089 INFO:tasks.workunit.client.0.vm00.stdout:0/393: write d3/d7/f31 [4102946,20744] 0 2026-03-10T12:37:50.090 INFO:tasks.workunit.client.0.vm00.stdout:5/327: write d1f/d6a/f57 [2742783,54499] 0 2026-03-10T12:37:50.091 INFO:tasks.workunit.client.0.vm00.stdout:5/328: write d1f/d26/f48 [1405309,70935] 0 2026-03-10T12:37:50.092 INFO:tasks.workunit.client.0.vm00.stdout:5/329: chown d1f/d26/d2b/d37/f38 11727 1 2026-03-10T12:37:50.092 INFO:tasks.workunit.client.0.vm00.stdout:5/330: chown d1f/d26/d2b/d37/f61 1 1 2026-03-10T12:37:50.093 INFO:tasks.workunit.client.0.vm00.stdout:5/331: write d1f/f46 [4323456,73405] 0 2026-03-10T12:37:50.093 INFO:tasks.workunit.client.0.vm00.stdout:0/394: mknod d3/db/d77/d82/c8d 0 2026-03-10T12:37:50.103 INFO:tasks.workunit.client.0.vm00.stdout:0/395: write f2 [2505533,5122] 0 2026-03-10T12:37:50.110 INFO:tasks.workunit.client.1.vm07.stdout:9/432: creat d5/d1f/f9f x:0 0 0 2026-03-10T12:37:50.115 INFO:tasks.workunit.client.1.vm07.stdout:3/429: creat dc/dd/f9a x:0 0 0 2026-03-10T12:37:50.116 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:49 vm07.local ceph-mon[58582]: pgmap v160: 65 pgs: 65 active+clean; 1.5 GiB data, 5.5 GiB used, 114 GiB / 120 GiB avail; 27 MiB/s rd, 131 MiB/s wr, 223 op/s 2026-03-10T12:37:50.124 INFO:tasks.workunit.client.1.vm07.stdout:8/413: truncate d1/d3/d11/f35 814130 0 2026-03-10T12:37:50.124 INFO:tasks.workunit.client.1.vm07.stdout:8/414: chown d1/c44 0 1 2026-03-10T12:37:50.128 INFO:tasks.workunit.client.1.vm07.stdout:9/433: unlink d5/d13/d22/l8c 0 2026-03-10T12:37:50.128 INFO:tasks.workunit.client.1.vm07.stdout:3/430: truncate dc/d18/d2d/f71 365847 0 2026-03-10T12:37:50.128 INFO:tasks.workunit.client.1.vm07.stdout:6/356: link d1/d4/d6/d53/d66/f68 d1/d4/d6/d4e/d64/f6f 0 2026-03-10T12:37:50.128 INFO:tasks.workunit.client.1.vm07.stdout:3/431: fsync dc/dd/f41 0 2026-03-10T12:37:50.129 INFO:tasks.workunit.client.1.vm07.stdout:9/434: write d5/d16/d23/d26/f86 [52392,126380] 0 2026-03-10T12:37:50.133 INFO:tasks.workunit.client.1.vm07.stdout:6/357: mknod d1/d4/d6/d43/c70 0 2026-03-10T12:37:50.150 INFO:tasks.workunit.client.1.vm07.stdout:8/415: unlink d1/f3e 0 2026-03-10T12:37:50.157 INFO:tasks.workunit.client.1.vm07.stdout:9/435: creat d5/d16/d23/d26/d68/fa0 x:0 0 0 2026-03-10T12:37:50.157 INFO:tasks.workunit.client.1.vm07.stdout:9/436: readlink d5/d13/l15 0 2026-03-10T12:37:50.158 INFO:tasks.workunit.client.1.vm07.stdout:1/384: dread d9/df/f10 [0,4194304] 0 2026-03-10T12:37:50.158 INFO:tasks.workunit.client.1.vm07.stdout:9/437: readlink d5/d13/d57/d3e/l77 0 2026-03-10T12:37:50.162 INFO:tasks.workunit.client.1.vm07.stdout:3/432: rmdir dc/d18/d2d/d3d/d97 0 2026-03-10T12:37:50.162 INFO:tasks.workunit.client.1.vm07.stdout:3/433: stat dc/d18 0 2026-03-10T12:37:50.176 INFO:tasks.workunit.client.0.vm00.stdout:2/298: truncate d4/d6/d41/f4c 3091741 0 2026-03-10T12:37:50.179 INFO:tasks.workunit.client.0.vm00.stdout:2/299: creat d4/f67 x:0 0 0 2026-03-10T12:37:50.180 INFO:tasks.workunit.client.0.vm00.stdout:2/300: mkdir d4/d53/d68 0 2026-03-10T12:37:50.182 INFO:tasks.workunit.client.0.vm00.stdout:2/301: creat d4/d53/d68/f69 x:0 0 0 2026-03-10T12:37:50.182 INFO:tasks.workunit.client.0.vm00.stdout:2/302: chown d4/d53/d68 1 1 2026-03-10T12:37:50.200 INFO:tasks.workunit.client.1.vm07.stdout:9/438: truncate d5/d1f/d31/d64/f70 900284 0 2026-03-10T12:37:50.200 INFO:tasks.workunit.client.1.vm07.stdout:3/434: rename dc/dd/d28/d3b/f5b to dc/dd/d28/d7a/d8e/f9b 0 2026-03-10T12:37:50.200 INFO:tasks.workunit.client.1.vm07.stdout:1/385: mkdir d9/df/d79 0 2026-03-10T12:37:50.200 INFO:tasks.workunit.client.0.vm00.stdout:2/303: rename d4/d6/d2d/d31/d32/c33 to d4/d6/d2d/d31/d32/c6a 0 2026-03-10T12:37:50.200 INFO:tasks.workunit.client.0.vm00.stdout:2/304: symlink d4/dd/d63/l6b 0 2026-03-10T12:37:50.200 INFO:tasks.workunit.client.0.vm00.stdout:2/305: symlink d4/d53/l6c 0 2026-03-10T12:37:50.200 INFO:tasks.workunit.client.1.vm07.stdout:1/386: readlink d9/df/d29/d2c/l66 0 2026-03-10T12:37:50.208 INFO:tasks.workunit.client.1.vm07.stdout:9/439: creat d5/d16/d18/fa1 x:0 0 0 2026-03-10T12:37:50.209 INFO:tasks.workunit.client.0.vm00.stdout:1/347: getdents da/d4d 0 2026-03-10T12:37:50.210 INFO:tasks.workunit.client.0.vm00.stdout:1/348: write da/d12/d26/f57 [248274,84585] 0 2026-03-10T12:37:50.215 INFO:tasks.workunit.client.0.vm00.stdout:8/261: dwrite d0/f28 [0,4194304] 0 2026-03-10T12:37:50.216 INFO:tasks.workunit.client.0.vm00.stdout:8/262: write d0/f10 [1163223,87152] 0 2026-03-10T12:37:50.228 INFO:tasks.workunit.client.0.vm00.stdout:1/349: dwrite da/d12/f62 [0,4194304] 0 2026-03-10T12:37:50.236 INFO:tasks.workunit.client.0.vm00.stdout:8/263: link d0/lc d0/d12/d17/l4e 0 2026-03-10T12:37:50.237 INFO:tasks.workunit.client.0.vm00.stdout:8/264: write d0/d12/f27 [1308926,88260] 0 2026-03-10T12:37:50.237 INFO:tasks.workunit.client.0.vm00.stdout:8/265: chown d0/f11 4552975 1 2026-03-10T12:37:50.237 INFO:tasks.workunit.client.0.vm00.stdout:8/266: readlink d0/d12/d17/l3b 0 2026-03-10T12:37:50.241 INFO:tasks.workunit.client.0.vm00.stdout:8/267: symlink d0/d12/d17/d48/l4f 0 2026-03-10T12:37:50.242 INFO:tasks.workunit.client.0.vm00.stdout:1/350: link da/d12/c3a da/d24/d28/c79 0 2026-03-10T12:37:50.243 INFO:tasks.workunit.client.0.vm00.stdout:1/351: write da/d21/d27/f54 [1807493,35682] 0 2026-03-10T12:37:50.248 INFO:tasks.workunit.client.0.vm00.stdout:1/352: dwrite da/f22 [0,4194304] 0 2026-03-10T12:37:50.253 INFO:tasks.workunit.client.0.vm00.stdout:1/353: creat da/d24/d28/d44/f7a x:0 0 0 2026-03-10T12:37:50.262 INFO:tasks.workunit.client.0.vm00.stdout:1/354: read da/d24/d28/f3c [458080,75621] 0 2026-03-10T12:37:50.263 INFO:tasks.workunit.client.0.vm00.stdout:1/355: fsync da/f14 0 2026-03-10T12:37:50.264 INFO:tasks.workunit.client.0.vm00.stdout:1/356: chown da/d12/c25 2336860 1 2026-03-10T12:37:50.268 INFO:tasks.workunit.client.0.vm00.stdout:1/357: dwrite da/d24/f47 [0,4194304] 0 2026-03-10T12:37:50.271 INFO:tasks.workunit.client.0.vm00.stdout:1/358: truncate da/d21/d39/f4f 530148 0 2026-03-10T12:37:50.273 INFO:tasks.workunit.client.0.vm00.stdout:1/359: link da/c1e da/d24/c7b 0 2026-03-10T12:37:50.274 INFO:tasks.workunit.client.0.vm00.stdout:1/360: readlink da/d12/l4b 0 2026-03-10T12:37:50.279 INFO:tasks.workunit.client.0.vm00.stdout:1/361: creat da/d24/d5a/f7c x:0 0 0 2026-03-10T12:37:50.281 INFO:tasks.workunit.client.1.vm07.stdout:7/378: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:37:50.282 INFO:tasks.workunit.client.0.vm00.stdout:1/362: getdents da/d24/d5a/d71 0 2026-03-10T12:37:50.289 INFO:tasks.workunit.client.1.vm07.stdout:4/515: write d0/d4/d5/d34/f5d [668836,62340] 0 2026-03-10T12:37:50.295 INFO:tasks.workunit.client.1.vm07.stdout:7/379: dwrite d0/d47/f59 [4194304,4194304] 0 2026-03-10T12:37:50.304 INFO:tasks.workunit.client.0.vm00.stdout:7/276: dwrite da/fe [0,4194304] 0 2026-03-10T12:37:50.311 INFO:tasks.workunit.client.1.vm07.stdout:7/380: fsync d0/d47/d48/f54 0 2026-03-10T12:37:50.314 INFO:tasks.workunit.client.1.vm07.stdout:7/381: chown d0/d57/d62/f6c 254381 1 2026-03-10T12:37:50.315 INFO:tasks.workunit.client.0.vm00.stdout:7/277: dread da/d25/f4e [0,4194304] 0 2026-03-10T12:37:50.318 INFO:tasks.workunit.client.1.vm07.stdout:5/450: write d0/d22/f27 [1305910,47225] 0 2026-03-10T12:37:50.318 INFO:tasks.workunit.client.0.vm00.stdout:7/278: readlink da/d1b/l45 0 2026-03-10T12:37:50.321 INFO:tasks.workunit.client.0.vm00.stdout:0/396: dwrite d3/d7/f10 [4194304,4194304] 0 2026-03-10T12:37:50.321 INFO:tasks.workunit.client.1.vm07.stdout:7/382: dwrite d0/f28 [0,4194304] 0 2026-03-10T12:37:50.328 INFO:tasks.workunit.client.1.vm07.stdout:7/383: chown d0/d47/l49 3711597 1 2026-03-10T12:37:50.334 INFO:tasks.workunit.client.1.vm07.stdout:0/454: truncate d0/d14/f36 531654 0 2026-03-10T12:37:50.334 INFO:tasks.workunit.client.1.vm07.stdout:5/451: creat d0/d22/d18/d19/d36/d75/f9c x:0 0 0 2026-03-10T12:37:50.340 INFO:tasks.workunit.client.0.vm00.stdout:0/397: dread d3/d7/d4c/d5b/d38/f81 [0,4194304] 0 2026-03-10T12:37:50.341 INFO:tasks.workunit.client.0.vm00.stdout:0/398: dread - d3/d22/f71 zero size 2026-03-10T12:37:50.348 INFO:tasks.workunit.client.1.vm07.stdout:7/384: creat d0/d57/d62/f75 x:0 0 0 2026-03-10T12:37:50.353 INFO:tasks.workunit.client.1.vm07.stdout:0/455: creat d0/d14/d7c/f90 x:0 0 0 2026-03-10T12:37:50.356 INFO:tasks.workunit.client.1.vm07.stdout:5/452: symlink d0/d22/d18/d19/d21/d54/l9d 0 2026-03-10T12:37:50.356 INFO:tasks.workunit.client.1.vm07.stdout:6/358: write d1/d4/d6/d16/f50 [648112,12013] 0 2026-03-10T12:37:50.361 INFO:tasks.workunit.client.1.vm07.stdout:7/385: unlink d0/f4e 0 2026-03-10T12:37:50.361 INFO:tasks.workunit.client.1.vm07.stdout:7/386: chown d0/f42 54672019 1 2026-03-10T12:37:50.368 INFO:tasks.workunit.client.0.vm00.stdout:7/279: symlink da/d41/d48/l66 0 2026-03-10T12:37:50.368 INFO:tasks.workunit.client.1.vm07.stdout:8/416: truncate d1/d3/f1d 507096 0 2026-03-10T12:37:50.397 INFO:tasks.workunit.client.1.vm07.stdout:4/516: fsync d0/d4/d5/d34/f5d 0 2026-03-10T12:37:50.399 INFO:tasks.workunit.client.1.vm07.stdout:4/517: dread d0/d4/d10/d5f/d6d/f71 [0,4194304] 0 2026-03-10T12:37:50.399 INFO:tasks.workunit.client.1.vm07.stdout:4/518: chown d0/d4/d5/d34/f37 0 1 2026-03-10T12:37:50.417 INFO:tasks.workunit.client.0.vm00.stdout:4/353: write df/f42 [4878337,120582] 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.1.vm07.stdout:7/387: symlink d0/d67/d6f/l76 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.1.vm07.stdout:3/435: write dc/dd/d43/f61 [456010,101087] 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.1.vm07.stdout:3/436: write dc/d18/d24/f3a [4519879,23675] 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.1.vm07.stdout:0/456: fsync d0/f15 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.1.vm07.stdout:9/440: dwrite d5/d13/d2c/f41 [0,4194304] 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.0.vm00.stdout:4/354: dread df/d1f/d36/f51 [0,4194304] 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.0.vm00.stdout:4/355: dread df/d32/d64/f67 [0,4194304] 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.0.vm00.stdout:6/292: link d2/c6 d2/d51/c6f 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.0.vm00.stdout:4/356: creat df/d63/d6b/f75 x:0 0 0 2026-03-10T12:37:50.433 INFO:tasks.workunit.client.0.vm00.stdout:7/280: mknod da/d26/d37/d56/c67 0 2026-03-10T12:37:50.439 INFO:tasks.workunit.client.0.vm00.stdout:0/399: link c1 d3/d7/d58/c8e 0 2026-03-10T12:37:50.440 INFO:tasks.workunit.client.0.vm00.stdout:0/400: chown d3/db/d24/f2f 11 1 2026-03-10T12:37:50.440 INFO:tasks.workunit.client.1.vm07.stdout:5/453: getdents d0/d22/d18/d80 0 2026-03-10T12:37:50.446 INFO:tasks.workunit.client.0.vm00.stdout:8/268: dread d0/f10 [0,4194304] 0 2026-03-10T12:37:50.453 INFO:tasks.workunit.client.0.vm00.stdout:7/281: write da/f13 [3394982,102151] 0 2026-03-10T12:37:50.463 INFO:tasks.workunit.client.1.vm07.stdout:1/387: dread d9/df/f26 [0,4194304] 0 2026-03-10T12:37:50.464 INFO:tasks.workunit.client.1.vm07.stdout:1/388: readlink d9/df/d29/d2b/d31/l53 0 2026-03-10T12:37:50.469 INFO:tasks.workunit.client.0.vm00.stdout:7/282: dwrite da/fb [0,4194304] 0 2026-03-10T12:37:50.473 INFO:tasks.workunit.client.0.vm00.stdout:7/283: dread da/d25/d2c/f4f [0,4194304] 0 2026-03-10T12:37:50.477 INFO:tasks.workunit.client.0.vm00.stdout:5/332: truncate d1f/d6a/f57 2876539 0 2026-03-10T12:37:50.482 INFO:tasks.workunit.client.0.vm00.stdout:0/401: chown d3/d7/d3c/c36 143378071 1 2026-03-10T12:37:50.483 INFO:tasks.workunit.client.0.vm00.stdout:5/333: dwrite d1f/d26/d2b/d37/f38 [4194304,4194304] 0 2026-03-10T12:37:50.487 INFO:tasks.workunit.client.1.vm07.stdout:7/388: creat d0/d57/d62/f77 x:0 0 0 2026-03-10T12:37:50.487 INFO:tasks.workunit.client.0.vm00.stdout:5/334: dread d1f/f22 [0,4194304] 0 2026-03-10T12:37:50.496 INFO:tasks.workunit.client.0.vm00.stdout:2/306: rename d4/d6/d2d/d31/d32 to d4/d6/d41/d6d 0 2026-03-10T12:37:50.498 INFO:tasks.workunit.client.1.vm07.stdout:9/441: chown d5/lf 5105 1 2026-03-10T12:37:50.499 INFO:tasks.workunit.client.1.vm07.stdout:9/442: fsync d5/d16/d18/fa1 0 2026-03-10T12:37:50.500 INFO:tasks.workunit.client.0.vm00.stdout:2/307: creat d4/f6e x:0 0 0 2026-03-10T12:37:50.501 INFO:tasks.workunit.client.0.vm00.stdout:2/308: creat d4/d6/d2d/d3a/d43/d51/f6f x:0 0 0 2026-03-10T12:37:50.503 INFO:tasks.workunit.client.0.vm00.stdout:0/402: link d3/d33/f4d d3/d40/d65/f8f 0 2026-03-10T12:37:50.504 INFO:tasks.workunit.client.0.vm00.stdout:0/403: fdatasync d3/d7/f70 0 2026-03-10T12:37:50.511 INFO:tasks.workunit.client.1.vm07.stdout:8/417: sync 2026-03-10T12:37:50.511 INFO:tasks.workunit.client.1.vm07.stdout:8/418: dread - d1/d3/d6/d50/f5e zero size 2026-03-10T12:37:50.511 INFO:tasks.workunit.client.1.vm07.stdout:8/419: fdatasync d1/d3/d6/d7b/f7c 0 2026-03-10T12:37:50.516 INFO:tasks.workunit.client.0.vm00.stdout:0/404: mknod d3/d7/d4c/d5b/d38/c90 0 2026-03-10T12:37:50.516 INFO:tasks.workunit.client.1.vm07.stdout:4/519: rename d0/d4/d10/d9a/c28 to d0/d4/d7a/d46/caf 0 2026-03-10T12:37:50.517 INFO:tasks.workunit.client.0.vm00.stdout:0/405: write f2 [2482478,56116] 0 2026-03-10T12:37:50.524 INFO:tasks.workunit.client.0.vm00.stdout:1/363: unlink da/d21/d39/f4f 0 2026-03-10T12:37:50.526 INFO:tasks.workunit.client.0.vm00.stdout:1/364: write da/d12/f1d [3603086,70537] 0 2026-03-10T12:37:50.527 INFO:tasks.workunit.client.0.vm00.stdout:1/365: symlink da/d24/d28/d44/l7d 0 2026-03-10T12:37:50.528 INFO:tasks.workunit.client.0.vm00.stdout:1/366: chown da/d24/l2f 4491 1 2026-03-10T12:37:50.528 INFO:tasks.workunit.client.0.vm00.stdout:1/367: stat da/d21/d27/d6a/f6b 0 2026-03-10T12:37:50.529 INFO:tasks.workunit.client.0.vm00.stdout:1/368: stat da/d24/d28/c29 0 2026-03-10T12:37:50.529 INFO:tasks.workunit.client.0.vm00.stdout:1/369: read da/d12/f1d [1010134,78332] 0 2026-03-10T12:37:50.537 INFO:tasks.workunit.client.0.vm00.stdout:4/357: truncate df/f16 1555140 0 2026-03-10T12:37:50.539 INFO:tasks.workunit.client.0.vm00.stdout:4/358: write f8 [4216184,33617] 0 2026-03-10T12:37:50.545 INFO:tasks.workunit.client.0.vm00.stdout:0/406: mknod d3/d7/d4c/d5b/d38/c91 0 2026-03-10T12:37:50.564 INFO:tasks.workunit.client.0.vm00.stdout:5/335: dread f19 [0,4194304] 0 2026-03-10T12:37:50.566 INFO:tasks.workunit.client.0.vm00.stdout:5/336: mkdir d1f/d26/d2b/d35/d53/d72 0 2026-03-10T12:37:50.566 INFO:tasks.workunit.client.0.vm00.stdout:5/337: write d1f/d26/f48 [1217799,57456] 0 2026-03-10T12:37:50.567 INFO:tasks.workunit.client.0.vm00.stdout:5/338: fdatasync d1f/d26/d2b/d35/d53/f70 0 2026-03-10T12:37:50.568 INFO:tasks.workunit.client.0.vm00.stdout:5/339: fdatasync d1f/d26/d2b/d37/f61 0 2026-03-10T12:37:50.569 INFO:tasks.workunit.client.1.vm07.stdout:8/420: rename d1/d3/d18/f38 to d1/d3/d11/f86 0 2026-03-10T12:37:50.569 INFO:tasks.workunit.client.1.vm07.stdout:4/520: rmdir d0/d4/d10/d5f 39 2026-03-10T12:37:50.570 INFO:tasks.workunit.client.0.vm00.stdout:0/407: dwrite d3/d22/d3a/f8c [0,4194304] 0 2026-03-10T12:37:50.571 INFO:tasks.workunit.client.1.vm07.stdout:6/359: dwrite d1/d4/d6/d4e/d64/f6f [0,4194304] 0 2026-03-10T12:37:50.571 INFO:tasks.workunit.client.1.vm07.stdout:7/389: symlink d0/l78 0 2026-03-10T12:37:50.575 INFO:tasks.workunit.client.1.vm07.stdout:8/421: dwrite d1/f79 [0,4194304] 0 2026-03-10T12:37:50.580 INFO:tasks.workunit.client.0.vm00.stdout:1/370: sync 2026-03-10T12:37:50.581 INFO:tasks.workunit.client.0.vm00.stdout:1/371: fsync da/d12/d26/f2e 0 2026-03-10T12:37:50.589 INFO:tasks.workunit.client.0.vm00.stdout:5/340: write d1f/d6a/f57 [3062356,96756] 0 2026-03-10T12:37:50.589 INFO:tasks.workunit.client.0.vm00.stdout:5/341: chown d1f/d26/d2b/d37/f4c 0 1 2026-03-10T12:37:50.592 INFO:tasks.workunit.client.0.vm00.stdout:5/342: mkdir d1f/d26/d2b/d35/d53/d5b/d73 0 2026-03-10T12:37:50.593 INFO:tasks.workunit.client.0.vm00.stdout:5/343: read - d1f/d26/d2e/d58/f63 zero size 2026-03-10T12:37:50.594 INFO:tasks.workunit.client.1.vm07.stdout:9/443: creat d5/d69/d93/d97/fa2 x:0 0 0 2026-03-10T12:37:50.595 INFO:tasks.workunit.client.1.vm07.stdout:9/444: chown d5/d13/f67 0 1 2026-03-10T12:37:50.600 INFO:tasks.workunit.client.0.vm00.stdout:0/408: creat d3/d40/d65/f92 x:0 0 0 2026-03-10T12:37:50.600 INFO:tasks.workunit.client.0.vm00.stdout:0/409: fdatasync d3/d7/d4c/d5b/d38/d44/f49 0 2026-03-10T12:37:50.608 INFO:tasks.workunit.client.0.vm00.stdout:5/344: dread d1f/f30 [0,4194304] 0 2026-03-10T12:37:50.610 INFO:tasks.workunit.client.0.vm00.stdout:5/345: creat d1f/d6a/f74 x:0 0 0 2026-03-10T12:37:50.615 INFO:tasks.workunit.client.0.vm00.stdout:7/284: rename da/d1b/d2d to da/d25/d2c/d58/d68 0 2026-03-10T12:37:50.619 INFO:tasks.workunit.client.0.vm00.stdout:2/309: truncate d4/dd/f10 8331407 0 2026-03-10T12:37:50.620 INFO:tasks.workunit.client.0.vm00.stdout:2/310: dread - d4/dd/f62 zero size 2026-03-10T12:37:50.626 INFO:tasks.workunit.client.0.vm00.stdout:2/311: dwrite d4/d53/f61 [0,4194304] 0 2026-03-10T12:37:50.636 INFO:tasks.workunit.client.0.vm00.stdout:3/377: dread dd/d27/d2c/d34/d45/f47 [0,4194304] 0 2026-03-10T12:37:50.636 INFO:tasks.workunit.client.0.vm00.stdout:2/312: write d4/dd/f3c [1877467,128919] 0 2026-03-10T12:37:50.636 INFO:tasks.workunit.client.0.vm00.stdout:2/313: write d4/dd/d38/f3f [1221160,33639] 0 2026-03-10T12:37:50.636 INFO:tasks.workunit.client.0.vm00.stdout:2/314: fsync d4/f28 0 2026-03-10T12:37:50.636 INFO:tasks.workunit.client.0.vm00.stdout:2/315: dread - d4/d6/d41/d6d/d40/f65 zero size 2026-03-10T12:37:50.639 INFO:tasks.workunit.client.0.vm00.stdout:3/378: mknod dd/d27/d2c/d34/d45/c88 0 2026-03-10T12:37:50.645 INFO:tasks.workunit.client.0.vm00.stdout:3/379: creat dd/d27/d2c/f89 x:0 0 0 2026-03-10T12:37:50.651 INFO:tasks.workunit.client.0.vm00.stdout:3/380: mkdir dd/d3d/d8a 0 2026-03-10T12:37:50.651 INFO:tasks.workunit.client.0.vm00.stdout:3/381: dwrite f7 [0,4194304] 0 2026-03-10T12:37:50.653 INFO:tasks.workunit.client.0.vm00.stdout:3/382: dwrite dd/d2a/f78 [0,4194304] 0 2026-03-10T12:37:50.655 INFO:tasks.workunit.client.0.vm00.stdout:0/410: rename d3/db/f45 to d3/d7/d4c/d5b/d38/f93 0 2026-03-10T12:37:50.655 INFO:tasks.workunit.client.0.vm00.stdout:3/383: fdatasync dd/d27/f44 0 2026-03-10T12:37:50.658 INFO:tasks.workunit.client.0.vm00.stdout:2/316: rename d4/d6/d2d/c42 to d4/d6/d2d/d3a/d43/c70 0 2026-03-10T12:37:50.660 INFO:tasks.workunit.client.0.vm00.stdout:3/384: creat dd/d3d/d8a/f8b x:0 0 0 2026-03-10T12:37:50.663 INFO:tasks.workunit.client.0.vm00.stdout:3/385: unlink dd/d18/f12 0 2026-03-10T12:37:50.665 INFO:tasks.workunit.client.0.vm00.stdout:3/386: rmdir dd/d4e 39 2026-03-10T12:37:50.669 INFO:tasks.workunit.client.0.vm00.stdout:3/387: dwrite dd/d27/f44 [0,4194304] 0 2026-03-10T12:37:50.679 INFO:tasks.workunit.client.1.vm07.stdout:4/521: creat d0/d4/d10/d8d/fb0 x:0 0 0 2026-03-10T12:37:50.679 INFO:tasks.workunit.client.1.vm07.stdout:8/422: chown d1/d3/d6/l78 27551 1 2026-03-10T12:37:50.679 INFO:tasks.workunit.client.0.vm00.stdout:2/317: getdents d4 0 2026-03-10T12:37:50.679 INFO:tasks.workunit.client.0.vm00.stdout:2/318: fdatasync d4/d6/d2d/f3d 0 2026-03-10T12:37:50.679 INFO:tasks.workunit.client.0.vm00.stdout:6/293: mkdir d2/d51/d70 0 2026-03-10T12:37:50.679 INFO:tasks.workunit.client.0.vm00.stdout:2/319: dread d4/d6/f22 [0,4194304] 0 2026-03-10T12:37:50.684 INFO:tasks.workunit.client.0.vm00.stdout:9/359: dread d0/d3d/d43/d53/d57/f4f [0,4194304] 0 2026-03-10T12:37:50.686 INFO:tasks.workunit.client.0.vm00.stdout:2/320: creat d4/d6/d2d/d31/f71 x:0 0 0 2026-03-10T12:37:50.687 INFO:tasks.workunit.client.1.vm07.stdout:3/437: getdents dc/dd/d43/d5c 0 2026-03-10T12:37:50.692 INFO:tasks.workunit.client.0.vm00.stdout:2/321: symlink d4/dd/d38/l72 0 2026-03-10T12:37:50.693 INFO:tasks.workunit.client.0.vm00.stdout:2/322: fdatasync d4/dd/d38/f5a 0 2026-03-10T12:37:50.693 INFO:tasks.workunit.client.0.vm00.stdout:2/323: fdatasync d4/d6/f16 0 2026-03-10T12:37:50.694 INFO:tasks.workunit.client.1.vm07.stdout:0/457: write d0/d14/d5f/d41/f55 [3524160,100319] 0 2026-03-10T12:37:50.694 INFO:tasks.workunit.client.1.vm07.stdout:5/454: write d0/d22/d18/d19/d21/f61 [714815,101458] 0 2026-03-10T12:37:50.697 INFO:tasks.workunit.client.0.vm00.stdout:2/324: creat d4/f73 x:0 0 0 2026-03-10T12:37:50.701 INFO:tasks.workunit.client.0.vm00.stdout:7/285: mknod da/d26/d50/c69 0 2026-03-10T12:37:50.731 INFO:tasks.workunit.client.0.vm00.stdout:4/359: dread df/f16 [0,4194304] 0 2026-03-10T12:37:50.732 INFO:tasks.workunit.client.0.vm00.stdout:4/360: mkdir df/d32/d76 0 2026-03-10T12:37:50.733 INFO:tasks.workunit.client.0.vm00.stdout:4/361: mkdir df/d63/d77 0 2026-03-10T12:37:50.734 INFO:tasks.workunit.client.0.vm00.stdout:4/362: rmdir df/d32/d64 39 2026-03-10T12:37:50.736 INFO:tasks.workunit.client.1.vm07.stdout:2/324: dread d0/d42/f2c [0,4194304] 0 2026-03-10T12:37:50.736 INFO:tasks.workunit.client.0.vm00.stdout:4/363: mknod df/d1f/d22/d26/d2e/c78 0 2026-03-10T12:37:50.743 INFO:tasks.workunit.client.0.vm00.stdout:6/294: creat d2/d42/f71 x:0 0 0 2026-03-10T12:37:50.745 INFO:tasks.workunit.client.0.vm00.stdout:6/295: fdatasync d2/d14/f2e 0 2026-03-10T12:37:50.753 INFO:tasks.workunit.client.0.vm00.stdout:1/372: write da/f14 [2368823,78737] 0 2026-03-10T12:37:50.755 INFO:tasks.workunit.client.0.vm00.stdout:1/373: mkdir da/d24/d28/d44/d5d/d72/d7e 0 2026-03-10T12:37:50.756 INFO:tasks.workunit.client.0.vm00.stdout:1/374: stat da/d21/d27/f6e 0 2026-03-10T12:37:50.757 INFO:tasks.workunit.client.0.vm00.stdout:1/375: truncate da/d12/d26/f2e 1987841 0 2026-03-10T12:37:50.758 INFO:tasks.workunit.client.0.vm00.stdout:1/376: fdatasync da/d21/d39/f55 0 2026-03-10T12:37:50.762 INFO:tasks.workunit.client.0.vm00.stdout:1/377: chown da/d24/f47 0 1 2026-03-10T12:37:50.762 INFO:tasks.workunit.client.0.vm00.stdout:1/378: dread - da/d24/d5a/f75 zero size 2026-03-10T12:37:50.762 INFO:tasks.workunit.client.0.vm00.stdout:5/346: truncate d1f/f27 919775 0 2026-03-10T12:37:50.764 INFO:tasks.workunit.client.1.vm07.stdout:1/389: dwrite d9/d2d/d4f/d5a/f6e [0,4194304] 0 2026-03-10T12:37:50.766 INFO:tasks.workunit.client.0.vm00.stdout:5/347: creat d1f/d26/d2e/f75 x:0 0 0 2026-03-10T12:37:50.768 INFO:tasks.workunit.client.0.vm00.stdout:1/379: mknod da/d4d/d78/c7f 0 2026-03-10T12:37:50.769 INFO:tasks.workunit.client.0.vm00.stdout:5/348: mknod d1f/d26/d2e/d58/c76 0 2026-03-10T12:37:50.770 INFO:tasks.workunit.client.0.vm00.stdout:1/380: mkdir da/d24/d28/d44/d5d/d80 0 2026-03-10T12:37:50.773 INFO:tasks.workunit.client.0.vm00.stdout:5/349: getdents d1f/d26/d2b/d35/d53/d5b 0 2026-03-10T12:37:50.777 INFO:tasks.workunit.client.0.vm00.stdout:5/350: dwrite d1f/d6a/f57 [0,4194304] 0 2026-03-10T12:37:50.788 INFO:tasks.workunit.client.1.vm07.stdout:8/423: rmdir d1/d3/d6/d50 39 2026-03-10T12:37:50.789 INFO:tasks.workunit.client.0.vm00.stdout:5/351: fdatasync d1f/d26/d2b/d35/d53/f70 0 2026-03-10T12:37:50.789 INFO:tasks.workunit.client.0.vm00.stdout:5/352: dread - d1f/d26/d2b/d35/d53/f70 zero size 2026-03-10T12:37:50.789 INFO:tasks.workunit.client.0.vm00.stdout:5/353: fsync d1f/d39/f65 0 2026-03-10T12:37:50.789 INFO:tasks.workunit.client.0.vm00.stdout:9/360: dread d0/d5/d16/f24 [0,4194304] 0 2026-03-10T12:37:50.789 INFO:tasks.workunit.client.0.vm00.stdout:9/361: chown d0/d5/d16/f39 1044 1 2026-03-10T12:37:50.789 INFO:tasks.workunit.client.0.vm00.stdout:9/362: chown d0/d5/d16/d1e/d2b/f6b 382 1 2026-03-10T12:37:50.789 INFO:tasks.workunit.client.0.vm00.stdout:3/388: dread dd/f25 [0,4194304] 0 2026-03-10T12:37:50.794 INFO:tasks.workunit.client.0.vm00.stdout:8/269: dread d0/d12/d17/f1d [0,4194304] 0 2026-03-10T12:37:50.797 INFO:tasks.workunit.client.0.vm00.stdout:6/296: fdatasync d2/da/f6a 0 2026-03-10T12:37:50.800 INFO:tasks.workunit.client.0.vm00.stdout:6/297: read d2/d39/f46 [505893,113133] 0 2026-03-10T12:37:50.812 INFO:tasks.workunit.client.1.vm07.stdout:1/390: creat d9/df/d54/f7a x:0 0 0 2026-03-10T12:37:50.812 INFO:tasks.workunit.client.0.vm00.stdout:7/286: rename da/d26/l32 to da/d1b/l6a 0 2026-03-10T12:37:50.812 INFO:tasks.workunit.client.0.vm00.stdout:6/298: write d2/d16/d29/d31/d48/f59 [599537,50597] 0 2026-03-10T12:37:50.815 INFO:tasks.workunit.client.0.vm00.stdout:7/287: dwrite da/d41/f4b [0,4194304] 0 2026-03-10T12:37:50.826 INFO:tasks.workunit.client.1.vm07.stdout:8/424: mkdir d1/d3/d11/d87 0 2026-03-10T12:37:50.829 INFO:tasks.workunit.client.0.vm00.stdout:2/325: dwrite d4/d53/f61 [0,4194304] 0 2026-03-10T12:37:50.831 INFO:tasks.workunit.client.0.vm00.stdout:9/363: creat d0/d5/d16/d19/f7d x:0 0 0 2026-03-10T12:37:50.833 INFO:tasks.workunit.client.1.vm07.stdout:0/458: dread d0/f21 [0,4194304] 0 2026-03-10T12:37:50.834 INFO:tasks.workunit.client.0.vm00.stdout:3/389: creat dd/d3d/d84/f8c x:0 0 0 2026-03-10T12:37:50.836 INFO:tasks.workunit.client.0.vm00.stdout:8/270: mknod d0/dd/c50 0 2026-03-10T12:37:50.838 INFO:tasks.workunit.client.1.vm07.stdout:3/438: mkdir dc/d18/d99/d9c 0 2026-03-10T12:37:50.839 INFO:tasks.workunit.client.1.vm07.stdout:5/455: unlink d0/d22/d18/d19/c48 0 2026-03-10T12:37:50.842 INFO:tasks.workunit.client.0.vm00.stdout:8/271: unlink d0/dd/f20 0 2026-03-10T12:37:50.845 INFO:tasks.workunit.client.0.vm00.stdout:8/272: chown d0/dd/d38/f3d 6985 1 2026-03-10T12:37:50.847 INFO:tasks.workunit.client.0.vm00.stdout:8/273: chown d0/d12/d17/l31 12 1 2026-03-10T12:37:50.859 INFO:tasks.workunit.client.1.vm07.stdout:5/456: dwrite d0/d22/f16 [0,4194304] 0 2026-03-10T12:37:50.859 INFO:tasks.workunit.client.0.vm00.stdout:8/274: chown d0/f28 48387955 1 2026-03-10T12:37:50.859 INFO:tasks.workunit.client.0.vm00.stdout:8/275: write d0/f28 [4472234,63098] 0 2026-03-10T12:37:50.859 INFO:tasks.workunit.client.0.vm00.stdout:8/276: chown d0/d12/d36/d3e/f4a 407470 1 2026-03-10T12:37:50.859 INFO:tasks.workunit.client.0.vm00.stdout:8/277: dread d0/d12/d2d/f44 [0,4194304] 0 2026-03-10T12:37:50.859 INFO:tasks.workunit.client.0.vm00.stdout:8/278: mkdir d0/d12/d36/d51 0 2026-03-10T12:37:50.860 INFO:tasks.workunit.client.0.vm00.stdout:8/279: creat d0/d12/d2d/f52 x:0 0 0 2026-03-10T12:37:50.862 INFO:tasks.workunit.client.0.vm00.stdout:8/280: symlink d0/d12/d36/l53 0 2026-03-10T12:37:50.863 INFO:tasks.workunit.client.0.vm00.stdout:8/281: rmdir d0/dd/d38 39 2026-03-10T12:37:50.863 INFO:tasks.workunit.client.0.vm00.stdout:8/282: chown d0/d12/d2d 23 1 2026-03-10T12:37:50.908 INFO:tasks.workunit.client.0.vm00.stdout:7/288: sync 2026-03-10T12:37:50.908 INFO:tasks.workunit.client.0.vm00.stdout:2/326: sync 2026-03-10T12:37:50.908 INFO:tasks.workunit.client.1.vm07.stdout:2/325: sync 2026-03-10T12:37:50.917 INFO:tasks.workunit.client.1.vm07.stdout:7/390: write d0/f14 [605185,106827] 0 2026-03-10T12:37:50.917 INFO:tasks.workunit.client.0.vm00.stdout:2/327: dread d4/d6/f4e [0,4194304] 0 2026-03-10T12:37:50.920 INFO:tasks.workunit.client.0.vm00.stdout:2/328: write d4/dd/d38/f3f [146690,41790] 0 2026-03-10T12:37:50.920 INFO:tasks.workunit.client.0.vm00.stdout:2/329: write d4/d53/f5d [168517,116127] 0 2026-03-10T12:37:50.923 INFO:tasks.workunit.client.0.vm00.stdout:2/330: creat d4/d6/d2d/d3a/f74 x:0 0 0 2026-03-10T12:37:50.923 INFO:tasks.workunit.client.0.vm00.stdout:2/331: write d4/f73 [42265,31867] 0 2026-03-10T12:37:50.927 INFO:tasks.workunit.client.0.vm00.stdout:2/332: dwrite d4/d6/d2d/d3a/f74 [0,4194304] 0 2026-03-10T12:37:50.928 INFO:tasks.workunit.client.0.vm00.stdout:2/333: write d4/d6/f16 [8040720,112912] 0 2026-03-10T12:37:50.939 INFO:tasks.workunit.client.0.vm00.stdout:2/334: dread - d4/f6e zero size 2026-03-10T12:37:50.940 INFO:tasks.workunit.client.0.vm00.stdout:0/411: write d3/d7/d4c/d5b/f57 [122810,85042] 0 2026-03-10T12:37:50.951 INFO:tasks.workunit.client.1.vm07.stdout:3/439: creat dc/dd/d43/d5c/f9d x:0 0 0 2026-03-10T12:37:50.963 INFO:tasks.workunit.client.0.vm00.stdout:3/390: dread dd/d18/d13/d1d/f69 [0,4194304] 0 2026-03-10T12:37:50.964 INFO:tasks.workunit.client.1.vm07.stdout:5/457: truncate d0/ff 4742408 0 2026-03-10T12:37:50.965 INFO:tasks.workunit.client.1.vm07.stdout:5/458: fsync d0/d22/d18/d19/d36/f3d 0 2026-03-10T12:37:50.966 INFO:tasks.workunit.client.1.vm07.stdout:6/360: truncate d1/d4/d6/d16/d1a/d33/f37 945238 0 2026-03-10T12:37:50.967 INFO:tasks.workunit.client.0.vm00.stdout:3/391: link dd/d18/d13/f6b dd/d18/d14/d2b/f8d 0 2026-03-10T12:37:50.971 INFO:tasks.workunit.client.0.vm00.stdout:3/392: dwrite f7 [8388608,4194304] 0 2026-03-10T12:37:50.971 INFO:tasks.workunit.client.1.vm07.stdout:4/522: getdents d0/d4/d10/d3c/d2b 0 2026-03-10T12:37:50.972 INFO:tasks.workunit.client.0.vm00.stdout:3/393: dread - dd/d64/f87 zero size 2026-03-10T12:37:50.974 INFO:tasks.workunit.client.0.vm00.stdout:3/394: chown dd/d4e/d5d/f71 5 1 2026-03-10T12:37:50.977 INFO:tasks.workunit.client.1.vm07.stdout:9/445: write d5/d1f/d31/d64/f70 [445390,16125] 0 2026-03-10T12:37:50.977 INFO:tasks.workunit.client.0.vm00.stdout:3/395: dwrite dd/d18/f7c [0,4194304] 0 2026-03-10T12:37:50.982 INFO:tasks.workunit.client.0.vm00.stdout:3/396: link dd/d18/d13/c1c dd/d3d/c8e 0 2026-03-10T12:37:50.982 INFO:tasks.workunit.client.0.vm00.stdout:3/397: fsync dd/d3d/d8a/f8b 0 2026-03-10T12:37:50.983 INFO:tasks.workunit.client.0.vm00.stdout:3/398: write dd/d27/d2c/f7d [682445,126335] 0 2026-03-10T12:37:50.988 INFO:tasks.workunit.client.0.vm00.stdout:2/335: dread d4/f1d [4194304,4194304] 0 2026-03-10T12:37:50.988 INFO:tasks.workunit.client.0.vm00.stdout:2/336: readlink d4/d6/d41/l56 0 2026-03-10T12:37:50.997 INFO:tasks.workunit.client.0.vm00.stdout:0/412: link d3/d7/d4c/d5b/d38/c91 d3/d7/d3c/c94 0 2026-03-10T12:37:51.007 INFO:tasks.workunit.client.0.vm00.stdout:0/413: mknod d3/d7/d4c/d5b/d38/d44/d5a/c95 0 2026-03-10T12:37:51.009 INFO:tasks.workunit.client.0.vm00.stdout:3/399: dread dd/d27/d2c/d34/f60 [0,4194304] 0 2026-03-10T12:37:51.012 INFO:tasks.workunit.client.1.vm07.stdout:7/391: chown d0/l41 0 1 2026-03-10T12:37:51.037 INFO:tasks.workunit.client.0.vm00.stdout:6/299: getdents d2/d51 0 2026-03-10T12:37:51.045 INFO:tasks.workunit.client.1.vm07.stdout:1/391: write d9/df/f26 [899087,4034] 0 2026-03-10T12:37:51.049 INFO:tasks.workunit.client.0.vm00.stdout:7/289: write da/d47/f62 [4785336,66661] 0 2026-03-10T12:37:51.049 INFO:tasks.workunit.client.0.vm00.stdout:6/300: truncate d2/f9 3524142 0 2026-03-10T12:37:51.062 INFO:tasks.workunit.client.0.vm00.stdout:7/290: symlink da/d25/l6b 0 2026-03-10T12:37:51.064 INFO:tasks.workunit.client.1.vm07.stdout:7/392: mkdir d0/d61/d79 0 2026-03-10T12:37:51.068 INFO:tasks.workunit.client.0.vm00.stdout:6/301: mknod d2/d16/d29/d31/d48/c72 0 2026-03-10T12:37:51.072 INFO:tasks.workunit.client.0.vm00.stdout:6/302: dwrite d2/da/dc/f25 [0,4194304] 0 2026-03-10T12:37:51.074 INFO:tasks.workunit.client.0.vm00.stdout:4/364: mknod df/d1f/d22/c79 0 2026-03-10T12:37:51.075 INFO:tasks.workunit.client.0.vm00.stdout:4/365: creat df/d63/d6b/d73/f7a x:0 0 0 2026-03-10T12:37:51.078 INFO:tasks.workunit.client.0.vm00.stdout:4/366: dread df/f12 [0,4194304] 0 2026-03-10T12:37:51.081 INFO:tasks.workunit.client.0.vm00.stdout:4/367: getdents df/d1f/d36 0 2026-03-10T12:37:51.084 INFO:tasks.workunit.client.0.vm00.stdout:4/368: mknod df/d32/d76/c7b 0 2026-03-10T12:37:51.085 INFO:tasks.workunit.client.0.vm00.stdout:4/369: dread - df/d1f/d36/d3a/d41/f5e zero size 2026-03-10T12:37:51.090 INFO:tasks.workunit.client.0.vm00.stdout:4/370: getdents df/d1f/d22/d26/d65 0 2026-03-10T12:37:51.096 INFO:tasks.workunit.client.0.vm00.stdout:0/414: truncate d3/d7/d4c/d5b/f56 3160339 0 2026-03-10T12:37:51.103 INFO:tasks.workunit.client.1.vm07.stdout:6/361: mkdir d1/d4/d71 0 2026-03-10T12:37:51.104 INFO:tasks.workunit.client.1.vm07.stdout:6/362: write d1/f38 [3795864,83631] 0 2026-03-10T12:37:51.105 INFO:tasks.workunit.client.1.vm07.stdout:6/363: chown d1/d4/d6/d16/d1a/d2c 137 1 2026-03-10T12:37:51.106 INFO:tasks.workunit.client.1.vm07.stdout:3/440: dwrite dc/dd/d1f/d45/f54 [0,4194304] 0 2026-03-10T12:37:51.108 INFO:tasks.workunit.client.0.vm00.stdout:6/303: read - d2/d16/d29/f4c zero size 2026-03-10T12:37:51.108 INFO:tasks.workunit.client.1.vm07.stdout:6/364: chown d1/d4/d6/d53/f5e 1 1 2026-03-10T12:37:51.110 INFO:tasks.workunit.client.1.vm07.stdout:8/425: dwrite d1/d3/f57 [0,4194304] 0 2026-03-10T12:37:51.117 INFO:tasks.workunit.client.1.vm07.stdout:5/459: rename d0/d22/d18/d19/d2e/d3f/d5c to d0/d22/d18/d3e/d53/d9e 0 2026-03-10T12:37:51.117 INFO:tasks.workunit.client.1.vm07.stdout:5/460: dread - d0/d22/f93 zero size 2026-03-10T12:37:51.123 INFO:tasks.workunit.client.0.vm00.stdout:6/304: truncate d2/d16/d29/f4c 724147 0 2026-03-10T12:37:51.123 INFO:tasks.workunit.client.1.vm07.stdout:2/326: dwrite d0/f40 [0,4194304] 0 2026-03-10T12:37:51.123 INFO:tasks.workunit.client.1.vm07.stdout:5/461: truncate d0/d22/d18/d19/d2e/f59 4709838 0 2026-03-10T12:37:51.123 INFO:tasks.workunit.client.0.vm00.stdout:6/305: chown d2/d16/f23 24093105 1 2026-03-10T12:37:51.124 INFO:tasks.workunit.client.0.vm00.stdout:6/306: write d2/d16/d29/f64 [930810,124677] 0 2026-03-10T12:37:51.124 INFO:tasks.workunit.client.1.vm07.stdout:3/441: dwrite dc/d18/d24/f55 [0,4194304] 0 2026-03-10T12:37:51.134 INFO:tasks.workunit.client.1.vm07.stdout:2/327: write d0/f40 [2011306,46932] 0 2026-03-10T12:37:51.134 INFO:tasks.workunit.client.1.vm07.stdout:9/446: getdents d5/d1f/d75 0 2026-03-10T12:37:51.149 INFO:tasks.workunit.client.0.vm00.stdout:6/307: sync 2026-03-10T12:37:51.156 INFO:tasks.workunit.client.0.vm00.stdout:0/415: link d3/f4 d3/d7/d4c/f96 0 2026-03-10T12:37:51.156 INFO:tasks.workunit.client.1.vm07.stdout:0/459: link d0/d14/d5f/d76/d2f/d31/l3f d0/d14/d5f/d41/d6a/d74/l91 0 2026-03-10T12:37:51.156 INFO:tasks.workunit.client.1.vm07.stdout:7/393: creat d0/d47/d48/f7a x:0 0 0 2026-03-10T12:37:51.160 INFO:tasks.workunit.client.0.vm00.stdout:6/308: mknod d2/d39/c73 0 2026-03-10T12:37:51.161 INFO:tasks.workunit.client.1.vm07.stdout:7/394: dread d0/d47/d48/f4b [0,4194304] 0 2026-03-10T12:37:51.162 INFO:tasks.workunit.client.0.vm00.stdout:0/416: chown d3/d7/c52 0 1 2026-03-10T12:37:51.162 INFO:tasks.workunit.client.0.vm00.stdout:0/417: write d3/d7/d4c/d5b/f57 [640483,66092] 0 2026-03-10T12:37:51.166 INFO:tasks.workunit.client.0.vm00.stdout:6/309: read d2/d39/f46 [145925,57628] 0 2026-03-10T12:37:51.167 INFO:tasks.workunit.client.1.vm07.stdout:6/365: symlink d1/d4/d6/d53/l72 0 2026-03-10T12:37:51.173 INFO:tasks.workunit.client.1.vm07.stdout:6/366: dwrite d1/d4/f5a [0,4194304] 0 2026-03-10T12:37:51.181 INFO:tasks.workunit.client.1.vm07.stdout:4/523: rename d0/d19/c83 to d0/d4/d10/d3c/d2b/d2d/da7/cb1 0 2026-03-10T12:37:51.186 INFO:tasks.workunit.client.0.vm00.stdout:1/381: rename da/d12/d26/f40 to da/d24/f81 0 2026-03-10T12:37:51.188 INFO:tasks.workunit.client.0.vm00.stdout:9/364: rename d0/d5/d16/d1e/d27/l58 to d0/d3d/d59/d4e/l7e 0 2026-03-10T12:37:51.190 INFO:tasks.workunit.client.0.vm00.stdout:5/354: unlink d1f/f21 0 2026-03-10T12:37:51.192 INFO:tasks.workunit.client.0.vm00.stdout:8/283: rename d0/d12/d36/d3e/l4b to d0/d46/l54 0 2026-03-10T12:37:51.193 INFO:tasks.workunit.client.0.vm00.stdout:8/284: chown d0/dd/f4d 381103 1 2026-03-10T12:37:51.194 INFO:tasks.workunit.client.1.vm07.stdout:3/442: rmdir dc 39 2026-03-10T12:37:51.194 INFO:tasks.workunit.client.0.vm00.stdout:5/355: rmdir d1f/d26/d2b/d37 39 2026-03-10T12:37:51.196 INFO:tasks.workunit.client.0.vm00.stdout:3/400: truncate dd/d27/d2c/d34/d38/f48 4324456 0 2026-03-10T12:37:51.198 INFO:tasks.workunit.client.0.vm00.stdout:1/382: mknod da/d24/d28/d44/d5d/d80/c82 0 2026-03-10T12:37:51.198 INFO:tasks.workunit.client.1.vm07.stdout:2/328: mkdir d0/d42/d4e/d56/d70 0 2026-03-10T12:37:51.199 INFO:tasks.workunit.client.0.vm00.stdout:9/365: mkdir d0/d7f 0 2026-03-10T12:37:51.200 INFO:tasks.workunit.client.0.vm00.stdout:9/366: fdatasync d0/d3d/d59/d4e/f70 0 2026-03-10T12:37:51.203 INFO:tasks.workunit.client.1.vm07.stdout:6/367: creat d1/d4/d6/d43/f73 x:0 0 0 2026-03-10T12:37:51.206 INFO:tasks.workunit.client.0.vm00.stdout:2/337: rename d4/f28 to d4/d6/f75 0 2026-03-10T12:37:51.208 INFO:tasks.workunit.client.1.vm07.stdout:3/443: fsync dc/dd/f85 0 2026-03-10T12:37:51.208 INFO:tasks.workunit.client.1.vm07.stdout:4/524: chown d0/d4/d10/d5f/d6d/c6f 0 1 2026-03-10T12:37:51.212 INFO:tasks.workunit.client.1.vm07.stdout:1/392: getdents d9/df/d55 0 2026-03-10T12:37:51.212 INFO:tasks.workunit.client.1.vm07.stdout:6/368: creat d1/d4/d6/d16/d1a/d33/f74 x:0 0 0 2026-03-10T12:37:51.214 INFO:tasks.workunit.client.1.vm07.stdout:6/369: write d1/d4/d6/d4e/d64/f6f [66220,96717] 0 2026-03-10T12:37:51.236 INFO:tasks.workunit.client.1.vm07.stdout:0/460: creat d0/d14/d5f/d76/d2f/d31/d4f/f92 x:0 0 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.1.vm07.stdout:4/525: mkdir d0/d4/d10/d8d/db2 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.1.vm07.stdout:2/329: creat d0/d29/d64/d6c/f71 x:0 0 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.1.vm07.stdout:3/444: dwrite dc/dd/d1f/f6d [0,4194304] 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:7/291: rename da/d25/f4e to da/d26/d37/d56/f6c 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:7/292: read - da/d25/f5a zero size 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:4/371: rename df/d63/d6b/d73/f49 to df/d57/f7c 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:5/356: rename d1f/f25 to d1f/d26/d2b/d37/f77 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:5/357: chown d1f/d26/d2e 1105114 1 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:5/358: fdatasync d1f/f2c 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:5/359: rmdir d1f/d39 39 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:4/372: dread df/f3d [0,4194304] 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:5/360: dwrite d1f/f30 [4194304,4194304] 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:7/293: symlink da/d1b/l6d 0 2026-03-10T12:37:51.237 INFO:tasks.workunit.client.0.vm00.stdout:4/373: write df/f16 [2250924,49764] 0 2026-03-10T12:37:51.242 INFO:tasks.workunit.client.0.vm00.stdout:5/361: mkdir d1f/d26/d2b/d35/d78 0 2026-03-10T12:37:51.250 INFO:tasks.workunit.client.1.vm07.stdout:3/445: write dc/dd/f41 [4708441,74819] 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.0.vm00.stdout:5/362: unlink l9 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.0.vm00.stdout:4/374: link df/d1f/d22/d26/d2e/f50 df/d1f/d22/f7d 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.0.vm00.stdout:4/375: dwrite df/d1f/d22/f52 [0,4194304] 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.0.vm00.stdout:4/376: write df/d1f/d22/d26/f56 [958070,16181] 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.0.vm00.stdout:4/377: dwrite df/d1f/d36/d3a/d41/f47 [0,4194304] 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.0.vm00.stdout:5/363: creat d1f/d26/f79 x:0 0 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.0.vm00.stdout:5/364: mkdir d1f/d26/d2e/d7a 0 2026-03-10T12:37:51.266 INFO:tasks.workunit.client.1.vm07.stdout:1/393: symlink d9/df/d29/l7b 0 2026-03-10T12:37:51.267 INFO:tasks.workunit.client.1.vm07.stdout:2/330: creat d0/d5b/f72 x:0 0 0 2026-03-10T12:37:51.267 INFO:tasks.workunit.client.0.vm00.stdout:6/310: sync 2026-03-10T12:37:51.280 INFO:tasks.workunit.client.1.vm07.stdout:5/462: sync 2026-03-10T12:37:51.280 INFO:tasks.workunit.client.1.vm07.stdout:4/526: creat d0/d4/d5/da/fb3 x:0 0 0 2026-03-10T12:37:51.283 INFO:tasks.workunit.client.0.vm00.stdout:7/294: creat da/d25/d2e/d4c/f6e x:0 0 0 2026-03-10T12:37:51.293 INFO:tasks.workunit.client.0.vm00.stdout:7/295: creat da/d26/d37/f6f x:0 0 0 2026-03-10T12:37:51.300 INFO:tasks.workunit.client.1.vm07.stdout:5/463: symlink d0/d22/d18/d19/d21/d3a/l9f 0 2026-03-10T12:37:51.304 INFO:tasks.workunit.client.1.vm07.stdout:5/464: readlink d0/d22/d18/d3e/d53/d9e/l6e 0 2026-03-10T12:37:51.311 INFO:tasks.workunit.client.0.vm00.stdout:7/296: getdents da/d3f 0 2026-03-10T12:37:51.311 INFO:tasks.workunit.client.0.vm00.stdout:7/297: write da/d25/d2c/d58/f64 [806373,10532] 0 2026-03-10T12:37:51.316 INFO:tasks.workunit.client.0.vm00.stdout:7/298: dread - da/d1b/f1e zero size 2026-03-10T12:37:51.320 INFO:tasks.workunit.client.1.vm07.stdout:2/331: creat d0/f73 x:0 0 0 2026-03-10T12:37:51.320 INFO:tasks.workunit.client.0.vm00.stdout:7/299: symlink da/d26/d37/d61/l70 0 2026-03-10T12:37:51.320 INFO:tasks.workunit.client.0.vm00.stdout:7/300: write da/d25/d2c/d58/f64 [130929,73312] 0 2026-03-10T12:37:51.320 INFO:tasks.workunit.client.0.vm00.stdout:9/367: sync 2026-03-10T12:37:51.320 INFO:tasks.workunit.client.0.vm00.stdout:4/378: sync 2026-03-10T12:37:51.321 INFO:tasks.workunit.client.1.vm07.stdout:1/394: getdents d9/df/d29/d2b/d31 0 2026-03-10T12:37:51.321 INFO:tasks.workunit.client.1.vm07.stdout:4/527: creat d0/d5c/d7c/fb4 x:0 0 0 2026-03-10T12:37:51.328 INFO:tasks.workunit.client.0.vm00.stdout:4/379: dread - df/d1f/d22/d26/f31 zero size 2026-03-10T12:37:51.331 INFO:tasks.workunit.client.0.vm00.stdout:7/301: rmdir da/d1b/d40 39 2026-03-10T12:37:51.337 INFO:tasks.workunit.client.1.vm07.stdout:4/528: rmdir d0/d4/d10/d3c/d2b 39 2026-03-10T12:37:51.337 INFO:tasks.workunit.client.0.vm00.stdout:4/380: creat df/d32/d76/f7e x:0 0 0 2026-03-10T12:37:51.341 INFO:tasks.workunit.client.0.vm00.stdout:7/302: mkdir da/d3f/d71 0 2026-03-10T12:37:51.343 INFO:tasks.workunit.client.1.vm07.stdout:4/529: unlink d0/d19/ca6 0 2026-03-10T12:37:51.344 INFO:tasks.workunit.client.0.vm00.stdout:8/285: chown d0/d12/d36/l40 746139 1 2026-03-10T12:37:51.344 INFO:tasks.workunit.client.0.vm00.stdout:4/381: link df/d32/c62 df/c7f 0 2026-03-10T12:37:51.345 INFO:tasks.workunit.client.0.vm00.stdout:6/311: rename d2/d16/d29/d31/d48 to d2/d16/d74 0 2026-03-10T12:37:51.346 INFO:tasks.workunit.client.0.vm00.stdout:7/303: unlink da/d25/d2c/d58/f64 0 2026-03-10T12:37:51.346 INFO:tasks.workunit.client.0.vm00.stdout:6/312: chown d2/da/dc/d2f/f4f 313327 1 2026-03-10T12:37:51.347 INFO:tasks.workunit.client.0.vm00.stdout:8/286: chown d0/l1 12976 1 2026-03-10T12:37:51.350 INFO:tasks.workunit.client.0.vm00.stdout:0/418: write d3/d7/f11 [2195178,28129] 0 2026-03-10T12:37:51.363 INFO:tasks.workunit.client.1.vm07.stdout:4/530: unlink d0/d4/c77 0 2026-03-10T12:37:51.363 INFO:tasks.workunit.client.1.vm07.stdout:4/531: link d0/d5c/d7c/fb4 d0/d8e/fb5 0 2026-03-10T12:37:51.363 INFO:tasks.workunit.client.0.vm00.stdout:7/304: creat da/d41/f72 x:0 0 0 2026-03-10T12:37:51.363 INFO:tasks.workunit.client.0.vm00.stdout:8/287: dwrite d0/d12/d2d/f52 [0,4194304] 0 2026-03-10T12:37:51.363 INFO:tasks.workunit.client.0.vm00.stdout:6/313: mknod d2/d16/d29/d31/c75 0 2026-03-10T12:37:51.363 INFO:tasks.workunit.client.0.vm00.stdout:7/305: mkdir da/d26/d50/d73 0 2026-03-10T12:37:51.363 INFO:tasks.workunit.client.0.vm00.stdout:8/288: chown d0/d12/f23 16 1 2026-03-10T12:37:51.365 INFO:tasks.workunit.client.0.vm00.stdout:6/314: dwrite d2/d39/f4a [0,4194304] 0 2026-03-10T12:37:51.368 INFO:tasks.workunit.client.0.vm00.stdout:6/315: chown d2/d16/d74/c55 15646645 1 2026-03-10T12:37:51.368 INFO:tasks.workunit.client.0.vm00.stdout:6/316: write d2/d16/f6d [1377135,86883] 0 2026-03-10T12:37:51.371 INFO:tasks.workunit.client.1.vm07.stdout:4/532: creat d0/d4/d10/d5f/fb6 x:0 0 0 2026-03-10T12:37:51.381 INFO:tasks.workunit.client.0.vm00.stdout:7/306: creat da/d1b/d40/f74 x:0 0 0 2026-03-10T12:37:51.396 INFO:tasks.workunit.client.1.vm07.stdout:4/533: truncate d0/d4/d5/d34/fa3 663599 0 2026-03-10T12:37:51.397 INFO:tasks.workunit.client.0.vm00.stdout:1/383: dread f5 [0,4194304] 0 2026-03-10T12:37:51.399 INFO:tasks.workunit.client.0.vm00.stdout:9/368: rename d0/d5/d16 to d0/d3d/d43/d80 0 2026-03-10T12:37:51.399 INFO:tasks.workunit.client.0.vm00.stdout:9/369: stat d0/d3d/d59 0 2026-03-10T12:37:51.400 INFO:tasks.workunit.client.0.vm00.stdout:4/382: sync 2026-03-10T12:37:51.401 INFO:tasks.workunit.client.0.vm00.stdout:0/419: sync 2026-03-10T12:37:51.403 INFO:tasks.workunit.client.0.vm00.stdout:1/384: readlink da/d24/l2c 0 2026-03-10T12:37:51.403 INFO:tasks.workunit.client.0.vm00.stdout:1/385: write da/d21/d27/f6e [708878,124802] 0 2026-03-10T12:37:51.405 INFO:tasks.workunit.client.0.vm00.stdout:8/289: rename d0/f7 to d0/d12/d2d/f55 0 2026-03-10T12:37:51.405 INFO:tasks.workunit.client.0.vm00.stdout:8/290: stat d0/f11 0 2026-03-10T12:37:51.407 INFO:tasks.workunit.client.1.vm07.stdout:9/447: dwrite d5/d16/d23/d26/f42 [0,4194304] 0 2026-03-10T12:37:51.408 INFO:tasks.workunit.client.1.vm07.stdout:8/426: write d1/d3/d18/f32 [397155,115657] 0 2026-03-10T12:37:51.412 INFO:tasks.workunit.client.0.vm00.stdout:9/370: dwrite d0/d3d/d43/d80/f39 [8388608,4194304] 0 2026-03-10T12:37:51.413 INFO:tasks.workunit.client.1.vm07.stdout:7/395: truncate d0/d47/d48/f4b 3454566 0 2026-03-10T12:37:51.414 INFO:tasks.workunit.client.0.vm00.stdout:9/371: write d0/d3d/d43/d53/d57/f67 [79451,106076] 0 2026-03-10T12:37:51.418 INFO:tasks.workunit.client.0.vm00.stdout:1/386: creat da/d24/d28/d44/f83 x:0 0 0 2026-03-10T12:37:51.419 INFO:tasks.workunit.client.0.vm00.stdout:8/291: stat d0/d12/d36/c3c 0 2026-03-10T12:37:51.429 INFO:tasks.workunit.client.0.vm00.stdout:8/292: write d0/f11 [726226,87826] 0 2026-03-10T12:37:51.429 INFO:tasks.workunit.client.1.vm07.stdout:6/370: dwrite d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:37:51.429 INFO:tasks.workunit.client.1.vm07.stdout:0/461: dread d0/d14/d5f/d76/d2f/f5d [0,4194304] 0 2026-03-10T12:37:51.432 INFO:tasks.workunit.client.1.vm07.stdout:2/332: sync 2026-03-10T12:37:51.432 INFO:tasks.workunit.client.1.vm07.stdout:1/395: sync 2026-03-10T12:37:51.433 INFO:tasks.workunit.client.1.vm07.stdout:1/396: chown d9/df/d29/d6b 1288878 1 2026-03-10T12:37:51.433 INFO:tasks.workunit.client.0.vm00.stdout:9/372: symlink d0/d3d/d59/l81 0 2026-03-10T12:37:51.436 INFO:tasks.workunit.client.0.vm00.stdout:1/387: truncate da/d24/d28/f3c 3422766 0 2026-03-10T12:37:51.436 INFO:tasks.workunit.client.0.vm00.stdout:9/373: dwrite d0/d3d/d43/d53/d57/f3f [0,4194304] 0 2026-03-10T12:37:51.440 INFO:tasks.workunit.client.0.vm00.stdout:9/374: dwrite d0/d3d/d59/d4e/f6f [0,4194304] 0 2026-03-10T12:37:51.442 INFO:tasks.workunit.client.1.vm07.stdout:0/462: dread - d0/d14/d5f/d41/f77 zero size 2026-03-10T12:37:51.458 INFO:tasks.workunit.client.0.vm00.stdout:3/401: truncate dd/d4e/d5d/f81 7297 0 2026-03-10T12:37:51.459 INFO:tasks.workunit.client.0.vm00.stdout:3/402: truncate dd/d27/d2c/d34/f60 1454556 0 2026-03-10T12:37:51.465 INFO:tasks.workunit.client.0.vm00.stdout:5/365: truncate f12 2379031 0 2026-03-10T12:37:51.468 INFO:tasks.workunit.client.0.vm00.stdout:1/388: unlink f5 0 2026-03-10T12:37:51.468 INFO:tasks.workunit.client.1.vm07.stdout:4/534: symlink d0/d4/d10/d8d/db2/lb7 0 2026-03-10T12:37:51.469 INFO:tasks.workunit.client.0.vm00.stdout:9/375: symlink d0/d3d/d43/d80/d1e/l82 0 2026-03-10T12:37:51.470 INFO:tasks.workunit.client.0.vm00.stdout:2/338: fdatasync d4/d53/f61 0 2026-03-10T12:37:51.471 INFO:tasks.workunit.client.0.vm00.stdout:5/366: rename d1f/d26/d2e/c4e to d1f/d26/d2b/d35/d53/d72/c7b 0 2026-03-10T12:37:51.477 INFO:tasks.workunit.client.0.vm00.stdout:3/403: creat dd/d3d/d73/f8f x:0 0 0 2026-03-10T12:37:51.478 INFO:tasks.workunit.client.0.vm00.stdout:8/293: creat d0/f56 x:0 0 0 2026-03-10T12:37:51.478 INFO:tasks.workunit.client.0.vm00.stdout:8/294: write d0/f11 [2929268,19823] 0 2026-03-10T12:37:51.480 INFO:tasks.workunit.client.0.vm00.stdout:3/404: creat dd/d3d/d65/f90 x:0 0 0 2026-03-10T12:37:51.481 INFO:tasks.workunit.client.0.vm00.stdout:8/295: dread d0/f28 [0,4194304] 0 2026-03-10T12:37:51.491 INFO:tasks.workunit.client.1.vm07.stdout:3/446: dwrite dc/dd/f96 [0,4194304] 0 2026-03-10T12:37:51.495 INFO:tasks.workunit.client.0.vm00.stdout:9/376: creat d0/d3d/f83 x:0 0 0 2026-03-10T12:37:51.500 INFO:tasks.workunit.client.0.vm00.stdout:9/377: mkdir d0/d3d/d59/d4e/d84 0 2026-03-10T12:37:51.501 INFO:tasks.workunit.client.0.vm00.stdout:9/378: readlink d0/d3d/d43/d80/d1e/l6e 0 2026-03-10T12:37:51.501 INFO:tasks.workunit.client.0.vm00.stdout:8/296: symlink d0/dd/l57 0 2026-03-10T12:37:51.504 INFO:tasks.workunit.client.1.vm07.stdout:3/447: dread dc/dd/d28/d7a/f88 [0,4194304] 0 2026-03-10T12:37:51.505 INFO:tasks.workunit.client.0.vm00.stdout:8/297: mkdir d0/d58 0 2026-03-10T12:37:51.506 INFO:tasks.workunit.client.0.vm00.stdout:9/379: rmdir d0/d3d/d78 0 2026-03-10T12:37:51.507 INFO:tasks.workunit.client.0.vm00.stdout:8/298: readlink d0/d12/l3a 0 2026-03-10T12:37:51.507 INFO:tasks.workunit.client.1.vm07.stdout:9/448: mkdir d5/d16/da3 0 2026-03-10T12:37:51.507 INFO:tasks.workunit.client.1.vm07.stdout:5/465: write d0/d22/d18/d19/d21/f37 [4853664,69709] 0 2026-03-10T12:37:51.508 INFO:tasks.workunit.client.0.vm00.stdout:8/299: chown d0/d12/d36/l53 475740181 1 2026-03-10T12:37:51.508 INFO:tasks.workunit.client.0.vm00.stdout:9/380: mkdir d0/d3d/d43/d80/d1e/d85 0 2026-03-10T12:37:51.511 INFO:tasks.workunit.client.0.vm00.stdout:9/381: write d0/d3d/d43/d80/f24 [3820550,5979] 0 2026-03-10T12:37:51.512 INFO:tasks.workunit.client.0.vm00.stdout:9/382: symlink d0/d3d/d43/d80/d1e/l86 0 2026-03-10T12:37:51.513 INFO:tasks.workunit.client.0.vm00.stdout:9/383: creat d0/d3d/d59/d4e/d84/f87 x:0 0 0 2026-03-10T12:37:51.514 INFO:tasks.workunit.client.0.vm00.stdout:9/384: chown d0/d3d/d43/d80/d19/d50 0 1 2026-03-10T12:37:51.514 INFO:tasks.workunit.client.1.vm07.stdout:1/397: fsync d9/df/d54/f57 0 2026-03-10T12:37:51.514 INFO:tasks.workunit.client.0.vm00.stdout:9/385: stat d0/d3d/d59/l81 0 2026-03-10T12:37:51.521 INFO:tasks.workunit.client.1.vm07.stdout:9/449: dread d5/d13/d2c/f41 [0,4194304] 0 2026-03-10T12:37:51.523 INFO:tasks.workunit.client.0.vm00.stdout:4/383: dwrite df/f20 [0,4194304] 0 2026-03-10T12:37:51.532 INFO:tasks.workunit.client.0.vm00.stdout:9/386: dread d0/f21 [4194304,4194304] 0 2026-03-10T12:37:51.543 INFO:tasks.workunit.client.1.vm07.stdout:1/398: dread d9/df/d29/d2b/d30/f38 [0,4194304] 0 2026-03-10T12:37:51.544 INFO:tasks.workunit.client.0.vm00.stdout:4/384: dwrite df/f12 [0,4194304] 0 2026-03-10T12:37:51.549 INFO:tasks.workunit.client.0.vm00.stdout:9/387: mkdir d0/d7f/d88 0 2026-03-10T12:37:51.557 INFO:tasks.workunit.client.0.vm00.stdout:6/317: mknod d2/da/c76 0 2026-03-10T12:37:51.557 INFO:tasks.workunit.client.0.vm00.stdout:6/318: chown d2/da/dc/l50 165584135 1 2026-03-10T12:37:51.557 INFO:tasks.workunit.client.0.vm00.stdout:6/319: readlink d2/d16/l65 0 2026-03-10T12:37:51.557 INFO:tasks.workunit.client.0.vm00.stdout:7/307: mknod da/d25/d2e/d4c/c75 0 2026-03-10T12:37:51.567 INFO:tasks.workunit.client.0.vm00.stdout:1/389: dwrite da/d21/f74 [0,4194304] 0 2026-03-10T12:37:51.569 INFO:tasks.workunit.client.0.vm00.stdout:1/390: dread da/d12/f62 [0,4194304] 0 2026-03-10T12:37:51.570 INFO:tasks.workunit.client.0.vm00.stdout:0/420: truncate d3/d33/f4d 206479 0 2026-03-10T12:37:51.573 INFO:tasks.workunit.client.0.vm00.stdout:5/367: dwrite d1f/f27 [0,4194304] 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.0.vm00.stdout:3/405: rmdir dd/d3d 39 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.0.vm00.stdout:3/406: chown dd/d18/d14/d2b/l5c 1907865 1 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.0.vm00.stdout:0/421: dwrite d3/d7/d4c/d5b/d38/f8b [0,4194304] 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.0.vm00.stdout:3/407: dread dd/f25 [0,4194304] 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.1.vm07.stdout:5/466: unlink d0/d22/d18/d19/d2e/d3f/c7e 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.1.vm07.stdout:8/427: creat d1/f88 x:0 0 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.1.vm07.stdout:9/450: mkdir d5/d13/d6c/da4 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.1.vm07.stdout:7/396: creat d0/f7b x:0 0 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.1.vm07.stdout:0/463: mkdir d0/d14/d5f/d76/d93 0 2026-03-10T12:37:51.587 INFO:tasks.workunit.client.1.vm07.stdout:2/333: write d0/f2d [1024702,122755] 0 2026-03-10T12:37:51.589 INFO:tasks.workunit.client.0.vm00.stdout:5/368: mknod d1f/d6a/c7c 0 2026-03-10T12:37:51.590 INFO:tasks.workunit.client.0.vm00.stdout:5/369: write d1f/d26/d2b/d35/d53/f70 [659199,42498] 0 2026-03-10T12:37:51.598 INFO:tasks.workunit.client.0.vm00.stdout:5/370: dwrite f11 [4194304,4194304] 0 2026-03-10T12:37:51.603 INFO:tasks.workunit.client.0.vm00.stdout:5/371: dwrite d1f/d26/d2e/d58/f63 [0,4194304] 0 2026-03-10T12:37:51.607 INFO:tasks.workunit.client.0.vm00.stdout:6/320: link d2/d42/f71 d2/da/f77 0 2026-03-10T12:37:51.608 INFO:tasks.workunit.client.0.vm00.stdout:5/372: truncate d1f/d26/d2e/f3c 3660287 0 2026-03-10T12:37:51.613 INFO:tasks.workunit.client.0.vm00.stdout:5/373: symlink d1f/d26/d6f/l7d 0 2026-03-10T12:37:51.614 INFO:tasks.workunit.client.0.vm00.stdout:5/374: write d1f/d6a/f57 [3288321,66017] 0 2026-03-10T12:37:51.625 INFO:tasks.workunit.client.0.vm00.stdout:4/385: sync 2026-03-10T12:37:51.625 INFO:tasks.workunit.client.0.vm00.stdout:3/408: sync 2026-03-10T12:37:51.625 INFO:tasks.workunit.client.0.vm00.stdout:0/422: sync 2026-03-10T12:37:51.625 INFO:tasks.workunit.client.0.vm00.stdout:3/409: read dd/d18/d13/f6b [1182771,58521] 0 2026-03-10T12:37:51.626 INFO:tasks.workunit.client.0.vm00.stdout:4/386: chown df/d1f/d22/d26/d2e 196 1 2026-03-10T12:37:51.626 INFO:tasks.workunit.client.0.vm00.stdout:0/423: chown d3/d7/d4c/d5b/f57 9608204 1 2026-03-10T12:37:51.634 INFO:tasks.workunit.client.0.vm00.stdout:4/387: mknod df/d1f/d22/d26/d65/c80 0 2026-03-10T12:37:51.634 INFO:tasks.workunit.client.0.vm00.stdout:4/388: readlink df/d1f/d22/l3f 0 2026-03-10T12:37:51.634 INFO:tasks.workunit.client.0.vm00.stdout:3/410: unlink dd/d3d/f54 0 2026-03-10T12:37:51.635 INFO:tasks.workunit.client.0.vm00.stdout:4/389: mknod df/d32/d76/c81 0 2026-03-10T12:37:51.636 INFO:tasks.workunit.client.0.vm00.stdout:3/411: write dd/d27/d2c/d34/f60 [1594175,18] 0 2026-03-10T12:37:51.637 INFO:tasks.workunit.client.0.vm00.stdout:3/412: dread - dd/d27/d2c/d34/d45/f75 zero size 2026-03-10T12:37:51.640 INFO:tasks.workunit.client.0.vm00.stdout:4/390: link df/f19 df/d32/d76/f82 0 2026-03-10T12:37:51.640 INFO:tasks.workunit.client.0.vm00.stdout:3/413: link dd/d3d/d65/f90 dd/d27/f91 0 2026-03-10T12:37:51.642 INFO:tasks.workunit.client.0.vm00.stdout:3/414: getdents dd 0 2026-03-10T12:37:51.643 INFO:tasks.workunit.client.0.vm00.stdout:3/415: mkdir dd/d64/d92 0 2026-03-10T12:37:51.644 INFO:tasks.workunit.client.0.vm00.stdout:3/416: fdatasync dd/d64/f7b 0 2026-03-10T12:37:51.645 INFO:tasks.workunit.client.0.vm00.stdout:2/339: truncate d4/f1d 7440508 0 2026-03-10T12:37:51.647 INFO:tasks.workunit.client.0.vm00.stdout:7/308: symlink da/d41/d48/l76 0 2026-03-10T12:37:51.648 INFO:tasks.workunit.client.0.vm00.stdout:2/340: mkdir d4/d53/d76 0 2026-03-10T12:37:51.650 INFO:tasks.workunit.client.0.vm00.stdout:2/341: unlink d4/d6/l7 0 2026-03-10T12:37:51.653 INFO:tasks.workunit.client.0.vm00.stdout:2/342: rmdir d4/d6/d2d/d3a/d43/d51 39 2026-03-10T12:37:51.653 INFO:tasks.workunit.client.0.vm00.stdout:2/343: read d4/d53/f61 [3449545,126130] 0 2026-03-10T12:37:51.654 INFO:tasks.workunit.client.0.vm00.stdout:0/424: write f2 [3573756,48341] 0 2026-03-10T12:37:51.661 INFO:tasks.workunit.client.1.vm07.stdout:5/467: read d0/d22/d18/d19/d2e/f52 [221593,125334] 0 2026-03-10T12:37:51.661 INFO:tasks.workunit.client.1.vm07.stdout:8/428: dread - d1/d3/d6/d50/f5e zero size 2026-03-10T12:37:51.662 INFO:tasks.workunit.client.0.vm00.stdout:2/344: mknod d4/d6/d41/c77 0 2026-03-10T12:37:51.662 INFO:tasks.workunit.client.0.vm00.stdout:0/425: fsync d3/d7/d4c/d5b/d38/f89 0 2026-03-10T12:37:51.662 INFO:tasks.workunit.client.0.vm00.stdout:2/345: mkdir d4/d78 0 2026-03-10T12:37:51.662 INFO:tasks.workunit.client.0.vm00.stdout:7/309: symlink da/d3f/d71/l77 0 2026-03-10T12:37:51.662 INFO:tasks.workunit.client.0.vm00.stdout:2/346: creat d4/d6/d2d/d31/f79 x:0 0 0 2026-03-10T12:37:51.662 INFO:tasks.workunit.client.0.vm00.stdout:2/347: chown d4/d6/d2d/d3a/f74 1135 1 2026-03-10T12:37:51.663 INFO:tasks.workunit.client.1.vm07.stdout:7/397: symlink d0/d67/d6f/l7c 0 2026-03-10T12:37:51.663 INFO:tasks.workunit.client.1.vm07.stdout:9/451: write d5/d1f/d7d/f7f [452024,130006] 0 2026-03-10T12:37:51.667 INFO:tasks.workunit.client.0.vm00.stdout:8/300: truncate d0/f28 3948586 0 2026-03-10T12:37:51.668 INFO:tasks.workunit.client.0.vm00.stdout:8/301: chown d0/d12/f27 66980232 1 2026-03-10T12:37:51.668 INFO:tasks.workunit.client.1.vm07.stdout:2/334: mkdir d0/d29/d64/d74 0 2026-03-10T12:37:51.673 INFO:tasks.workunit.client.1.vm07.stdout:6/371: getdents d1/d4/d6/d16/d1a 0 2026-03-10T12:37:51.677 INFO:tasks.workunit.client.1.vm07.stdout:6/372: dread d1/d4/d6/d16/d49/f67 [0,4194304] 0 2026-03-10T12:37:51.683 INFO:tasks.workunit.client.1.vm07.stdout:4/535: creat d0/d4/fb8 x:0 0 0 2026-03-10T12:37:51.685 INFO:tasks.workunit.client.0.vm00.stdout:4/391: sync 2026-03-10T12:37:51.688 INFO:tasks.workunit.client.0.vm00.stdout:2/348: sync 2026-03-10T12:37:51.688 INFO:tasks.workunit.client.1.vm07.stdout:5/468: dread - d0/d22/d18/f86 zero size 2026-03-10T12:37:51.688 INFO:tasks.workunit.client.0.vm00.stdout:6/321: dread d2/da/dc/f45 [0,4194304] 0 2026-03-10T12:37:51.689 INFO:tasks.workunit.client.1.vm07.stdout:8/429: read - d1/d3/d11/f46 zero size 2026-03-10T12:37:51.693 INFO:tasks.workunit.client.1.vm07.stdout:8/430: dread - d1/d3/d6/d50/d70/f7f zero size 2026-03-10T12:37:51.694 INFO:tasks.workunit.client.1.vm07.stdout:8/431: write d1/d3/d18/f75 [1296466,78803] 0 2026-03-10T12:37:51.694 INFO:tasks.workunit.client.1.vm07.stdout:8/432: readlink d1/d3/d6/l78 0 2026-03-10T12:37:51.694 INFO:tasks.workunit.client.0.vm00.stdout:4/392: mknod df/d63/c83 0 2026-03-10T12:37:51.698 INFO:tasks.workunit.client.0.vm00.stdout:4/393: dwrite df/d1f/d36/d3a/d41/f2f [0,4194304] 0 2026-03-10T12:37:51.698 INFO:tasks.workunit.client.0.vm00.stdout:5/375: dread d1f/d26/d2b/d37/f4c [0,4194304] 0 2026-03-10T12:37:51.699 INFO:tasks.workunit.client.0.vm00.stdout:5/376: fdatasync d1f/d26/d2b/f44 0 2026-03-10T12:37:51.702 INFO:tasks.workunit.client.1.vm07.stdout:3/448: dwrite dc/d18/d24/f2c [4194304,4194304] 0 2026-03-10T12:37:51.711 INFO:tasks.workunit.client.0.vm00.stdout:3/417: dread dd/d27/f35 [4194304,4194304] 0 2026-03-10T12:37:51.712 INFO:tasks.workunit.client.0.vm00.stdout:2/349: creat d4/d6/d41/f7a x:0 0 0 2026-03-10T12:37:51.713 INFO:tasks.workunit.client.1.vm07.stdout:9/452: dread d5/d13/d57/f95 [0,4194304] 0 2026-03-10T12:37:51.717 INFO:tasks.workunit.client.0.vm00.stdout:3/418: mkdir dd/d64/d93 0 2026-03-10T12:37:51.718 INFO:tasks.workunit.client.0.vm00.stdout:2/350: write d4/dd/d38/f3f [2089828,50923] 0 2026-03-10T12:37:51.722 INFO:tasks.workunit.client.0.vm00.stdout:4/394: creat df/d1f/d22/d26/f84 x:0 0 0 2026-03-10T12:37:51.723 INFO:tasks.workunit.client.0.vm00.stdout:5/377: creat d1f/d26/d2b/f7e x:0 0 0 2026-03-10T12:37:51.724 INFO:tasks.workunit.client.0.vm00.stdout:5/378: dread d1f/d26/d2b/d37/f4c [0,4194304] 0 2026-03-10T12:37:51.728 INFO:tasks.workunit.client.1.vm07.stdout:5/469: creat d0/d22/d18/d19/d2e/d67/fa0 x:0 0 0 2026-03-10T12:37:51.733 INFO:tasks.workunit.client.0.vm00.stdout:2/351: creat d4/f7b x:0 0 0 2026-03-10T12:37:51.734 INFO:tasks.workunit.client.0.vm00.stdout:2/352: write d4/d6/d2d/d31/f46 [187759,86351] 0 2026-03-10T12:37:51.734 INFO:tasks.workunit.client.0.vm00.stdout:5/379: mkdir d1f/d26/d2b/d35/d78/d7f 0 2026-03-10T12:37:51.735 INFO:tasks.workunit.client.0.vm00.stdout:3/419: getdents dd/d3d/d65 0 2026-03-10T12:37:51.735 INFO:tasks.workunit.client.0.vm00.stdout:2/353: chown d4/d53 2650365 1 2026-03-10T12:37:51.735 INFO:tasks.workunit.client.0.vm00.stdout:3/420: dread - dd/d27/d2c/f89 zero size 2026-03-10T12:37:51.735 INFO:tasks.workunit.client.0.vm00.stdout:2/354: chown d4/d6/d2d/d3a/f44 0 1 2026-03-10T12:37:51.737 INFO:tasks.workunit.client.0.vm00.stdout:4/395: creat df/f85 x:0 0 0 2026-03-10T12:37:51.738 INFO:tasks.workunit.client.0.vm00.stdout:2/355: creat d4/d6/d2d/d3a/f7c x:0 0 0 2026-03-10T12:37:51.739 INFO:tasks.workunit.client.0.vm00.stdout:5/380: creat d1f/d26/d2e/d58/d6b/f80 x:0 0 0 2026-03-10T12:37:51.741 INFO:tasks.workunit.client.0.vm00.stdout:3/421: symlink dd/d18/l94 0 2026-03-10T12:37:51.743 INFO:tasks.workunit.client.0.vm00.stdout:5/381: unlink d1f/d26/d2e/d58/f63 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.1.vm07.stdout:3/449: creat dc/dd/d28/d3b/f9e x:0 0 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.1.vm07.stdout:9/453: symlink d5/d69/d93/la5 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.1.vm07.stdout:1/399: getdents d9/df/d29/d2b/d30 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.0.vm00.stdout:5/382: dread - d1f/d26/d2e/d58/d6b/f80 zero size 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.0.vm00.stdout:3/422: chown dd/d27/d2c/d34/d38/f48 63378273 1 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.0.vm00.stdout:3/423: write dd/d2a/f78 [3815126,96387] 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.0.vm00.stdout:4/396: getdents df/d63/d6b/d73 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.0.vm00.stdout:4/397: symlink df/d1f/d22/d26/d2e/l86 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.0.vm00.stdout:5/383: link f19 d1f/d26/d2b/d37/f81 0 2026-03-10T12:37:51.756 INFO:tasks.workunit.client.0.vm00.stdout:5/384: mknod d1f/d26/d6f/c82 0 2026-03-10T12:37:51.762 INFO:tasks.workunit.client.0.vm00.stdout:4/398: mknod df/d1f/d22/d26/c87 0 2026-03-10T12:37:51.763 INFO:tasks.workunit.client.0.vm00.stdout:4/399: stat df/f85 0 2026-03-10T12:37:51.764 INFO:tasks.workunit.client.1.vm07.stdout:3/450: creat dc/dd/d28/d3b/f9f x:0 0 0 2026-03-10T12:37:51.768 INFO:tasks.workunit.client.0.vm00.stdout:4/400: dread df/d1f/d22/d26/f56 [0,4194304] 0 2026-03-10T12:37:51.770 INFO:tasks.workunit.client.1.vm07.stdout:7/398: link d0/c3e d0/c7d 0 2026-03-10T12:37:51.771 INFO:tasks.workunit.client.0.vm00.stdout:9/388: truncate d0/d3d/d59/d4e/f6f 2974044 0 2026-03-10T12:37:51.771 INFO:tasks.workunit.client.0.vm00.stdout:9/389: chown d0/d3d/d43/f68 221081619 1 2026-03-10T12:37:51.771 INFO:tasks.workunit.client.0.vm00.stdout:1/391: truncate da/d24/d5a/f68 264078 0 2026-03-10T12:37:51.772 INFO:tasks.workunit.client.0.vm00.stdout:9/390: write d0/d3d/d43/f68 [74842,109] 0 2026-03-10T12:37:51.774 INFO:tasks.workunit.client.0.vm00.stdout:3/424: dread dd/d18/d14/d2b/f31 [0,4194304] 0 2026-03-10T12:37:51.774 INFO:tasks.workunit.client.0.vm00.stdout:3/425: chown dd/d3d 2383 1 2026-03-10T12:37:51.776 INFO:tasks.workunit.client.0.vm00.stdout:3/426: write dd/d64/f87 [618190,59467] 0 2026-03-10T12:37:51.778 INFO:tasks.workunit.client.1.vm07.stdout:6/373: getdents d1/d4/d6/d53 0 2026-03-10T12:37:51.778 INFO:tasks.workunit.client.1.vm07.stdout:9/454: creat d5/d13/d6c/da4/fa6 x:0 0 0 2026-03-10T12:37:51.784 INFO:tasks.workunit.client.1.vm07.stdout:9/455: dwrite d5/d16/f35 [0,4194304] 0 2026-03-10T12:37:51.785 INFO:tasks.workunit.client.1.vm07.stdout:6/374: truncate d1/d4/d6/d53/f5e 867366 0 2026-03-10T12:37:51.787 INFO:tasks.workunit.client.0.vm00.stdout:5/385: symlink d1f/d26/d2b/d35/d53/d5b/l83 0 2026-03-10T12:37:51.789 INFO:tasks.workunit.client.0.vm00.stdout:4/401: dread df/d1f/d36/d3a/f6e [0,4194304] 0 2026-03-10T12:37:51.791 INFO:tasks.workunit.client.0.vm00.stdout:4/402: dread f3 [0,4194304] 0 2026-03-10T12:37:51.796 INFO:tasks.workunit.client.0.vm00.stdout:9/391: mknod d0/d3d/d43/d80/d1e/d27/c89 0 2026-03-10T12:37:51.810 INFO:tasks.workunit.client.0.vm00.stdout:4/403: dread f8 [0,4194304] 0 2026-03-10T12:37:51.811 INFO:tasks.workunit.client.1.vm07.stdout:3/451: mkdir dc/dd/d43/d76/d95/da0 0 2026-03-10T12:37:51.811 INFO:tasks.workunit.client.1.vm07.stdout:3/452: chown dc/dd/d28/f67 186 1 2026-03-10T12:37:51.811 INFO:tasks.workunit.client.0.vm00.stdout:1/392: rename da/d24/l63 to da/d21/d27/d6a/l84 0 2026-03-10T12:37:51.813 INFO:tasks.workunit.client.0.vm00.stdout:8/302: write d0/d12/f34 [5321555,11957] 0 2026-03-10T12:37:51.816 INFO:tasks.workunit.client.0.vm00.stdout:1/393: dwrite da/d12/f20 [0,4194304] 0 2026-03-10T12:37:51.819 INFO:tasks.workunit.client.0.vm00.stdout:9/392: creat d0/d3d/d43/d53/d57/f8a x:0 0 0 2026-03-10T12:37:51.824 INFO:tasks.workunit.client.1.vm07.stdout:6/375: symlink d1/d4/d6/d46/d4d/l75 0 2026-03-10T12:37:51.825 INFO:tasks.workunit.client.1.vm07.stdout:9/456: creat d5/d13/d57/fa7 x:0 0 0 2026-03-10T12:37:51.825 INFO:tasks.workunit.client.1.vm07.stdout:6/376: chown d1/d4/d6/d4e/l6d 909829 1 2026-03-10T12:37:51.837 INFO:tasks.workunit.client.1.vm07.stdout:0/464: rename d0/c2 to d0/d14/d5f/d76/d2f/d31/c94 0 2026-03-10T12:37:51.843 INFO:tasks.workunit.client.0.vm00.stdout:1/394: symlink da/d24/d28/d44/d59/l85 0 2026-03-10T12:37:51.844 INFO:tasks.workunit.client.0.vm00.stdout:2/356: write d4/dd/f10 [2183055,108997] 0 2026-03-10T12:37:51.844 INFO:tasks.workunit.client.0.vm00.stdout:1/395: readlink da/d21/d27/l60 0 2026-03-10T12:37:51.845 INFO:tasks.workunit.client.0.vm00.stdout:2/357: readlink d4/dd/l29 0 2026-03-10T12:37:51.848 INFO:tasks.workunit.client.0.vm00.stdout:9/393: dwrite d0/d3d/d43/d80/f49 [0,4194304] 0 2026-03-10T12:37:51.852 INFO:tasks.workunit.client.1.vm07.stdout:1/400: creat d9/df/d29/d2b/f7c x:0 0 0 2026-03-10T12:37:51.852 INFO:tasks.workunit.client.0.vm00.stdout:9/394: dwrite d0/d3d/d43/d80/d1e/d2b/f47 [0,4194304] 0 2026-03-10T12:37:51.856 INFO:tasks.workunit.client.0.vm00.stdout:9/395: write d0/d3d/d59/f4a [2133619,75117] 0 2026-03-10T12:37:51.866 INFO:tasks.workunit.client.0.vm00.stdout:1/396: write da/d12/f66 [716702,57252] 0 2026-03-10T12:37:51.867 INFO:tasks.workunit.client.0.vm00.stdout:2/358: rmdir d4/d6/d2d/d3a/d43 39 2026-03-10T12:37:51.870 INFO:tasks.workunit.client.0.vm00.stdout:1/397: dwrite da/d24/d28/d44/f83 [0,4194304] 0 2026-03-10T12:37:51.874 INFO:tasks.workunit.client.0.vm00.stdout:4/404: rename df/d1f/d36/c3e to df/d1f/d22/c88 0 2026-03-10T12:37:51.874 INFO:tasks.workunit.client.0.vm00.stdout:4/405: chown df/d63/c6d 95568 1 2026-03-10T12:37:51.875 INFO:tasks.workunit.client.0.vm00.stdout:4/406: dread - df/d1f/d22/f4c zero size 2026-03-10T12:37:51.875 INFO:tasks.workunit.client.0.vm00.stdout:4/407: write df/f12 [249896,57185] 0 2026-03-10T12:37:51.878 INFO:tasks.workunit.client.0.vm00.stdout:9/396: creat d0/d3d/d43/d53/d57/f8b x:0 0 0 2026-03-10T12:37:51.887 INFO:tasks.workunit.client.0.vm00.stdout:2/359: creat d4/d53/f7d x:0 0 0 2026-03-10T12:37:51.891 INFO:tasks.workunit.client.0.vm00.stdout:8/303: rename d0/d12/d17/l31 to d0/d12/d36/d3e/l59 0 2026-03-10T12:37:51.897 INFO:tasks.workunit.client.0.vm00.stdout:3/427: truncate dd/d3d/f50 725443 0 2026-03-10T12:37:51.898 INFO:tasks.workunit.client.0.vm00.stdout:3/428: dread - dd/d3d/d84/f8c zero size 2026-03-10T12:37:51.898 INFO:tasks.workunit.client.0.vm00.stdout:3/429: dread dd/d27/d2c/d34/d45/f47 [0,4194304] 0 2026-03-10T12:37:51.898 INFO:tasks.workunit.client.0.vm00.stdout:3/430: dread - dd/d3d/d65/f90 zero size 2026-03-10T12:37:51.898 INFO:tasks.workunit.client.1.vm07.stdout:5/470: rename d0/d22/d18/d19/d36/f79 to d0/d22/d18/d19/d21/fa1 0 2026-03-10T12:37:51.900 INFO:tasks.workunit.client.0.vm00.stdout:4/408: creat df/d63/d6b/f89 x:0 0 0 2026-03-10T12:37:51.901 INFO:tasks.workunit.client.1.vm07.stdout:0/465: symlink d0/d14/d5f/d41/d6a/l95 0 2026-03-10T12:37:51.905 INFO:tasks.workunit.client.0.vm00.stdout:2/360: creat d4/d6/d41/d6d/d40/f7e x:0 0 0 2026-03-10T12:37:51.906 INFO:tasks.workunit.client.0.vm00.stdout:3/431: creat dd/d18/d13/d1d/d43/f95 x:0 0 0 2026-03-10T12:37:51.906 INFO:tasks.workunit.client.0.vm00.stdout:9/397: write d0/f17 [3258192,64102] 0 2026-03-10T12:37:51.908 INFO:tasks.workunit.client.1.vm07.stdout:1/401: creat d9/df/d29/d2b/d31/f7d x:0 0 0 2026-03-10T12:37:51.916 INFO:tasks.workunit.client.0.vm00.stdout:9/398: getdents d0/d5 0 2026-03-10T12:37:51.918 INFO:tasks.workunit.client.0.vm00.stdout:3/432: getdents dd/d18/d14/d2b 0 2026-03-10T12:37:51.920 INFO:tasks.workunit.client.1.vm07.stdout:5/471: creat d0/d22/d18/d19/d21/d3a/fa2 x:0 0 0 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.1.vm07.stdout:5/472: chown d0/d22/d18/d19/d2e/d3f 1034786 1 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.1.vm07.stdout:5/473: dread - d0/d22/d18/d19/d21/d3a/fa2 zero size 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.1.vm07.stdout:3/453: creat dc/d18/fa1 x:0 0 0 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.0.vm00.stdout:3/433: dwrite dd/d4e/d5d/f71 [0,4194304] 0 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.0.vm00.stdout:2/361: link d4/c37 d4/c7f 0 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.0.vm00.stdout:3/434: symlink dd/d3d/d65/l96 0 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.0.vm00.stdout:2/362: creat d4/d6/d41/d6d/d40/f80 x:0 0 0 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.0.vm00.stdout:2/363: chown d4/d53/d76 184577407 1 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.0.vm00.stdout:3/435: symlink dd/d18/d13/d1d/d43/d55/l97 0 2026-03-10T12:37:51.930 INFO:tasks.workunit.client.0.vm00.stdout:8/304: sync 2026-03-10T12:37:51.933 INFO:tasks.workunit.client.0.vm00.stdout:2/364: symlink d4/d6/d2d/l81 0 2026-03-10T12:37:51.934 INFO:tasks.workunit.client.0.vm00.stdout:3/436: write dd/d18/d13/f22 [1814841,47071] 0 2026-03-10T12:37:51.934 INFO:tasks.workunit.client.0.vm00.stdout:3/437: chown dd/f25 15 1 2026-03-10T12:37:51.934 INFO:tasks.workunit.client.0.vm00.stdout:3/438: readlink dd/d3d/l62 0 2026-03-10T12:37:51.937 INFO:tasks.workunit.client.0.vm00.stdout:3/439: dwrite dd/d64/f5e [0,4194304] 0 2026-03-10T12:37:51.940 INFO:tasks.workunit.client.0.vm00.stdout:9/399: getdents d0/d3d/d59/d4e 0 2026-03-10T12:37:51.941 INFO:tasks.workunit.client.0.vm00.stdout:2/365: sync 2026-03-10T12:37:51.941 INFO:tasks.workunit.client.1.vm07.stdout:6/377: rename d1/f1e to d1/d4/d6/d43/d65/f76 0 2026-03-10T12:37:51.943 INFO:tasks.workunit.client.1.vm07.stdout:0/466: creat d0/d14/d5f/d41/d86/f96 x:0 0 0 2026-03-10T12:37:51.948 INFO:tasks.workunit.client.0.vm00.stdout:3/440: dread dd/d18/d14/f3c [0,4194304] 0 2026-03-10T12:37:51.951 INFO:tasks.workunit.client.1.vm07.stdout:1/402: creat d9/df/d79/f7e x:0 0 0 2026-03-10T12:37:51.952 INFO:tasks.workunit.client.0.vm00.stdout:8/305: symlink d0/d12/d2d/d49/l5a 0 2026-03-10T12:37:51.956 INFO:tasks.workunit.client.1.vm07.stdout:2/335: write d0/f15 [675702,55092] 0 2026-03-10T12:37:51.957 INFO:tasks.workunit.client.0.vm00.stdout:6/322: dwrite d2/f5e [0,4194304] 0 2026-03-10T12:37:51.958 INFO:tasks.workunit.client.1.vm07.stdout:2/336: chown d0/d29/l34 46 1 2026-03-10T12:37:51.960 INFO:tasks.workunit.client.0.vm00.stdout:2/366: symlink d4/d6/d41/l82 0 2026-03-10T12:37:51.960 INFO:tasks.workunit.client.1.vm07.stdout:4/536: dwrite d0/d4/d7a/d46/d76/fa0 [0,4194304] 0 2026-03-10T12:37:51.960 INFO:tasks.workunit.client.0.vm00.stdout:2/367: chown d4/d6/d2d/d31/f71 220685 1 2026-03-10T12:37:51.961 INFO:tasks.workunit.client.0.vm00.stdout:2/368: chown d4/d53/d68 227107 1 2026-03-10T12:37:51.967 INFO:tasks.workunit.client.0.vm00.stdout:3/441: rename dd/d64/f5e to dd/d64/f98 0 2026-03-10T12:37:51.967 INFO:tasks.workunit.client.0.vm00.stdout:3/442: chown dd/d18/f7c 1134012397 1 2026-03-10T12:37:51.974 INFO:tasks.workunit.client.0.vm00.stdout:8/306: mkdir d0/d12/d36/d5b 0 2026-03-10T12:37:51.985 INFO:tasks.workunit.client.1.vm07.stdout:5/474: rename d0/d22/d18/d19/d21/d54/f8a to d0/d22/d18/d3e/d53/fa3 0 2026-03-10T12:37:51.985 INFO:tasks.workunit.client.1.vm07.stdout:8/433: dwrite d1/d3/f73 [0,4194304] 0 2026-03-10T12:37:51.985 INFO:tasks.workunit.client.0.vm00.stdout:5/386: write d1f/f46 [2802918,86828] 0 2026-03-10T12:37:51.985 INFO:tasks.workunit.client.0.vm00.stdout:2/369: rmdir d4/d6/d2d/d3a 39 2026-03-10T12:37:51.986 INFO:tasks.workunit.client.0.vm00.stdout:3/443: mkdir dd/d18/d13/d99 0 2026-03-10T12:37:51.986 INFO:tasks.workunit.client.0.vm00.stdout:3/444: stat dd/d18/d13/d1d/d43/d55/c85 0 2026-03-10T12:37:51.986 INFO:tasks.workunit.client.0.vm00.stdout:6/323: read d2/f30 [2804182,65567] 0 2026-03-10T12:37:51.986 INFO:tasks.workunit.client.0.vm00.stdout:7/310: rmdir da/d25/d63 0 2026-03-10T12:37:51.988 INFO:tasks.workunit.client.1.vm07.stdout:0/467: readlink d0/d14/d5f/d76/d2f/d31/l3f 0 2026-03-10T12:37:51.989 INFO:tasks.workunit.client.0.vm00.stdout:2/370: dread d4/d6/d41/d6d/d40/f5e [0,4194304] 0 2026-03-10T12:37:51.989 INFO:tasks.workunit.client.0.vm00.stdout:6/324: stat d2/da/dc/d2f/f56 0 2026-03-10T12:37:51.998 INFO:tasks.workunit.client.0.vm00.stdout:9/400: link d0/d3d/d43/d53/d57/f6c d0/d3d/f8c 0 2026-03-10T12:37:52.002 INFO:tasks.workunit.client.0.vm00.stdout:9/401: dwrite d0/f4 [0,4194304] 0 2026-03-10T12:37:52.004 INFO:tasks.workunit.client.0.vm00.stdout:5/387: creat d1f/d6a/f84 x:0 0 0 2026-03-10T12:37:52.006 INFO:tasks.workunit.client.0.vm00.stdout:3/445: mknod dd/c9a 0 2026-03-10T12:37:52.006 INFO:tasks.workunit.client.0.vm00.stdout:2/371: creat d4/dd/d63/f83 x:0 0 0 2026-03-10T12:37:52.008 INFO:tasks.workunit.client.1.vm07.stdout:7/399: dwrite d0/d47/f51 [0,4194304] 0 2026-03-10T12:37:52.009 INFO:tasks.workunit.client.0.vm00.stdout:5/388: creat d1f/d26/d2b/d35/d53/d72/f85 x:0 0 0 2026-03-10T12:37:52.010 INFO:tasks.workunit.client.1.vm07.stdout:7/400: dread d0/d47/f51 [0,4194304] 0 2026-03-10T12:37:52.019 INFO:tasks.workunit.client.1.vm07.stdout:2/337: mkdir d0/d29/d64/d74/d75 0 2026-03-10T12:37:52.028 INFO:tasks.workunit.client.0.vm00.stdout:7/311: read da/d47/f62 [72446,60336] 0 2026-03-10T12:37:52.032 INFO:tasks.workunit.client.0.vm00.stdout:4/409: read df/d1f/d22/d26/d2e/f50 [57661,43662] 0 2026-03-10T12:37:52.033 INFO:tasks.workunit.client.0.vm00.stdout:6/325: dread d2/da/dc/f13 [0,4194304] 0 2026-03-10T12:37:52.043 INFO:tasks.workunit.client.1.vm07.stdout:4/537: mkdir d0/d4/d10/d9a/db9 0 2026-03-10T12:37:52.046 INFO:tasks.workunit.client.1.vm07.stdout:0/468: rename d0/d14/c49 to d0/d14/d5f/d41/d86/c97 0 2026-03-10T12:37:52.051 INFO:tasks.workunit.client.1.vm07.stdout:2/338: dread - d0/d42/d26/d38/d4f/f5c zero size 2026-03-10T12:37:52.053 INFO:tasks.workunit.client.1.vm07.stdout:7/401: link d0/f10 d0/d57/d62/f7e 0 2026-03-10T12:37:52.053 INFO:tasks.workunit.client.1.vm07.stdout:2/339: fsync d0/f40 0 2026-03-10T12:37:52.056 INFO:tasks.workunit.client.1.vm07.stdout:0/469: mknod d0/d14/d5f/d76/d2f/d31/d6b/c98 0 2026-03-10T12:37:52.056 INFO:tasks.workunit.client.1.vm07.stdout:7/402: mknod d0/d52/c7f 0 2026-03-10T12:37:52.065 INFO:tasks.workunit.client.1.vm07.stdout:2/340: fsync d0/d42/d1f/d20/f2b 0 2026-03-10T12:37:52.069 INFO:tasks.workunit.client.1.vm07.stdout:0/470: mknod d0/c99 0 2026-03-10T12:37:52.069 INFO:tasks.workunit.client.1.vm07.stdout:7/403: fdatasync d0/d47/d48/f4b 0 2026-03-10T12:37:52.069 INFO:tasks.workunit.client.1.vm07.stdout:7/404: chown d0/f3f 231370788 1 2026-03-10T12:37:52.070 INFO:tasks.workunit.client.1.vm07.stdout:4/538: getdents d0/d4/d10/d3c/d2b/d2d 0 2026-03-10T12:37:52.071 INFO:tasks.workunit.client.1.vm07.stdout:0/471: mkdir d0/d14/d5f/d41/d6a/d9a 0 2026-03-10T12:37:52.072 INFO:tasks.workunit.client.1.vm07.stdout:7/405: mkdir d0/d67/d6f/d80 0 2026-03-10T12:37:52.072 INFO:tasks.workunit.client.1.vm07.stdout:4/539: unlink d0/d4/d10/d8d/fab 0 2026-03-10T12:37:52.079 INFO:tasks.workunit.client.1.vm07.stdout:4/540: stat d0/d4/d10/d8d/db2/lb7 0 2026-03-10T12:37:52.088 INFO:tasks.workunit.client.1.vm07.stdout:0/472: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/f9b x:0 0 0 2026-03-10T12:37:52.088 INFO:tasks.workunit.client.1.vm07.stdout:4/541: mknod d0/d4/d10/d3c/cba 0 2026-03-10T12:37:52.092 INFO:tasks.workunit.client.1.vm07.stdout:0/473: dwrite d0/d14/d7c/f90 [0,4194304] 0 2026-03-10T12:37:52.095 INFO:tasks.workunit.client.1.vm07.stdout:0/474: fsync d0/d14/d5f/d76/d2f/d31/d4f/d60/f75 0 2026-03-10T12:37:52.114 INFO:tasks.workunit.client.1.vm07.stdout:0/475: mknod d0/d14/d5f/d76/c9c 0 2026-03-10T12:37:52.180 INFO:tasks.workunit.client.1.vm07.stdout:3/454: sync 2026-03-10T12:37:52.187 INFO:tasks.workunit.client.0.vm00.stdout:5/389: dread d1f/f22 [0,4194304] 0 2026-03-10T12:37:52.191 INFO:tasks.workunit.client.0.vm00.stdout:5/390: chown d1f/d26/d2e/d58/d6b/f80 0 1 2026-03-10T12:37:52.191 INFO:tasks.workunit.client.0.vm00.stdout:8/307: rename d0/d12/d36/d3e to d0/d5c 0 2026-03-10T12:37:52.191 INFO:tasks.workunit.client.1.vm07.stdout:5/475: sync 2026-03-10T12:37:52.195 INFO:tasks.workunit.client.1.vm07.stdout:4/542: sync 2026-03-10T12:37:52.197 INFO:tasks.workunit.client.0.vm00.stdout:5/391: mkdir d1f/d26/d2e/d58/d6b/d86 0 2026-03-10T12:37:52.199 INFO:tasks.workunit.client.1.vm07.stdout:3/455: dwrite dc/d18/d2d/d3d/f5a [0,4194304] 0 2026-03-10T12:37:52.199 INFO:tasks.workunit.client.0.vm00.stdout:9/402: rename d0/d3d/d43/d80/d19/f32 to d0/d3d/d43/d80/f8d 0 2026-03-10T12:37:52.202 INFO:tasks.workunit.client.0.vm00.stdout:5/392: creat d1f/d26/d2e/d58/d6b/f87 x:0 0 0 2026-03-10T12:37:52.203 INFO:tasks.workunit.client.0.vm00.stdout:9/403: mknod d0/d3d/d43/d80/d19/c8e 0 2026-03-10T12:37:52.203 INFO:tasks.workunit.client.0.vm00.stdout:8/308: symlink d0/dd/l5d 0 2026-03-10T12:37:52.205 INFO:tasks.workunit.client.0.vm00.stdout:5/393: fsync d1f/d39/f65 0 2026-03-10T12:37:52.207 INFO:tasks.workunit.client.1.vm07.stdout:4/543: link d0/d4/d5/d34/l64 d0/d4/d5/lbb 0 2026-03-10T12:37:52.212 INFO:tasks.workunit.client.1.vm07.stdout:5/476: dwrite d0/d22/d18/d19/d2e/d67/fa0 [0,4194304] 0 2026-03-10T12:37:52.212 INFO:tasks.workunit.client.0.vm00.stdout:5/394: creat d1f/d26/d2b/d35/d53/d5b/d73/f88 x:0 0 0 2026-03-10T12:37:52.212 INFO:tasks.workunit.client.0.vm00.stdout:9/404: creat d0/d3d/f8f x:0 0 0 2026-03-10T12:37:52.212 INFO:tasks.workunit.client.0.vm00.stdout:8/309: dwrite d0/dd/d38/f3d [0,4194304] 0 2026-03-10T12:37:52.213 INFO:tasks.workunit.client.0.vm00.stdout:5/395: write d1f/d26/d2b/f44 [690420,27459] 0 2026-03-10T12:37:52.226 INFO:tasks.workunit.client.0.vm00.stdout:9/405: symlink d0/d7f/l90 0 2026-03-10T12:37:52.228 INFO:tasks.workunit.client.0.vm00.stdout:9/406: dread d0/d3d/d43/d80/f49 [0,4194304] 0 2026-03-10T12:37:52.229 INFO:tasks.workunit.client.0.vm00.stdout:8/310: symlink d0/d12/d36/d51/l5e 0 2026-03-10T12:37:52.229 INFO:tasks.workunit.client.0.vm00.stdout:8/311: chown d0/d12/d2d 1 1 2026-03-10T12:37:52.230 INFO:tasks.workunit.client.0.vm00.stdout:8/312: chown d0/f8 1048 1 2026-03-10T12:37:52.231 INFO:tasks.workunit.client.0.vm00.stdout:8/313: read d0/d12/d2d/f33 [1067898,126853] 0 2026-03-10T12:37:52.235 INFO:tasks.workunit.client.0.vm00.stdout:8/314: rename d0/d12/d43/f45 to d0/d5c/f5f 0 2026-03-10T12:37:52.239 INFO:tasks.workunit.client.0.vm00.stdout:9/407: getdents d0/d3d/d43/d80/d1e/d27 0 2026-03-10T12:37:52.239 INFO:tasks.workunit.client.0.vm00.stdout:5/396: dread d1f/d26/d2b/d37/f81 [0,4194304] 0 2026-03-10T12:37:52.239 INFO:tasks.workunit.client.0.vm00.stdout:9/408: truncate d0/d3d/d43/d80/d19/f7d 940216 0 2026-03-10T12:37:52.241 INFO:tasks.workunit.client.0.vm00.stdout:5/397: dread d1f/d26/d2b/d35/f41 [0,4194304] 0 2026-03-10T12:37:52.243 INFO:tasks.workunit.client.0.vm00.stdout:9/409: dwrite d0/d3d/d43/d80/d1e/d2b/f47 [0,4194304] 0 2026-03-10T12:37:52.248 INFO:tasks.workunit.client.0.vm00.stdout:0/426: creat d3/db/f97 x:0 0 0 2026-03-10T12:37:52.251 INFO:tasks.workunit.client.0.vm00.stdout:0/427: dwrite d3/d7/d4c/d5b/d38/f89 [4194304,4194304] 0 2026-03-10T12:37:52.261 INFO:tasks.workunit.client.1.vm07.stdout:9/457: dwrite d5/d16/d18/f20 [0,4194304] 0 2026-03-10T12:37:52.261 INFO:tasks.workunit.client.1.vm07.stdout:8/434: symlink d1/l89 0 2026-03-10T12:37:52.261 INFO:tasks.workunit.client.0.vm00.stdout:5/398: symlink d1f/d39/l89 0 2026-03-10T12:37:52.261 INFO:tasks.workunit.client.0.vm00.stdout:9/410: symlink d0/d3d/d59/d4e/d84/l91 0 2026-03-10T12:37:52.261 INFO:tasks.workunit.client.0.vm00.stdout:9/411: truncate d0/d3d/d59/f45 4644496 0 2026-03-10T12:37:52.261 INFO:tasks.workunit.client.0.vm00.stdout:9/412: mknod d0/d3d/d59/d4e/c92 0 2026-03-10T12:37:52.261 INFO:tasks.workunit.client.0.vm00.stdout:9/413: fsync d0/d3d/d59/f45 0 2026-03-10T12:37:52.262 INFO:tasks.workunit.client.0.vm00.stdout:9/414: truncate d0/d3d/f83 645742 0 2026-03-10T12:37:52.262 INFO:tasks.workunit.client.0.vm00.stdout:9/415: fsync d0/d3d/d59/f4a 0 2026-03-10T12:37:52.264 INFO:tasks.workunit.client.0.vm00.stdout:5/399: rename d1f/f4a to d1f/d26/d2b/d37/f8a 0 2026-03-10T12:37:52.264 INFO:tasks.workunit.client.0.vm00.stdout:5/400: write d1f/f27 [2861467,76247] 0 2026-03-10T12:37:52.267 INFO:tasks.workunit.client.0.vm00.stdout:0/428: mknod d3/db/d24/d25/c98 0 2026-03-10T12:37:52.268 INFO:tasks.workunit.client.0.vm00.stdout:0/429: truncate d3/d40/d65/f92 750039 0 2026-03-10T12:37:52.268 INFO:tasks.workunit.client.1.vm07.stdout:9/458: rmdir d5/d16/d23/d26 39 2026-03-10T12:37:52.269 INFO:tasks.workunit.client.0.vm00.stdout:9/416: rmdir d0/d3d/d43/d80/d1e/d2b 39 2026-03-10T12:37:52.270 INFO:tasks.workunit.client.0.vm00.stdout:9/417: readlink d0/d3d/d43/d80/d1e/l6e 0 2026-03-10T12:37:52.270 INFO:tasks.workunit.client.1.vm07.stdout:6/378: dwrite d1/d4/d6/d53/d66/f68 [0,4194304] 0 2026-03-10T12:37:52.276 INFO:tasks.workunit.client.1.vm07.stdout:9/459: creat d5/d13/d57/d3e/fa8 x:0 0 0 2026-03-10T12:37:52.276 INFO:tasks.workunit.client.1.vm07.stdout:6/379: write d1/f38 [3610543,71569] 0 2026-03-10T12:37:52.276 INFO:tasks.workunit.client.0.vm00.stdout:9/418: dwrite d0/d3d/d43/d80/d19/f1b [0,4194304] 0 2026-03-10T12:37:52.276 INFO:tasks.workunit.client.0.vm00.stdout:0/430: stat d3/d40/d65/f8f 0 2026-03-10T12:37:52.276 INFO:tasks.workunit.client.0.vm00.stdout:9/419: write d0/d3d/d43/f68 [1108711,11935] 0 2026-03-10T12:37:52.279 INFO:tasks.workunit.client.0.vm00.stdout:9/420: dwrite d0/f5d [0,4194304] 0 2026-03-10T12:37:52.280 INFO:tasks.workunit.client.1.vm07.stdout:9/460: write d5/d13/f14 [1965419,7325] 0 2026-03-10T12:37:52.284 INFO:tasks.workunit.client.1.vm07.stdout:1/403: dwrite d9/df/f21 [0,4194304] 0 2026-03-10T12:37:52.285 INFO:tasks.workunit.client.0.vm00.stdout:9/421: mknod d0/d3d/d43/d53/c93 0 2026-03-10T12:37:52.288 INFO:tasks.workunit.client.0.vm00.stdout:9/422: creat d0/d3d/d59/f94 x:0 0 0 2026-03-10T12:37:52.289 INFO:tasks.workunit.client.0.vm00.stdout:9/423: write d0/f5d [3233135,73249] 0 2026-03-10T12:37:52.290 INFO:tasks.workunit.client.0.vm00.stdout:9/424: chown d0/d3d/d43/d80/d1e/d27/c4b 488 1 2026-03-10T12:37:52.298 INFO:tasks.workunit.client.1.vm07.stdout:8/435: read d1/d3/ff [2885096,99] 0 2026-03-10T12:37:52.303 INFO:tasks.workunit.client.0.vm00.stdout:0/431: creat d3/d7/d3c/f99 x:0 0 0 2026-03-10T12:37:52.303 INFO:tasks.workunit.client.1.vm07.stdout:6/380: mkdir d1/d4/d71/d77 0 2026-03-10T12:37:52.305 INFO:tasks.workunit.client.0.vm00.stdout:8/315: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:52.309 INFO:tasks.workunit.client.1.vm07.stdout:1/404: rename d9/df/f21 to d9/df/d29/d2c/f7f 0 2026-03-10T12:37:52.309 INFO:tasks.workunit.client.0.vm00.stdout:8/316: mkdir d0/d12/d60 0 2026-03-10T12:37:52.313 INFO:tasks.workunit.client.0.vm00.stdout:8/317: creat d0/d12/d36/d51/f61 x:0 0 0 2026-03-10T12:37:52.315 INFO:tasks.workunit.client.1.vm07.stdout:9/461: dwrite d5/d16/f19 [0,4194304] 0 2026-03-10T12:37:52.317 INFO:tasks.workunit.client.1.vm07.stdout:9/462: write d5/d13/d57/d3e/fa8 [891862,110095] 0 2026-03-10T12:37:52.318 INFO:tasks.workunit.client.1.vm07.stdout:9/463: dread - d5/d13/d22/f9e zero size 2026-03-10T12:37:52.318 INFO:tasks.workunit.client.1.vm07.stdout:9/464: chown d5/d1f/d5e 56 1 2026-03-10T12:37:52.319 INFO:tasks.workunit.client.0.vm00.stdout:8/318: symlink d0/d12/d36/l62 0 2026-03-10T12:37:52.324 INFO:tasks.workunit.client.1.vm07.stdout:1/405: truncate d9/fe 1409238 0 2026-03-10T12:37:52.324 INFO:tasks.workunit.client.1.vm07.stdout:6/381: creat d1/d4/d6/d16/d1a/d2c/f78 x:0 0 0 2026-03-10T12:37:52.336 INFO:tasks.workunit.client.1.vm07.stdout:9/465: link d5/f45 d5/d13/d57/d3e/fa9 0 2026-03-10T12:37:52.337 INFO:tasks.workunit.client.1.vm07.stdout:1/406: mkdir d9/d2d/d80 0 2026-03-10T12:37:52.340 INFO:tasks.workunit.client.1.vm07.stdout:9/466: readlink d5/d13/l9a 0 2026-03-10T12:37:52.341 INFO:tasks.workunit.client.1.vm07.stdout:6/382: fdatasync d1/d4/d6/f2a 0 2026-03-10T12:37:52.342 INFO:tasks.workunit.client.0.vm00.stdout:3/446: chown dd/d18/f83 50 1 2026-03-10T12:37:52.344 INFO:tasks.workunit.client.1.vm07.stdout:9/467: symlink d5/d1f/d75/laa 0 2026-03-10T12:37:52.344 INFO:tasks.workunit.client.0.vm00.stdout:3/447: mknod dd/d18/d13/d1d/d43/c9b 0 2026-03-10T12:37:52.348 INFO:tasks.workunit.client.0.vm00.stdout:3/448: dwrite dd/d18/f7c [4194304,4194304] 0 2026-03-10T12:37:52.352 INFO:tasks.workunit.client.1.vm07.stdout:6/383: unlink d1/d4/d6/d53/d66/f68 0 2026-03-10T12:37:52.352 INFO:tasks.workunit.client.0.vm00.stdout:2/372: truncate f1 2775935 0 2026-03-10T12:37:52.352 INFO:tasks.workunit.client.0.vm00.stdout:4/410: truncate df/d57/f7c 1750871 0 2026-03-10T12:37:52.352 INFO:tasks.workunit.client.0.vm00.stdout:3/449: unlink dd/d18/d14/f3c 0 2026-03-10T12:37:52.356 INFO:tasks.workunit.client.0.vm00.stdout:1/398: dwrite da/d24/d5a/f68 [0,4194304] 0 2026-03-10T12:37:52.357 INFO:tasks.workunit.client.0.vm00.stdout:1/399: write da/d12/d26/f57 [3331571,116605] 0 2026-03-10T12:37:52.361 INFO:tasks.workunit.client.0.vm00.stdout:2/373: dwrite d4/f6e [0,4194304] 0 2026-03-10T12:37:52.376 INFO:tasks.workunit.client.0.vm00.stdout:2/374: dwrite d4/d6/d2d/d31/f79 [0,4194304] 0 2026-03-10T12:37:52.376 INFO:tasks.workunit.client.1.vm07.stdout:9/468: rename d5/d1f/l2e to d5/d13/d2c/lab 0 2026-03-10T12:37:52.376 INFO:tasks.workunit.client.1.vm07.stdout:7/406: rmdir d0/d57/d62 39 2026-03-10T12:37:52.377 INFO:tasks.workunit.client.0.vm00.stdout:0/432: rmdir d3/d7/d4c/d5b/d38/d44 39 2026-03-10T12:37:52.377 INFO:tasks.workunit.client.0.vm00.stdout:7/312: write da/d25/f29 [772008,84166] 0 2026-03-10T12:37:52.378 INFO:tasks.workunit.client.0.vm00.stdout:6/326: write d2/f30 [5198417,23845] 0 2026-03-10T12:37:52.378 INFO:tasks.workunit.client.0.vm00.stdout:7/313: dread - da/d1b/d40/f5c zero size 2026-03-10T12:37:52.388 INFO:tasks.workunit.client.0.vm00.stdout:2/375: dwrite d4/d6/f2b [4194304,4194304] 0 2026-03-10T12:37:52.393 INFO:tasks.workunit.client.0.vm00.stdout:7/314: dwrite da/d25/d2c/f30 [0,4194304] 0 2026-03-10T12:37:52.394 INFO:tasks.workunit.client.0.vm00.stdout:2/376: chown d4/d6/f34 7340454 1 2026-03-10T12:37:52.394 INFO:tasks.workunit.client.0.vm00.stdout:6/327: creat d2/d16/f78 x:0 0 0 2026-03-10T12:37:52.400 INFO:tasks.workunit.client.0.vm00.stdout:7/315: dwrite da/fe [0,4194304] 0 2026-03-10T12:37:52.406 INFO:tasks.workunit.client.1.vm07.stdout:6/384: truncate d1/d4/f62 525343 0 2026-03-10T12:37:52.408 INFO:tasks.workunit.client.0.vm00.stdout:5/401: read d1f/f2c [1552557,127265] 0 2026-03-10T12:37:52.409 INFO:tasks.workunit.client.0.vm00.stdout:6/328: symlink d2/d42/l79 0 2026-03-10T12:37:52.410 INFO:tasks.workunit.client.0.vm00.stdout:5/402: truncate d1f/d26/d2e/d58/d6b/f87 912820 0 2026-03-10T12:37:52.410 INFO:tasks.workunit.client.0.vm00.stdout:5/403: chown d1f/d39/l89 667 1 2026-03-10T12:37:52.417 INFO:tasks.workunit.client.1.vm07.stdout:6/385: read d1/d4/d6/d4e/d64/f6f [1391437,77066] 0 2026-03-10T12:37:52.417 INFO:tasks.workunit.client.0.vm00.stdout:2/377: rmdir d4/d6/d2d/d31 39 2026-03-10T12:37:52.423 INFO:tasks.workunit.client.0.vm00.stdout:4/411: mkdir df/d8a 0 2026-03-10T12:37:52.423 INFO:tasks.workunit.client.0.vm00.stdout:4/412: stat df/d1f/d22/d26/d2e 0 2026-03-10T12:37:52.424 INFO:tasks.workunit.client.1.vm07.stdout:6/386: creat d1/d4/d71/f79 x:0 0 0 2026-03-10T12:37:52.429 INFO:tasks.workunit.client.0.vm00.stdout:7/316: mknod da/d26/d50/d73/c78 0 2026-03-10T12:37:52.432 INFO:tasks.workunit.client.0.vm00.stdout:3/450: creat dd/d64/d92/f9c x:0 0 0 2026-03-10T12:37:52.432 INFO:tasks.workunit.client.0.vm00.stdout:3/451: chown dd/d18/d13/f6b 40725582 1 2026-03-10T12:37:52.435 INFO:tasks.workunit.client.0.vm00.stdout:4/413: chown df/d1f/d36/d3a/d41/f2f 899 1 2026-03-10T12:37:52.443 INFO:tasks.workunit.client.1.vm07.stdout:6/387: chown d1/d4/d6/cd 1842 1 2026-03-10T12:37:52.443 INFO:tasks.workunit.client.0.vm00.stdout:4/414: chown df/f19 2 1 2026-03-10T12:37:52.443 INFO:tasks.workunit.client.0.vm00.stdout:4/415: creat df/d63/d6b/d73/f8b x:0 0 0 2026-03-10T12:37:52.443 INFO:tasks.workunit.client.0.vm00.stdout:4/416: stat df/d32/d64 0 2026-03-10T12:37:52.443 INFO:tasks.workunit.client.0.vm00.stdout:4/417: symlink df/d1f/d22/d26/l8c 0 2026-03-10T12:37:52.445 INFO:tasks.workunit.client.0.vm00.stdout:4/418: creat df/d63/d77/f8d x:0 0 0 2026-03-10T12:37:52.445 INFO:tasks.workunit.client.0.vm00.stdout:4/419: stat df/d32/d76/f7e 0 2026-03-10T12:37:52.447 INFO:tasks.workunit.client.1.vm07.stdout:6/388: creat d1/d4/d6/d16/d49/f7a x:0 0 0 2026-03-10T12:37:52.495 INFO:tasks.workunit.client.1.vm07.stdout:6/389: read d1/d4/d6/d43/d65/f76 [1410177,10129] 0 2026-03-10T12:37:52.495 INFO:tasks.workunit.client.1.vm07.stdout:2/341: write d0/f4 [191096,16707] 0 2026-03-10T12:37:52.501 INFO:tasks.workunit.client.0.vm00.stdout:2/378: dread d4/dd/f3e [0,4194304] 0 2026-03-10T12:37:52.501 INFO:tasks.workunit.client.1.vm07.stdout:2/342: creat d0/d5b/f76 x:0 0 0 2026-03-10T12:37:52.502 INFO:tasks.workunit.client.0.vm00.stdout:2/379: symlink d4/d6/d41/d6d/l84 0 2026-03-10T12:37:52.502 INFO:tasks.workunit.client.1.vm07.stdout:2/343: write d0/d42/d1f/f2f [2320590,58489] 0 2026-03-10T12:37:52.503 INFO:tasks.workunit.client.0.vm00.stdout:2/380: write d4/d6/d2d/d3a/f44 [139651,103389] 0 2026-03-10T12:37:52.503 INFO:tasks.workunit.client.1.vm07.stdout:0/476: dwrite d0/d14/d5f/d76/f30 [0,4194304] 0 2026-03-10T12:37:52.504 INFO:tasks.workunit.client.1.vm07.stdout:3/456: dwrite dc/d18/d24/f37 [0,4194304] 0 2026-03-10T12:37:52.505 INFO:tasks.workunit.client.0.vm00.stdout:2/381: mkdir d4/d6/d2d/d3a/d43/d85 0 2026-03-10T12:37:52.505 INFO:tasks.workunit.client.0.vm00.stdout:2/382: dread - d4/d6/d41/d6d/d40/f80 zero size 2026-03-10T12:37:52.507 INFO:tasks.workunit.client.0.vm00.stdout:2/383: write d4/d6/d2d/d3a/d43/d51/f6f [454987,98292] 0 2026-03-10T12:37:52.517 INFO:tasks.workunit.client.1.vm07.stdout:0/477: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/f9b [0,4194304] 0 2026-03-10T12:37:52.517 INFO:tasks.workunit.client.0.vm00.stdout:2/384: symlink d4/d78/l86 0 2026-03-10T12:37:52.517 INFO:tasks.workunit.client.0.vm00.stdout:2/385: symlink d4/d53/d76/l87 0 2026-03-10T12:37:52.517 INFO:tasks.workunit.client.0.vm00.stdout:2/386: getdents d4/d6/d2d/d3a 0 2026-03-10T12:37:52.517 INFO:tasks.workunit.client.0.vm00.stdout:2/387: dwrite d4/d6/d2d/d31/f79 [0,4194304] 0 2026-03-10T12:37:52.523 INFO:tasks.workunit.client.0.vm00.stdout:3/452: dread dd/d3d/f53 [0,4194304] 0 2026-03-10T12:37:52.523 INFO:tasks.workunit.client.0.vm00.stdout:2/388: symlink d4/dd/d38/l88 0 2026-03-10T12:37:52.525 INFO:tasks.workunit.client.0.vm00.stdout:2/389: creat d4/d6/f89 x:0 0 0 2026-03-10T12:37:52.527 INFO:tasks.workunit.client.0.vm00.stdout:3/453: dwrite dd/d27/d2c/d34/f60 [0,4194304] 0 2026-03-10T12:37:52.535 INFO:tasks.workunit.client.0.vm00.stdout:2/390: creat d4/d53/d68/f8a x:0 0 0 2026-03-10T12:37:52.535 INFO:tasks.workunit.client.0.vm00.stdout:2/391: stat d4/d6/d2d/d31/f71 0 2026-03-10T12:37:52.536 INFO:tasks.workunit.client.0.vm00.stdout:3/454: write dd/f15 [2009372,93555] 0 2026-03-10T12:37:52.536 INFO:tasks.workunit.client.0.vm00.stdout:4/420: sync 2026-03-10T12:37:52.537 INFO:tasks.workunit.client.0.vm00.stdout:3/455: dwrite dd/d3d/d8a/f8b [0,4194304] 0 2026-03-10T12:37:52.539 INFO:tasks.workunit.client.0.vm00.stdout:3/456: read - dd/d64/d92/f9c zero size 2026-03-10T12:37:52.542 INFO:tasks.workunit.client.0.vm00.stdout:4/421: truncate df/f1b 3288578 0 2026-03-10T12:37:52.543 INFO:tasks.workunit.client.0.vm00.stdout:1/400: dread da/d21/d27/f54 [0,4194304] 0 2026-03-10T12:37:52.543 INFO:tasks.workunit.client.1.vm07.stdout:2/344: rename d0/d42/d4e/d56 to d0/d42/d4e/d77 0 2026-03-10T12:37:52.543 INFO:tasks.workunit.client.0.vm00.stdout:1/401: fsync da/d24/f76 0 2026-03-10T12:37:52.545 INFO:tasks.workunit.client.0.vm00.stdout:4/422: fdatasync df/d1f/d36/d3a/d41/f33 0 2026-03-10T12:37:52.549 INFO:tasks.workunit.client.1.vm07.stdout:2/345: creat d0/d29/d64/f78 x:0 0 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:0/478: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d9d 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:2/346: symlink d0/l79 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:0/479: mkdir d0/d14/d5f/d76/d2f/d31/d79/d9e 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:4/544: dwrite d0/d4/d5/f43 [0,4194304] 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:4/545: chown d0/d4/d10/d3c/c5b 1332154 1 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:4/546: symlink d0/d4/d10/d8d/db2/lbc 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:0/480: fdatasync d0/d14/d5f/d76/d2f/d31/d4f/f5c 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.1.vm07.stdout:4/547: stat d0/d4/c9 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:4/423: creat df/d1f/d22/d26/d65/f8e x:0 0 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:1/402: dwrite da/d12/f20 [0,4194304] 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:4/424: creat df/d63/d77/f8f x:0 0 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:1/403: creat da/d4d/d78/f86 x:0 0 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:1/404: readlink da/d24/l2c 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:1/405: chown f3 31526 1 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:4/425: mkdir df/d6c/d90 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:1/406: rename da/d24/d28/d44/d5d/d80/c82 to da/d4d/d78/c87 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:1/407: dwrite da/d21/d27/f6e [0,4194304] 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:4/426: rename df/d1f/d22/d26/d2e to df/d1f/d22/d26/d65/d91 0 2026-03-10T12:37:52.583 INFO:tasks.workunit.client.0.vm00.stdout:8/319: dread d0/d12/d36/f39 [0,4194304] 0 2026-03-10T12:37:52.586 INFO:tasks.workunit.client.1.vm07.stdout:0/481: rmdir d0/d14/d5f/d41/d86 39 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.0.vm00.stdout:8/320: dread d0/d12/d2d/f33 [0,4194304] 0 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.1.vm07.stdout:0/482: symlink d0/d14/d5f/d41/d6a/d9a/l9f 0 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.1.vm07.stdout:4/548: getdents d0/d5c/d7c 0 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.1.vm07.stdout:4/549: fdatasync d0/d4/d10/d5f/fb6 0 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.1.vm07.stdout:4/550: dread - d0/d8e/fb5 zero size 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.1.vm07.stdout:0/483: fdatasync d0/d14/f37 0 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.1.vm07.stdout:4/551: symlink d0/d4/d5/d78/lbd 0 2026-03-10T12:37:52.607 INFO:tasks.workunit.client.1.vm07.stdout:0/484: dwrite d0/f1d [0,4194304] 0 2026-03-10T12:37:52.611 INFO:tasks.workunit.client.1.vm07.stdout:4/552: rename d0/d4/d5/lbb to d0/d4/d5/da/d66/lbe 0 2026-03-10T12:37:52.621 INFO:tasks.workunit.client.1.vm07.stdout:0/485: mkdir d0/d14/d5f/d76/da0 0 2026-03-10T12:37:52.626 INFO:tasks.workunit.client.1.vm07.stdout:4/553: link d0/d4/d5/da/d66/fa8 d0/d4/d10/d3c/d2b/d54/fbf 0 2026-03-10T12:37:52.642 INFO:tasks.workunit.client.1.vm07.stdout:4/554: creat d0/d5c/d7c/fc0 x:0 0 0 2026-03-10T12:37:52.642 INFO:tasks.workunit.client.1.vm07.stdout:4/555: readlink d0/d4/l88 0 2026-03-10T12:37:52.642 INFO:tasks.workunit.client.1.vm07.stdout:4/556: write d0/d4/d7a/d46/d76/fae [173216,34111] 0 2026-03-10T12:37:52.642 INFO:tasks.workunit.client.1.vm07.stdout:4/557: symlink d0/d4/d10/d3c/lc1 0 2026-03-10T12:37:52.642 INFO:tasks.workunit.client.1.vm07.stdout:4/558: dwrite d0/d4/fb8 [0,4194304] 0 2026-03-10T12:37:52.642 INFO:tasks.workunit.client.1.vm07.stdout:4/559: symlink d0/d4/d5/da/d66/lc2 0 2026-03-10T12:37:52.642 INFO:tasks.workunit.client.1.vm07.stdout:4/560: write d0/d4/d7a/d46/d76/fa0 [1320929,97418] 0 2026-03-10T12:37:52.650 INFO:tasks.workunit.client.1.vm07.stdout:4/561: dwrite d0/d4/d5/d34/f94 [0,4194304] 0 2026-03-10T12:37:52.660 INFO:tasks.workunit.client.1.vm07.stdout:4/562: dread d0/d4/d7a/d46/d76/fa0 [0,4194304] 0 2026-03-10T12:37:52.680 INFO:tasks.workunit.client.0.vm00.stdout:8/321: rename d0/d5c/f5f to d0/d12/d17/f63 0 2026-03-10T12:37:52.688 INFO:tasks.workunit.client.1.vm07.stdout:3/457: sync 2026-03-10T12:37:52.689 INFO:tasks.workunit.client.1.vm07.stdout:3/458: chown dc/d18/f36 256972483 1 2026-03-10T12:37:52.693 INFO:tasks.workunit.client.1.vm07.stdout:3/459: creat dc/dd/d43/d76/d95/da0/fa2 x:0 0 0 2026-03-10T12:37:52.703 INFO:tasks.workunit.client.1.vm07.stdout:3/460: mkdir dc/d18/d99/da3 0 2026-03-10T12:37:52.704 INFO:tasks.workunit.client.1.vm07.stdout:3/461: fsync dc/dd/f96 0 2026-03-10T12:37:52.706 INFO:tasks.workunit.client.1.vm07.stdout:3/462: stat dc/dd/d28/d3b 0 2026-03-10T12:37:52.711 INFO:tasks.workunit.client.1.vm07.stdout:3/463: unlink dc/dd/f1d 0 2026-03-10T12:37:52.714 INFO:tasks.workunit.client.1.vm07.stdout:3/464: unlink dc/dd/d1f/d6f/c84 0 2026-03-10T12:37:52.721 INFO:tasks.workunit.client.1.vm07.stdout:0/486: sync 2026-03-10T12:37:52.726 INFO:tasks.workunit.client.1.vm07.stdout:0/487: truncate d0/d14/d5f/d76/f78 2584911 0 2026-03-10T12:37:52.766 INFO:tasks.workunit.client.1.vm07.stdout:4/563: dread d0/d4/d5/da/d66/f8c [4194304,4194304] 0 2026-03-10T12:37:52.767 INFO:tasks.workunit.client.1.vm07.stdout:4/564: getdents d0/d4/d10/d9a/db9 0 2026-03-10T12:37:52.780 INFO:tasks.workunit.client.1.vm07.stdout:4/565: unlink d0/d4/d10/d9a/f3e 0 2026-03-10T12:37:52.781 INFO:tasks.workunit.client.0.vm00.stdout:9/425: dwrite d0/d5/f3b [0,4194304] 0 2026-03-10T12:37:52.782 INFO:tasks.workunit.client.0.vm00.stdout:9/426: dread - d0/d3d/d59/d4e/f7c zero size 2026-03-10T12:37:52.782 INFO:tasks.workunit.client.0.vm00.stdout:9/427: fsync d0/d3d/f8f 0 2026-03-10T12:37:52.785 INFO:tasks.workunit.client.0.vm00.stdout:9/428: creat d0/d3d/d43/d80/d19/f95 x:0 0 0 2026-03-10T12:37:52.786 INFO:tasks.workunit.client.0.vm00.stdout:9/429: write d0/d3d/d43/d80/d1e/d2b/f5f [583511,47043] 0 2026-03-10T12:37:52.840 INFO:tasks.workunit.client.1.vm07.stdout:5/477: write d0/d22/d18/d19/d2e/d3f/f6a [3261387,94800] 0 2026-03-10T12:37:52.843 INFO:tasks.workunit.client.1.vm07.stdout:4/566: sync 2026-03-10T12:37:52.844 INFO:tasks.workunit.client.1.vm07.stdout:8/436: dwrite d1/f68 [0,4194304] 0 2026-03-10T12:37:52.845 INFO:tasks.workunit.client.1.vm07.stdout:5/478: read d0/d22/d18/d19/d2e/f88 [1791287,105856] 0 2026-03-10T12:37:52.848 INFO:tasks.workunit.client.1.vm07.stdout:6/390: getdents d1/d4/d6/d16/d1a/d2c 0 2026-03-10T12:37:52.865 INFO:tasks.workunit.client.0.vm00.stdout:1/408: dread da/d12/f30 [0,4194304] 0 2026-03-10T12:37:52.865 INFO:tasks.workunit.client.1.vm07.stdout:6/391: read d1/d4/d6/d16/f50 [413907,70834] 0 2026-03-10T12:37:52.865 INFO:tasks.workunit.client.1.vm07.stdout:5/479: rmdir d0/d22 39 2026-03-10T12:37:52.865 INFO:tasks.workunit.client.1.vm07.stdout:8/437: creat d1/d3/d6c/f8a x:0 0 0 2026-03-10T12:37:52.865 INFO:tasks.workunit.client.1.vm07.stdout:6/392: write d1/f26 [1242558,63224] 0 2026-03-10T12:37:52.866 INFO:tasks.workunit.client.0.vm00.stdout:1/409: dread f3 [0,4194304] 0 2026-03-10T12:37:52.868 INFO:tasks.workunit.client.1.vm07.stdout:4/567: getdents d0/d4/d10 0 2026-03-10T12:37:52.868 INFO:tasks.workunit.client.1.vm07.stdout:8/438: mknod d1/d3/d18/c8b 0 2026-03-10T12:37:52.871 INFO:tasks.workunit.client.0.vm00.stdout:1/410: dwrite da/f13 [4194304,4194304] 0 2026-03-10T12:37:52.874 INFO:tasks.workunit.client.0.vm00.stdout:1/411: write da/d21/d27/f54 [4976579,18909] 0 2026-03-10T12:37:52.875 INFO:tasks.workunit.client.0.vm00.stdout:1/412: creat da/d21/f88 x:0 0 0 2026-03-10T12:37:52.880 INFO:tasks.workunit.client.1.vm07.stdout:6/393: creat d1/d4/d6/d16/d1a/d33/f7b x:0 0 0 2026-03-10T12:37:52.880 INFO:tasks.workunit.client.1.vm07.stdout:8/439: fdatasync d1/f2 0 2026-03-10T12:37:52.881 INFO:tasks.workunit.client.1.vm07.stdout:6/394: write d1/d4/d44/f45 [1928963,31869] 0 2026-03-10T12:37:52.886 INFO:tasks.workunit.client.1.vm07.stdout:5/480: rename d0/d22/d18/d19/l8d to d0/d22/la4 0 2026-03-10T12:37:52.886 INFO:tasks.workunit.client.1.vm07.stdout:8/440: creat d1/d3/d40/f8c x:0 0 0 2026-03-10T12:37:52.887 INFO:tasks.workunit.client.1.vm07.stdout:6/395: creat d1/d4/d6/f7c x:0 0 0 2026-03-10T12:37:52.896 INFO:tasks.workunit.client.1.vm07.stdout:4/568: creat d0/d5c/fc3 x:0 0 0 2026-03-10T12:37:52.896 INFO:tasks.workunit.client.1.vm07.stdout:8/441: dread d1/d3/d18/f75 [0,4194304] 0 2026-03-10T12:37:52.896 INFO:tasks.workunit.client.1.vm07.stdout:8/442: readlink d1/d3/d11/l5c 0 2026-03-10T12:37:52.913 INFO:tasks.workunit.client.1.vm07.stdout:6/396: rmdir d1/d4/d6/d4e 39 2026-03-10T12:37:52.929 INFO:tasks.workunit.client.1.vm07.stdout:8/443: symlink d1/d3/d40/l8d 0 2026-03-10T12:37:52.952 INFO:tasks.workunit.client.1.vm07.stdout:8/444: mkdir d1/d3/d18/d8e 0 2026-03-10T12:37:52.958 INFO:tasks.workunit.client.0.vm00.stdout:5/404: write d1f/d26/d2b/d35/f42 [9335,33690] 0 2026-03-10T12:37:52.958 INFO:tasks.workunit.client.0.vm00.stdout:5/405: chown d1f/d26/d2e 981419586 1 2026-03-10T12:37:52.964 INFO:tasks.workunit.client.0.vm00.stdout:5/406: dwrite d1f/d26/d2e/f71 [0,4194304] 0 2026-03-10T12:37:52.965 INFO:tasks.workunit.client.1.vm07.stdout:1/407: dwrite d9/d2d/d4f/d5a/f65 [0,4194304] 0 2026-03-10T12:37:52.965 INFO:tasks.workunit.client.1.vm07.stdout:1/408: chown d9/df/d29/d2b/d30 2197 1 2026-03-10T12:37:52.970 INFO:tasks.workunit.client.0.vm00.stdout:2/392: dwrite d4/d6/f30 [0,4194304] 0 2026-03-10T12:37:52.973 INFO:tasks.workunit.client.0.vm00.stdout:2/393: write d4/d6/d2d/d31/f79 [5115964,12612] 0 2026-03-10T12:37:52.973 INFO:tasks.workunit.client.0.vm00.stdout:2/394: stat d4/d6/d2d/l5f 0 2026-03-10T12:37:52.980 INFO:tasks.workunit.client.1.vm07.stdout:5/481: getdents d0/d22/d18/d19/d36/d75 0 2026-03-10T12:37:52.983 INFO:tasks.workunit.client.0.vm00.stdout:3/457: dwrite dd/d4e/d5d/f71 [4194304,4194304] 0 2026-03-10T12:37:52.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:52 vm00.local ceph-mon[50686]: pgmap v161: 65 pgs: 65 active+clean; 1.8 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 45 MiB/s rd, 181 MiB/s wr, 332 op/s 2026-03-10T12:37:52.989 INFO:tasks.workunit.client.0.vm00.stdout:2/395: dwrite d4/d6/f75 [0,4194304] 0 2026-03-10T12:37:52.992 INFO:tasks.workunit.client.0.vm00.stdout:1/413: fsync da/d4d/d78/f86 0 2026-03-10T12:37:52.995 INFO:tasks.workunit.client.0.vm00.stdout:5/407: rename d1f/d26/d2b/d35/d53/d5b/d73 to d1f/d26/d2b/d35/d8b 0 2026-03-10T12:37:52.999 INFO:tasks.workunit.client.0.vm00.stdout:2/396: rmdir d4/d78 39 2026-03-10T12:37:53.003 INFO:tasks.workunit.client.0.vm00.stdout:3/458: rename c0 to dd/d18/d14/c9d 0 2026-03-10T12:37:53.003 INFO:tasks.workunit.client.0.vm00.stdout:5/408: dread - d1f/d26/d2b/d37/f61 zero size 2026-03-10T12:37:53.003 INFO:tasks.workunit.client.1.vm07.stdout:1/409: symlink d9/df/d29/d2c/d59/l81 0 2026-03-10T12:37:53.003 INFO:tasks.workunit.client.1.vm07.stdout:8/445: symlink d1/d3/d6/l8f 0 2026-03-10T12:37:53.003 INFO:tasks.workunit.client.1.vm07.stdout:5/482: chown d0/d22/d18/d19/d2e/l49 1128589 1 2026-03-10T12:37:53.009 INFO:tasks.workunit.client.0.vm00.stdout:7/317: dread da/d25/f29 [0,4194304] 0 2026-03-10T12:37:53.011 INFO:tasks.workunit.client.1.vm07.stdout:7/407: write d0/f13 [1749533,114575] 0 2026-03-10T12:37:53.014 INFO:tasks.workunit.client.0.vm00.stdout:2/397: creat d4/d53/d76/f8b x:0 0 0 2026-03-10T12:37:53.016 INFO:tasks.workunit.client.0.vm00.stdout:9/430: dread d0/d3d/d59/f4a [0,4194304] 0 2026-03-10T12:37:53.017 INFO:tasks.workunit.client.0.vm00.stdout:9/431: chown d0/d3d/d43/d80/d19/c8e 777602 1 2026-03-10T12:37:53.018 INFO:tasks.workunit.client.0.vm00.stdout:4/427: truncate df/d1f/d36/d3a/d41/f47 3703845 0 2026-03-10T12:37:53.019 INFO:tasks.workunit.client.1.vm07.stdout:8/446: truncate d1/d3/f2d 4540459 0 2026-03-10T12:37:53.020 INFO:tasks.workunit.client.1.vm07.stdout:9/469: write d5/d13/d57/d4f/f63 [372850,5335] 0 2026-03-10T12:37:53.021 INFO:tasks.workunit.client.1.vm07.stdout:9/470: chown d5/d13/d57/d4f/f88 3 1 2026-03-10T12:37:53.022 INFO:tasks.workunit.client.0.vm00.stdout:4/428: dwrite df/d63/d77/f8d [0,4194304] 0 2026-03-10T12:37:53.024 INFO:tasks.workunit.client.0.vm00.stdout:4/429: chown df/d1f/d36/d3a/c55 57 1 2026-03-10T12:37:53.030 INFO:tasks.workunit.client.0.vm00.stdout:2/398: dread d4/dd/f3c [0,4194304] 0 2026-03-10T12:37:53.031 INFO:tasks.workunit.client.0.vm00.stdout:1/414: rename da/d24/f47 to da/d21/d39/f89 0 2026-03-10T12:37:53.037 INFO:tasks.workunit.client.0.vm00.stdout:3/459: creat dd/d18/d13/f9e x:0 0 0 2026-03-10T12:37:53.041 INFO:tasks.workunit.client.0.vm00.stdout:5/409: truncate d1f/d26/d2b/d37/f81 233933 0 2026-03-10T12:37:53.053 INFO:tasks.workunit.client.0.vm00.stdout:5/410: write d1f/f59 [1199751,38706] 0 2026-03-10T12:37:53.054 INFO:tasks.workunit.client.0.vm00.stdout:5/411: chown d1f/d6a/c7c 458684 1 2026-03-10T12:37:53.054 INFO:tasks.workunit.client.0.vm00.stdout:8/322: truncate d0/d12/f27 571848 0 2026-03-10T12:37:53.054 INFO:tasks.workunit.client.0.vm00.stdout:9/432: fsync d0/d3d/d43/d80/f8d 0 2026-03-10T12:37:53.054 INFO:tasks.workunit.client.0.vm00.stdout:9/433: write d0/d3d/d43/d80/d1e/d27/f75 [419909,129306] 0 2026-03-10T12:37:53.054 INFO:tasks.workunit.client.0.vm00.stdout:9/434: chown d0/d3d/d43/d80/f8d 31407 1 2026-03-10T12:37:53.054 INFO:tasks.workunit.client.0.vm00.stdout:2/399: rmdir d4/d6/d2d/d3a 39 2026-03-10T12:37:53.056 INFO:tasks.workunit.client.0.vm00.stdout:1/415: creat da/d24/d28/d44/d5d/d80/f8a x:0 0 0 2026-03-10T12:37:53.063 INFO:tasks.workunit.client.1.vm07.stdout:6/397: link d1/d4/d4a/f55 d1/d4/d6/f7d 0 2026-03-10T12:37:53.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:52 vm07.local ceph-mon[58582]: pgmap v161: 65 pgs: 65 active+clean; 1.8 GiB data, 6.5 GiB used, 113 GiB / 120 GiB avail; 45 MiB/s rd, 181 MiB/s wr, 332 op/s 2026-03-10T12:37:53.070 INFO:tasks.workunit.client.0.vm00.stdout:1/416: mkdir da/d24/d28/d56/d8b 0 2026-03-10T12:37:53.070 INFO:tasks.workunit.client.0.vm00.stdout:0/433: truncate d3/d22/f2e 2589650 0 2026-03-10T12:37:53.071 INFO:tasks.workunit.client.0.vm00.stdout:3/460: creat dd/d2a/f9f x:0 0 0 2026-03-10T12:37:53.075 INFO:tasks.workunit.client.0.vm00.stdout:3/461: dwrite dd/d27/d2c/f7d [0,4194304] 0 2026-03-10T12:37:53.077 INFO:tasks.workunit.client.0.vm00.stdout:5/412: link d1f/d26/d2b/d35/d53/f70 d1f/d26/d2e/f8c 0 2026-03-10T12:37:53.077 INFO:tasks.workunit.client.0.vm00.stdout:8/323: mknod d0/d12/d36/d5b/c64 0 2026-03-10T12:37:53.080 INFO:tasks.workunit.client.0.vm00.stdout:9/435: rename d0/d3d/d43/d53/d57/c37 to d0/d3d/d59/c96 0 2026-03-10T12:37:53.081 INFO:tasks.workunit.client.0.vm00.stdout:9/436: rename d0/d3d/d43 to d0/d3d/d43/d80/d1e/d85/d97 22 2026-03-10T12:37:53.081 INFO:tasks.workunit.client.0.vm00.stdout:4/430: creat df/d1f/d36/f92 x:0 0 0 2026-03-10T12:37:53.081 INFO:tasks.workunit.client.0.vm00.stdout:9/437: readlink d0/d3d/d43/d53/d57/l55 0 2026-03-10T12:37:53.083 INFO:tasks.workunit.client.0.vm00.stdout:3/462: creat dd/d18/d14/fa0 x:0 0 0 2026-03-10T12:37:53.085 INFO:tasks.workunit.client.0.vm00.stdout:3/463: readlink dd/d18/d13/l70 0 2026-03-10T12:37:53.085 INFO:tasks.workunit.client.0.vm00.stdout:5/413: symlink d1f/d6a/l8d 0 2026-03-10T12:37:53.086 INFO:tasks.workunit.client.1.vm07.stdout:8/447: creat d1/d3/d11/f90 x:0 0 0 2026-03-10T12:37:53.087 INFO:tasks.workunit.client.0.vm00.stdout:2/400: mknod d4/d6/d2d/d3a/d43/c8c 0 2026-03-10T12:37:53.088 INFO:tasks.workunit.client.0.vm00.stdout:2/401: write d4/f6e [319464,55400] 0 2026-03-10T12:37:53.091 INFO:tasks.workunit.client.0.vm00.stdout:1/417: link da/d21/f88 da/d21/d39/f8c 0 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.1.vm07.stdout:2/347: dwrite d0/d42/d26/d38/d4f/f65 [0,4194304] 0 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.1.vm07.stdout:2/348: readlink d0/lb 0 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.1.vm07.stdout:4/569: dread d0/d4/d7a/f87 [0,4194304] 0 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.1.vm07.stdout:2/349: rename d0/d42/d1f/d20 to d0/d42/d1f/d20/d7a 22 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.1.vm07.stdout:4/570: chown d0/d8e/fb5 3112 1 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.1.vm07.stdout:2/350: write d0/f2d [1342464,92114] 0 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.1.vm07.stdout:9/471: dread d5/fb [0,4194304] 0 2026-03-10T12:37:53.111 INFO:tasks.workunit.client.0.vm00.stdout:6/329: dwrite d2/da/dc/d2f/f56 [0,4194304] 0 2026-03-10T12:37:53.112 INFO:tasks.workunit.client.0.vm00.stdout:5/414: dwrite d1f/d26/d2b/d35/f68 [0,4194304] 0 2026-03-10T12:37:53.112 INFO:tasks.workunit.client.0.vm00.stdout:1/418: write da/d21/f74 [1177468,103072] 0 2026-03-10T12:37:53.112 INFO:tasks.workunit.client.0.vm00.stdout:4/431: mkdir df/d93 0 2026-03-10T12:37:53.112 INFO:tasks.workunit.client.0.vm00.stdout:4/432: truncate df/d63/d77/f8d 4838649 0 2026-03-10T12:37:53.112 INFO:tasks.workunit.client.0.vm00.stdout:9/438: mkdir d0/d3d/d43/d80/d1e/d85/d98 0 2026-03-10T12:37:53.112 INFO:tasks.workunit.client.0.vm00.stdout:4/433: mkdir df/d63/d94 0 2026-03-10T12:37:53.112 INFO:tasks.workunit.client.0.vm00.stdout:9/439: symlink d0/d7f/l99 0 2026-03-10T12:37:53.113 INFO:tasks.workunit.client.0.vm00.stdout:9/440: dread - d0/d3d/d59/d4e/d84/f87 zero size 2026-03-10T12:37:53.116 INFO:tasks.workunit.client.0.vm00.stdout:8/324: link d0/f22 d0/d12/d36/d5b/f65 0 2026-03-10T12:37:53.117 INFO:tasks.workunit.client.0.vm00.stdout:2/402: rename d4/dd/d38/l88 to d4/l8d 0 2026-03-10T12:37:53.117 INFO:tasks.workunit.client.0.vm00.stdout:9/441: dwrite d0/d3d/d43/d53/d57/f8b [0,4194304] 0 2026-03-10T12:37:53.123 INFO:tasks.workunit.client.0.vm00.stdout:1/419: getdents da/d24/d28/d67 0 2026-03-10T12:37:53.123 INFO:tasks.workunit.client.0.vm00.stdout:1/420: read - da/d21/d39/f8c zero size 2026-03-10T12:37:53.128 INFO:tasks.workunit.client.0.vm00.stdout:6/330: mkdir d2/d14/d7a 0 2026-03-10T12:37:53.137 INFO:tasks.workunit.client.0.vm00.stdout:2/403: mkdir d4/d6/d41/d6d/d40/d8e 0 2026-03-10T12:37:53.138 INFO:tasks.workunit.client.0.vm00.stdout:8/325: mknod d0/c66 0 2026-03-10T12:37:53.138 INFO:tasks.workunit.client.0.vm00.stdout:8/326: stat d0/d12/d36/d51/f61 0 2026-03-10T12:37:53.138 INFO:tasks.workunit.client.0.vm00.stdout:9/442: symlink d0/d3d/d43/d53/l9a 0 2026-03-10T12:37:53.138 INFO:tasks.workunit.client.0.vm00.stdout:8/327: creat d0/d12/d17/f67 x:0 0 0 2026-03-10T12:37:53.142 INFO:tasks.workunit.client.0.vm00.stdout:8/328: dwrite d0/d5c/f42 [0,4194304] 0 2026-03-10T12:37:53.145 INFO:tasks.workunit.client.0.vm00.stdout:4/434: rename df/d32/f58 to df/d1f/d22/f95 0 2026-03-10T12:37:53.158 INFO:tasks.workunit.client.1.vm07.stdout:7/408: getdents d0/d67/d6f/d80 0 2026-03-10T12:37:53.158 INFO:tasks.workunit.client.1.vm07.stdout:7/409: chown d0/f13 39 1 2026-03-10T12:37:53.159 INFO:tasks.workunit.client.0.vm00.stdout:2/404: creat d4/d6/d2d/d3a/d43/d85/f8f x:0 0 0 2026-03-10T12:37:53.159 INFO:tasks.workunit.client.0.vm00.stdout:2/405: fdatasync d4/d6/d2d/d3a/f7c 0 2026-03-10T12:37:53.159 INFO:tasks.workunit.client.0.vm00.stdout:1/421: rename da/d24/d28/c29 to da/d24/d28/d44/d5d/d72/d7e/c8d 0 2026-03-10T12:37:53.159 INFO:tasks.workunit.client.0.vm00.stdout:2/406: chown d4/d6/d2d/d31/l4f 104421 1 2026-03-10T12:37:53.159 INFO:tasks.workunit.client.0.vm00.stdout:8/329: mkdir d0/d58/d68 0 2026-03-10T12:37:53.159 INFO:tasks.workunit.client.0.vm00.stdout:1/422: mknod da/d4d/d78/c8e 0 2026-03-10T12:37:53.161 INFO:tasks.workunit.client.0.vm00.stdout:1/423: symlink da/d24/d28/d56/l8f 0 2026-03-10T12:37:53.165 INFO:tasks.workunit.client.0.vm00.stdout:1/424: mknod da/d24/d28/d44/c90 0 2026-03-10T12:37:53.166 INFO:tasks.workunit.client.0.vm00.stdout:1/425: chown da/d4d 7211 1 2026-03-10T12:37:53.167 INFO:tasks.workunit.client.0.vm00.stdout:2/407: link d4/dd/c20 d4/d53/d68/c90 0 2026-03-10T12:37:53.171 INFO:tasks.workunit.client.0.vm00.stdout:2/408: dwrite d4/d6/d41/d6d/d40/f80 [0,4194304] 0 2026-03-10T12:37:53.177 INFO:tasks.workunit.client.0.vm00.stdout:2/409: mknod d4/d6/d2d/d3a/d43/d85/c91 0 2026-03-10T12:37:53.177 INFO:tasks.workunit.client.0.vm00.stdout:2/410: creat d4/d53/d76/f92 x:0 0 0 2026-03-10T12:37:53.177 INFO:tasks.workunit.client.0.vm00.stdout:2/411: read - d4/d6/d41/d6d/d40/f7e zero size 2026-03-10T12:37:53.177 INFO:tasks.workunit.client.0.vm00.stdout:2/412: stat d4/d6/d41/d6d/l48 0 2026-03-10T12:37:53.177 INFO:tasks.workunit.client.0.vm00.stdout:2/413: stat d4/d6/d2d/d3a/d43/d85/c91 0 2026-03-10T12:37:53.178 INFO:tasks.workunit.client.0.vm00.stdout:2/414: mkdir d4/d6/d93 0 2026-03-10T12:37:53.179 INFO:tasks.workunit.client.0.vm00.stdout:2/415: write d4/dd/f10 [2250344,20215] 0 2026-03-10T12:37:53.184 INFO:tasks.workunit.client.0.vm00.stdout:6/331: rmdir d2/d39 39 2026-03-10T12:37:53.191 INFO:tasks.workunit.client.0.vm00.stdout:1/426: dread da/d24/f81 [0,4194304] 0 2026-03-10T12:37:53.192 INFO:tasks.workunit.client.0.vm00.stdout:1/427: mkdir da/d12/d91 0 2026-03-10T12:37:53.193 INFO:tasks.workunit.client.0.vm00.stdout:1/428: dread - da/d12/d26/f69 zero size 2026-03-10T12:37:53.193 INFO:tasks.workunit.client.0.vm00.stdout:1/429: fdatasync da/d21/d27/f6e 0 2026-03-10T12:37:53.214 INFO:tasks.workunit.client.1.vm07.stdout:7/410: creat d0/d47/f81 x:0 0 0 2026-03-10T12:37:53.220 INFO:tasks.workunit.client.0.vm00.stdout:6/332: mkdir d2/d51/d7b 0 2026-03-10T12:37:53.220 INFO:tasks.workunit.client.1.vm07.stdout:9/472: getdents d5/d13/d57/d3e 0 2026-03-10T12:37:53.220 INFO:tasks.workunit.client.1.vm07.stdout:7/411: link d0/c3e d0/d47/c82 0 2026-03-10T12:37:53.224 INFO:tasks.workunit.client.0.vm00.stdout:2/416: dread d4/d53/f5d [0,4194304] 0 2026-03-10T12:37:53.226 INFO:tasks.workunit.client.0.vm00.stdout:2/417: unlink d4/dd/l5c 0 2026-03-10T12:37:53.230 INFO:tasks.workunit.client.0.vm00.stdout:2/418: dwrite d4/d6/d41/d6d/d40/f50 [0,4194304] 0 2026-03-10T12:37:53.236 INFO:tasks.workunit.client.0.vm00.stdout:2/419: chown d4/dd/l29 4022608 1 2026-03-10T12:37:53.236 INFO:tasks.workunit.client.0.vm00.stdout:2/420: chown d4/d6/d2d/d3a/d43/d85/f8f 53 1 2026-03-10T12:37:53.236 INFO:tasks.workunit.client.0.vm00.stdout:2/421: readlink d4/d53/d76/l87 0 2026-03-10T12:37:53.237 INFO:tasks.workunit.client.0.vm00.stdout:2/422: mknod d4/d6/d2d/c94 0 2026-03-10T12:37:53.238 INFO:tasks.workunit.client.0.vm00.stdout:2/423: mknod d4/d6/d2d/d31/c95 0 2026-03-10T12:37:53.239 INFO:tasks.workunit.client.0.vm00.stdout:2/424: stat d4/f1d 0 2026-03-10T12:37:53.241 INFO:tasks.workunit.client.0.vm00.stdout:2/425: rename d4/d53/f5d to d4/d6/d41/f96 0 2026-03-10T12:37:53.243 INFO:tasks.workunit.client.0.vm00.stdout:2/426: rename d4/d6/d2d/l81 to d4/dd/l97 0 2026-03-10T12:37:53.260 INFO:tasks.workunit.client.1.vm07.stdout:8/448: sync 2026-03-10T12:37:53.260 INFO:tasks.workunit.client.1.vm07.stdout:4/571: sync 2026-03-10T12:37:53.260 INFO:tasks.workunit.client.1.vm07.stdout:6/398: sync 2026-03-10T12:37:53.266 INFO:tasks.workunit.client.1.vm07.stdout:4/572: creat d0/d8e/fc4 x:0 0 0 2026-03-10T12:37:53.267 INFO:tasks.workunit.client.1.vm07.stdout:8/449: truncate d1/d3/f59 181523 0 2026-03-10T12:37:53.269 INFO:tasks.workunit.client.1.vm07.stdout:4/573: mkdir d0/d4/d5/d78/dc5 0 2026-03-10T12:37:53.280 INFO:tasks.workunit.client.1.vm07.stdout:8/450: rename d1/d3/d18/l22 to d1/d3/d6/d54/l91 0 2026-03-10T12:37:53.282 INFO:tasks.workunit.client.1.vm07.stdout:9/473: dread d5/f65 [0,4194304] 0 2026-03-10T12:37:53.285 INFO:tasks.workunit.client.1.vm07.stdout:8/451: mkdir d1/d3/d40/d92 0 2026-03-10T12:37:53.288 INFO:tasks.workunit.client.1.vm07.stdout:9/474: chown d5/l12 22915 1 2026-03-10T12:37:53.290 INFO:tasks.workunit.client.1.vm07.stdout:8/452: rmdir d1 39 2026-03-10T12:37:53.300 INFO:tasks.workunit.client.1.vm07.stdout:9/475: mkdir d5/d13/d6c/d89/dac 0 2026-03-10T12:37:53.313 INFO:tasks.workunit.client.1.vm07.stdout:8/453: unlink d1/d3/d6/l8f 0 2026-03-10T12:37:53.313 INFO:tasks.workunit.client.1.vm07.stdout:9/476: creat d5/d1f/d31/fad x:0 0 0 2026-03-10T12:37:53.313 INFO:tasks.workunit.client.1.vm07.stdout:8/454: fdatasync d1/d3/f57 0 2026-03-10T12:37:53.319 INFO:tasks.workunit.client.1.vm07.stdout:9/477: creat d5/d1f/d5e/d6b/fae x:0 0 0 2026-03-10T12:37:53.333 INFO:tasks.workunit.client.1.vm07.stdout:9/478: unlink d5/f45 0 2026-03-10T12:37:53.333 INFO:tasks.workunit.client.1.vm07.stdout:6/399: sync 2026-03-10T12:37:53.333 INFO:tasks.workunit.client.1.vm07.stdout:6/400: stat d1/d4/d6/f7d 0 2026-03-10T12:37:53.333 INFO:tasks.workunit.client.1.vm07.stdout:6/401: write d1/d4/d4a/f56 [1734178,123975] 0 2026-03-10T12:37:53.333 INFO:tasks.workunit.client.1.vm07.stdout:6/402: fsync d1/d4/d44/f45 0 2026-03-10T12:37:53.344 INFO:tasks.workunit.client.0.vm00.stdout:1/430: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:53.347 INFO:tasks.workunit.client.0.vm00.stdout:1/431: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:53.355 INFO:tasks.workunit.client.0.vm00.stdout:1/432: chown da/d4d/d78/c8e 25 1 2026-03-10T12:37:53.355 INFO:tasks.workunit.client.0.vm00.stdout:1/433: write da/f13 [1302490,28650] 0 2026-03-10T12:37:53.355 INFO:tasks.workunit.client.0.vm00.stdout:1/434: dread da/d24/d28/d67/f52 [0,4194304] 0 2026-03-10T12:37:53.355 INFO:tasks.workunit.client.0.vm00.stdout:1/435: creat da/d21/d39/f92 x:0 0 0 2026-03-10T12:37:53.368 INFO:tasks.workunit.client.1.vm07.stdout:0/488: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/f6e [1395182,79892] 0 2026-03-10T12:37:53.373 INFO:tasks.workunit.client.0.vm00.stdout:1/436: dread da/d21/d39/f89 [0,4194304] 0 2026-03-10T12:37:53.373 INFO:tasks.workunit.client.1.vm07.stdout:3/465: dwrite dc/f17 [0,4194304] 0 2026-03-10T12:37:53.375 INFO:tasks.workunit.client.0.vm00.stdout:1/437: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:53.377 INFO:tasks.workunit.client.0.vm00.stdout:1/438: fsync f3 0 2026-03-10T12:37:53.380 INFO:tasks.workunit.client.0.vm00.stdout:1/439: mkdir da/d21/d93 0 2026-03-10T12:37:53.380 INFO:tasks.workunit.client.1.vm07.stdout:1/410: write d9/df/f58 [472792,11022] 0 2026-03-10T12:37:53.381 INFO:tasks.workunit.client.0.vm00.stdout:7/318: write da/d26/d37/f4a [559102,47514] 0 2026-03-10T12:37:53.381 INFO:tasks.workunit.client.1.vm07.stdout:5/483: write d0/d22/d18/d3e/d53/f84 [394986,116666] 0 2026-03-10T12:37:53.382 INFO:tasks.workunit.client.1.vm07.stdout:1/411: readlink d9/df/d29/d2c/l66 0 2026-03-10T12:37:53.385 INFO:tasks.workunit.client.0.vm00.stdout:7/319: dwrite da/fb [0,4194304] 0 2026-03-10T12:37:53.394 INFO:tasks.workunit.client.1.vm07.stdout:0/489: rename d0/d14/d5f/d76/d2f/d31/d6b to d0/d14/d5f/d76/da1 0 2026-03-10T12:37:53.395 INFO:tasks.workunit.client.1.vm07.stdout:0/490: chown d0 607 1 2026-03-10T12:37:53.398 INFO:tasks.workunit.client.0.vm00.stdout:7/320: creat da/d26/d37/f79 x:0 0 0 2026-03-10T12:37:53.399 INFO:tasks.workunit.client.1.vm07.stdout:0/491: chown d0/d14/d5f/d76/d2f/d31/d4f/f92 329097068 1 2026-03-10T12:37:53.407 INFO:tasks.workunit.client.1.vm07.stdout:0/492: rmdir d0/d14 39 2026-03-10T12:37:53.414 INFO:tasks.workunit.client.1.vm07.stdout:5/484: creat d0/d22/d18/d3e/fa5 x:0 0 0 2026-03-10T12:37:53.425 INFO:tasks.workunit.client.0.vm00.stdout:3/464: write dd/d18/d13/f6b [3283176,53597] 0 2026-03-10T12:37:53.426 INFO:tasks.workunit.client.1.vm07.stdout:1/412: creat d9/df/d29/f82 x:0 0 0 2026-03-10T12:37:53.430 INFO:tasks.workunit.client.0.vm00.stdout:3/465: creat dd/d27/d2c/d34/d38/fa1 x:0 0 0 2026-03-10T12:37:53.432 INFO:tasks.workunit.client.0.vm00.stdout:5/415: truncate d1f/d26/d2b/f52 2766695 0 2026-03-10T12:37:53.432 INFO:tasks.workunit.client.1.vm07.stdout:5/485: symlink d0/d22/d18/d80/la6 0 2026-03-10T12:37:53.435 INFO:tasks.workunit.client.0.vm00.stdout:5/416: mkdir d1f/d26/d2b/d35/d8b/d8e 0 2026-03-10T12:37:53.438 INFO:tasks.workunit.client.0.vm00.stdout:5/417: rename d1f/d6a/c56 to d1f/d26/d2e/c8f 0 2026-03-10T12:37:53.438 INFO:tasks.workunit.client.0.vm00.stdout:5/418: fsync f12 0 2026-03-10T12:37:53.439 INFO:tasks.workunit.client.0.vm00.stdout:5/419: chown d1f/d26/d2b/d35/c64 12 1 2026-03-10T12:37:53.439 INFO:tasks.workunit.client.0.vm00.stdout:5/420: stat d1f/d26/f48 0 2026-03-10T12:37:53.441 INFO:tasks.workunit.client.0.vm00.stdout:4/435: unlink df/d1f/d22/f95 0 2026-03-10T12:37:53.443 INFO:tasks.workunit.client.0.vm00.stdout:7/321: symlink da/d26/d37/l7a 0 2026-03-10T12:37:53.444 INFO:tasks.workunit.client.0.vm00.stdout:7/322: read - da/d25/f5a zero size 2026-03-10T12:37:53.446 INFO:tasks.workunit.client.0.vm00.stdout:5/421: mkdir d1f/d26/d2b/d35/d8b/d90 0 2026-03-10T12:37:53.448 INFO:tasks.workunit.client.0.vm00.stdout:8/330: dread d0/d12/d17/f1d [0,4194304] 0 2026-03-10T12:37:53.450 INFO:tasks.workunit.client.0.vm00.stdout:9/443: dwrite d0/d3d/f8c [0,4194304] 0 2026-03-10T12:37:53.450 INFO:tasks.workunit.client.0.vm00.stdout:9/444: stat d0/d3d/d43/f54 0 2026-03-10T12:37:53.451 INFO:tasks.workunit.client.0.vm00.stdout:5/422: mknod d1f/d39/c91 0 2026-03-10T12:37:53.452 INFO:tasks.workunit.client.1.vm07.stdout:1/413: link d9/df/f26 d9/d2d/d4f/d75/f83 0 2026-03-10T12:37:53.452 INFO:tasks.workunit.client.0.vm00.stdout:5/423: write d1f/d26/f79 [1013445,8781] 0 2026-03-10T12:37:53.454 INFO:tasks.workunit.client.0.vm00.stdout:7/323: write da/d25/d2c/f4f [1677004,19662] 0 2026-03-10T12:37:53.456 INFO:tasks.workunit.client.0.vm00.stdout:8/331: creat d0/d12/d36/d5b/f69 x:0 0 0 2026-03-10T12:37:53.459 INFO:tasks.workunit.client.0.vm00.stdout:4/436: dread df/f20 [0,4194304] 0 2026-03-10T12:37:53.459 INFO:tasks.workunit.client.0.vm00.stdout:7/324: mkdir da/d41/d7b 0 2026-03-10T12:37:53.467 INFO:tasks.workunit.client.0.vm00.stdout:2/427: rmdir d4/d53/d68 39 2026-03-10T12:37:53.468 INFO:tasks.workunit.client.0.vm00.stdout:2/428: write d4/d6/d2d/d3a/d43/d51/f6f [905968,88627] 0 2026-03-10T12:37:53.468 INFO:tasks.workunit.client.1.vm07.stdout:0/493: link d0/d14/d5f/d41/d86/f96 d0/d14/d5f/d76/da1/fa2 0 2026-03-10T12:37:53.473 INFO:tasks.workunit.client.0.vm00.stdout:4/437: truncate df/d32/d76/f82 4439439 0 2026-03-10T12:37:53.476 INFO:tasks.workunit.client.0.vm00.stdout:2/429: mknod d4/d6/d2d/d31/c98 0 2026-03-10T12:37:53.476 INFO:tasks.workunit.client.1.vm07.stdout:2/351: write d0/d42/d26/d4b/f58 [854866,67142] 0 2026-03-10T12:37:53.476 INFO:tasks.workunit.client.1.vm07.stdout:7/412: write d0/d61/f64 [247751,13544] 0 2026-03-10T12:37:53.476 INFO:tasks.workunit.client.1.vm07.stdout:2/352: fdatasync d0/d42/d1f/d20/f3f 0 2026-03-10T12:37:53.476 INFO:tasks.workunit.client.1.vm07.stdout:2/353: read d0/d42/f2c [1209524,80177] 0 2026-03-10T12:37:53.478 INFO:tasks.workunit.client.0.vm00.stdout:9/445: mkdir d0/d9b 0 2026-03-10T12:37:53.481 INFO:tasks.workunit.client.0.vm00.stdout:5/424: symlink d1f/d26/l92 0 2026-03-10T12:37:53.482 INFO:tasks.workunit.client.1.vm07.stdout:4/574: dwrite d0/d4/d7a/f93 [0,4194304] 0 2026-03-10T12:37:53.486 INFO:tasks.workunit.client.0.vm00.stdout:4/438: readlink df/d1f/l35 0 2026-03-10T12:37:53.487 INFO:tasks.workunit.client.1.vm07.stdout:8/455: write d1/d3/d40/f5b [475756,55104] 0 2026-03-10T12:37:53.487 INFO:tasks.workunit.client.1.vm07.stdout:9/479: truncate d5/d1f/d31/f82 2918302 0 2026-03-10T12:37:53.488 INFO:tasks.workunit.client.1.vm07.stdout:2/354: dread d0/d42/f1b [0,4194304] 0 2026-03-10T12:37:53.488 INFO:tasks.workunit.client.0.vm00.stdout:3/466: dread dd/d18/f83 [0,4194304] 0 2026-03-10T12:37:53.489 INFO:tasks.workunit.client.0.vm00.stdout:3/467: chown dd/d18/d13/d1d/d43/c9b 69278 1 2026-03-10T12:37:53.492 INFO:tasks.workunit.client.0.vm00.stdout:2/430: mknod d4/d6/d2d/d3a/c99 0 2026-03-10T12:37:53.494 INFO:tasks.workunit.client.0.vm00.stdout:7/325: symlink da/d25/d2e/d4c/l7c 0 2026-03-10T12:37:53.496 INFO:tasks.workunit.client.0.vm00.stdout:9/446: rmdir d0/d3d/d43/d80/d19 39 2026-03-10T12:37:53.502 INFO:tasks.workunit.client.0.vm00.stdout:3/468: rmdir dd/d3d/d65 39 2026-03-10T12:37:53.507 INFO:tasks.workunit.client.0.vm00.stdout:9/447: chown d0/d5/c72 9658 1 2026-03-10T12:37:53.507 INFO:tasks.workunit.client.0.vm00.stdout:5/425: dread f12 [0,4194304] 0 2026-03-10T12:37:53.508 INFO:tasks.workunit.client.0.vm00.stdout:1/440: write da/fc [1603053,54194] 0 2026-03-10T12:37:53.508 INFO:tasks.workunit.client.0.vm00.stdout:5/426: read - d1f/d26/d2b/f7e zero size 2026-03-10T12:37:53.509 INFO:tasks.workunit.client.0.vm00.stdout:9/448: read d0/d3d/d43/d53/d57/f3f [2944388,84036] 0 2026-03-10T12:37:53.511 INFO:tasks.workunit.client.0.vm00.stdout:3/469: mkdir dd/d2a/da2 0 2026-03-10T12:37:53.513 INFO:tasks.workunit.client.0.vm00.stdout:2/431: fsync f1 0 2026-03-10T12:37:53.515 INFO:tasks.workunit.client.1.vm07.stdout:6/403: write d1/d4/f11 [3450442,76087] 0 2026-03-10T12:37:53.517 INFO:tasks.workunit.client.0.vm00.stdout:1/441: mkdir da/d21/d27/d6a/d94 0 2026-03-10T12:37:53.518 INFO:tasks.workunit.client.0.vm00.stdout:5/427: symlink d1f/d26/d2b/d35/d53/d5b/l93 0 2026-03-10T12:37:53.519 INFO:tasks.workunit.client.0.vm00.stdout:9/449: rmdir d0/d7f 39 2026-03-10T12:37:53.520 INFO:tasks.workunit.client.1.vm07.stdout:3/466: dread dc/dd/d28/d3b/f4d [0,4194304] 0 2026-03-10T12:37:53.522 INFO:tasks.workunit.client.0.vm00.stdout:9/450: dread d0/d3d/d43/d53/d57/f6c [0,4194304] 0 2026-03-10T12:37:53.526 INFO:tasks.workunit.client.0.vm00.stdout:3/470: rename dd/d2a/l4a to dd/d2a/la3 0 2026-03-10T12:37:53.526 INFO:tasks.workunit.client.0.vm00.stdout:3/471: chown dd/d3d/d73 37826 1 2026-03-10T12:37:53.526 INFO:tasks.workunit.client.0.vm00.stdout:3/472: chown dd/d27/f91 27968728 1 2026-03-10T12:37:53.526 INFO:tasks.workunit.client.0.vm00.stdout:3/473: truncate dd/d64/f87 1561783 0 2026-03-10T12:37:53.527 INFO:tasks.workunit.client.0.vm00.stdout:3/474: write dd/d27/d2c/f7d [1768890,129457] 0 2026-03-10T12:37:53.528 INFO:tasks.workunit.client.0.vm00.stdout:3/475: chown dd/d18/d13/f22 53 1 2026-03-10T12:37:53.528 INFO:tasks.workunit.client.0.vm00.stdout:3/476: write dd/d18/f7c [3641298,37337] 0 2026-03-10T12:37:53.530 INFO:tasks.workunit.client.1.vm07.stdout:4/575: creat d0/d4/d10/d3c/d2b/d2d/da7/fc6 x:0 0 0 2026-03-10T12:37:53.532 INFO:tasks.workunit.client.1.vm07.stdout:4/576: write d0/d4/d7a/f93 [700012,114522] 0 2026-03-10T12:37:53.536 INFO:tasks.workunit.client.1.vm07.stdout:8/456: truncate d1/f3f 825224 0 2026-03-10T12:37:53.536 INFO:tasks.workunit.client.0.vm00.stdout:5/428: unlink d1f/d6a/c7c 0 2026-03-10T12:37:53.538 INFO:tasks.workunit.client.0.vm00.stdout:9/451: rename d0/d3d/d43/d80/d1e/d27/f52 to d0/d5/dc/f9c 0 2026-03-10T12:37:53.540 INFO:tasks.workunit.client.1.vm07.stdout:9/480: mkdir d5/d13/d6c/d7a/daf 0 2026-03-10T12:37:53.542 INFO:tasks.workunit.client.1.vm07.stdout:2/355: symlink d0/d42/d26/d4b/l7b 0 2026-03-10T12:37:53.542 INFO:tasks.workunit.client.0.vm00.stdout:3/477: unlink dd/d27/d2c/d34/d38/f76 0 2026-03-10T12:37:53.543 INFO:tasks.workunit.client.0.vm00.stdout:2/432: mknod d4/d53/c9a 0 2026-03-10T12:37:53.551 INFO:tasks.workunit.client.1.vm07.stdout:5/486: getdents d0/d22 0 2026-03-10T12:37:53.576 INFO:tasks.workunit.client.1.vm07.stdout:0/494: mknod d0/ca3 0 2026-03-10T12:37:53.576 INFO:tasks.workunit.client.0.vm00.stdout:2/433: fsync d4/d6/f2b 0 2026-03-10T12:37:53.576 INFO:tasks.workunit.client.0.vm00.stdout:1/442: mknod da/d24/c95 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:5/429: mkdir d1f/d6a/d94 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:9/452: mknod d0/d3d/d43/d80/d1e/d85/c9d 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:8/332: truncate d0/d12/f27 348679 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:3/478: dread dd/d64/f98 [0,4194304] 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:5/430: symlink d1f/d26/d2b/d35/d78/l95 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:9/453: rmdir d0/d3d/d43/d80/d1e/d27 39 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:3/479: creat dd/d64/fa4 x:0 0 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:3/480: chown dd/d2a/f9f 1001774 1 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.0.vm00.stdout:9/454: creat d0/d3d/d43/d80/d1e/d27/f9e x:0 0 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.1.vm07.stdout:3/467: mknod dc/d18/d99/ca4 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.1.vm07.stdout:7/413: creat d0/d61/d79/f83 x:0 0 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.1.vm07.stdout:4/577: creat d0/d4/d10/fc7 x:0 0 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.1.vm07.stdout:5/487: rename d0/d22/d18/d19/d21/d54/l4b to d0/d22/d18/d80/la7 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.1.vm07.stdout:0/495: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/fa4 x:0 0 0 2026-03-10T12:37:53.577 INFO:tasks.workunit.client.1.vm07.stdout:4/578: unlink d0/d5c/fc3 0 2026-03-10T12:37:53.588 INFO:tasks.workunit.client.1.vm07.stdout:3/468: link dc/dd/f21 dc/dd/d28/d3b/fa5 0 2026-03-10T12:37:53.592 INFO:tasks.workunit.client.1.vm07.stdout:8/457: getdents d1/d3/d5d/d65 0 2026-03-10T12:37:53.594 INFO:tasks.workunit.client.0.vm00.stdout:5/431: dread d1f/f59 [0,4194304] 0 2026-03-10T12:37:53.597 INFO:tasks.workunit.client.0.vm00.stdout:5/432: mkdir d1f/d96 0 2026-03-10T12:37:53.597 INFO:tasks.workunit.client.1.vm07.stdout:3/469: link dc/dd/d28/l6b dc/d18/la6 0 2026-03-10T12:37:53.599 INFO:tasks.workunit.client.1.vm07.stdout:3/470: truncate dc/dd/d28/d3b/f9f 844132 0 2026-03-10T12:37:53.604 INFO:tasks.workunit.client.1.vm07.stdout:3/471: write dc/d18/f36 [1599584,91523] 0 2026-03-10T12:37:53.609 INFO:tasks.workunit.client.0.vm00.stdout:5/433: dread d1f/d26/d2b/d37/f77 [4194304,4194304] 0 2026-03-10T12:37:53.609 INFO:tasks.workunit.client.1.vm07.stdout:3/472: rename c9 to dc/d18/d2d/ca7 0 2026-03-10T12:37:53.609 INFO:tasks.workunit.client.0.vm00.stdout:5/434: chown c6 7 1 2026-03-10T12:37:53.611 INFO:tasks.workunit.client.0.vm00.stdout:5/435: rmdir d1f/d26/d2e/d7a 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.1.vm07.stdout:3/473: dwrite dc/dd/f85 [0,4194304] 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.0.vm00.stdout:5/436: creat d1f/f97 x:0 0 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.0.vm00.stdout:5/437: symlink d1f/d96/l98 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.0.vm00.stdout:5/438: mkdir d1f/d26/d2b/d35/d78/d99 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.0.vm00.stdout:5/439: rename d1f/d26/d2e/d58/d6b/f80 to d1f/d26/d2e/d58/d6b/f9a 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.0.vm00.stdout:5/440: creat d1f/d26/d6f/f9b x:0 0 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.0.vm00.stdout:5/441: truncate d1f/d6a/f84 432496 0 2026-03-10T12:37:53.625 INFO:tasks.workunit.client.0.vm00.stdout:5/442: dread - d1f/d26/d2b/d35/d53/d72/f85 zero size 2026-03-10T12:37:53.626 INFO:tasks.workunit.client.0.vm00.stdout:4/439: sync 2026-03-10T12:37:53.626 INFO:tasks.workunit.client.1.vm07.stdout:6/404: sync 2026-03-10T12:37:53.626 INFO:tasks.workunit.client.0.vm00.stdout:1/443: sync 2026-03-10T12:37:53.627 INFO:tasks.workunit.client.0.vm00.stdout:5/443: dwrite f11 [4194304,4194304] 0 2026-03-10T12:37:53.628 INFO:tasks.workunit.client.1.vm07.stdout:9/481: sync 2026-03-10T12:37:53.642 INFO:tasks.workunit.client.0.vm00.stdout:1/444: mknod da/d4d/d78/c96 0 2026-03-10T12:37:53.642 INFO:tasks.workunit.client.1.vm07.stdout:9/482: read d5/d13/d22/f36 [91327,127894] 0 2026-03-10T12:37:53.643 INFO:tasks.workunit.client.1.vm07.stdout:5/488: dread d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:37:53.643 INFO:tasks.workunit.client.0.vm00.stdout:2/434: write d4/d6/d41/d6d/d40/f5e [633232,16890] 0 2026-03-10T12:37:53.647 INFO:tasks.workunit.client.0.vm00.stdout:8/333: write d0/d12/d2d/f33 [989899,63388] 0 2026-03-10T12:37:53.647 INFO:tasks.workunit.client.1.vm07.stdout:9/483: creat d5/d1f/d31/d76/fb0 x:0 0 0 2026-03-10T12:37:53.650 INFO:tasks.workunit.client.0.vm00.stdout:9/455: dwrite d0/d3d/d43/d80/d1e/d2b/f6b [4194304,4194304] 0 2026-03-10T12:37:53.652 INFO:tasks.workunit.client.0.vm00.stdout:5/444: fdatasync d1f/d39/f5f 0 2026-03-10T12:37:53.652 INFO:tasks.workunit.client.1.vm07.stdout:9/484: dread d5/d16/d23/d26/f42 [0,4194304] 0 2026-03-10T12:37:53.655 INFO:tasks.workunit.client.0.vm00.stdout:9/456: dwrite d0/d3d/d43/d80/d1e/d2b/f36 [0,4194304] 0 2026-03-10T12:37:53.660 INFO:tasks.workunit.client.1.vm07.stdout:6/405: sync 2026-03-10T12:37:53.668 INFO:tasks.workunit.client.0.vm00.stdout:5/445: rename d1f/f2c to d1f/d39/f9c 0 2026-03-10T12:37:53.668 INFO:tasks.workunit.client.0.vm00.stdout:5/446: chown d1f/d26/d2b/d35/d53/d5b/l83 30497559 1 2026-03-10T12:37:53.670 INFO:tasks.workunit.client.0.vm00.stdout:2/435: dread d4/dd/ff [0,4194304] 0 2026-03-10T12:37:53.670 INFO:tasks.workunit.client.0.vm00.stdout:2/436: chown d4/dd 93 1 2026-03-10T12:37:53.671 INFO:tasks.workunit.client.0.vm00.stdout:2/437: write d4/dd/f10 [8079787,65610] 0 2026-03-10T12:37:53.678 INFO:tasks.workunit.client.0.vm00.stdout:2/438: fsync d4/d6/f4e 0 2026-03-10T12:37:53.679 INFO:tasks.workunit.client.1.vm07.stdout:9/485: unlink d5/d13/d57/d4f/c8b 0 2026-03-10T12:37:53.679 INFO:tasks.workunit.client.0.vm00.stdout:9/457: creat d0/f9f x:0 0 0 2026-03-10T12:37:53.680 INFO:tasks.workunit.client.0.vm00.stdout:9/458: read - d0/d3d/d59/f94 zero size 2026-03-10T12:37:53.681 INFO:tasks.workunit.client.0.vm00.stdout:9/459: write d0/d3d/d43/d80/d1e/d27/f9e [199457,2723] 0 2026-03-10T12:37:53.684 INFO:tasks.workunit.client.0.vm00.stdout:5/447: truncate d1f/d26/d2b/d37/f38 7715714 0 2026-03-10T12:37:53.685 INFO:tasks.workunit.client.0.vm00.stdout:1/445: getdents da/d24/d28/d44/d5d/d72 0 2026-03-10T12:37:53.687 INFO:tasks.workunit.client.0.vm00.stdout:2/439: mkdir d4/d53/d76/d9b 0 2026-03-10T12:37:53.688 INFO:tasks.workunit.client.0.vm00.stdout:9/460: fdatasync d0/d3d/d43/d80/d1e/d27/f28 0 2026-03-10T12:37:53.690 INFO:tasks.workunit.client.1.vm07.stdout:5/489: rename d0/d22/d18/d19/d36/d75/f9c to d0/d22/d18/d19/fa8 0 2026-03-10T12:37:53.700 INFO:tasks.workunit.client.1.vm07.stdout:0/496: fsync d0/d14/d5f/d41/d86/f96 0 2026-03-10T12:37:53.701 INFO:tasks.workunit.client.1.vm07.stdout:1/414: write d9/f61 [608777,91929] 0 2026-03-10T12:37:53.701 INFO:tasks.workunit.client.0.vm00.stdout:5/448: rename d1f/d26/d2b/d35/d8b to d1f/d26/d2b/d35/d53/d72/d9d 0 2026-03-10T12:37:53.701 INFO:tasks.workunit.client.0.vm00.stdout:3/481: dread dd/d27/f44 [0,4194304] 0 2026-03-10T12:37:53.701 INFO:tasks.workunit.client.0.vm00.stdout:0/434: truncate d3/d22/d3a/f8c 1065181 0 2026-03-10T12:37:53.701 INFO:tasks.workunit.client.0.vm00.stdout:1/446: truncate da/d24/f81 1473997 0 2026-03-10T12:37:53.701 INFO:tasks.workunit.client.0.vm00.stdout:2/440: dwrite d4/f39 [0,4194304] 0 2026-03-10T12:37:53.701 INFO:tasks.workunit.client.0.vm00.stdout:7/326: write da/d25/d2c/d58/d68/f38 [4850358,57495] 0 2026-03-10T12:37:53.702 INFO:tasks.workunit.client.0.vm00.stdout:2/441: dread d4/d6/d41/f96 [0,4194304] 0 2026-03-10T12:37:53.703 INFO:tasks.workunit.client.1.vm07.stdout:0/497: dwrite d0/d14/d5f/d76/f8a [0,4194304] 0 2026-03-10T12:37:53.708 INFO:tasks.workunit.client.0.vm00.stdout:2/442: dwrite d4/f73 [0,4194304] 0 2026-03-10T12:37:53.711 INFO:tasks.workunit.client.0.vm00.stdout:2/443: readlink d4/d53/l59 0 2026-03-10T12:37:53.711 INFO:tasks.workunit.client.0.vm00.stdout:2/444: fsync d4/d53/f61 0 2026-03-10T12:37:53.719 INFO:tasks.workunit.client.0.vm00.stdout:9/461: unlink d0/d3d/d43/d80/f34 0 2026-03-10T12:37:53.721 INFO:tasks.workunit.client.0.vm00.stdout:1/447: fsync da/d12/f62 0 2026-03-10T12:37:53.721 INFO:tasks.workunit.client.0.vm00.stdout:0/435: symlink d3/db/d24/l9a 0 2026-03-10T12:37:53.728 INFO:tasks.workunit.client.0.vm00.stdout:8/334: dwrite d0/f22 [4194304,4194304] 0 2026-03-10T12:37:53.728 INFO:tasks.workunit.client.0.vm00.stdout:4/440: dwrite df/d32/d76/f82 [4194304,4194304] 0 2026-03-10T12:37:53.728 INFO:tasks.workunit.client.0.vm00.stdout:9/462: dwrite d0/d3d/d59/f94 [0,4194304] 0 2026-03-10T12:37:53.729 INFO:tasks.workunit.client.1.vm07.stdout:5/490: mkdir d0/d22/d18/d19/d2e/da9 0 2026-03-10T12:37:53.731 INFO:tasks.workunit.client.0.vm00.stdout:9/463: readlink d0/d3d/d43/d80/l71 0 2026-03-10T12:37:53.732 INFO:tasks.workunit.client.0.vm00.stdout:4/441: write df/d1f/d36/d3a/d41/f33 [723732,95471] 0 2026-03-10T12:37:53.737 INFO:tasks.workunit.client.1.vm07.stdout:1/415: rmdir d9/df/d29 39 2026-03-10T12:37:53.738 INFO:tasks.workunit.client.0.vm00.stdout:4/442: write df/f12 [1784939,73555] 0 2026-03-10T12:37:53.738 INFO:tasks.workunit.client.0.vm00.stdout:4/443: write df/d1f/d36/d3a/d41/f33 [7039492,56364] 0 2026-03-10T12:37:53.739 INFO:tasks.workunit.client.0.vm00.stdout:8/335: dread d0/f22 [4194304,4194304] 0 2026-03-10T12:37:53.740 INFO:tasks.workunit.client.1.vm07.stdout:2/356: dwrite d0/f13 [4194304,4194304] 0 2026-03-10T12:37:53.741 INFO:tasks.workunit.client.1.vm07.stdout:1/416: write d9/df/f58 [648277,74194] 0 2026-03-10T12:37:53.742 INFO:tasks.workunit.client.0.vm00.stdout:8/336: dread - d0/d12/d36/d5b/f69 zero size 2026-03-10T12:37:53.742 INFO:tasks.workunit.client.1.vm07.stdout:7/414: write d0/d52/f5d [1500884,62619] 0 2026-03-10T12:37:53.743 INFO:tasks.workunit.client.0.vm00.stdout:4/444: fsync df/d1f/d22/d26/d65/f8e 0 2026-03-10T12:37:53.750 INFO:tasks.workunit.client.0.vm00.stdout:0/436: creat d3/d7/d4c/d5b/f9b x:0 0 0 2026-03-10T12:37:53.755 INFO:tasks.workunit.client.0.vm00.stdout:1/448: rmdir da/d24/d28/d44/d5d/d72/d7e 39 2026-03-10T12:37:53.755 INFO:tasks.workunit.client.0.vm00.stdout:6/333: symlink d2/d51/d70/l7c 0 2026-03-10T12:37:53.756 INFO:tasks.workunit.client.0.vm00.stdout:1/449: dread - da/d21/d39/f8c zero size 2026-03-10T12:37:53.757 INFO:tasks.workunit.client.1.vm07.stdout:8/458: dwrite d1/d3/d11/f47 [0,4194304] 0 2026-03-10T12:37:53.761 INFO:tasks.workunit.client.1.vm07.stdout:0/498: readlink d0/d14/d5f/l29 0 2026-03-10T12:37:53.761 INFO:tasks.workunit.client.0.vm00.stdout:1/450: dwrite da/d12/d26/f69 [0,4194304] 0 2026-03-10T12:37:53.776 INFO:tasks.workunit.client.0.vm00.stdout:0/437: fsync d3/d22/f42 0 2026-03-10T12:37:53.776 INFO:tasks.workunit.client.0.vm00.stdout:0/438: dread - d3/d22/f83 zero size 2026-03-10T12:37:53.780 INFO:tasks.workunit.client.1.vm07.stdout:5/491: dread d0/d22/d18/d19/d2e/f88 [0,4194304] 0 2026-03-10T12:37:53.781 INFO:tasks.workunit.client.1.vm07.stdout:3/474: dwrite dc/dd/d28/d3b/f70 [0,4194304] 0 2026-03-10T12:37:53.784 INFO:tasks.workunit.client.1.vm07.stdout:3/475: chown dc/dd/d1f/d6f/c87 704483 1 2026-03-10T12:37:53.786 INFO:tasks.workunit.client.0.vm00.stdout:8/337: rename d0/d12/f34 to d0/d12/d43/f6a 0 2026-03-10T12:37:53.786 INFO:tasks.workunit.client.0.vm00.stdout:7/327: rmdir da/d26 39 2026-03-10T12:37:53.787 INFO:tasks.workunit.client.0.vm00.stdout:5/449: getdents d1f/d6a 0 2026-03-10T12:37:53.787 INFO:tasks.workunit.client.0.vm00.stdout:5/450: readlink d1f/d26/d2b/d35/d78/l95 0 2026-03-10T12:37:53.790 INFO:tasks.workunit.client.0.vm00.stdout:9/464: creat d0/d3d/d43/d80/d1e/d85/d98/fa0 x:0 0 0 2026-03-10T12:37:53.790 INFO:tasks.workunit.client.0.vm00.stdout:9/465: chown d0/d3d/d43/c4d 2415608 1 2026-03-10T12:37:53.792 INFO:tasks.workunit.client.0.vm00.stdout:4/445: creat df/d63/d94/f96 x:0 0 0 2026-03-10T12:37:53.800 INFO:tasks.workunit.client.0.vm00.stdout:4/446: creat df/d1f/d22/d26/d65/d91/f97 x:0 0 0 2026-03-10T12:37:53.801 INFO:tasks.workunit.client.0.vm00.stdout:4/447: write df/d1f/d36/d3a/f44 [1317086,9120] 0 2026-03-10T12:37:53.803 INFO:tasks.workunit.client.0.vm00.stdout:8/338: unlink d0/lc 0 2026-03-10T12:37:53.805 INFO:tasks.workunit.client.0.vm00.stdout:1/451: rename da/d24/d28/d67/c49 to da/c97 0 2026-03-10T12:37:53.806 INFO:tasks.workunit.client.0.vm00.stdout:8/339: creat d0/d12/d36/d5b/f6b x:0 0 0 2026-03-10T12:37:53.807 INFO:tasks.workunit.client.0.vm00.stdout:5/451: creat d1f/d26/d2b/d37/f9e x:0 0 0 2026-03-10T12:37:53.808 INFO:tasks.workunit.client.0.vm00.stdout:1/452: dread - da/d21/d27/d6a/f6d zero size 2026-03-10T12:37:53.809 INFO:tasks.workunit.client.0.vm00.stdout:5/452: write d1f/d26/d2e/f71 [4771436,52179] 0 2026-03-10T12:37:53.809 INFO:tasks.workunit.client.0.vm00.stdout:1/453: fdatasync da/d24/d28/d67/f52 0 2026-03-10T12:37:53.810 INFO:tasks.workunit.client.0.vm00.stdout:5/453: stat d1f/d6a 0 2026-03-10T12:37:53.814 INFO:tasks.workunit.client.0.vm00.stdout:8/340: dwrite d0/f56 [0,4194304] 0 2026-03-10T12:37:53.814 INFO:tasks.workunit.client.0.vm00.stdout:5/454: dread d1f/f22 [4194304,4194304] 0 2026-03-10T12:37:53.819 INFO:tasks.workunit.client.0.vm00.stdout:8/341: dwrite d0/d12/d36/d5b/f69 [0,4194304] 0 2026-03-10T12:37:53.820 INFO:tasks.workunit.client.1.vm07.stdout:6/406: dread d1/f26 [0,4194304] 0 2026-03-10T12:37:53.820 INFO:tasks.workunit.client.1.vm07.stdout:2/357: unlink d0/f40 0 2026-03-10T12:37:53.828 INFO:tasks.workunit.client.0.vm00.stdout:1/454: mkdir da/d24/d28/d56/d8b/d98 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:8/342: link d0/d12/d2d/f52 d0/d12/d2d/d49/f6c 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:5/455: link d1f/d26/d2b/d35/f50 d1f/d26/f9f 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:5/456: stat d1f/d26/d2e/d58/d6b/f87 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:5/457: rename d1f/d26/d2e/d58/d6b/f9a to d1f/d26/d2b/d35/d53/d72/fa0 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:1/455: creat da/d12/f99 x:0 0 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:5/458: mknod d1f/d26/d2e/ca1 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:5/459: truncate d1f/d26/d2b/d35/f50 1524464 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:1/456: creat da/d24/d28/d44/d5d/d72/f9a x:0 0 0 2026-03-10T12:37:53.843 INFO:tasks.workunit.client.0.vm00.stdout:1/457: symlink da/d24/d28/d44/d5d/l9b 0 2026-03-10T12:37:53.845 INFO:tasks.workunit.client.1.vm07.stdout:0/499: unlink d0/d14/d5f/d76/d2f/c40 0 2026-03-10T12:37:53.845 INFO:tasks.workunit.client.0.vm00.stdout:1/458: dwrite da/d12/f64 [0,4194304] 0 2026-03-10T12:37:53.849 INFO:tasks.workunit.client.0.vm00.stdout:1/459: creat da/d4d/f9c x:0 0 0 2026-03-10T12:37:53.852 INFO:tasks.workunit.client.0.vm00.stdout:4/448: sync 2026-03-10T12:37:53.853 INFO:tasks.workunit.client.0.vm00.stdout:4/449: write df/d1f/d22/f7d [6326759,44623] 0 2026-03-10T12:37:53.855 INFO:tasks.workunit.client.0.vm00.stdout:4/450: mknod df/d6c/c98 0 2026-03-10T12:37:53.856 INFO:tasks.workunit.client.0.vm00.stdout:4/451: mknod df/d63/d6b/d73/c99 0 2026-03-10T12:37:53.858 INFO:tasks.workunit.client.0.vm00.stdout:4/452: symlink df/d1f/d22/d26/d65/d91/l9a 0 2026-03-10T12:37:53.877 INFO:tasks.workunit.client.0.vm00.stdout:4/453: getdents df/d1f 0 2026-03-10T12:37:53.877 INFO:tasks.workunit.client.0.vm00.stdout:7/328: creat da/d1b/d40/f7d x:0 0 0 2026-03-10T12:37:53.878 INFO:tasks.workunit.client.0.vm00.stdout:6/334: link d2/da/dc/f40 d2/d16/d74/f7d 0 2026-03-10T12:37:53.881 INFO:tasks.workunit.client.0.vm00.stdout:4/454: link df/c1a df/d32/d76/c9b 0 2026-03-10T12:37:53.886 INFO:tasks.workunit.client.1.vm07.stdout:4/579: symlink d0/d4/d10/d3c/d2b/lc8 0 2026-03-10T12:37:53.891 INFO:tasks.workunit.client.0.vm00.stdout:1/460: dread da/d24/f45 [4194304,4194304] 0 2026-03-10T12:37:53.899 INFO:tasks.workunit.client.0.vm00.stdout:6/335: mknod d2/d16/d74/c7e 0 2026-03-10T12:37:53.904 INFO:tasks.workunit.client.0.vm00.stdout:6/336: dread d2/da/dc/f45 [0,4194304] 0 2026-03-10T12:37:53.918 INFO:tasks.workunit.client.0.vm00.stdout:6/337: sync 2026-03-10T12:37:53.918 INFO:tasks.workunit.client.0.vm00.stdout:6/338: chown d2/d51/d70 1932047080 1 2026-03-10T12:37:53.923 INFO:tasks.workunit.client.0.vm00.stdout:6/339: read d2/da/dc/d2f/f4f [2323094,76683] 0 2026-03-10T12:37:53.942 INFO:tasks.workunit.client.1.vm07.stdout:2/358: dwrite d0/d42/d26/f52 [0,4194304] 0 2026-03-10T12:37:53.958 INFO:tasks.workunit.client.1.vm07.stdout:1/417: creat d9/df/d29/d2c/d59/f84 x:0 0 0 2026-03-10T12:37:53.965 INFO:tasks.workunit.client.1.vm07.stdout:1/418: truncate d9/df/d29/d2c/d59/f84 412448 0 2026-03-10T12:37:53.965 INFO:tasks.workunit.client.1.vm07.stdout:7/415: creat d0/d57/d62/f84 x:0 0 0 2026-03-10T12:37:53.965 INFO:tasks.workunit.client.1.vm07.stdout:0/500: read - d0/d14/d5f/d3b/f5b zero size 2026-03-10T12:37:53.965 INFO:tasks.workunit.client.1.vm07.stdout:9/486: getdents d5/d13/d6c/d7a 0 2026-03-10T12:37:53.973 INFO:tasks.workunit.client.1.vm07.stdout:4/580: fsync d0/d4/d5/da/f44 0 2026-03-10T12:37:53.978 INFO:tasks.workunit.client.1.vm07.stdout:2/359: fdatasync d0/f12 0 2026-03-10T12:37:53.979 INFO:tasks.workunit.client.1.vm07.stdout:2/360: write d0/d42/f53 [1221920,40881] 0 2026-03-10T12:37:53.981 INFO:tasks.workunit.client.1.vm07.stdout:1/419: write d9/df/d29/d2c/d59/f73 [601270,21194] 0 2026-03-10T12:37:53.983 INFO:tasks.workunit.client.1.vm07.stdout:7/416: unlink d0/l43 0 2026-03-10T12:37:53.986 INFO:tasks.workunit.client.1.vm07.stdout:0/501: read d0/f1c [3998687,18379] 0 2026-03-10T12:37:53.992 INFO:tasks.workunit.client.0.vm00.stdout:1/461: fsync da/d24/f81 0 2026-03-10T12:37:53.993 INFO:tasks.workunit.client.0.vm00.stdout:9/466: truncate d0/d3d/d43/d80/d1e/d27/f28 668451 0 2026-03-10T12:37:53.996 INFO:tasks.workunit.client.1.vm07.stdout:4/581: chown d0/d4/d10/f36 0 1 2026-03-10T12:37:53.996 INFO:tasks.workunit.client.0.vm00.stdout:1/462: symlink da/d24/d5a/l9d 0 2026-03-10T12:37:53.999 INFO:tasks.workunit.client.1.vm07.stdout:4/582: dread d0/d4/d5/f43 [0,4194304] 0 2026-03-10T12:37:54.001 INFO:tasks.workunit.client.0.vm00.stdout:1/463: dread da/d24/f81 [0,4194304] 0 2026-03-10T12:37:54.001 INFO:tasks.workunit.client.1.vm07.stdout:4/583: read - d0/d4/d10/d3c/d2b/d54/fbf zero size 2026-03-10T12:37:54.003 INFO:tasks.workunit.client.0.vm00.stdout:9/467: creat d0/d3d/d43/d80/fa1 x:0 0 0 2026-03-10T12:37:54.003 INFO:tasks.workunit.client.0.vm00.stdout:1/464: read da/d12/f66 [421586,126468] 0 2026-03-10T12:37:54.006 INFO:tasks.workunit.client.0.vm00.stdout:2/445: write d4/d6/f2e [865771,109834] 0 2026-03-10T12:37:54.006 INFO:tasks.workunit.client.0.vm00.stdout:2/446: stat d4/dd/c25 0 2026-03-10T12:37:54.013 INFO:tasks.workunit.client.1.vm07.stdout:2/361: dread d0/d42/f1b [0,4194304] 0 2026-03-10T12:37:54.013 INFO:tasks.workunit.client.1.vm07.stdout:1/420: creat d9/df/d29/d2b/d3d/f85 x:0 0 0 2026-03-10T12:37:54.013 INFO:tasks.workunit.client.0.vm00.stdout:3/482: truncate dd/d27/d2c/f7d 732046 0 2026-03-10T12:37:54.013 INFO:tasks.workunit.client.0.vm00.stdout:3/483: fdatasync dd/d18/d13/d1d/f86 0 2026-03-10T12:37:54.013 INFO:tasks.workunit.client.0.vm00.stdout:3/484: stat dd/d18/d13/d1d/f5b 0 2026-03-10T12:37:54.013 INFO:tasks.workunit.client.0.vm00.stdout:9/468: symlink d0/d3d/d59/d74/la2 0 2026-03-10T12:37:54.013 INFO:tasks.workunit.client.0.vm00.stdout:3/485: mkdir dd/d18/d13/d99/da5 0 2026-03-10T12:37:54.014 INFO:tasks.workunit.client.0.vm00.stdout:1/465: read da/d21/f74 [45621,127475] 0 2026-03-10T12:37:54.014 INFO:tasks.workunit.client.0.vm00.stdout:9/469: rename d0/d3d/d59/d4e/d84 to d0/d3d/d59/d4e/da3 0 2026-03-10T12:37:54.017 INFO:tasks.workunit.client.1.vm07.stdout:9/487: creat d5/d16/da3/fb1 x:0 0 0 2026-03-10T12:37:54.017 INFO:tasks.workunit.client.1.vm07.stdout:9/488: chown d5/d16/d23/d26/f5c 8731 1 2026-03-10T12:37:54.024 INFO:tasks.workunit.client.0.vm00.stdout:1/466: link da/fc da/d21/d27/d6a/f9e 0 2026-03-10T12:37:54.027 INFO:tasks.workunit.client.0.vm00.stdout:1/467: dwrite da/d21/d39/f55 [0,4194304] 0 2026-03-10T12:37:54.032 INFO:tasks.workunit.client.1.vm07.stdout:3/476: dread dc/dd/d28/d3b/f9f [0,4194304] 0 2026-03-10T12:37:54.035 INFO:tasks.workunit.client.1.vm07.stdout:6/407: rename d1/d4/f62 to d1/d4/d6/f7e 0 2026-03-10T12:37:54.039 INFO:tasks.workunit.client.0.vm00.stdout:1/468: getdents da/d24/d28/d56/d8b/d98 0 2026-03-10T12:37:54.040 INFO:tasks.workunit.client.1.vm07.stdout:4/584: unlink d0/d4/d10/d3c/d2b/d54/fbf 0 2026-03-10T12:37:54.041 INFO:tasks.workunit.client.0.vm00.stdout:1/469: mknod da/d24/c9f 0 2026-03-10T12:37:54.042 INFO:tasks.workunit.client.0.vm00.stdout:3/486: dread dd/d3d/f50 [0,4194304] 0 2026-03-10T12:37:54.047 INFO:tasks.workunit.client.0.vm00.stdout:1/470: chown da/d4d/d78/c87 1 1 2026-03-10T12:37:54.054 INFO:tasks.workunit.client.0.vm00.stdout:3/487: dread dd/d2a/f78 [0,4194304] 0 2026-03-10T12:37:54.055 INFO:tasks.workunit.client.0.vm00.stdout:7/329: dread da/d41/f4b [0,4194304] 0 2026-03-10T12:37:54.055 INFO:tasks.workunit.client.0.vm00.stdout:3/488: chown dd/d27/d2c/d34/d45/f47 1179057565 1 2026-03-10T12:37:54.055 INFO:tasks.workunit.client.0.vm00.stdout:3/489: dread - dd/d18/d13/f9e zero size 2026-03-10T12:37:54.056 INFO:tasks.workunit.client.1.vm07.stdout:3/477: mknod dc/dd/d28/d7a/d8e/ca8 0 2026-03-10T12:37:54.058 INFO:tasks.workunit.client.0.vm00.stdout:3/490: rename dd/d27/c33 to dd/d27/d2c/d34/d45/ca6 0 2026-03-10T12:37:54.063 INFO:tasks.workunit.client.0.vm00.stdout:3/491: link dd/d2a/f78 dd/d18/d13/d1d/d43/fa7 0 2026-03-10T12:37:54.064 INFO:tasks.workunit.client.0.vm00.stdout:3/492: truncate dd/d27/d2c/d34/d45/f75 222954 0 2026-03-10T12:37:54.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:53 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:54.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:53 vm07.local ceph-mon[58582]: pgmap v162: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 37 MiB/s rd, 127 MiB/s wr, 233 op/s 2026-03-10T12:37:54.067 INFO:tasks.workunit.client.1.vm07.stdout:6/408: truncate d1/f3d 515988 0 2026-03-10T12:37:54.069 INFO:tasks.workunit.client.0.vm00.stdout:3/493: fdatasync dd/d18/d13/d1d/f5b 0 2026-03-10T12:37:54.071 INFO:tasks.workunit.client.0.vm00.stdout:7/330: unlink da/d25/f29 0 2026-03-10T12:37:54.073 INFO:tasks.workunit.client.1.vm07.stdout:2/362: symlink d0/d42/d26/d38/d4f/d5d/l7c 0 2026-03-10T12:37:54.073 INFO:tasks.workunit.client.1.vm07.stdout:2/363: chown d0/d45 4072673 1 2026-03-10T12:37:54.077 INFO:tasks.workunit.client.0.vm00.stdout:3/494: symlink dd/la8 0 2026-03-10T12:37:54.080 INFO:tasks.workunit.client.0.vm00.stdout:2/447: dread d4/d6/f2e [0,4194304] 0 2026-03-10T12:37:54.093 INFO:tasks.workunit.client.0.vm00.stdout:2/448: dwrite d4/f6e [0,4194304] 0 2026-03-10T12:37:54.093 INFO:tasks.workunit.client.1.vm07.stdout:8/459: write d1/fc [2791179,92236] 0 2026-03-10T12:37:54.093 INFO:tasks.workunit.client.1.vm07.stdout:5/492: dwrite d0/f70 [0,4194304] 0 2026-03-10T12:37:54.096 INFO:tasks.workunit.client.1.vm07.stdout:5/493: stat d0/d22/d18/d3e/d53/d9e/f76 0 2026-03-10T12:37:54.106 INFO:tasks.workunit.client.0.vm00.stdout:2/449: creat d4/d6/f9c x:0 0 0 2026-03-10T12:37:54.106 INFO:tasks.workunit.client.0.vm00.stdout:8/343: write d0/f9 [8468028,121842] 0 2026-03-10T12:37:54.107 INFO:tasks.workunit.client.0.vm00.stdout:8/344: write d0/f11 [2786960,103235] 0 2026-03-10T12:37:54.108 INFO:tasks.workunit.client.0.vm00.stdout:8/345: write d0/d5c/f42 [786586,117192] 0 2026-03-10T12:37:54.117 INFO:tasks.workunit.client.0.vm00.stdout:6/340: dwrite d2/d16/f20 [0,4194304] 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:7/417: link d0/c15 d0/d61/c85 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:8/460: mknod d1/d3/d6/d50/c93 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:9/489: creat d5/d16/d23/fb2 x:0 0 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:0/502: rename d0/d14/d5f/d76/d2f/c3e to d0/d14/d5f/d76/ca5 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:2/364: mkdir d0/d42/d26/d7d 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:8/461: readlink d1/d3/l51 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:9/490: read - d5/d1f/d31/fad zero size 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.1.vm07.stdout:8/462: truncate d1/d3/d6c/f8a 46328 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:0/439: dwrite d3/d22/f2e [0,4194304] 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:3/495: rmdir dd/d18/d13/d1d/d43/d55 39 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:7/331: mknod da/d26/c7e 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:5/460: write d1f/d26/d2b/f5e [401668,41054] 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:2/450: readlink d4/dd/l97 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:7/332: rename da/d25/l53 to da/d26/d50/d73/l7f 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:7/333: fdatasync da/d25/f5a 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:7/334: truncate da/d26/d37/f79 849688 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:5/461: symlink d1f/d26/d2e/d58/d6b/la2 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:5/462: readlink d1f/d26/d2b/d35/d53/d5b/l83 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:5/463: stat d1f/d26/l92 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:5/464: fsync d1f/d26/d2b/f7e 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:7/335: rename da/d26/c2f to da/d25/c80 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:3/496: mknod dd/d27/d2c/d34/ca9 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:2/451: mknod d4/d6/d2d/c9d 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:5/465: mkdir d1f/d26/d2b/d35/d53/d72/da3 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:9/470: dwrite d0/d3d/d43/d53/d57/f4f [0,4194304] 0 2026-03-10T12:37:54.152 INFO:tasks.workunit.client.0.vm00.stdout:1/471: write da/d24/f45 [3169339,31748] 0 2026-03-10T12:37:54.155 INFO:tasks.workunit.client.0.vm00.stdout:5/466: mkdir d1f/d26/d2b/d37/da4 0 2026-03-10T12:37:54.158 INFO:tasks.workunit.client.0.vm00.stdout:8/346: dread d0/d12/d36/d5b/f65 [0,4194304] 0 2026-03-10T12:37:54.158 INFO:tasks.workunit.client.0.vm00.stdout:2/452: read d4/d6/f75 [4426376,31260] 0 2026-03-10T12:37:54.161 INFO:tasks.workunit.client.0.vm00.stdout:9/471: symlink d0/la4 0 2026-03-10T12:37:54.161 INFO:tasks.workunit.client.0.vm00.stdout:7/336: mkdir da/d41/d48/d81 0 2026-03-10T12:37:54.163 INFO:tasks.workunit.client.1.vm07.stdout:9/491: rmdir d5/d1f/d5e/d6b 39 2026-03-10T12:37:54.164 INFO:tasks.workunit.client.0.vm00.stdout:2/453: truncate d4/d6/d2d/f3d 755973 0 2026-03-10T12:37:54.167 INFO:tasks.workunit.client.0.vm00.stdout:5/467: link d1f/d26/f48 d1f/d26/d2e/fa5 0 2026-03-10T12:37:54.173 INFO:tasks.workunit.client.0.vm00.stdout:9/472: creat d0/d3d/d43/d53/fa5 x:0 0 0 2026-03-10T12:37:54.174 INFO:tasks.workunit.client.1.vm07.stdout:0/503: symlink d0/d14/d5f/d76/d2f/d31/d79/la6 0 2026-03-10T12:37:54.185 INFO:tasks.workunit.client.0.vm00.stdout:5/468: link c14 d1f/d6a/d94/ca6 0 2026-03-10T12:37:54.185 INFO:tasks.workunit.client.0.vm00.stdout:5/469: readlink d1f/d26/d2b/l3f 0 2026-03-10T12:37:54.186 INFO:tasks.workunit.client.1.vm07.stdout:2/365: symlink d0/d42/d4e/l7e 0 2026-03-10T12:37:54.186 INFO:tasks.workunit.client.1.vm07.stdout:2/366: dwrite d0/d29/d64/d6c/f71 [0,4194304] 0 2026-03-10T12:37:54.186 INFO:tasks.workunit.client.1.vm07.stdout:1/421: rename d9/df/d79/f7e to d9/df/d29/d2b/d31/f86 0 2026-03-10T12:37:54.200 INFO:tasks.workunit.client.1.vm07.stdout:1/422: creat d9/df/d55/f87 x:0 0 0 2026-03-10T12:37:54.202 INFO:tasks.workunit.client.0.vm00.stdout:7/337: rename da/d25/d2c/d58 to da/d25/d2c/d82 0 2026-03-10T12:37:54.202 INFO:tasks.workunit.client.1.vm07.stdout:1/423: mknod d9/df/c88 0 2026-03-10T12:37:54.204 INFO:tasks.workunit.client.0.vm00.stdout:7/338: read da/fb [1472802,121784] 0 2026-03-10T12:37:54.204 INFO:tasks.workunit.client.1.vm07.stdout:1/424: mknod d9/df/d29/d2c/c89 0 2026-03-10T12:37:54.204 INFO:tasks.workunit.client.0.vm00.stdout:0/440: sync 2026-03-10T12:37:54.204 INFO:tasks.workunit.client.0.vm00.stdout:3/497: sync 2026-03-10T12:37:54.205 INFO:tasks.workunit.client.0.vm00.stdout:7/339: chown da/d25/d2e/d4c/l55 76129640 1 2026-03-10T12:37:54.206 INFO:tasks.workunit.client.1.vm07.stdout:1/425: symlink d9/d2d/d4f/d75/l8a 0 2026-03-10T12:37:54.207 INFO:tasks.workunit.client.0.vm00.stdout:0/441: fdatasync d3/d40/d65/f8f 0 2026-03-10T12:37:54.209 INFO:tasks.workunit.client.0.vm00.stdout:3/498: getdents dd/d64 0 2026-03-10T12:37:54.211 INFO:tasks.workunit.client.0.vm00.stdout:3/499: write dd/d27/d2c/d34/d45/f47 [3449588,94089] 0 2026-03-10T12:37:54.211 INFO:tasks.workunit.client.0.vm00.stdout:3/500: dread - dd/d64/fa4 zero size 2026-03-10T12:37:54.212 INFO:tasks.workunit.client.0.vm00.stdout:0/442: getdents d3/d7/d3c/d4b 0 2026-03-10T12:37:54.212 INFO:tasks.workunit.client.0.vm00.stdout:0/443: read - d3/d22/f71 zero size 2026-03-10T12:37:54.213 INFO:tasks.workunit.client.0.vm00.stdout:0/444: write d3/d7/d4c/d5b/d38/f89 [9154007,108947] 0 2026-03-10T12:37:54.219 INFO:tasks.workunit.client.0.vm00.stdout:0/445: creat d3/f9c x:0 0 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:3/501: dwrite dd/d27/d2c/d34/d45/f75 [0,4194304] 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:0/446: mkdir d3/d7/d4c/d9d 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:0/447: write d3/db/d77/f8a [860798,69936] 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:3/502: creat dd/d4e/faa x:0 0 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:3/503: write dd/d27/d2c/f89 [971331,15437] 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:0/448: creat d3/db/d77/f9e x:0 0 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:3/504: write dd/d64/f98 [1826975,118966] 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:3/505: stat dd/d18/d13/c1c 0 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:3/506: dread - dd/d18/d13/d1d/d43/f95 zero size 2026-03-10T12:37:54.233 INFO:tasks.workunit.client.0.vm00.stdout:3/507: mknod dd/d64/d93/cab 0 2026-03-10T12:37:54.234 INFO:tasks.workunit.client.0.vm00.stdout:3/508: write dd/d4e/d5d/f71 [3918893,25966] 0 2026-03-10T12:37:54.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:53 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:37:54.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:53 vm00.local ceph-mon[50686]: pgmap v162: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 37 MiB/s rd, 127 MiB/s wr, 233 op/s 2026-03-10T12:37:54.234 INFO:tasks.workunit.client.0.vm00.stdout:3/509: stat dd/d18/d13/c26 0 2026-03-10T12:37:54.234 INFO:tasks.workunit.client.0.vm00.stdout:3/510: stat dd/d18/d14/d2b/f31 0 2026-03-10T12:37:54.236 INFO:tasks.workunit.client.0.vm00.stdout:3/511: mkdir dd/d4e/d6a/dac 0 2026-03-10T12:37:54.238 INFO:tasks.workunit.client.0.vm00.stdout:3/512: write dd/d27/f35 [5507327,8291] 0 2026-03-10T12:37:54.239 INFO:tasks.workunit.client.0.vm00.stdout:0/449: dread d3/d7/d4c/d5b/f2a [0,4194304] 0 2026-03-10T12:37:54.242 INFO:tasks.workunit.client.0.vm00.stdout:0/450: creat d3/d7/f9f x:0 0 0 2026-03-10T12:37:54.243 INFO:tasks.workunit.client.0.vm00.stdout:0/451: write d3/d22/f83 [835141,111945] 0 2026-03-10T12:37:54.245 INFO:tasks.workunit.client.0.vm00.stdout:0/452: creat d3/d7/d58/fa0 x:0 0 0 2026-03-10T12:37:54.246 INFO:tasks.workunit.client.0.vm00.stdout:0/453: fdatasync d3/d7/d4c/d5b/f57 0 2026-03-10T12:37:54.246 INFO:tasks.workunit.client.0.vm00.stdout:0/454: write d3/d7/f11 [8637329,40508] 0 2026-03-10T12:37:54.256 INFO:tasks.workunit.client.0.vm00.stdout:4/455: fsync df/d1f/d22/f7d 0 2026-03-10T12:37:54.257 INFO:tasks.workunit.client.1.vm07.stdout:4/585: sync 2026-03-10T12:37:54.259 INFO:tasks.workunit.client.0.vm00.stdout:4/456: creat df/d1f/d22/d26/f9c x:0 0 0 2026-03-10T12:37:54.265 INFO:tasks.workunit.client.1.vm07.stdout:4/586: unlink d0/d4/d10/d5f/d6d/c6f 0 2026-03-10T12:37:54.265 INFO:tasks.workunit.client.1.vm07.stdout:8/463: sync 2026-03-10T12:37:54.265 INFO:tasks.workunit.client.1.vm07.stdout:9/492: sync 2026-03-10T12:37:54.265 INFO:tasks.workunit.client.1.vm07.stdout:2/367: sync 2026-03-10T12:37:54.265 INFO:tasks.workunit.client.1.vm07.stdout:7/418: sync 2026-03-10T12:37:54.265 INFO:tasks.workunit.client.0.vm00.stdout:4/457: truncate df/d1f/d22/d26/d65/d91/f50 5162477 0 2026-03-10T12:37:54.274 INFO:tasks.workunit.client.1.vm07.stdout:7/419: symlink d0/d67/d6f/l86 0 2026-03-10T12:37:54.275 INFO:tasks.workunit.client.1.vm07.stdout:9/493: dread d5/d16/d23/d26/f5c [0,4194304] 0 2026-03-10T12:37:54.278 INFO:tasks.workunit.client.1.vm07.stdout:8/464: rename d1/d3/d6c/f8a to d1/d3/d40/d92/f94 0 2026-03-10T12:37:54.280 INFO:tasks.workunit.client.1.vm07.stdout:4/587: symlink d0/d4/d5/d78/dc5/lc9 0 2026-03-10T12:37:54.288 INFO:tasks.workunit.client.1.vm07.stdout:9/494: symlink d5/d13/d6c/da4/lb3 0 2026-03-10T12:37:54.289 INFO:tasks.workunit.client.1.vm07.stdout:9/495: chown d5/d13/d57/f95 13 1 2026-03-10T12:37:54.293 INFO:tasks.workunit.client.1.vm07.stdout:9/496: dread d5/d13/d22/f36 [0,4194304] 0 2026-03-10T12:37:54.299 INFO:tasks.workunit.client.0.vm00.stdout:4/458: sync 2026-03-10T12:37:54.303 INFO:tasks.workunit.client.1.vm07.stdout:4/588: dread d0/d4/d10/d3c/f68 [0,4194304] 0 2026-03-10T12:37:54.313 INFO:tasks.workunit.client.1.vm07.stdout:2/368: rename d0/d42/d26/d4b/f6d to d0/d42/d26/d7d/f7f 0 2026-03-10T12:37:54.324 INFO:tasks.workunit.client.1.vm07.stdout:8/465: truncate d1/d3/d6/f4f 1692313 0 2026-03-10T12:37:54.325 INFO:tasks.workunit.client.1.vm07.stdout:8/466: write d1/d3/d6/d50/d70/f7f [320152,28633] 0 2026-03-10T12:37:54.325 INFO:tasks.workunit.client.1.vm07.stdout:2/369: chown d0/c37 9 1 2026-03-10T12:37:54.333 INFO:tasks.workunit.client.0.vm00.stdout:2/454: dread d4/d6/d2d/d3a/d43/d51/f6f [0,4194304] 0 2026-03-10T12:37:54.333 INFO:tasks.workunit.client.1.vm07.stdout:4/589: mknod d0/d19/cca 0 2026-03-10T12:37:54.338 INFO:tasks.workunit.client.0.vm00.stdout:2/455: rename d4/d6/d2d/d3a/d43/d51 to d4/d53/d9e 0 2026-03-10T12:37:54.338 INFO:tasks.workunit.client.0.vm00.stdout:2/456: readlink d4/d6/d41/l82 0 2026-03-10T12:37:54.351 INFO:tasks.workunit.client.0.vm00.stdout:0/455: fsync d3/d7/d4c/d5b/d38/f89 0 2026-03-10T12:37:54.361 INFO:tasks.workunit.client.1.vm07.stdout:3/478: write dc/dd/d1f/f91 [867638,123360] 0 2026-03-10T12:37:54.362 INFO:tasks.workunit.client.1.vm07.stdout:3/479: write dc/dd/f96 [4288100,2946] 0 2026-03-10T12:37:54.364 INFO:tasks.workunit.client.1.vm07.stdout:8/467: dread d1/f3d [0,4194304] 0 2026-03-10T12:37:54.364 INFO:tasks.workunit.client.1.vm07.stdout:6/409: write d1/d4/d6/d16/d1a/f29 [498809,50585] 0 2026-03-10T12:37:54.364 INFO:tasks.workunit.client.1.vm07.stdout:8/468: chown d1/d3/d11/f46 125598 1 2026-03-10T12:37:54.367 INFO:tasks.workunit.client.0.vm00.stdout:8/347: dread d0/f11 [0,4194304] 0 2026-03-10T12:37:54.371 INFO:tasks.workunit.client.0.vm00.stdout:9/473: write d0/d3d/d43/d80/d1e/f60 [42485,34322] 0 2026-03-10T12:37:54.371 INFO:tasks.workunit.client.1.vm07.stdout:8/469: dwrite d1/d3/d6/d54/f7d [0,4194304] 0 2026-03-10T12:37:54.372 INFO:tasks.workunit.client.0.vm00.stdout:9/474: write d0/d3d/d43/d80/d1e/d2b/f36 [2371664,99641] 0 2026-03-10T12:37:54.389 INFO:tasks.workunit.client.0.vm00.stdout:9/475: rmdir d0/d3d/d43 39 2026-03-10T12:37:54.391 INFO:tasks.workunit.client.1.vm07.stdout:6/410: unlink d1/f26 0 2026-03-10T12:37:54.392 INFO:tasks.workunit.client.0.vm00.stdout:6/341: dwrite d2/d14/f1b [0,4194304] 0 2026-03-10T12:37:54.394 INFO:tasks.workunit.client.0.vm00.stdout:0/456: getdents d3/d22 0 2026-03-10T12:37:54.394 INFO:tasks.workunit.client.0.vm00.stdout:6/342: chown d2/da/dc/d2f/f56 8 1 2026-03-10T12:37:54.394 INFO:tasks.workunit.client.0.vm00.stdout:0/457: chown d3/d7/d58/fa0 3519183 1 2026-03-10T12:37:54.397 INFO:tasks.workunit.client.1.vm07.stdout:6/411: dwrite d1/d4/d6/d46/d4d/fb [0,4194304] 0 2026-03-10T12:37:54.401 INFO:tasks.workunit.client.0.vm00.stdout:0/458: dwrite d3/d22/f2e [0,4194304] 0 2026-03-10T12:37:54.404 INFO:tasks.workunit.client.0.vm00.stdout:0/459: dwrite d3/d7/d58/fa0 [0,4194304] 0 2026-03-10T12:37:54.409 INFO:tasks.workunit.client.0.vm00.stdout:5/470: link d1f/d26/d2b/d37/f38 d1f/d26/d2b/d35/d53/fa7 0 2026-03-10T12:37:54.413 INFO:tasks.workunit.client.0.vm00.stdout:6/343: symlink d2/da/dc/d2f/l7f 0 2026-03-10T12:37:54.415 INFO:tasks.workunit.client.0.vm00.stdout:3/513: truncate dd/d27/f56 2200629 0 2026-03-10T12:37:54.418 INFO:tasks.workunit.client.1.vm07.stdout:5/494: write d0/d22/d18/d19/d21/f42 [3952375,87433] 0 2026-03-10T12:37:54.425 INFO:tasks.workunit.client.0.vm00.stdout:3/514: dread dd/d27/d2c/d34/d38/f48 [0,4194304] 0 2026-03-10T12:37:54.427 INFO:tasks.workunit.client.0.vm00.stdout:2/457: write d4/dd/ff [1132128,67928] 0 2026-03-10T12:37:54.439 INFO:tasks.workunit.client.0.vm00.stdout:8/348: unlink d0/d12/d17/c26 0 2026-03-10T12:37:54.443 INFO:tasks.workunit.client.1.vm07.stdout:0/504: write d0/d14/d5f/d76/d2f/d31/f4d [4872268,29461] 0 2026-03-10T12:37:54.443 INFO:tasks.workunit.client.0.vm00.stdout:5/471: symlink d1f/d39/la8 0 2026-03-10T12:37:54.443 INFO:tasks.workunit.client.0.vm00.stdout:9/476: unlink d0/d5/dc/f41 0 2026-03-10T12:37:54.446 INFO:tasks.workunit.client.0.vm00.stdout:8/349: truncate d0/d12/d17/f63 299152 0 2026-03-10T12:37:54.446 INFO:tasks.workunit.client.0.vm00.stdout:8/350: write d0/f56 [2054695,19689] 0 2026-03-10T12:37:54.447 INFO:tasks.workunit.client.0.vm00.stdout:9/477: chown d0/d3d/d43/d80/d1e/d85/c9d 6 1 2026-03-10T12:37:54.448 INFO:tasks.workunit.client.0.vm00.stdout:5/472: rename d1f/d26/d2e/f75 to d1f/d26/d6f/fa9 0 2026-03-10T12:37:54.449 INFO:tasks.workunit.client.0.vm00.stdout:5/473: stat d1f/d39/l47 0 2026-03-10T12:37:54.451 INFO:tasks.workunit.client.0.vm00.stdout:3/515: truncate dd/d4e/d5d/f81 152435 0 2026-03-10T12:37:54.453 INFO:tasks.workunit.client.0.vm00.stdout:8/351: creat d0/dd/d38/f6d x:0 0 0 2026-03-10T12:37:54.453 INFO:tasks.workunit.client.0.vm00.stdout:9/478: chown d0/d3d/c63 5 1 2026-03-10T12:37:54.455 INFO:tasks.workunit.client.0.vm00.stdout:8/352: mkdir d0/d46/d6e 0 2026-03-10T12:37:54.461 INFO:tasks.workunit.client.0.vm00.stdout:7/340: write da/d1b/f22 [397485,50920] 0 2026-03-10T12:37:54.462 INFO:tasks.workunit.client.1.vm07.stdout:5/495: truncate d0/d22/d18/d19/d21/f38 4146305 0 2026-03-10T12:37:54.462 INFO:tasks.workunit.client.1.vm07.stdout:1/426: readlink d9/df/d29/d2b/d31/l45 0 2026-03-10T12:37:54.463 INFO:tasks.workunit.client.0.vm00.stdout:3/516: creat dd/d3d/d65/fad x:0 0 0 2026-03-10T12:37:54.463 INFO:tasks.workunit.client.0.vm00.stdout:3/517: chown dd/d64/l7e 77523012 1 2026-03-10T12:37:54.475 INFO:tasks.workunit.client.1.vm07.stdout:8/470: link d1/d3/d6/d50/c93 d1/d3/d11/c95 0 2026-03-10T12:37:54.476 INFO:tasks.workunit.client.1.vm07.stdout:3/480: getdents dc/dd/d1f/d45 0 2026-03-10T12:37:54.477 INFO:tasks.workunit.client.1.vm07.stdout:8/471: chown d1/d3/d40/d92 27177 1 2026-03-10T12:37:54.477 INFO:tasks.workunit.client.1.vm07.stdout:5/496: creat d0/d22/d18/d3e/d53/faa x:0 0 0 2026-03-10T12:37:54.479 INFO:tasks.workunit.client.1.vm07.stdout:7/420: dwrite d0/d47/d48/f4b [0,4194304] 0 2026-03-10T12:37:54.496 INFO:tasks.workunit.client.0.vm00.stdout:6/344: mkdir d2/d42/d80 0 2026-03-10T12:37:54.497 INFO:tasks.workunit.client.0.vm00.stdout:7/341: creat da/d41/d7b/f83 x:0 0 0 2026-03-10T12:37:54.497 INFO:tasks.workunit.client.0.vm00.stdout:7/342: chown da/d25/d2c/c3a 2522 1 2026-03-10T12:37:54.498 INFO:tasks.workunit.client.0.vm00.stdout:7/343: write da/d1b/d40/f74 [999711,672] 0 2026-03-10T12:37:54.503 INFO:tasks.workunit.client.0.vm00.stdout:4/459: dread df/f1b [0,4194304] 0 2026-03-10T12:37:54.503 INFO:tasks.workunit.client.1.vm07.stdout:1/427: read d9/fd [3994224,49815] 0 2026-03-10T12:37:54.504 INFO:tasks.workunit.client.0.vm00.stdout:4/460: creat df/d63/d77/f9d x:0 0 0 2026-03-10T12:37:54.505 INFO:tasks.workunit.client.0.vm00.stdout:4/461: mkdir df/d93/d9e 0 2026-03-10T12:37:54.506 INFO:tasks.workunit.client.1.vm07.stdout:9/497: dwrite d5/d1f/f3d [0,4194304] 0 2026-03-10T12:37:54.510 INFO:tasks.workunit.client.0.vm00.stdout:4/462: dwrite f8 [0,4194304] 0 2026-03-10T12:37:54.515 INFO:tasks.workunit.client.0.vm00.stdout:4/463: mknod df/d32/d76/c9f 0 2026-03-10T12:37:54.517 INFO:tasks.workunit.client.1.vm07.stdout:3/481: creat dc/dd/d43/d5c/fa9 x:0 0 0 2026-03-10T12:37:54.517 INFO:tasks.workunit.client.1.vm07.stdout:2/370: write d0/f46 [872528,72781] 0 2026-03-10T12:37:54.520 INFO:tasks.workunit.client.0.vm00.stdout:4/464: truncate df/f29 582712 0 2026-03-10T12:37:54.520 INFO:tasks.workunit.client.1.vm07.stdout:3/482: chown dc/d18/d2d/d3d/c7b 1537711 1 2026-03-10T12:37:54.520 INFO:tasks.workunit.client.0.vm00.stdout:4/465: chown df/d1f/d36/f92 75548 1 2026-03-10T12:37:54.523 INFO:tasks.workunit.client.1.vm07.stdout:0/505: creat d0/d14/d5f/d76/d2f/d31/d4f/fa7 x:0 0 0 2026-03-10T12:37:54.532 INFO:tasks.workunit.client.0.vm00.stdout:1/472: dread da/d24/f45 [0,4194304] 0 2026-03-10T12:37:54.538 INFO:tasks.workunit.client.0.vm00.stdout:2/458: dwrite d4/d6/d41/f4c [0,4194304] 0 2026-03-10T12:37:54.542 INFO:tasks.workunit.client.0.vm00.stdout:4/466: sync 2026-03-10T12:37:54.544 INFO:tasks.workunit.client.0.vm00.stdout:7/344: link da/d3f/d71/l77 da/d25/d2c/d82/l84 0 2026-03-10T12:37:54.545 INFO:tasks.workunit.client.0.vm00.stdout:2/459: fdatasync f1 0 2026-03-10T12:37:54.548 INFO:tasks.workunit.client.0.vm00.stdout:5/474: write d1f/d26/d2b/f52 [3357937,46274] 0 2026-03-10T12:37:54.548 INFO:tasks.workunit.client.0.vm00.stdout:7/345: creat da/d3f/d60/f85 x:0 0 0 2026-03-10T12:37:54.549 INFO:tasks.workunit.client.1.vm07.stdout:3/483: rmdir dc 39 2026-03-10T12:37:54.550 INFO:tasks.workunit.client.0.vm00.stdout:9/479: dwrite d0/d5/f26 [0,4194304] 0 2026-03-10T12:37:54.551 INFO:tasks.workunit.client.0.vm00.stdout:9/480: chown d0/f17 87706 1 2026-03-10T12:37:54.552 INFO:tasks.workunit.client.0.vm00.stdout:4/467: creat df/d57/fa0 x:0 0 0 2026-03-10T12:37:54.553 INFO:tasks.workunit.client.0.vm00.stdout:9/481: chown d0/d3d/d43/d53/d57/f4f 13513282 1 2026-03-10T12:37:54.553 INFO:tasks.workunit.client.0.vm00.stdout:3/518: write dd/d27/f91 [148361,16806] 0 2026-03-10T12:37:54.554 INFO:tasks.workunit.client.1.vm07.stdout:9/498: creat d5/d1f/d5e/d6b/fb4 x:0 0 0 2026-03-10T12:37:54.554 INFO:tasks.workunit.client.1.vm07.stdout:1/428: creat d9/df/d29/f8b x:0 0 0 2026-03-10T12:37:54.558 INFO:tasks.workunit.client.1.vm07.stdout:1/429: unlink d9/f6c 0 2026-03-10T12:37:54.558 INFO:tasks.workunit.client.0.vm00.stdout:3/519: dwrite dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:54.559 INFO:tasks.workunit.client.1.vm07.stdout:3/484: creat dc/dd/d43/d76/d95/da0/faa x:0 0 0 2026-03-10T12:37:54.559 INFO:tasks.workunit.client.1.vm07.stdout:1/430: mknod d9/d2d/c8c 0 2026-03-10T12:37:54.563 INFO:tasks.workunit.client.0.vm00.stdout:5/475: rename d1f/d26/d6f/l7d to d1f/d6a/d94/laa 0 2026-03-10T12:37:54.568 INFO:tasks.workunit.client.1.vm07.stdout:3/485: rename dc/dd/d1f/f91 to dc/dd/d28/d7a/fab 0 2026-03-10T12:37:54.569 INFO:tasks.workunit.client.0.vm00.stdout:7/346: rename da/d41/d48/l76 to da/d25/d2c/l86 0 2026-03-10T12:37:54.570 INFO:tasks.workunit.client.1.vm07.stdout:9/499: dwrite d5/d13/d57/d3e/fa9 [4194304,4194304] 0 2026-03-10T12:37:54.571 INFO:tasks.workunit.client.0.vm00.stdout:2/460: getdents d4/d6/d93 0 2026-03-10T12:37:54.575 INFO:tasks.workunit.client.0.vm00.stdout:2/461: dwrite d4/dd/ff [0,4194304] 0 2026-03-10T12:37:54.581 INFO:tasks.workunit.client.0.vm00.stdout:2/462: dwrite d4/d6/d41/d6d/d40/f80 [0,4194304] 0 2026-03-10T12:37:54.582 INFO:tasks.workunit.client.1.vm07.stdout:3/486: mkdir dc/dd/d1f/dac 0 2026-03-10T12:37:54.584 INFO:tasks.workunit.client.0.vm00.stdout:7/347: mkdir da/d47/d87 0 2026-03-10T12:37:54.585 INFO:tasks.workunit.client.1.vm07.stdout:1/431: creat d9/d2d/d80/f8d x:0 0 0 2026-03-10T12:37:54.585 INFO:tasks.workunit.client.1.vm07.stdout:3/487: fdatasync dc/d18/d24/f2c 0 2026-03-10T12:37:54.588 INFO:tasks.workunit.client.0.vm00.stdout:5/476: truncate d1f/f32 779611 0 2026-03-10T12:37:54.597 INFO:tasks.workunit.client.0.vm00.stdout:4/468: truncate df/d1f/d22/f7d 2882694 0 2026-03-10T12:37:54.597 INFO:tasks.workunit.client.1.vm07.stdout:9/500: symlink d5/d13/d6c/lb5 0 2026-03-10T12:37:54.597 INFO:tasks.workunit.client.1.vm07.stdout:1/432: write d9/df/d29/d2c/f7f [3455700,112000] 0 2026-03-10T12:37:54.598 INFO:tasks.workunit.client.1.vm07.stdout:7/421: dread d0/f40 [0,4194304] 0 2026-03-10T12:37:54.598 INFO:tasks.workunit.client.1.vm07.stdout:1/433: mkdir d9/d2d/d80/d8e 0 2026-03-10T12:37:54.599 INFO:tasks.workunit.client.1.vm07.stdout:1/434: chown d9/df/d55/f6f 374408945 1 2026-03-10T12:37:54.602 INFO:tasks.workunit.client.1.vm07.stdout:2/371: dread d0/d42/d26/f3e [0,4194304] 0 2026-03-10T12:37:54.604 INFO:tasks.workunit.client.1.vm07.stdout:9/501: unlink d5/d13/f1b 0 2026-03-10T12:37:54.608 INFO:tasks.workunit.client.1.vm07.stdout:7/422: unlink d0/f39 0 2026-03-10T12:37:54.608 INFO:tasks.workunit.client.1.vm07.stdout:2/372: rmdir d0/d42/d1f 39 2026-03-10T12:37:54.609 INFO:tasks.workunit.client.1.vm07.stdout:9/502: unlink d5/d1f/d5e/d6b/c7b 0 2026-03-10T12:37:54.609 INFO:tasks.workunit.client.1.vm07.stdout:2/373: chown d0/d42/d26/d4b 1 1 2026-03-10T12:37:54.609 INFO:tasks.workunit.client.0.vm00.stdout:7/348: sync 2026-03-10T12:37:54.610 INFO:tasks.workunit.client.1.vm07.stdout:2/374: fdatasync d0/d42/d4e/d77/f60 0 2026-03-10T12:37:54.610 INFO:tasks.workunit.client.0.vm00.stdout:7/349: write da/d1b/f22 [1328878,90015] 0 2026-03-10T12:37:54.611 INFO:tasks.workunit.client.0.vm00.stdout:3/520: dwrite dd/f25 [0,4194304] 0 2026-03-10T12:37:54.612 INFO:tasks.workunit.client.1.vm07.stdout:1/435: read d9/df/d29/d2b/f4e [3597476,3189] 0 2026-03-10T12:37:54.613 INFO:tasks.workunit.client.0.vm00.stdout:5/477: write d1f/d26/d2b/d35/d53/d72/fa0 [433321,58700] 0 2026-03-10T12:37:54.613 INFO:tasks.workunit.client.0.vm00.stdout:5/478: stat d1f/d26/d2b/l3f 0 2026-03-10T12:37:54.614 INFO:tasks.workunit.client.0.vm00.stdout:5/479: stat d1f/d26/f48 0 2026-03-10T12:37:54.614 INFO:tasks.workunit.client.1.vm07.stdout:7/423: dwrite d0/d47/f59 [0,4194304] 0 2026-03-10T12:37:54.615 INFO:tasks.workunit.client.1.vm07.stdout:7/424: dread - d0/f7b zero size 2026-03-10T12:37:54.617 INFO:tasks.workunit.client.0.vm00.stdout:4/469: mknod df/d1f/d22/d26/d65/d91/ca1 0 2026-03-10T12:37:54.618 INFO:tasks.workunit.client.0.vm00.stdout:4/470: write f3 [4897877,90862] 0 2026-03-10T12:37:54.618 INFO:tasks.workunit.client.0.vm00.stdout:4/471: chown c7 2875 1 2026-03-10T12:37:54.629 INFO:tasks.workunit.client.1.vm07.stdout:9/503: fdatasync d5/d13/f2b 0 2026-03-10T12:37:54.629 INFO:tasks.workunit.client.1.vm07.stdout:1/436: rmdir d9/df/d54 39 2026-03-10T12:37:54.629 INFO:tasks.workunit.client.1.vm07.stdout:4/590: dwrite d0/d4/d5/da/f6e [0,4194304] 0 2026-03-10T12:37:54.629 INFO:tasks.workunit.client.0.vm00.stdout:4/472: chown df/f3d 5351 1 2026-03-10T12:37:54.629 INFO:tasks.workunit.client.0.vm00.stdout:9/482: rename d0/c6 to d0/d3d/d59/d74/ca6 0 2026-03-10T12:37:54.629 INFO:tasks.workunit.client.0.vm00.stdout:2/463: symlink d4/d53/d68/l9f 0 2026-03-10T12:37:54.636 INFO:tasks.workunit.client.1.vm07.stdout:2/375: mkdir d0/d80 0 2026-03-10T12:37:54.636 INFO:tasks.workunit.client.0.vm00.stdout:4/473: mkdir df/d1f/d22/d26/d65/d91/da2 0 2026-03-10T12:37:54.637 INFO:tasks.workunit.client.0.vm00.stdout:4/474: write df/f12 [4263021,117218] 0 2026-03-10T12:37:54.637 INFO:tasks.workunit.client.0.vm00.stdout:4/475: readlink df/d1f/l21 0 2026-03-10T12:37:54.640 INFO:tasks.workunit.client.0.vm00.stdout:9/483: creat d0/d3d/d43/d80/d1e/d85/d98/fa7 x:0 0 0 2026-03-10T12:37:54.642 INFO:tasks.workunit.client.1.vm07.stdout:1/437: dread d9/df/f10 [0,4194304] 0 2026-03-10T12:37:54.647 INFO:tasks.workunit.client.0.vm00.stdout:9/484: rmdir d0/d3d/d43/d80/d1e/d85 39 2026-03-10T12:37:54.649 INFO:tasks.workunit.client.1.vm07.stdout:9/504: unlink d5/d13/d22/f36 0 2026-03-10T12:37:54.652 INFO:tasks.workunit.client.0.vm00.stdout:7/350: getdents da/d41 0 2026-03-10T12:37:54.653 INFO:tasks.workunit.client.0.vm00.stdout:7/351: write da/d26/d37/f79 [846647,68298] 0 2026-03-10T12:37:54.656 INFO:tasks.workunit.client.1.vm07.stdout:4/591: creat d0/d4/d5/da/fcb x:0 0 0 2026-03-10T12:37:54.660 INFO:tasks.workunit.client.0.vm00.stdout:7/352: creat da/d3f/d60/f88 x:0 0 0 2026-03-10T12:37:54.662 INFO:tasks.workunit.client.1.vm07.stdout:2/376: truncate d0/f12 4973475 0 2026-03-10T12:37:54.663 INFO:tasks.workunit.client.1.vm07.stdout:1/438: symlink d9/d2d/d80/l8f 0 2026-03-10T12:37:54.663 INFO:tasks.workunit.client.1.vm07.stdout:9/505: creat d5/d13/d6c/fb6 x:0 0 0 2026-03-10T12:37:54.664 INFO:tasks.workunit.client.0.vm00.stdout:4/476: rename df/d32/d76/c7b to df/d1f/d22/d26/ca3 0 2026-03-10T12:37:54.668 INFO:tasks.workunit.client.0.vm00.stdout:9/485: creat d0/d7f/d88/fa8 x:0 0 0 2026-03-10T12:37:54.676 INFO:tasks.workunit.client.0.vm00.stdout:9/486: dwrite d0/d3d/d59/d4e/f6f [0,4194304] 0 2026-03-10T12:37:54.677 INFO:tasks.workunit.client.0.vm00.stdout:4/477: link df/d1f/l35 df/la4 0 2026-03-10T12:37:54.678 INFO:tasks.workunit.client.0.vm00.stdout:4/478: dread - df/d1f/d22/d26/f9c zero size 2026-03-10T12:37:54.682 INFO:tasks.workunit.client.0.vm00.stdout:1/473: dwrite da/d21/d27/d6a/f9e [0,4194304] 0 2026-03-10T12:37:54.685 INFO:tasks.workunit.client.0.vm00.stdout:4/479: rename df/d1f/d22/d26/c6a to df/d1f/d36/d3a/d41/ca5 0 2026-03-10T12:37:54.685 INFO:tasks.workunit.client.0.vm00.stdout:4/480: chown df/f4f 123313 1 2026-03-10T12:37:54.685 INFO:tasks.workunit.client.0.vm00.stdout:4/481: chown df/d8a 2 1 2026-03-10T12:37:54.688 INFO:tasks.workunit.client.1.vm07.stdout:7/425: getdents d0/d52 0 2026-03-10T12:37:54.697 INFO:tasks.workunit.client.0.vm00.stdout:9/487: symlink d0/d3d/d43/d80/d1e/d85/la9 0 2026-03-10T12:37:54.697 INFO:tasks.workunit.client.1.vm07.stdout:7/426: write d0/d57/d62/f6c [2332956,75776] 0 2026-03-10T12:37:54.697 INFO:tasks.workunit.client.1.vm07.stdout:7/427: rename d0/d47 to d0/d47/d87 22 2026-03-10T12:37:54.697 INFO:tasks.workunit.client.0.vm00.stdout:1/474: dwrite da/d12/f1d [0,4194304] 0 2026-03-10T12:37:54.699 INFO:tasks.workunit.client.0.vm00.stdout:1/475: dread - da/d21/d39/f92 zero size 2026-03-10T12:37:54.702 INFO:tasks.workunit.client.1.vm07.stdout:4/592: creat d0/d4/d10/d3c/d2b/d2d/d9c/fcc x:0 0 0 2026-03-10T12:37:54.703 INFO:tasks.workunit.client.0.vm00.stdout:4/482: symlink df/la6 0 2026-03-10T12:37:54.706 INFO:tasks.workunit.client.0.vm00.stdout:4/483: dwrite f9 [4194304,4194304] 0 2026-03-10T12:37:54.714 INFO:tasks.workunit.client.0.vm00.stdout:4/484: write f3 [5808063,30912] 0 2026-03-10T12:37:54.714 INFO:tasks.workunit.client.0.vm00.stdout:1/476: creat da/d21/d27/fa0 x:0 0 0 2026-03-10T12:37:54.714 INFO:tasks.workunit.client.0.vm00.stdout:1/477: write da/d24/d28/d44/f7a [630317,90123] 0 2026-03-10T12:37:54.714 INFO:tasks.workunit.client.0.vm00.stdout:1/478: dwrite da/d24/d28/d44/d5d/d72/f9a [0,4194304] 0 2026-03-10T12:37:54.716 INFO:tasks.workunit.client.0.vm00.stdout:1/479: stat da/d21 0 2026-03-10T12:37:54.724 INFO:tasks.workunit.client.1.vm07.stdout:2/377: creat d0/d42/d4e/f81 x:0 0 0 2026-03-10T12:37:54.725 INFO:tasks.workunit.client.0.vm00.stdout:1/480: unlink da/d12/f30 0 2026-03-10T12:37:54.727 INFO:tasks.workunit.client.0.vm00.stdout:1/481: creat da/d24/d28/d56/d8b/fa1 x:0 0 0 2026-03-10T12:37:54.727 INFO:tasks.workunit.client.0.vm00.stdout:1/482: readlink da/l4c 0 2026-03-10T12:37:54.729 INFO:tasks.workunit.client.0.vm00.stdout:1/483: rename da/d4d to da/d24/d28/d67/da2 0 2026-03-10T12:37:54.729 INFO:tasks.workunit.client.0.vm00.stdout:1/484: chown da/f14 831513 1 2026-03-10T12:37:54.732 INFO:tasks.workunit.client.0.vm00.stdout:1/485: dwrite da/d24/d28/d44/f83 [0,4194304] 0 2026-03-10T12:37:54.733 INFO:tasks.workunit.client.1.vm07.stdout:7/428: dread - d0/f5f zero size 2026-03-10T12:37:54.740 INFO:tasks.workunit.client.0.vm00.stdout:5/480: write d1f/d26/d2b/d35/f50 [1405656,127996] 0 2026-03-10T12:37:54.743 INFO:tasks.workunit.client.0.vm00.stdout:5/481: dwrite d1f/d26/d2b/d35/f68 [0,4194304] 0 2026-03-10T12:37:54.744 INFO:tasks.workunit.client.1.vm07.stdout:2/378: mknod d0/d45/d54/c82 0 2026-03-10T12:37:54.744 INFO:tasks.workunit.client.0.vm00.stdout:5/482: stat l17 0 2026-03-10T12:37:54.745 INFO:tasks.workunit.client.0.vm00.stdout:5/483: chown d1f/d26/d2b/d35/d53/d72/d9d/f88 19986 1 2026-03-10T12:37:54.755 INFO:tasks.workunit.client.1.vm07.stdout:7/429: fdatasync d0/f5f 0 2026-03-10T12:37:54.782 INFO:tasks.workunit.client.0.vm00.stdout:3/521: truncate dd/d18/d13/d1d/f42 1972610 0 2026-03-10T12:37:54.782 INFO:tasks.workunit.client.0.vm00.stdout:5/484: getdents d1f/d26/d2b/d35/d78 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:7/430: readlink d0/d67/d6f/l76 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:7/431: symlink d0/d61/d79/l88 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:4/593: link d0/d4/d7a/f87 d0/d4/d7a/fcd 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:4/594: write d0/d4/d10/d3c/d2b/d2d/d9c/fcc [407450,114349] 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:7/432: creat d0/d67/f89 x:0 0 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:7/433: mkdir d0/d47/d48/d8a 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:7/434: dread d0/d47/d48/f4b [0,4194304] 0 2026-03-10T12:37:54.783 INFO:tasks.workunit.client.1.vm07.stdout:2/379: link d0/d42/d26/d38/d4f/d62/c63 d0/d42/d26/d7d/c83 0 2026-03-10T12:37:54.786 INFO:tasks.workunit.client.1.vm07.stdout:7/435: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:37:54.788 INFO:tasks.workunit.client.0.vm00.stdout:8/353: dread d0/d12/d2d/f52 [0,4194304] 0 2026-03-10T12:37:54.791 INFO:tasks.workunit.client.0.vm00.stdout:8/354: rename d0/f28 to d0/d12/d2d/f6f 0 2026-03-10T12:37:54.793 INFO:tasks.workunit.client.0.vm00.stdout:8/355: dread d0/d5c/f42 [0,4194304] 0 2026-03-10T12:37:54.794 INFO:tasks.workunit.client.0.vm00.stdout:8/356: creat d0/d46/d6e/f70 x:0 0 0 2026-03-10T12:37:54.800 INFO:tasks.workunit.client.1.vm07.stdout:2/380: creat d0/d42/d1f/f84 x:0 0 0 2026-03-10T12:37:54.824 INFO:tasks.workunit.client.1.vm07.stdout:2/381: unlink d0/d42/d4e/d77/f60 0 2026-03-10T12:37:54.824 INFO:tasks.workunit.client.1.vm07.stdout:2/382: rename d0/d5b/f72 to d0/d42/d26/f85 0 2026-03-10T12:37:54.824 INFO:tasks.workunit.client.1.vm07.stdout:2/383: truncate d0/d42/d26/d4b/f51 891102 0 2026-03-10T12:37:54.824 INFO:tasks.workunit.client.1.vm07.stdout:2/384: mkdir d0/d42/d1f/d20/d86 0 2026-03-10T12:37:54.824 INFO:tasks.workunit.client.1.vm07.stdout:2/385: chown d0/cd 66891007 1 2026-03-10T12:37:54.824 INFO:tasks.workunit.client.1.vm07.stdout:2/386: symlink d0/d45/l87 0 2026-03-10T12:37:54.831 INFO:tasks.workunit.client.1.vm07.stdout:6/412: write d1/d4/d6/f2a [2362410,124865] 0 2026-03-10T12:37:54.852 INFO:tasks.workunit.client.0.vm00.stdout:6/345: dwrite d2/d14/f2b [0,4194304] 0 2026-03-10T12:37:54.852 INFO:tasks.workunit.client.0.vm00.stdout:0/460: dwrite d3/d40/f7a [0,4194304] 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.0.vm00.stdout:0/461: mknod d3/d7/d4c/d9d/ca1 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.1.vm07.stdout:6/413: creat d1/d4/d6/d43/d65/f7f x:0 0 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.1.vm07.stdout:6/414: creat d1/d4/d6/f80 x:0 0 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.1.vm07.stdout:6/415: dwrite d1/d4/f11 [4194304,4194304] 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.1.vm07.stdout:6/416: write d1/d4/d6/d16/d49/f67 [2840382,43631] 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.1.vm07.stdout:6/417: mknod d1/d4/d6/d46/d4d/c81 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.1.vm07.stdout:6/418: creat d1/d4/f82 x:0 0 0 2026-03-10T12:37:54.853 INFO:tasks.workunit.client.1.vm07.stdout:6/419: dwrite d1/d4/f82 [0,4194304] 0 2026-03-10T12:37:54.868 INFO:tasks.workunit.client.0.vm00.stdout:8/357: sync 2026-03-10T12:37:54.876 INFO:tasks.workunit.client.0.vm00.stdout:5/485: dread d1f/d39/f65 [0,4194304] 0 2026-03-10T12:37:54.888 INFO:tasks.workunit.client.0.vm00.stdout:5/486: dwrite d1f/d26/d2b/d35/f42 [0,4194304] 0 2026-03-10T12:37:54.888 INFO:tasks.workunit.client.0.vm00.stdout:5/487: dwrite d1f/d26/d2b/d37/f4c [0,4194304] 0 2026-03-10T12:37:54.888 INFO:tasks.workunit.client.0.vm00.stdout:5/488: rename d1f/d26/d2e/l3b to d1f/d26/d2b/d35/d53/d72/d9d/d90/lab 0 2026-03-10T12:37:54.891 INFO:tasks.workunit.client.0.vm00.stdout:5/489: rename d1f/d26/d2b/d35/d53/c60 to d1f/d26/d2b/d35/d78/d7f/cac 0 2026-03-10T12:37:54.893 INFO:tasks.workunit.client.0.vm00.stdout:5/490: rename d1f/f30 to d1f/d26/d2b/d35/fad 0 2026-03-10T12:37:54.900 INFO:tasks.workunit.client.1.vm07.stdout:9/506: sync 2026-03-10T12:37:54.905 INFO:tasks.workunit.client.1.vm07.stdout:8/472: dread d1/f3f [0,4194304] 0 2026-03-10T12:37:54.911 INFO:tasks.workunit.client.1.vm07.stdout:5/497: write d0/d22/d18/d19/d21/d54/f9b [4434030,69453] 0 2026-03-10T12:37:54.916 INFO:tasks.workunit.client.1.vm07.stdout:0/506: dwrite d0/d14/f37 [4194304,4194304] 0 2026-03-10T12:37:54.929 INFO:tasks.workunit.client.1.vm07.stdout:9/507: creat d5/d1f/d31/d64/fb7 x:0 0 0 2026-03-10T12:37:54.929 INFO:tasks.workunit.client.1.vm07.stdout:3/488: dwrite dc/dd/d28/d3b/fa5 [0,4194304] 0 2026-03-10T12:37:54.933 INFO:tasks.workunit.client.1.vm07.stdout:5/498: mknod d0/d22/d18/d80/cab 0 2026-03-10T12:37:54.936 INFO:tasks.workunit.client.1.vm07.stdout:3/489: dwrite dc/dd/f41 [0,4194304] 0 2026-03-10T12:37:54.947 INFO:tasks.workunit.client.0.vm00.stdout:2/464: dwrite d4/d6/d2d/f3d [0,4194304] 0 2026-03-10T12:37:54.947 INFO:tasks.workunit.client.0.vm00.stdout:2/465: dread - d4/d6/f9c zero size 2026-03-10T12:37:54.949 INFO:tasks.workunit.client.0.vm00.stdout:2/466: dwrite d4/f39 [0,4194304] 0 2026-03-10T12:37:54.954 INFO:tasks.workunit.client.1.vm07.stdout:0/507: mkdir d0/d14/d5f/d76/d2f/d31/d4f/da8 0 2026-03-10T12:37:54.969 INFO:tasks.workunit.client.1.vm07.stdout:8/473: getdents d1/d3/d5d 0 2026-03-10T12:37:54.973 INFO:tasks.workunit.client.1.vm07.stdout:9/508: getdents d5/d13/d6c/d89 0 2026-03-10T12:37:54.976 INFO:tasks.workunit.client.1.vm07.stdout:9/509: dread - d5/d13/d6c/fb6 zero size 2026-03-10T12:37:54.976 INFO:tasks.workunit.client.1.vm07.stdout:3/490: rename dc/dd/l4a to dc/dd/lad 0 2026-03-10T12:37:54.976 INFO:tasks.workunit.client.1.vm07.stdout:9/510: chown d5/d1f/d31/d64/f70 128103396 1 2026-03-10T12:37:54.978 INFO:tasks.workunit.client.1.vm07.stdout:3/491: dread - dc/d18/d2d/f80 zero size 2026-03-10T12:37:54.979 INFO:tasks.workunit.client.1.vm07.stdout:8/474: getdents d1/d3/d6c 0 2026-03-10T12:37:54.985 INFO:tasks.workunit.client.1.vm07.stdout:3/492: dwrite dc/dd/d43/d76/d95/da0/faa [0,4194304] 0 2026-03-10T12:37:54.991 INFO:tasks.workunit.client.1.vm07.stdout:8/475: mknod d1/d3/d40/c96 0 2026-03-10T12:37:54.991 INFO:tasks.workunit.client.1.vm07.stdout:3/493: unlink dc/l25 0 2026-03-10T12:37:54.991 INFO:tasks.workunit.client.1.vm07.stdout:8/476: mknod d1/d3/d11/c97 0 2026-03-10T12:37:54.996 INFO:tasks.workunit.client.1.vm07.stdout:3/494: dwrite dc/d18/f36 [0,4194304] 0 2026-03-10T12:37:55.004 INFO:tasks.workunit.client.0.vm00.stdout:2/467: sync 2026-03-10T12:37:55.069 INFO:tasks.workunit.client.0.vm00.stdout:3/522: dread dd/d18/d14/d2b/f8d [0,4194304] 0 2026-03-10T12:37:55.070 INFO:tasks.workunit.client.0.vm00.stdout:3/523: readlink dd/d27/l79 0 2026-03-10T12:37:55.071 INFO:tasks.workunit.client.0.vm00.stdout:3/524: dread dd/d18/f83 [0,4194304] 0 2026-03-10T12:37:55.105 INFO:tasks.workunit.client.0.vm00.stdout:9/488: truncate d0/d3d/d43/d53/d57/f4f 2726080 0 2026-03-10T12:37:55.107 INFO:tasks.workunit.client.0.vm00.stdout:9/489: creat d0/d3d/d59/d74/faa x:0 0 0 2026-03-10T12:37:55.108 INFO:tasks.workunit.client.0.vm00.stdout:1/486: write da/d24/d28/f37 [360735,23789] 0 2026-03-10T12:37:55.108 INFO:tasks.workunit.client.0.vm00.stdout:9/490: creat d0/d3d/d43/d80/d1e/d85/d98/fab x:0 0 0 2026-03-10T12:37:55.109 INFO:tasks.workunit.client.0.vm00.stdout:1/487: symlink da/d24/d28/d67/da2/d78/la3 0 2026-03-10T12:37:55.110 INFO:tasks.workunit.client.0.vm00.stdout:1/488: write da/f13 [6272403,71358] 0 2026-03-10T12:37:55.111 INFO:tasks.workunit.client.0.vm00.stdout:9/491: chown d0/d3d/d43/d80/d19/f65 4777 1 2026-03-10T12:37:55.111 INFO:tasks.workunit.client.0.vm00.stdout:9/492: truncate d0/d3d/d43/d80/fa1 839182 0 2026-03-10T12:37:55.112 INFO:tasks.workunit.client.0.vm00.stdout:1/489: mkdir da/d24/d28/d56/da4 0 2026-03-10T12:37:55.124 INFO:tasks.workunit.client.0.vm00.stdout:8/358: rmdir d0/d46/d6e 39 2026-03-10T12:37:55.129 INFO:tasks.workunit.client.0.vm00.stdout:8/359: chown d0/d46/d6e 40845483 1 2026-03-10T12:37:55.129 INFO:tasks.workunit.client.0.vm00.stdout:8/360: write d0/f10 [911367,6509] 0 2026-03-10T12:37:55.129 INFO:tasks.workunit.client.0.vm00.stdout:8/361: rename d0/d12/l3a to d0/d46/d6e/l71 0 2026-03-10T12:37:55.129 INFO:tasks.workunit.client.0.vm00.stdout:8/362: readlink d0/d12/d36/d51/l5e 0 2026-03-10T12:37:55.129 INFO:tasks.workunit.client.0.vm00.stdout:8/363: dread - d0/d12/d36/f41 zero size 2026-03-10T12:37:55.130 INFO:tasks.workunit.client.0.vm00.stdout:8/364: creat d0/d12/d36/f72 x:0 0 0 2026-03-10T12:37:55.132 INFO:tasks.workunit.client.0.vm00.stdout:8/365: dread d0/d12/d36/d5b/f65 [0,4194304] 0 2026-03-10T12:37:55.133 INFO:tasks.workunit.client.0.vm00.stdout:8/366: write d0/d12/d2d/f44 [127229,22958] 0 2026-03-10T12:37:55.136 INFO:tasks.workunit.client.0.vm00.stdout:8/367: dwrite d0/f22 [4194304,4194304] 0 2026-03-10T12:37:55.176 INFO:tasks.workunit.client.0.vm00.stdout:1/490: sync 2026-03-10T12:37:55.177 INFO:tasks.workunit.client.0.vm00.stdout:1/491: write da/d24/d28/d44/f83 [5101549,86538] 0 2026-03-10T12:37:55.187 INFO:tasks.workunit.client.0.vm00.stdout:1/492: symlink da/d12/d91/la5 0 2026-03-10T12:37:55.188 INFO:tasks.workunit.client.0.vm00.stdout:1/493: fsync da/d24/d28/d44/d5d/d72/f9a 0 2026-03-10T12:37:55.192 INFO:tasks.workunit.client.0.vm00.stdout:3/525: read dd/d4e/d5d/f71 [7923417,81221] 0 2026-03-10T12:37:55.196 INFO:tasks.workunit.client.0.vm00.stdout:3/526: dwrite dd/d3d/d73/f8f [0,4194304] 0 2026-03-10T12:37:55.240 INFO:tasks.workunit.client.1.vm07.stdout:6/420: dread d1/d4/d6/d16/d1a/d33/f37 [0,4194304] 0 2026-03-10T12:37:55.243 INFO:tasks.workunit.client.0.vm00.stdout:9/493: read d0/d3d/d43/d53/d57/f8b [1743872,63339] 0 2026-03-10T12:37:55.252 INFO:tasks.workunit.client.1.vm07.stdout:6/421: dwrite d1/d4/f3f [4194304,4194304] 0 2026-03-10T12:37:55.252 INFO:tasks.workunit.client.0.vm00.stdout:5/491: fsync d1f/d26/d2b/d35/fad 0 2026-03-10T12:37:55.252 INFO:tasks.workunit.client.0.vm00.stdout:9/494: getdents d0/d3d/d43/d80/d19/d50 0 2026-03-10T12:37:55.252 INFO:tasks.workunit.client.0.vm00.stdout:9/495: fdatasync d0/f9f 0 2026-03-10T12:37:55.252 INFO:tasks.workunit.client.0.vm00.stdout:5/492: creat d1f/d26/d2b/d35/d53/d72/d9d/d90/fae x:0 0 0 2026-03-10T12:37:55.252 INFO:tasks.workunit.client.0.vm00.stdout:5/493: fsync d1f/d26/d2e/f8c 0 2026-03-10T12:37:55.252 INFO:tasks.workunit.client.0.vm00.stdout:9/496: dwrite d0/d3d/d43/d53/d57/f8a [0,4194304] 0 2026-03-10T12:37:55.257 INFO:tasks.workunit.client.0.vm00.stdout:9/497: symlink d0/d3d/d59/d4e/da3/lac 0 2026-03-10T12:37:55.262 INFO:tasks.workunit.client.0.vm00.stdout:5/494: mkdir d1f/d26/d2b/d35/d78/d99/daf 0 2026-03-10T12:37:55.262 INFO:tasks.workunit.client.0.vm00.stdout:9/498: creat d0/d3d/d59/fad x:0 0 0 2026-03-10T12:37:55.262 INFO:tasks.workunit.client.0.vm00.stdout:9/499: chown d0/d3d/d43/d80/d1e/d85/la9 2 1 2026-03-10T12:37:55.262 INFO:tasks.workunit.client.0.vm00.stdout:9/500: fdatasync d0/d3d/d43/d80/d19/f1b 0 2026-03-10T12:37:55.265 INFO:tasks.workunit.client.0.vm00.stdout:9/501: creat d0/d3d/d43/d80/d1e/d27/fae x:0 0 0 2026-03-10T12:37:55.266 INFO:tasks.workunit.client.1.vm07.stdout:6/422: rmdir d1/d4/d71/d77 0 2026-03-10T12:37:55.269 INFO:tasks.workunit.client.0.vm00.stdout:9/502: dwrite d0/d3d/d43/d80/d1e/d2b/f36 [4194304,4194304] 0 2026-03-10T12:37:55.273 INFO:tasks.workunit.client.0.vm00.stdout:9/503: link d0/la4 d0/d3d/d43/d80/d19/d50/laf 0 2026-03-10T12:37:55.274 INFO:tasks.workunit.client.0.vm00.stdout:9/504: mkdir d0/d3d/d43/d53/d57/db0 0 2026-03-10T12:37:55.280 INFO:tasks.workunit.client.1.vm07.stdout:6/423: rename d1/d4/d6/d46/c5d to d1/d4/d6/d16/d1a/d6e/c83 0 2026-03-10T12:37:55.280 INFO:tasks.workunit.client.0.vm00.stdout:9/505: chown d0/d3d/d43/d80/d1e/d85/d98/fa0 296630 1 2026-03-10T12:37:55.280 INFO:tasks.workunit.client.0.vm00.stdout:9/506: creat d0/d3d/d43/d80/d19/fb1 x:0 0 0 2026-03-10T12:37:55.281 INFO:tasks.workunit.client.0.vm00.stdout:7/353: dwrite da/f35 [0,4194304] 0 2026-03-10T12:37:55.283 INFO:tasks.workunit.client.0.vm00.stdout:7/354: chown da/d25/l5b 14823941 1 2026-03-10T12:37:55.287 INFO:tasks.workunit.client.0.vm00.stdout:9/507: sync 2026-03-10T12:37:55.288 INFO:tasks.workunit.client.0.vm00.stdout:7/355: mkdir da/d26/d50/d73/d89 0 2026-03-10T12:37:55.289 INFO:tasks.workunit.client.0.vm00.stdout:9/508: mknod d0/d3d/d43/d80/d1e/d85/cb2 0 2026-03-10T12:37:55.292 INFO:tasks.workunit.client.0.vm00.stdout:7/356: dwrite da/d26/d37/f6f [0,4194304] 0 2026-03-10T12:37:55.292 INFO:tasks.workunit.client.1.vm07.stdout:6/424: dwrite d1/d4/d6/d16/d1a/d2c/f78 [0,4194304] 0 2026-03-10T12:37:55.292 INFO:tasks.workunit.client.0.vm00.stdout:9/509: mknod d0/d7f/d88/cb3 0 2026-03-10T12:37:55.292 INFO:tasks.workunit.client.1.vm07.stdout:1/439: dwrite d9/fe [0,4194304] 0 2026-03-10T12:37:55.295 INFO:tasks.workunit.client.0.vm00.stdout:9/510: mknod d0/d9b/cb4 0 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.1.vm07.stdout:1/440: chown d9/df/d55 1637 1 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.1.vm07.stdout:6/425: mknod d1/d4/c84 0 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.1.vm07.stdout:6/426: write d1/d4/d6/f80 [105817,9020] 0 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.1.vm07.stdout:1/441: rename l5 to d9/df/l90 0 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.0.vm00.stdout:9/511: chown d0/d3d/d43/f68 40380637 1 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.0.vm00.stdout:9/512: dwrite d0/f4 [8388608,4194304] 0 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.0.vm00.stdout:9/513: mknod d0/d3d/d43/cb5 0 2026-03-10T12:37:55.306 INFO:tasks.workunit.client.0.vm00.stdout:9/514: chown d0/f9f 3460351 1 2026-03-10T12:37:55.319 INFO:tasks.workunit.client.0.vm00.stdout:9/515: dread d0/d3d/d43/d80/d1e/d2b/f47 [0,4194304] 0 2026-03-10T12:37:55.320 INFO:tasks.workunit.client.0.vm00.stdout:9/516: creat d0/d3d/d43/d80/d19/fb6 x:0 0 0 2026-03-10T12:37:55.321 INFO:tasks.workunit.client.0.vm00.stdout:9/517: write d0/f5d [828435,60644] 0 2026-03-10T12:37:55.322 INFO:tasks.workunit.client.0.vm00.stdout:9/518: write d0/d3d/d43/f68 [657066,72823] 0 2026-03-10T12:37:55.323 INFO:tasks.workunit.client.0.vm00.stdout:9/519: unlink d0/d3d/d43/d53/d57/f3f 0 2026-03-10T12:37:55.329 INFO:tasks.workunit.client.0.vm00.stdout:9/520: write d0/d3d/d43/d53/d57/f67 [1009444,113644] 0 2026-03-10T12:37:55.329 INFO:tasks.workunit.client.0.vm00.stdout:9/521: rmdir d0/d3d/d43/d80/d19/d50 39 2026-03-10T12:37:55.329 INFO:tasks.workunit.client.0.vm00.stdout:9/522: chown d0/d3d/d43/c5b 3651252 1 2026-03-10T12:37:55.329 INFO:tasks.workunit.client.1.vm07.stdout:6/427: symlink d1/d4/d6/d4e/l85 0 2026-03-10T12:37:55.330 INFO:tasks.workunit.client.0.vm00.stdout:9/523: read d0/d5/dc/f9c [10705,26985] 0 2026-03-10T12:37:55.330 INFO:tasks.workunit.client.0.vm00.stdout:9/524: chown d0/f4 30097695 1 2026-03-10T12:37:55.331 INFO:tasks.workunit.client.1.vm07.stdout:1/442: fsync d9/df/d54/f57 0 2026-03-10T12:37:55.332 INFO:tasks.workunit.client.1.vm07.stdout:6/428: dread - d1/d4/d6/d16/d1a/d33/f7b zero size 2026-03-10T12:37:55.345 INFO:tasks.workunit.client.1.vm07.stdout:6/429: creat d1/d4/d6/d43/d65/f86 x:0 0 0 2026-03-10T12:37:55.353 INFO:tasks.workunit.client.1.vm07.stdout:1/443: rename d9/df/d29/d2c to d9/df/d29/d2b/d31/d91 0 2026-03-10T12:37:55.364 INFO:tasks.workunit.client.0.vm00.stdout:2/468: dwrite d4/dd/f3e [0,4194304] 0 2026-03-10T12:37:55.365 INFO:tasks.workunit.client.0.vm00.stdout:2/469: read d4/d53/d9e/f6f [565397,78961] 0 2026-03-10T12:37:55.367 INFO:tasks.workunit.client.0.vm00.stdout:1/494: rename da/d24/d28/d56 to da/d24/d28/d44/d59/da6 0 2026-03-10T12:37:55.374 INFO:tasks.workunit.client.0.vm00.stdout:2/470: symlink d4/dd/la0 0 2026-03-10T12:37:55.374 INFO:tasks.workunit.client.1.vm07.stdout:1/444: mkdir d9/df/d29/d2b/d92 0 2026-03-10T12:37:55.374 INFO:tasks.workunit.client.1.vm07.stdout:7/436: write d0/f4f [666583,71625] 0 2026-03-10T12:37:55.375 INFO:tasks.workunit.client.0.vm00.stdout:8/368: write d0/d12/d2d/f6f [622968,12731] 0 2026-03-10T12:37:55.377 INFO:tasks.workunit.client.0.vm00.stdout:2/471: dwrite d4/f39 [0,4194304] 0 2026-03-10T12:37:55.381 INFO:tasks.workunit.client.1.vm07.stdout:4/595: dwrite d0/d4/d10/f4b [0,4194304] 0 2026-03-10T12:37:55.385 INFO:tasks.workunit.client.0.vm00.stdout:6/346: dwrite d2/d14/f3f [0,4194304] 0 2026-03-10T12:37:55.386 INFO:tasks.workunit.client.0.vm00.stdout:6/347: write d2/d16/f20 [1173309,32862] 0 2026-03-10T12:37:55.388 INFO:tasks.workunit.client.0.vm00.stdout:0/462: dwrite d3/d7/d3c/f72 [0,4194304] 0 2026-03-10T12:37:55.392 INFO:tasks.workunit.client.0.vm00.stdout:6/348: creat d2/d16/d74/f81 x:0 0 0 2026-03-10T12:37:55.394 INFO:tasks.workunit.client.0.vm00.stdout:3/527: rename dd/d4e/d5d/c5f to dd/d3d/cae 0 2026-03-10T12:37:55.394 INFO:tasks.workunit.client.0.vm00.stdout:0/463: write d3/d7/d4c/d5b/d38/d44/f49 [445531,20441] 0 2026-03-10T12:37:55.396 INFO:tasks.workunit.client.1.vm07.stdout:0/508: dwrite d0/f1d [4194304,4194304] 0 2026-03-10T12:37:55.397 INFO:tasks.workunit.client.1.vm07.stdout:0/509: chown d0/d14/d5f/d76/d2f/d31/d4f/d60/c66 1745400973 1 2026-03-10T12:37:55.397 INFO:tasks.workunit.client.1.vm07.stdout:9/511: truncate d5/d1f/d7d/f7f 296010 0 2026-03-10T12:37:55.398 INFO:tasks.workunit.client.1.vm07.stdout:8/477: rmdir d1/d3/d40 39 2026-03-10T12:37:55.398 INFO:tasks.workunit.client.0.vm00.stdout:0/464: dwrite d3/d22/f71 [0,4194304] 0 2026-03-10T12:37:55.403 INFO:tasks.workunit.client.0.vm00.stdout:0/465: write d3/d40/f7a [48772,17536] 0 2026-03-10T12:37:55.406 INFO:tasks.workunit.client.1.vm07.stdout:1/445: link d9/f19 d9/d2d/d4f/d5a/f93 0 2026-03-10T12:37:55.406 INFO:tasks.workunit.client.1.vm07.stdout:7/437: read d0/f42 [199104,60212] 0 2026-03-10T12:37:55.407 INFO:tasks.workunit.client.1.vm07.stdout:7/438: chown d0/f3f 4091431 1 2026-03-10T12:37:55.407 INFO:tasks.workunit.client.0.vm00.stdout:3/528: dread dd/d2a/f78 [0,4194304] 0 2026-03-10T12:37:55.408 INFO:tasks.workunit.client.1.vm07.stdout:8/478: mknod d1/d3/d6/d7b/c98 0 2026-03-10T12:37:55.410 INFO:tasks.workunit.client.0.vm00.stdout:8/369: symlink d0/l73 0 2026-03-10T12:37:55.413 INFO:tasks.workunit.client.1.vm07.stdout:0/510: truncate d0/d14/d5f/d41/d86/f96 302246 0 2026-03-10T12:37:55.414 INFO:tasks.workunit.client.0.vm00.stdout:8/370: dwrite d0/d12/d36/d5b/f6b [0,4194304] 0 2026-03-10T12:37:55.421 INFO:tasks.workunit.client.1.vm07.stdout:7/439: creat d0/d57/d62/f8b x:0 0 0 2026-03-10T12:37:55.421 INFO:tasks.workunit.client.1.vm07.stdout:7/440: chown d0/d47/d48 307140601 1 2026-03-10T12:37:55.425 INFO:tasks.workunit.client.0.vm00.stdout:0/466: fsync d3/d7/d4c/d5b/f37 0 2026-03-10T12:37:55.429 INFO:tasks.workunit.client.1.vm07.stdout:9/512: symlink d5/d16/d23/lb8 0 2026-03-10T12:37:55.429 INFO:tasks.workunit.client.1.vm07.stdout:0/511: creat d0/d14/d5f/d76/d2f/fa9 x:0 0 0 2026-03-10T12:37:55.429 INFO:tasks.workunit.client.1.vm07.stdout:9/513: fsync d5/d1f/d5e/d6b/fb4 0 2026-03-10T12:37:55.434 INFO:tasks.workunit.client.0.vm00.stdout:2/472: creat d4/d6/d2d/d3a/d43/fa1 x:0 0 0 2026-03-10T12:37:55.439 INFO:tasks.workunit.client.1.vm07.stdout:7/441: stat d0/d47/c82 0 2026-03-10T12:37:55.441 INFO:tasks.workunit.client.0.vm00.stdout:2/473: dwrite d4/d53/d76/f8b [0,4194304] 0 2026-03-10T12:37:55.441 INFO:tasks.workunit.client.1.vm07.stdout:7/442: chown d0/d47/l49 6 1 2026-03-10T12:37:55.441 INFO:tasks.workunit.client.0.vm00.stdout:3/529: rename dd/d27/d2c/d34/d38/fa1 to dd/d3d/faf 0 2026-03-10T12:37:55.442 INFO:tasks.workunit.client.0.vm00.stdout:0/467: dwrite d3/d7/d4c/d5b/f2b [0,4194304] 0 2026-03-10T12:37:55.442 INFO:tasks.workunit.client.0.vm00.stdout:2/474: chown d4/d6/d2d/d31/f71 0 1 2026-03-10T12:37:55.446 INFO:tasks.workunit.client.1.vm07.stdout:0/512: symlink d0/d14/d7c/laa 0 2026-03-10T12:37:55.457 INFO:tasks.workunit.client.0.vm00.stdout:3/530: creat dd/d2a/fb0 x:0 0 0 2026-03-10T12:37:55.457 INFO:tasks.workunit.client.1.vm07.stdout:7/443: symlink d0/d67/d6f/l8c 0 2026-03-10T12:37:55.457 INFO:tasks.workunit.client.1.vm07.stdout:1/446: link d9/l28 d9/df/d29/d2b/d31/d91/l94 0 2026-03-10T12:37:55.457 INFO:tasks.workunit.client.1.vm07.stdout:0/513: fdatasync d0/d14/f36 0 2026-03-10T12:37:55.458 INFO:tasks.workunit.client.0.vm00.stdout:4/485: truncate df/d1f/d22/f7d 2159451 0 2026-03-10T12:37:55.463 INFO:tasks.workunit.client.0.vm00.stdout:4/486: mkdir df/d1f/d22/d26/d65/da7 0 2026-03-10T12:37:55.463 INFO:tasks.workunit.client.0.vm00.stdout:0/468: creat d3/d7/d4c/d5b/d38/fa2 x:0 0 0 2026-03-10T12:37:55.464 INFO:tasks.workunit.client.0.vm00.stdout:5/495: write d1f/d26/d2e/f8c [289716,78803] 0 2026-03-10T12:37:55.464 INFO:tasks.workunit.client.1.vm07.stdout:8/479: link d1/d3/d11/l5c d1/d3/l99 0 2026-03-10T12:37:55.465 INFO:tasks.workunit.client.1.vm07.stdout:3/495: write dc/d18/f34 [2952552,114721] 0 2026-03-10T12:37:55.465 INFO:tasks.workunit.client.0.vm00.stdout:3/531: link dd/d2a/f9f dd/d27/d2c/fb1 0 2026-03-10T12:37:55.466 INFO:tasks.workunit.client.0.vm00.stdout:3/532: fdatasync dd/d18/d13/d1d/f86 0 2026-03-10T12:37:55.466 INFO:tasks.workunit.client.1.vm07.stdout:3/496: write dc/d18/d24/f49 [120728,52017] 0 2026-03-10T12:37:55.467 INFO:tasks.workunit.client.1.vm07.stdout:3/497: write dc/d18/f34 [3713614,58398] 0 2026-03-10T12:37:55.471 INFO:tasks.workunit.client.0.vm00.stdout:5/496: rmdir d1f/d26/d2b/d35 39 2026-03-10T12:37:55.471 INFO:tasks.workunit.client.1.vm07.stdout:1/447: creat d9/d2d/d4f/f95 x:0 0 0 2026-03-10T12:37:55.471 INFO:tasks.workunit.client.0.vm00.stdout:5/497: write d1f/d26/d2b/f52 [3981634,79358] 0 2026-03-10T12:37:55.472 INFO:tasks.workunit.client.1.vm07.stdout:3/498: chown dc/dd/d1f/d45/f56 52124 1 2026-03-10T12:37:55.472 INFO:tasks.workunit.client.0.vm00.stdout:5/498: write d1f/d26/d2b/f5e [159085,122014] 0 2026-03-10T12:37:55.473 INFO:tasks.workunit.client.0.vm00.stdout:5/499: stat d1f/d26/d2e/fa5 0 2026-03-10T12:37:55.474 INFO:tasks.workunit.client.0.vm00.stdout:4/487: mknod df/d8a/ca8 0 2026-03-10T12:37:55.479 INFO:tasks.workunit.client.0.vm00.stdout:3/533: rename dd/d4e/d5d/f81 to dd/d64/fb2 0 2026-03-10T12:37:55.489 INFO:tasks.workunit.client.0.vm00.stdout:4/488: rename df/d1f/d36/d3a/f68 to df/d1f/d36/d3a/fa9 0 2026-03-10T12:37:55.492 INFO:tasks.workunit.client.1.vm07.stdout:9/514: getdents d5/d16/d23/d26 0 2026-03-10T12:37:55.492 INFO:tasks.workunit.client.0.vm00.stdout:4/489: write df/f42 [3346850,12933] 0 2026-03-10T12:37:55.496 INFO:tasks.workunit.client.0.vm00.stdout:4/490: rename df/d1f/d22/d26/d65/d91/f97 to df/d1f/d36/faa 0 2026-03-10T12:37:55.503 INFO:tasks.workunit.client.0.vm00.stdout:4/491: read df/d1f/d22/f30 [1207732,118967] 0 2026-03-10T12:37:55.503 INFO:tasks.workunit.client.1.vm07.stdout:3/499: mknod dc/dd/d43/d76/d95/da0/cae 0 2026-03-10T12:37:55.503 INFO:tasks.workunit.client.1.vm07.stdout:7/444: dread d0/d61/f66 [0,4194304] 0 2026-03-10T12:37:55.505 INFO:tasks.workunit.client.1.vm07.stdout:9/515: dread - d5/d13/d6c/d7a/f94 zero size 2026-03-10T12:37:55.506 INFO:tasks.workunit.client.1.vm07.stdout:9/516: fsync d5/d69/d93/d97/fa2 0 2026-03-10T12:37:55.507 INFO:tasks.workunit.client.0.vm00.stdout:5/500: sync 2026-03-10T12:37:55.508 INFO:tasks.workunit.client.1.vm07.stdout:9/517: chown d5/d13/d57/l79 192025 1 2026-03-10T12:37:55.509 INFO:tasks.workunit.client.0.vm00.stdout:5/501: truncate d1f/d26/d2b/d35/d53/d72/d9d/f88 484949 0 2026-03-10T12:37:55.510 INFO:tasks.workunit.client.0.vm00.stdout:0/469: dread d3/d40/d65/f92 [0,4194304] 0 2026-03-10T12:37:55.514 INFO:tasks.workunit.client.1.vm07.stdout:3/500: mkdir dc/dd/d43/d76/daf 0 2026-03-10T12:37:55.514 INFO:tasks.workunit.client.0.vm00.stdout:5/502: creat d1f/d26/d2e/d58/fb0 x:0 0 0 2026-03-10T12:37:55.514 INFO:tasks.workunit.client.0.vm00.stdout:5/503: write d1f/d26/d2b/f44 [643863,33876] 0 2026-03-10T12:37:55.515 INFO:tasks.workunit.client.1.vm07.stdout:6/430: dread d1/d4/f3b [0,4194304] 0 2026-03-10T12:37:55.520 INFO:tasks.workunit.client.1.vm07.stdout:7/445: creat d0/d61/d79/f8d x:0 0 0 2026-03-10T12:37:55.524 INFO:tasks.workunit.client.0.vm00.stdout:0/470: dread - d3/d7/d3c/d74/f78 zero size 2026-03-10T12:37:55.525 INFO:tasks.workunit.client.0.vm00.stdout:5/504: dread d1f/d26/f79 [0,4194304] 0 2026-03-10T12:37:55.532 INFO:tasks.workunit.client.0.vm00.stdout:5/505: symlink d1f/lb1 0 2026-03-10T12:37:55.535 INFO:tasks.workunit.client.0.vm00.stdout:9/525: dwrite d0/d3d/d43/d80/f30 [0,4194304] 0 2026-03-10T12:37:55.536 INFO:tasks.workunit.client.1.vm07.stdout:6/431: unlink d1/d4/d6/d16/d1a/d2c/c35 0 2026-03-10T12:37:55.537 INFO:tasks.workunit.client.1.vm07.stdout:7/446: creat d0/d47/f8e x:0 0 0 2026-03-10T12:37:55.537 INFO:tasks.workunit.client.1.vm07.stdout:3/501: read dc/dd/f20 [2160881,59246] 0 2026-03-10T12:37:55.538 INFO:tasks.workunit.client.0.vm00.stdout:5/506: mkdir d1f/d26/d2b/d37/db2 0 2026-03-10T12:37:55.545 INFO:tasks.workunit.client.1.vm07.stdout:6/432: mknod d1/d4/d6/d16/d1a/c87 0 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.0.vm00.stdout:9/526: mknod d0/d3d/d43/d80/d1e/d85/d98/cb7 0 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.0.vm00.stdout:9/527: rmdir d0/d3d/d43/d80/d1e 39 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.1.vm07.stdout:7/447: creat d0/d61/d79/f8f x:0 0 0 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.1.vm07.stdout:3/502: chown dc/dd/l14 3114623 1 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.1.vm07.stdout:6/433: mkdir d1/d4/d6/d43/d88 0 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.1.vm07.stdout:3/503: creat dc/dd/d28/d7a/d8e/fb0 x:0 0 0 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.1.vm07.stdout:6/434: truncate d1/d4/d6/d43/f73 363482 0 2026-03-10T12:37:55.550 INFO:tasks.workunit.client.0.vm00.stdout:5/507: truncate d1f/d26/d2b/d35/d53/fa7 613438 0 2026-03-10T12:37:55.551 INFO:tasks.workunit.client.0.vm00.stdout:5/508: dread - d1f/d6a/f74 zero size 2026-03-10T12:37:55.551 INFO:tasks.workunit.client.0.vm00.stdout:5/509: readlink l15 0 2026-03-10T12:37:55.555 INFO:tasks.workunit.client.0.vm00.stdout:5/510: dwrite d1f/d26/d2b/f5e [0,4194304] 0 2026-03-10T12:37:55.562 INFO:tasks.workunit.client.0.vm00.stdout:5/511: getdents d1f/d26/d2e/d58/d6b 0 2026-03-10T12:37:55.565 INFO:tasks.workunit.client.1.vm07.stdout:4/596: dread d0/d4/d7a/f50 [0,4194304] 0 2026-03-10T12:37:55.566 INFO:tasks.workunit.client.1.vm07.stdout:6/435: getdents d1 0 2026-03-10T12:37:55.567 INFO:tasks.workunit.client.1.vm07.stdout:6/436: chown d1/d4/d6/d16/d1a/d33/f3c 23663976 1 2026-03-10T12:37:55.573 INFO:tasks.workunit.client.1.vm07.stdout:4/597: mknod d0/d4/cce 0 2026-03-10T12:37:55.584 INFO:tasks.workunit.client.1.vm07.stdout:4/598: rename d0/d4/la5 to d0/d4/lcf 0 2026-03-10T12:37:55.584 INFO:tasks.workunit.client.1.vm07.stdout:4/599: write d0/d4/d7a/d46/d76/fa0 [2467934,18935] 0 2026-03-10T12:37:55.585 INFO:tasks.workunit.client.0.vm00.stdout:9/528: sync 2026-03-10T12:37:55.586 INFO:tasks.workunit.client.0.vm00.stdout:9/529: write d0/d5/f26 [3908304,10216] 0 2026-03-10T12:37:55.588 INFO:tasks.workunit.client.0.vm00.stdout:9/530: mkdir d0/d7f/db8 0 2026-03-10T12:37:55.589 INFO:tasks.workunit.client.0.vm00.stdout:9/531: dread - d0/d3d/d59/d4e/da3/f87 zero size 2026-03-10T12:37:55.591 INFO:tasks.workunit.client.0.vm00.stdout:9/532: truncate d0/d3d/d43/d80/d1e/d85/d98/fab 559752 0 2026-03-10T12:37:55.599 INFO:tasks.workunit.client.0.vm00.stdout:9/533: write d0/d3d/d59/f4a [4292580,105473] 0 2026-03-10T12:37:55.599 INFO:tasks.workunit.client.0.vm00.stdout:9/534: read d0/d3d/d43/d80/d19/f1b [3450643,30318] 0 2026-03-10T12:37:55.599 INFO:tasks.workunit.client.0.vm00.stdout:9/535: link d0/d3d/d59/d4e/f7c d0/d3d/d43/d80/fb9 0 2026-03-10T12:37:55.600 INFO:tasks.workunit.client.0.vm00.stdout:1/495: write da/d12/d26/f2e [1673008,67623] 0 2026-03-10T12:37:55.602 INFO:tasks.workunit.client.0.vm00.stdout:1/496: symlink da/d12/d91/la7 0 2026-03-10T12:37:55.603 INFO:tasks.workunit.client.0.vm00.stdout:1/497: chown da/d12/d91/la7 1502809 1 2026-03-10T12:37:55.604 INFO:tasks.workunit.client.0.vm00.stdout:1/498: write da/d21/f74 [1594904,72499] 0 2026-03-10T12:37:55.606 INFO:tasks.workunit.client.0.vm00.stdout:1/499: mkdir da/d12/da8 0 2026-03-10T12:37:55.608 INFO:tasks.workunit.client.0.vm00.stdout:1/500: dwrite da/f13 [0,4194304] 0 2026-03-10T12:37:55.637 INFO:tasks.workunit.client.1.vm07.stdout:2/387: read d0/d42/f53 [628285,92944] 0 2026-03-10T12:37:55.687 INFO:tasks.workunit.client.1.vm07.stdout:3/504: dread dc/dd/d1f/d45/f50 [0,4194304] 0 2026-03-10T12:37:55.693 INFO:tasks.workunit.client.1.vm07.stdout:3/505: dwrite dc/dd/d28/d7a/d8e/fb0 [0,4194304] 0 2026-03-10T12:37:55.694 INFO:tasks.workunit.client.1.vm07.stdout:3/506: dread - dc/dd/d43/d5c/fa9 zero size 2026-03-10T12:37:55.701 INFO:tasks.workunit.client.1.vm07.stdout:5/499: read d0/d22/d18/f4c [7167037,75109] 0 2026-03-10T12:37:55.705 INFO:tasks.workunit.client.1.vm07.stdout:5/500: rename d0/d22/d18/c3c to d0/d22/d18/d19/d2e/d67/cac 0 2026-03-10T12:37:55.714 INFO:tasks.workunit.client.1.vm07.stdout:5/501: dwrite d0/d22/d18/d19/d21/d54/f9b [0,4194304] 0 2026-03-10T12:37:55.719 INFO:tasks.workunit.client.1.vm07.stdout:5/502: symlink d0/d22/d18/d19/d36/lad 0 2026-03-10T12:37:55.720 INFO:tasks.workunit.client.1.vm07.stdout:5/503: chown d0/d22/d18/d19/d36/d75/d77 70245842 1 2026-03-10T12:37:55.724 INFO:tasks.workunit.client.1.vm07.stdout:5/504: write d0/d22/f93 [752644,54930] 0 2026-03-10T12:37:55.730 INFO:tasks.workunit.client.1.vm07.stdout:5/505: mknod d0/d22/d18/d19/d21/d54/cae 0 2026-03-10T12:37:55.731 INFO:tasks.workunit.client.1.vm07.stdout:5/506: stat d0/d22/d18/d19/d21/d54 0 2026-03-10T12:37:55.736 INFO:tasks.workunit.client.1.vm07.stdout:5/507: unlink d0/f70 0 2026-03-10T12:37:55.739 INFO:tasks.workunit.client.1.vm07.stdout:5/508: creat d0/d22/d18/d19/d21/d54/faf x:0 0 0 2026-03-10T12:37:55.805 INFO:tasks.workunit.client.1.vm07.stdout:3/507: read dc/dd/f41 [4402332,81328] 0 2026-03-10T12:37:55.807 INFO:tasks.workunit.client.0.vm00.stdout:8/371: dwrite d0/f8 [0,4194304] 0 2026-03-10T12:37:55.816 INFO:tasks.workunit.client.0.vm00.stdout:2/475: truncate d4/d6/d41/d6d/d40/f80 268864 0 2026-03-10T12:37:55.819 INFO:tasks.workunit.client.0.vm00.stdout:2/476: mknod d4/dd/ca2 0 2026-03-10T12:37:55.842 INFO:tasks.workunit.client.1.vm07.stdout:1/448: dread d9/df/f11 [0,4194304] 0 2026-03-10T12:37:55.849 INFO:tasks.workunit.client.1.vm07.stdout:1/449: fsync d9/df/f4a 0 2026-03-10T12:37:55.860 INFO:tasks.workunit.client.1.vm07.stdout:1/450: fdatasync d9/df/f26 0 2026-03-10T12:37:55.883 INFO:tasks.workunit.client.0.vm00.stdout:2/477: sync 2026-03-10T12:37:55.884 INFO:tasks.workunit.client.0.vm00.stdout:2/478: creat d4/d6/d2d/d3a/d43/d85/fa3 x:0 0 0 2026-03-10T12:37:55.885 INFO:tasks.workunit.client.0.vm00.stdout:2/479: fdatasync d4/d6/d2d/d3a/f44 0 2026-03-10T12:37:55.907 INFO:tasks.workunit.client.0.vm00.stdout:7/357: dwrite da/d1b/f1e [0,4194304] 0 2026-03-10T12:37:55.912 INFO:tasks.workunit.client.1.vm07.stdout:0/514: dread d0/d14/d5f/d76/da1/fa2 [0,4194304] 0 2026-03-10T12:37:55.913 INFO:tasks.workunit.client.0.vm00.stdout:3/534: write dd/d18/d13/d1d/f42 [768226,34354] 0 2026-03-10T12:37:55.913 INFO:tasks.workunit.client.0.vm00.stdout:6/349: dwrite d2/da/dc/d2f/f4f [4194304,4194304] 0 2026-03-10T12:37:55.914 INFO:tasks.workunit.client.1.vm07.stdout:0/515: chown d0/d14/d5f/d76/d2f/d31/d4f/f92 62 1 2026-03-10T12:37:55.918 INFO:tasks.workunit.client.0.vm00.stdout:3/535: dwrite dd/d18/d13/d1d/d43/f95 [0,4194304] 0 2026-03-10T12:37:55.923 INFO:tasks.workunit.client.1.vm07.stdout:0/516: symlink d0/d14/d5f/d3b/lab 0 2026-03-10T12:37:55.954 INFO:tasks.workunit.client.1.vm07.stdout:0/517: write d0/d14/d5f/d76/f78 [2071868,115206] 0 2026-03-10T12:37:55.954 INFO:tasks.workunit.client.1.vm07.stdout:9/518: dwrite d5/d13/d22/f39 [0,4194304] 0 2026-03-10T12:37:55.954 INFO:tasks.workunit.client.1.vm07.stdout:8/480: dwrite d1/d3/f59 [0,4194304] 0 2026-03-10T12:37:55.954 INFO:tasks.workunit.client.1.vm07.stdout:6/437: write d1/d4/d6/d16/d1a/d2c/f59 [1576179,68748] 0 2026-03-10T12:37:55.954 INFO:tasks.workunit.client.1.vm07.stdout:6/438: read d1/d4/f11 [3102824,114714] 0 2026-03-10T12:37:55.954 INFO:tasks.workunit.client.1.vm07.stdout:0/518: truncate d0/d14/d5f/d76/f27 3244740 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/536: symlink dd/d64/d92/lb3 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/537: stat dd/d27/f56 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/538: chown f7 223754 1 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/539: fsync dd/d3d/f53 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/540: read dd/d18/d13/d1d/d43/f95 [79825,49536] 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/541: chown dd/d18/d13/d1d/d43/f95 161 1 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/542: chown dd/d18/d13/d1d/f5b 390901 1 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:0/471: dwrite d3/d33/f4d [0,4194304] 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:0/472: chown d3/d7/d3c/f72 45 1 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/543: dwrite dd/d18/d14/fa0 [0,4194304] 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/544: truncate dd/d27/f91 209790 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:6/350: creat d2/da/f82 x:0 0 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/545: dwrite dd/d18/f83 [0,4194304] 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/546: write dd/d18/d13/f22 [3667802,26753] 0 2026-03-10T12:37:55.955 INFO:tasks.workunit.client.0.vm00.stdout:3/547: read - dd/d2a/f9f zero size 2026-03-10T12:37:55.958 INFO:tasks.workunit.client.0.vm00.stdout:5/512: write d1f/d26/d2b/f5e [4471267,57022] 0 2026-03-10T12:37:55.963 INFO:tasks.workunit.client.1.vm07.stdout:7/448: read d0/f37 [1020916,22989] 0 2026-03-10T12:37:55.964 INFO:tasks.workunit.client.0.vm00.stdout:5/513: getdents d1f/d6a/d94 0 2026-03-10T12:37:55.971 INFO:tasks.workunit.client.1.vm07.stdout:9/519: creat d5/d1f/fb9 x:0 0 0 2026-03-10T12:37:55.977 INFO:tasks.workunit.client.1.vm07.stdout:8/481: symlink d1/d3/d6/d54/l9a 0 2026-03-10T12:37:55.986 INFO:tasks.workunit.client.0.vm00.stdout:6/351: mkdir d2/da/dc/d83 0 2026-03-10T12:37:55.986 INFO:tasks.workunit.client.1.vm07.stdout:9/520: creat d5/d13/d57/d4f/d6a/fba x:0 0 0 2026-03-10T12:37:55.988 INFO:tasks.workunit.client.0.vm00.stdout:6/352: creat d2/d16/d29/f84 x:0 0 0 2026-03-10T12:37:55.990 INFO:tasks.workunit.client.0.vm00.stdout:0/473: getdents d3 0 2026-03-10T12:37:55.998 INFO:tasks.workunit.client.1.vm07.stdout:6/439: getdents d1/d4/d44 0 2026-03-10T12:37:55.999 INFO:tasks.workunit.client.0.vm00.stdout:6/353: link d2/d16/d74/f6e d2/d39/f85 0 2026-03-10T12:37:56.006 INFO:tasks.workunit.client.1.vm07.stdout:6/440: link d1/d4/d6/d16/d1a/l4f d1/d4/d4a/l89 0 2026-03-10T12:37:56.009 INFO:tasks.workunit.client.0.vm00.stdout:6/354: link d2/d16/d74/c7e d2/da/dc/d83/c86 0 2026-03-10T12:37:56.009 INFO:tasks.workunit.client.1.vm07.stdout:6/441: chown d1/d4/d6/l23 1965 1 2026-03-10T12:37:56.020 INFO:tasks.workunit.client.1.vm07.stdout:6/442: dread d1/d4/f5a [0,4194304] 0 2026-03-10T12:37:56.046 INFO:tasks.workunit.client.1.vm07.stdout:6/443: rename d1/d4/d6/l1b to d1/d4/l8a 0 2026-03-10T12:37:56.046 INFO:tasks.workunit.client.1.vm07.stdout:6/444: fsync d1/d4/d6/d16/d1a/d33/f61 0 2026-03-10T12:37:56.046 INFO:tasks.workunit.client.1.vm07.stdout:6/445: rename d1/d4/d4a/f55 to d1/d4/d6/d4e/f8b 0 2026-03-10T12:37:56.046 INFO:tasks.workunit.client.1.vm07.stdout:6/446: symlink d1/d4/d44/l8c 0 2026-03-10T12:37:56.046 INFO:tasks.workunit.client.1.vm07.stdout:6/447: rename d1/d4/f3f to d1/d4/d6/f8d 0 2026-03-10T12:37:56.069 INFO:tasks.workunit.client.0.vm00.stdout:6/355: sync 2026-03-10T12:37:56.070 INFO:tasks.workunit.client.0.vm00.stdout:6/356: readlink d2/d16/d29/l5f 0 2026-03-10T12:37:56.087 INFO:tasks.workunit.client.0.vm00.stdout:6/357: dread d2/d16/d29/f64 [0,4194304] 0 2026-03-10T12:37:56.087 INFO:tasks.workunit.client.0.vm00.stdout:6/358: chown d2/da/f11 65778259 1 2026-03-10T12:37:56.091 INFO:tasks.workunit.client.0.vm00.stdout:6/359: getdents d2/d14/d7a 0 2026-03-10T12:37:56.095 INFO:tasks.workunit.client.0.vm00.stdout:6/360: symlink d2/da/dc/l87 0 2026-03-10T12:37:56.096 INFO:tasks.workunit.client.1.vm07.stdout:4/600: write d0/d4/d10/d3c/d2b/f60 [1961391,11655] 0 2026-03-10T12:37:56.099 INFO:tasks.workunit.client.0.vm00.stdout:6/361: mkdir d2/d16/d29/d31/d88 0 2026-03-10T12:37:56.107 INFO:tasks.workunit.client.0.vm00.stdout:9/536: rename d0/d3d/d43/d80 to d0/d3d/d59/d4e/dba 0 2026-03-10T12:37:56.107 INFO:tasks.workunit.client.0.vm00.stdout:1/501: rename da/d12/d91/la5 to da/d24/d5a/la9 0 2026-03-10T12:37:56.107 INFO:tasks.workunit.client.0.vm00.stdout:1/502: write da/d24/d28/d67/da2/f9c [618724,94770] 0 2026-03-10T12:37:56.107 INFO:tasks.workunit.client.0.vm00.stdout:1/503: dread - da/d21/d27/d6a/f6b zero size 2026-03-10T12:37:56.108 INFO:tasks.workunit.client.0.vm00.stdout:1/504: write da/d12/f99 [625338,2205] 0 2026-03-10T12:37:56.110 INFO:tasks.workunit.client.0.vm00.stdout:7/358: rename da/d1b/l21 to da/d1b/d40/l8a 0 2026-03-10T12:37:56.112 INFO:tasks.workunit.client.1.vm07.stdout:4/601: link d0/d4/lcf d0/d4/d7a/d46/d76/ld0 0 2026-03-10T12:37:56.114 INFO:tasks.workunit.client.1.vm07.stdout:4/602: chown d0/d4/d10/d3c/d2b/d54/laa 3892 1 2026-03-10T12:37:56.114 INFO:tasks.workunit.client.0.vm00.stdout:7/359: dwrite da/d25/d2e/d4c/f6e [0,4194304] 0 2026-03-10T12:37:56.117 INFO:tasks.workunit.client.0.vm00.stdout:5/514: rename d1f/d26/d2b/d37/f61 to d1f/d6a/d94/fb3 0 2026-03-10T12:37:56.124 INFO:tasks.workunit.client.0.vm00.stdout:5/515: mknod d1f/d26/cb4 0 2026-03-10T12:37:56.125 INFO:tasks.workunit.client.0.vm00.stdout:5/516: link d1f/l24 d1f/d26/d2b/d35/d53/d72/d9d/d8e/lb5 0 2026-03-10T12:37:56.125 INFO:tasks.workunit.client.0.vm00.stdout:5/517: write d1f/d26/d2b/f52 [926276,80491] 0 2026-03-10T12:37:56.125 INFO:tasks.workunit.client.0.vm00.stdout:7/360: mknod da/d26/c8b 0 2026-03-10T12:37:56.125 INFO:tasks.workunit.client.0.vm00.stdout:5/518: write d1f/d26/d2b/f7e [984207,72864] 0 2026-03-10T12:37:56.125 INFO:tasks.workunit.client.0.vm00.stdout:5/519: dread - d1f/d26/d6f/f9b zero size 2026-03-10T12:37:56.128 INFO:tasks.workunit.client.0.vm00.stdout:7/361: readlink da/l19 0 2026-03-10T12:37:56.128 INFO:tasks.workunit.client.0.vm00.stdout:5/520: creat d1f/d26/d2b/d35/d53/d72/d9d/d8e/fb6 x:0 0 0 2026-03-10T12:37:56.131 INFO:tasks.workunit.client.0.vm00.stdout:1/505: dread da/d21/d27/f6e [0,4194304] 0 2026-03-10T12:37:56.133 INFO:tasks.workunit.client.0.vm00.stdout:1/506: getdents da/d21/d39/d77 0 2026-03-10T12:37:56.134 INFO:tasks.workunit.client.0.vm00.stdout:3/548: rmdir dd/d3d/d84 39 2026-03-10T12:37:56.135 INFO:tasks.workunit.client.0.vm00.stdout:7/362: creat da/d3f/d71/f8c x:0 0 0 2026-03-10T12:37:56.137 INFO:tasks.workunit.client.0.vm00.stdout:3/549: chown dd/l1f 1 1 2026-03-10T12:37:56.138 INFO:tasks.workunit.client.0.vm00.stdout:1/507: creat da/d24/d28/faa x:0 0 0 2026-03-10T12:37:56.138 INFO:tasks.workunit.client.1.vm07.stdout:4/603: getdents d0/d5c/d7c 0 2026-03-10T12:37:56.139 INFO:tasks.workunit.client.0.vm00.stdout:3/550: mkdir dd/d2a/da2/db4 0 2026-03-10T12:37:56.141 INFO:tasks.workunit.client.0.vm00.stdout:7/363: symlink da/d26/d50/l8d 0 2026-03-10T12:37:56.142 INFO:tasks.workunit.client.0.vm00.stdout:3/551: link dd/d27/d2c/d34/d38/f48 dd/d64/fb5 0 2026-03-10T12:37:56.144 INFO:tasks.workunit.client.0.vm00.stdout:8/372: write d0/d12/d2d/f55 [481666,62694] 0 2026-03-10T12:37:56.145 INFO:tasks.workunit.client.0.vm00.stdout:3/552: readlink dd/d18/d13/d1d/l23 0 2026-03-10T12:37:56.146 INFO:tasks.workunit.client.0.vm00.stdout:7/364: write da/f17 [342135,67564] 0 2026-03-10T12:37:56.146 INFO:tasks.workunit.client.0.vm00.stdout:3/553: symlink dd/d64/d93/lb6 0 2026-03-10T12:37:56.147 INFO:tasks.workunit.client.0.vm00.stdout:8/373: link d0/dd/d38/f3d d0/d58/d68/f74 0 2026-03-10T12:37:56.148 INFO:tasks.workunit.client.1.vm07.stdout:4/604: dread d0/d4/d5/da/f15 [8388608,4194304] 0 2026-03-10T12:37:56.149 INFO:tasks.workunit.client.0.vm00.stdout:3/554: symlink dd/d27/d2c/lb7 0 2026-03-10T12:37:56.150 INFO:tasks.workunit.client.0.vm00.stdout:7/365: creat da/d41/f8e x:0 0 0 2026-03-10T12:37:56.150 INFO:tasks.workunit.client.0.vm00.stdout:3/555: creat dd/d64/d92/fb8 x:0 0 0 2026-03-10T12:37:56.151 INFO:tasks.workunit.client.0.vm00.stdout:3/556: creat dd/d64/fb9 x:0 0 0 2026-03-10T12:37:56.164 INFO:tasks.workunit.client.0.vm00.stdout:7/366: dread da/d26/f27 [0,4194304] 0 2026-03-10T12:37:56.164 INFO:tasks.workunit.client.0.vm00.stdout:7/367: write da/d41/f72 [690584,3503] 0 2026-03-10T12:37:56.165 INFO:tasks.workunit.client.0.vm00.stdout:7/368: chown da/d25/d2c/d82 878698815 1 2026-03-10T12:37:56.171 INFO:tasks.workunit.client.0.vm00.stdout:7/369: symlink da/d3f/l8f 0 2026-03-10T12:37:56.172 INFO:tasks.workunit.client.0.vm00.stdout:7/370: fdatasync da/d25/d2c/f4f 0 2026-03-10T12:37:56.178 INFO:tasks.workunit.client.1.vm07.stdout:0/519: dread d0/d14/f36 [0,4194304] 0 2026-03-10T12:37:56.194 INFO:tasks.workunit.client.0.vm00.stdout:7/371: symlink da/d25/d2c/l90 0 2026-03-10T12:37:56.194 INFO:tasks.workunit.client.0.vm00.stdout:7/372: rename da/l11 to da/d25/d2c/d82/l91 0 2026-03-10T12:37:56.194 INFO:tasks.workunit.client.1.vm07.stdout:9/521: dread d5/d16/f19 [0,4194304] 0 2026-03-10T12:37:56.194 INFO:tasks.workunit.client.1.vm07.stdout:0/520: symlink d0/d14/d5f/d76/d2f/d31/d79/d9e/lac 0 2026-03-10T12:37:56.194 INFO:tasks.workunit.client.1.vm07.stdout:9/522: symlink d5/d1f/d31/d64/lbb 0 2026-03-10T12:37:56.194 INFO:tasks.workunit.client.1.vm07.stdout:0/521: creat d0/d14/d7c/fad x:0 0 0 2026-03-10T12:37:56.194 INFO:tasks.workunit.client.1.vm07.stdout:9/523: write d5/d16/d23/fb2 [412987,65359] 0 2026-03-10T12:37:56.197 INFO:tasks.workunit.client.0.vm00.stdout:6/362: dread d2/d16/d74/f59 [0,4194304] 0 2026-03-10T12:37:56.202 INFO:tasks.workunit.client.1.vm07.stdout:2/388: sync 2026-03-10T12:37:56.202 INFO:tasks.workunit.client.1.vm07.stdout:3/508: sync 2026-03-10T12:37:56.207 INFO:tasks.workunit.client.1.vm07.stdout:9/524: creat d5/d1f/d75/fbc x:0 0 0 2026-03-10T12:37:56.207 INFO:tasks.workunit.client.1.vm07.stdout:0/522: fdatasync d0/d14/d5f/d3b/f5b 0 2026-03-10T12:37:56.208 INFO:tasks.workunit.client.0.vm00.stdout:6/363: truncate d2/d16/f41 641181 0 2026-03-10T12:37:56.219 INFO:tasks.workunit.client.0.vm00.stdout:2/480: dwrite f1 [0,4194304] 0 2026-03-10T12:37:56.219 INFO:tasks.workunit.client.1.vm07.stdout:0/523: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dae 0 2026-03-10T12:37:56.221 INFO:tasks.workunit.client.0.vm00.stdout:2/481: symlink d4/d78/la4 0 2026-03-10T12:37:56.230 INFO:tasks.workunit.client.1.vm07.stdout:3/509: dread dc/dd/d43/f61 [0,4194304] 0 2026-03-10T12:37:56.253 INFO:tasks.workunit.client.1.vm07.stdout:3/510: creat dc/d18/d99/da3/fb1 x:0 0 0 2026-03-10T12:37:56.253 INFO:tasks.workunit.client.1.vm07.stdout:3/511: getdents dc/d18/d2d 0 2026-03-10T12:37:56.253 INFO:tasks.workunit.client.1.vm07.stdout:3/512: symlink dc/d18/lb2 0 2026-03-10T12:37:56.253 INFO:tasks.workunit.client.1.vm07.stdout:3/513: chown dc/dd/d43/d76/d95/da0/cae 1 1 2026-03-10T12:37:56.253 INFO:tasks.workunit.client.1.vm07.stdout:3/514: rename dc/dd/d1f/c3c to dc/dd/d43/d76/d95/cb3 0 2026-03-10T12:37:56.253 INFO:tasks.workunit.client.1.vm07.stdout:3/515: chown f1 488 1 2026-03-10T12:37:56.265 INFO:tasks.workunit.client.1.vm07.stdout:5/509: dwrite d0/d22/d18/d80/f8b [0,4194304] 0 2026-03-10T12:37:56.269 INFO:tasks.workunit.client.1.vm07.stdout:5/510: readlink d0/d22/d18/d19/d21/d54/l9d 0 2026-03-10T12:37:56.273 INFO:tasks.workunit.client.1.vm07.stdout:1/451: write d9/df/f11 [2770057,55282] 0 2026-03-10T12:37:56.282 INFO:tasks.workunit.client.1.vm07.stdout:1/452: dwrite d9/df/d29/f8b [0,4194304] 0 2026-03-10T12:37:56.292 INFO:tasks.workunit.client.0.vm00.stdout:9/537: dread d0/d3d/d59/d4e/dba/f24 [0,4194304] 0 2026-03-10T12:37:56.295 INFO:tasks.workunit.client.0.vm00.stdout:9/538: rename d0/d3d/d43/d53/d57/f8a to d0/d7f/db8/fbb 0 2026-03-10T12:37:56.306 INFO:tasks.workunit.client.1.vm07.stdout:5/511: truncate d0/d22/f27 1722096 0 2026-03-10T12:37:56.306 INFO:tasks.workunit.client.1.vm07.stdout:1/453: creat d9/df/f96 x:0 0 0 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.1.vm07.stdout:1/454: creat d9/df/f97 x:0 0 0 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.1.vm07.stdout:1/455: chown d9/df/d54 464 1 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.0.vm00.stdout:9/539: dread d0/d3d/d59/d4e/dba/f24 [0,4194304] 0 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.0.vm00.stdout:9/540: symlink d0/d5/dc/lbc 0 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.0.vm00.stdout:9/541: link d0/d3d/d59/d4e/dba/f30 d0/d3d/d59/d4e/dba/d19/d50/fbd 0 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.0.vm00.stdout:9/542: symlink d0/d3d/d43/lbe 0 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.0.vm00.stdout:9/543: creat d0/d3d/d43/d53/d57/db0/fbf x:0 0 0 2026-03-10T12:37:56.307 INFO:tasks.workunit.client.0.vm00.stdout:9/544: dwrite d0/d3d/d59/d4e/dba/d1e/d2b/f5f [0,4194304] 0 2026-03-10T12:37:56.310 INFO:tasks.workunit.client.0.vm00.stdout:4/492: dread df/f3d [0,4194304] 0 2026-03-10T12:37:56.325 INFO:tasks.workunit.client.1.vm07.stdout:7/449: sync 2026-03-10T12:37:56.325 INFO:tasks.workunit.client.1.vm07.stdout:8/482: sync 2026-03-10T12:37:56.325 INFO:tasks.workunit.client.1.vm07.stdout:7/450: chown d0/c7d 1653516 1 2026-03-10T12:37:56.330 INFO:tasks.workunit.client.1.vm07.stdout:9/525: sync 2026-03-10T12:37:56.330 INFO:tasks.workunit.client.1.vm07.stdout:3/516: sync 2026-03-10T12:37:56.330 INFO:tasks.workunit.client.1.vm07.stdout:7/451: mkdir d0/d57/d62/d90 0 2026-03-10T12:37:56.338 INFO:tasks.workunit.client.1.vm07.stdout:7/452: dwrite d0/f13 [0,4194304] 0 2026-03-10T12:37:56.347 INFO:tasks.workunit.client.1.vm07.stdout:8/483: link d1/d3/f59 d1/d3/d6c/f9b 0 2026-03-10T12:37:56.354 INFO:tasks.workunit.client.0.vm00.stdout:0/474: dwrite d3/d7/f15 [0,4194304] 0 2026-03-10T12:37:56.369 INFO:tasks.workunit.client.0.vm00.stdout:0/475: dread d3/d22/f42 [0,4194304] 0 2026-03-10T12:37:56.372 INFO:tasks.workunit.client.0.vm00.stdout:0/476: dwrite d3/db/d77/f9e [0,4194304] 0 2026-03-10T12:37:56.384 INFO:tasks.workunit.client.1.vm07.stdout:3/517: rename dc/d18/la6 to dc/dd/d1f/dac/lb4 0 2026-03-10T12:37:56.402 INFO:tasks.workunit.client.1.vm07.stdout:6/448: write d1/d4/f3b [282687,18525] 0 2026-03-10T12:37:56.414 INFO:tasks.workunit.client.1.vm07.stdout:7/453: mkdir d0/d57/d62/d90/d91 0 2026-03-10T12:37:56.439 INFO:tasks.workunit.client.1.vm07.stdout:7/454: chown d0/d61 270176155 1 2026-03-10T12:37:56.439 INFO:tasks.workunit.client.1.vm07.stdout:4/605: write d0/d8e/fb5 [69620,61475] 0 2026-03-10T12:37:56.442 INFO:tasks.workunit.client.0.vm00.stdout:0/477: rename d3/d7/d3c/c6b to d3/d7/d3c/ca3 0 2026-03-10T12:37:56.443 INFO:tasks.workunit.client.1.vm07.stdout:3/518: rename dc/d18/d2d/d3d to dc/dd/db5 0 2026-03-10T12:37:56.444 INFO:tasks.workunit.client.1.vm07.stdout:7/455: mknod d0/d67/d6f/d80/c92 0 2026-03-10T12:37:56.445 INFO:tasks.workunit.client.1.vm07.stdout:3/519: read - dc/dd/f9a zero size 2026-03-10T12:37:56.446 INFO:tasks.workunit.client.1.vm07.stdout:7/456: creat d0/d61/f93 x:0 0 0 2026-03-10T12:37:56.475 INFO:tasks.workunit.client.1.vm07.stdout:7/457: symlink d0/d61/l94 0 2026-03-10T12:37:56.475 INFO:tasks.workunit.client.1.vm07.stdout:3/520: link dc/dd/d28/d7a/f88 dc/dd/d43/d76/d95/fb6 0 2026-03-10T12:37:56.475 INFO:tasks.workunit.client.1.vm07.stdout:7/458: creat d0/d61/d79/f95 x:0 0 0 2026-03-10T12:37:56.475 INFO:tasks.workunit.client.1.vm07.stdout:3/521: readlink dc/dd/lad 0 2026-03-10T12:37:56.475 INFO:tasks.workunit.client.1.vm07.stdout:7/459: rename d0/d61/d79/l88 to d0/d47/d48/d8a/l96 0 2026-03-10T12:37:56.476 INFO:tasks.workunit.client.1.vm07.stdout:3/522: creat dc/dd/fb7 x:0 0 0 2026-03-10T12:37:56.476 INFO:tasks.workunit.client.1.vm07.stdout:3/523: dwrite dc/dd/d43/d76/d95/da0/fa2 [0,4194304] 0 2026-03-10T12:37:56.527 INFO:tasks.workunit.client.0.vm00.stdout:5/521: rmdir d1f/d26/d2b/d35/d53 39 2026-03-10T12:37:56.531 INFO:tasks.workunit.client.0.vm00.stdout:3/557: symlink dd/d18/lba 0 2026-03-10T12:37:56.533 INFO:tasks.workunit.client.0.vm00.stdout:5/522: mkdir d1f/d26/d2b/d35/d53/d72/da3/db7 0 2026-03-10T12:37:56.536 INFO:tasks.workunit.client.0.vm00.stdout:3/558: mknod dd/d64/cbb 0 2026-03-10T12:37:56.537 INFO:tasks.workunit.client.0.vm00.stdout:5/523: creat d1f/d26/d2e/fb8 x:0 0 0 2026-03-10T12:37:56.542 INFO:tasks.workunit.client.0.vm00.stdout:5/524: dwrite d1f/d26/d2b/d35/d53/d72/d9d/d90/fae [0,4194304] 0 2026-03-10T12:37:56.549 INFO:tasks.workunit.client.0.vm00.stdout:6/364: dwrite d2/d16/d74/f5a [0,4194304] 0 2026-03-10T12:37:56.550 INFO:tasks.workunit.client.1.vm07.stdout:2/389: truncate d0/d42/d26/d38/d4f/f65 1957907 0 2026-03-10T12:37:56.550 INFO:tasks.workunit.client.1.vm07.stdout:0/524: write d0/d14/d5f/d76/d2f/d31/d4f/f5c [2144290,15700] 0 2026-03-10T12:37:56.550 INFO:tasks.workunit.client.1.vm07.stdout:0/525: chown d0/d83 16771014 1 2026-03-10T12:37:56.552 INFO:tasks.workunit.client.1.vm07.stdout:0/526: chown d0/d14/d5f/d41/c71 12 1 2026-03-10T12:37:56.557 INFO:tasks.workunit.client.0.vm00.stdout:1/508: write da/d12/f66 [1194353,7971] 0 2026-03-10T12:37:56.562 INFO:tasks.workunit.client.0.vm00.stdout:1/509: chown da/d21/d27/d6a/l84 52975 1 2026-03-10T12:37:56.562 INFO:tasks.workunit.client.1.vm07.stdout:2/390: fdatasync d0/f44 0 2026-03-10T12:37:56.563 INFO:tasks.workunit.client.1.vm07.stdout:0/527: truncate d0/d14/d5f/d76/d2f/d31/d4f/d60/f89 319073 0 2026-03-10T12:37:56.569 INFO:tasks.workunit.client.0.vm00.stdout:8/374: truncate d0/d12/d2d/f44 3295584 0 2026-03-10T12:37:56.572 INFO:tasks.workunit.client.1.vm07.stdout:0/528: readlink d0/d14/d5f/d76/d2f/d31/d4f/l50 0 2026-03-10T12:37:56.592 INFO:tasks.workunit.client.0.vm00.stdout:3/559: creat dd/d2a/fbc x:0 0 0 2026-03-10T12:37:56.603 INFO:tasks.workunit.client.1.vm07.stdout:0/529: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/f6e [0,4194304] 0 2026-03-10T12:37:56.603 INFO:tasks.workunit.client.1.vm07.stdout:0/530: dread - d0/d14/d5f/d3b/f5b zero size 2026-03-10T12:37:56.605 INFO:tasks.workunit.client.1.vm07.stdout:0/531: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf x:0 0 0 2026-03-10T12:37:56.635 INFO:tasks.workunit.client.1.vm07.stdout:0/532: rmdir d0/d14/d5f/d76/da0 0 2026-03-10T12:37:56.640 INFO:tasks.workunit.client.0.vm00.stdout:2/482: truncate d4/d6/d2d/d3a/f74 977489 0 2026-03-10T12:37:56.642 INFO:tasks.workunit.client.0.vm00.stdout:2/483: truncate d4/d6/d2d/d3a/f44 221822 0 2026-03-10T12:37:56.644 INFO:tasks.workunit.client.0.vm00.stdout:2/484: symlink d4/d78/la5 0 2026-03-10T12:37:56.648 INFO:tasks.workunit.client.0.vm00.stdout:2/485: dwrite d4/d53/d9e/f6f [0,4194304] 0 2026-03-10T12:37:56.649 INFO:tasks.workunit.client.0.vm00.stdout:2/486: readlink d4/d6/d41/l54 0 2026-03-10T12:37:56.650 INFO:tasks.workunit.client.0.vm00.stdout:2/487: fsync d4/d53/d68/f69 0 2026-03-10T12:37:56.651 INFO:tasks.workunit.client.0.vm00.stdout:2/488: readlink d4/d6/d41/l82 0 2026-03-10T12:37:56.653 INFO:tasks.workunit.client.0.vm00.stdout:2/489: creat d4/d6/d2d/d3a/fa6 x:0 0 0 2026-03-10T12:37:56.653 INFO:tasks.workunit.client.0.vm00.stdout:2/490: dread - d4/d6/d2d/d3a/fa6 zero size 2026-03-10T12:37:56.657 INFO:tasks.workunit.client.0.vm00.stdout:2/491: dwrite d4/d6/d2d/d3a/f7c [0,4194304] 0 2026-03-10T12:37:56.660 INFO:tasks.workunit.client.0.vm00.stdout:2/492: mkdir d4/dd/da7 0 2026-03-10T12:37:56.663 INFO:tasks.workunit.client.0.vm00.stdout:1/510: dread da/f13 [4194304,4194304] 0 2026-03-10T12:37:56.665 INFO:tasks.workunit.client.0.vm00.stdout:1/511: mkdir da/d24/d28/d44/d5d/dab 0 2026-03-10T12:37:56.669 INFO:tasks.workunit.client.1.vm07.stdout:0/533: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/f75 [0,4194304] 0 2026-03-10T12:37:56.671 INFO:tasks.workunit.client.1.vm07.stdout:0/534: unlink d0/d14/d5f/d3b/f46 0 2026-03-10T12:37:56.672 INFO:tasks.workunit.client.1.vm07.stdout:0/535: write d0/d14/d7c/fad [570835,42555] 0 2026-03-10T12:37:56.674 INFO:tasks.workunit.client.1.vm07.stdout:0/536: fdatasync d0/f1c 0 2026-03-10T12:37:56.675 INFO:tasks.workunit.client.1.vm07.stdout:0/537: chown d0/d14/d5f/d41 12444543 1 2026-03-10T12:37:56.677 INFO:tasks.workunit.client.1.vm07.stdout:0/538: symlink d0/d14/d5f/d76/d2f/d31/d4f/da8/lb0 0 2026-03-10T12:37:56.681 INFO:tasks.workunit.client.1.vm07.stdout:0/539: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/fa4 [0,4194304] 0 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:0/540: chown d0/d14/d5f/d76/f8a 549764 1 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:0/541: getdents d0/d14/d5f/d76/d2f/d31/d79/d85 0 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:0/542: chown d0/d14/d5f/d41/d6a/l95 2 1 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:0/543: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/f9b [3477867,51689] 0 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:0/544: creat d0/d14/d5f/d76/d2f/d31/d79/d9e/fb1 x:0 0 0 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:5/512: write d0/d22/d18/d19/d21/d54/f7d [1669230,63179] 0 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:1/456: write d9/f1f [1095117,20420] 0 2026-03-10T12:37:56.722 INFO:tasks.workunit.client.1.vm07.stdout:5/513: dwrite d0/d22/d18/d19/d21/d54/faf [0,4194304] 0 2026-03-10T12:37:56.724 INFO:tasks.workunit.client.1.vm07.stdout:5/514: mknod d0/d22/d18/d19/d21/d3a/cb0 0 2026-03-10T12:37:56.731 INFO:tasks.workunit.client.1.vm07.stdout:5/515: link d0/d22/d18/d19/d21/d3a/l43 d0/d22/d18/d19/d72/lb1 0 2026-03-10T12:37:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:56 vm00.local ceph-mon[50686]: pgmap v163: 65 pgs: 65 active+clean; 2.0 GiB data, 7.1 GiB used, 113 GiB / 120 GiB avail; 43 MiB/s rd, 143 MiB/s wr, 259 op/s 2026-03-10T12:37:56.734 INFO:tasks.workunit.client.1.vm07.stdout:5/516: symlink d0/d22/d18/d19/d36/d75/d77/lb2 0 2026-03-10T12:37:56.734 INFO:tasks.workunit.client.1.vm07.stdout:5/517: readlink d0/d22/d18/d3e/l58 0 2026-03-10T12:37:56.737 INFO:tasks.workunit.client.1.vm07.stdout:5/518: creat d0/d22/d18/d19/d2e/d3f/fb3 x:0 0 0 2026-03-10T12:37:56.738 INFO:tasks.workunit.client.1.vm07.stdout:5/519: chown d0/d22/d18/d19/d36/l51 53 1 2026-03-10T12:37:56.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:56 vm07.local ceph-mon[58582]: pgmap v163: 65 pgs: 65 active+clean; 2.0 GiB data, 7.1 GiB used, 113 GiB / 120 GiB avail; 43 MiB/s rd, 143 MiB/s wr, 259 op/s 2026-03-10T12:37:56.840 INFO:tasks.workunit.client.0.vm00.stdout:4/493: truncate df/d1f/d36/d3a/f6e 7021862 0 2026-03-10T12:37:56.842 INFO:tasks.workunit.client.0.vm00.stdout:3/560: truncate dd/d18/d13/d1d/f69 2476131 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:9/545: rename d0/d3d/d59/d4e/dba/c51 to d0/d3d/d59/d4e/dba/d19/d50/cc0 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:9/546: unlink d0/d3d/d59/d4e/dba/d1e/f60 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:9/547: chown d0/d3d/d59/d4e/dba/d1e/d27/f9e 0 1 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:5/525: rename d1f/f32 to d1f/d26/d2b/d35/d78/d7f/fb9 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:9/548: symlink d0/d9b/lc1 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:1/512: rename da/d24/d28/d44/d59/da6/d8b/fa1 to da/d24/d28/d44/d5d/d72/d7e/fac 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:5/526: creat d1f/d26/d2e/fba x:0 0 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:4/494: rename df/d63/d6b to df/d1f/d22/d26/dab 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:4/495: link df/f4e df/fac 0 2026-03-10T12:37:56.868 INFO:tasks.workunit.client.0.vm00.stdout:4/496: creat df/d1f/d22/d26/d65/d91/fad x:0 0 0 2026-03-10T12:37:56.869 INFO:tasks.workunit.client.0.vm00.stdout:9/549: dread d0/d3d/d59/d4e/dba/d1e/d27/f75 [0,4194304] 0 2026-03-10T12:37:56.872 INFO:tasks.workunit.client.0.vm00.stdout:9/550: rename d0/d3d/d59/d4e/da3 to d0/dc2 0 2026-03-10T12:37:56.873 INFO:tasks.workunit.client.0.vm00.stdout:9/551: symlink d0/d3d/d59/d4e/dba/d1e/d85/lc3 0 2026-03-10T12:37:56.876 INFO:tasks.workunit.client.0.vm00.stdout:3/561: dread dd/d18/d14/d2b/f31 [0,4194304] 0 2026-03-10T12:37:56.877 INFO:tasks.workunit.client.0.vm00.stdout:9/552: rename d0/d3d/d43/d53/d57 to d0/d7f/db8/dc4 0 2026-03-10T12:37:56.882 INFO:tasks.workunit.client.0.vm00.stdout:3/562: symlink dd/d27/lbd 0 2026-03-10T12:37:56.902 INFO:tasks.workunit.client.0.vm00.stdout:3/563: creat dd/d18/d14/fbe x:0 0 0 2026-03-10T12:37:56.902 INFO:tasks.workunit.client.0.vm00.stdout:3/564: fdatasync dd/d27/d2c/fb1 0 2026-03-10T12:37:56.934 INFO:tasks.workunit.client.1.vm07.stdout:9/526: write d5/d13/f14 [2524760,39473] 0 2026-03-10T12:37:56.939 INFO:tasks.workunit.client.0.vm00.stdout:8/375: write d0/d58/d68/f74 [691979,50077] 0 2026-03-10T12:37:56.942 INFO:tasks.workunit.client.1.vm07.stdout:7/460: getdents d0/d57/d62/d90 0 2026-03-10T12:37:56.944 INFO:tasks.workunit.client.1.vm07.stdout:8/484: write d1/d3/d40/f4c [31944,18071] 0 2026-03-10T12:37:56.946 INFO:tasks.workunit.client.1.vm07.stdout:6/449: write d1/d4/d6/d46/d4d/f22 [56661,35184] 0 2026-03-10T12:37:56.948 INFO:tasks.workunit.client.1.vm07.stdout:4/606: write d0/d4/d7a/f27 [4798978,47534] 0 2026-03-10T12:37:56.954 INFO:tasks.workunit.client.0.vm00.stdout:0/478: write d3/d40/d65/f92 [334312,68878] 0 2026-03-10T12:37:56.956 INFO:tasks.workunit.client.0.vm00.stdout:8/376: dread d0/f11 [0,4194304] 0 2026-03-10T12:37:56.959 INFO:tasks.workunit.client.0.vm00.stdout:8/377: dwrite d0/dd/d38/f3d [0,4194304] 0 2026-03-10T12:37:56.968 INFO:tasks.workunit.client.0.vm00.stdout:8/378: rename d0/d12/d36/f72 to d0/d12/d2d/f75 0 2026-03-10T12:37:56.977 INFO:tasks.workunit.client.0.vm00.stdout:8/379: link d0/d12/d2d/d49/l5a d0/d12/d36/d5b/l76 0 2026-03-10T12:37:56.978 INFO:tasks.workunit.client.1.vm07.stdout:3/524: dwrite dc/dd/f16 [0,4194304] 0 2026-03-10T12:37:56.978 INFO:tasks.workunit.client.0.vm00.stdout:5/527: dread f19 [0,4194304] 0 2026-03-10T12:37:56.979 INFO:tasks.workunit.client.0.vm00.stdout:2/493: write d4/d53/d9e/f60 [1263530,130077] 0 2026-03-10T12:37:56.982 INFO:tasks.workunit.client.0.vm00.stdout:8/380: mknod d0/d12/d2d/d49/c77 0 2026-03-10T12:37:56.991 INFO:tasks.workunit.client.1.vm07.stdout:6/450: link d1/d4/f3b d1/d4/d6/d16/d1a/f8e 0 2026-03-10T12:37:56.991 INFO:tasks.workunit.client.1.vm07.stdout:3/525: rmdir dc/dd/d28/d7a 39 2026-03-10T12:37:56.993 INFO:tasks.workunit.client.0.vm00.stdout:2/494: unlink d4/d6/f75 0 2026-03-10T12:37:56.995 INFO:tasks.workunit.client.0.vm00.stdout:8/381: write d0/d12/d2d/f52 [299803,9192] 0 2026-03-10T12:37:56.995 INFO:tasks.workunit.client.1.vm07.stdout:3/526: dwrite dc/d18/d99/da3/fb1 [0,4194304] 0 2026-03-10T12:37:56.998 INFO:tasks.workunit.client.0.vm00.stdout:2/495: read d4/d6/f2e [482927,3408] 0 2026-03-10T12:37:56.998 INFO:tasks.workunit.client.0.vm00.stdout:7/373: link da/f13 da/d25/d2e/d4c/f92 0 2026-03-10T12:37:57.000 INFO:tasks.workunit.client.0.vm00.stdout:5/528: truncate d1f/d26/d2e/f3c 2604791 0 2026-03-10T12:37:57.009 INFO:tasks.workunit.client.0.vm00.stdout:6/365: write d2/d16/d74/f62 [4372149,68535] 0 2026-03-10T12:37:57.009 INFO:tasks.workunit.client.1.vm07.stdout:6/451: mknod d1/d4/d6/d16/d49/c8f 0 2026-03-10T12:37:57.016 INFO:tasks.workunit.client.0.vm00.stdout:5/529: mkdir d1f/d26/d2b/d35/d53/d5b/dbb 0 2026-03-10T12:37:57.020 INFO:tasks.workunit.client.0.vm00.stdout:5/530: dwrite d1f/f97 [0,4194304] 0 2026-03-10T12:37:57.023 INFO:tasks.workunit.client.1.vm07.stdout:2/391: truncate d0/f13 2329820 0 2026-03-10T12:37:57.023 INFO:tasks.workunit.client.0.vm00.stdout:1/513: truncate da/d24/d28/d44/f83 2883738 0 2026-03-10T12:37:57.024 INFO:tasks.workunit.client.0.vm00.stdout:1/514: write f3 [659446,127605] 0 2026-03-10T12:37:57.025 INFO:tasks.workunit.client.0.vm00.stdout:1/515: dread - da/d21/d39/f8c zero size 2026-03-10T12:37:57.028 INFO:tasks.workunit.client.0.vm00.stdout:5/531: symlink d1f/d26/d2b/d35/d53/d72/lbc 0 2026-03-10T12:37:57.032 INFO:tasks.workunit.client.1.vm07.stdout:3/527: mkdir dc/dd/d43/d76/d95/db8 0 2026-03-10T12:37:57.032 INFO:tasks.workunit.client.0.vm00.stdout:2/496: getdents d4/d6/d41/d6d/d40 0 2026-03-10T12:37:57.034 INFO:tasks.workunit.client.0.vm00.stdout:1/516: creat da/d24/d28/d44/fad x:0 0 0 2026-03-10T12:37:57.035 INFO:tasks.workunit.client.0.vm00.stdout:1/517: chown da/d24/d28/d67/da2/d78/f86 1311 1 2026-03-10T12:37:57.038 INFO:tasks.workunit.client.0.vm00.stdout:4/497: write fb [3050609,43705] 0 2026-03-10T12:37:57.039 INFO:tasks.workunit.client.0.vm00.stdout:7/374: dread da/d41/f72 [0,4194304] 0 2026-03-10T12:37:57.040 INFO:tasks.workunit.client.0.vm00.stdout:2/497: mknod d4/d53/d9e/ca8 0 2026-03-10T12:37:57.043 INFO:tasks.workunit.client.0.vm00.stdout:9/553: rmdir d0 39 2026-03-10T12:37:57.046 INFO:tasks.workunit.client.0.vm00.stdout:2/498: write d4/dd/f17 [2642238,29804] 0 2026-03-10T12:37:57.048 INFO:tasks.workunit.client.0.vm00.stdout:1/518: rename da/d12/c1c to da/d21/d27/d6a/d94/cae 0 2026-03-10T12:37:57.056 INFO:tasks.workunit.client.1.vm07.stdout:6/452: dread - d1/d4/d6/d16/d1a/d33/f74 zero size 2026-03-10T12:37:57.056 INFO:tasks.workunit.client.0.vm00.stdout:3/565: dwrite dd/d18/d13/d1d/d43/fa7 [4194304,4194304] 0 2026-03-10T12:37:57.064 INFO:tasks.workunit.client.0.vm00.stdout:3/566: fsync dd/d18/d13/d1d/f5b 0 2026-03-10T12:37:57.069 INFO:tasks.workunit.client.0.vm00.stdout:1/519: symlink da/d24/d28/d44/d59/da6/da4/laf 0 2026-03-10T12:37:57.073 INFO:tasks.workunit.client.0.vm00.stdout:9/554: dread d0/d3d/d59/d4e/dba/f39 [0,4194304] 0 2026-03-10T12:37:57.078 INFO:tasks.workunit.client.0.vm00.stdout:3/567: unlink dd/d2a/f9f 0 2026-03-10T12:37:57.078 INFO:tasks.workunit.client.0.vm00.stdout:3/568: chown dd/d3d/d65/fad 792305805 1 2026-03-10T12:37:57.078 INFO:tasks.workunit.client.0.vm00.stdout:3/569: dread dd/d18/d13/d1d/d43/f95 [0,4194304] 0 2026-03-10T12:37:57.079 INFO:tasks.workunit.client.0.vm00.stdout:3/570: fdatasync dd/d27/d2c/d34/d45/f75 0 2026-03-10T12:37:57.079 INFO:tasks.workunit.client.0.vm00.stdout:3/571: chown dd/d64/l3f 51604 1 2026-03-10T12:37:57.080 INFO:tasks.workunit.client.0.vm00.stdout:4/498: getdents df/d63/d94 0 2026-03-10T12:37:57.082 INFO:tasks.workunit.client.0.vm00.stdout:1/520: readlink da/l16 0 2026-03-10T12:37:57.082 INFO:tasks.workunit.client.0.vm00.stdout:8/382: write d0/dd/f4d [862424,101959] 0 2026-03-10T12:37:57.085 INFO:tasks.workunit.client.0.vm00.stdout:9/555: symlink d0/d3d/d59/d4e/dba/d1e/d27/lc5 0 2026-03-10T12:37:57.092 INFO:tasks.workunit.client.0.vm00.stdout:4/499: unlink df/c48 0 2026-03-10T12:37:57.093 INFO:tasks.workunit.client.0.vm00.stdout:1/521: mkdir da/d24/d28/d67/db0 0 2026-03-10T12:37:57.096 INFO:tasks.workunit.client.0.vm00.stdout:4/500: dread - df/d1f/d22/d26/dab/d73/f7a zero size 2026-03-10T12:37:57.106 INFO:tasks.workunit.client.1.vm07.stdout:2/392: getdents d0/d42/d26/d38/d4f/d5d 0 2026-03-10T12:37:57.111 INFO:tasks.workunit.client.0.vm00.stdout:7/375: getdents da/d26/d37 0 2026-03-10T12:37:57.112 INFO:tasks.workunit.client.0.vm00.stdout:9/556: truncate d0/d3d/d59/d4e/dba/d19/f1b 3011856 0 2026-03-10T12:37:57.113 INFO:tasks.workunit.client.0.vm00.stdout:9/557: write d0/d3d/d59/d4e/dba/d1e/d85/d98/fab [397891,95533] 0 2026-03-10T12:37:57.116 INFO:tasks.workunit.client.0.vm00.stdout:1/522: creat da/d24/d28/fb1 x:0 0 0 2026-03-10T12:37:57.119 INFO:tasks.workunit.client.0.vm00.stdout:5/532: truncate d1f/d26/d2b/d35/d53/d72/d9d/d90/fae 1001184 0 2026-03-10T12:37:57.119 INFO:tasks.workunit.client.0.vm00.stdout:5/533: chown d1f/d26/f28 14 1 2026-03-10T12:37:57.127 INFO:tasks.workunit.client.1.vm07.stdout:1/457: write d9/df/d29/f70 [249086,91021] 0 2026-03-10T12:37:57.132 INFO:tasks.workunit.client.1.vm07.stdout:1/458: creat d9/d2d/d4f/f98 x:0 0 0 2026-03-10T12:37:57.134 INFO:tasks.workunit.client.1.vm07.stdout:1/459: stat d9/df/d29 0 2026-03-10T12:37:57.140 INFO:tasks.workunit.client.0.vm00.stdout:5/534: mkdir d1f/d96/dbd 0 2026-03-10T12:37:57.142 INFO:tasks.workunit.client.0.vm00.stdout:2/499: dread d4/d53/f61 [0,4194304] 0 2026-03-10T12:37:57.143 INFO:tasks.workunit.client.1.vm07.stdout:1/460: dwrite d9/df/d29/d2b/d3d/f85 [0,4194304] 0 2026-03-10T12:37:57.145 INFO:tasks.workunit.client.0.vm00.stdout:2/500: dwrite d4/d53/d68/f8a [0,4194304] 0 2026-03-10T12:37:57.146 INFO:tasks.workunit.client.0.vm00.stdout:2/501: write d4/d6/d2d/d3a/d43/fa1 [360396,27881] 0 2026-03-10T12:37:57.150 INFO:tasks.workunit.client.0.vm00.stdout:2/502: dread d4/d53/d76/f8b [0,4194304] 0 2026-03-10T12:37:57.151 INFO:tasks.workunit.client.1.vm07.stdout:1/461: dread d9/df/d29/f70 [0,4194304] 0 2026-03-10T12:37:57.152 INFO:tasks.workunit.client.0.vm00.stdout:9/558: link d0/d3d/d59/d4e/dba/d19/fb6 d0/d7f/db8/fc6 0 2026-03-10T12:37:57.156 INFO:tasks.workunit.client.1.vm07.stdout:1/462: read - d9/d2d/d4f/f5e zero size 2026-03-10T12:37:57.157 INFO:tasks.workunit.client.1.vm07.stdout:1/463: chown d9/df/d29/d6b 36806 1 2026-03-10T12:37:57.162 INFO:tasks.workunit.client.1.vm07.stdout:1/464: mknod d9/df/d29/d2b/d31/c99 0 2026-03-10T12:37:57.164 INFO:tasks.workunit.client.1.vm07.stdout:1/465: fdatasync d9/df/d29/d2b/d31/d91/d59/f84 0 2026-03-10T12:37:57.164 INFO:tasks.workunit.client.0.vm00.stdout:9/559: creat d0/d3d/d59/d4e/dba/d1e/d2b/fc7 x:0 0 0 2026-03-10T12:37:57.165 INFO:tasks.workunit.client.0.vm00.stdout:9/560: dread - d0/d3d/d59/d4e/dba/d1e/d85/d98/fa7 zero size 2026-03-10T12:37:57.166 INFO:tasks.workunit.client.0.vm00.stdout:9/561: truncate d0/d3d/d59/d4e/dba/fa1 887533 0 2026-03-10T12:37:57.168 INFO:tasks.workunit.client.1.vm07.stdout:1/466: mknod d9/df/d29/d2b/d92/c9a 0 2026-03-10T12:37:57.173 INFO:tasks.workunit.client.1.vm07.stdout:1/467: dread - d9/df/d29/f82 zero size 2026-03-10T12:37:57.173 INFO:tasks.workunit.client.1.vm07.stdout:1/468: write d9/df/f11 [4113636,42592] 0 2026-03-10T12:37:57.173 INFO:tasks.workunit.client.0.vm00.stdout:9/562: dwrite d0/d3d/d59/fad [0,4194304] 0 2026-03-10T12:37:57.173 INFO:tasks.workunit.client.0.vm00.stdout:9/563: fdatasync d0/d7f/db8/dc4/f67 0 2026-03-10T12:37:57.173 INFO:tasks.workunit.client.0.vm00.stdout:9/564: readlink d0/d9b/lc1 0 2026-03-10T12:37:57.177 INFO:tasks.workunit.client.1.vm07.stdout:1/469: dread d9/df/d29/f8b [0,4194304] 0 2026-03-10T12:37:57.181 INFO:tasks.workunit.client.0.vm00.stdout:5/535: mknod d1f/d26/d2e/d58/d6b/d86/cbe 0 2026-03-10T12:37:57.185 INFO:tasks.workunit.client.0.vm00.stdout:7/376: getdents da/d25/d2c 0 2026-03-10T12:37:57.190 INFO:tasks.workunit.client.0.vm00.stdout:2/503: unlink d4/dd/c26 0 2026-03-10T12:37:57.194 INFO:tasks.workunit.client.0.vm00.stdout:2/504: read d4/d6/d41/f4c [797990,13120] 0 2026-03-10T12:37:57.195 INFO:tasks.workunit.client.0.vm00.stdout:2/505: stat d4/d6/c2a 0 2026-03-10T12:37:57.195 INFO:tasks.workunit.client.0.vm00.stdout:2/506: fsync d4/d6/d2d/d3a/d43/d85/fa3 0 2026-03-10T12:37:57.197 INFO:tasks.workunit.client.1.vm07.stdout:1/470: link d9/l69 d9/df/d29/d6b/l9b 0 2026-03-10T12:37:57.198 INFO:tasks.workunit.client.1.vm07.stdout:9/527: write d5/d16/d23/d26/f42 [1734148,99774] 0 2026-03-10T12:37:57.201 INFO:tasks.workunit.client.0.vm00.stdout:8/383: write d0/d12/d17/d48/f4c [966727,54751] 0 2026-03-10T12:37:57.206 INFO:tasks.workunit.client.0.vm00.stdout:8/384: dwrite d0/d58/d68/f74 [0,4194304] 0 2026-03-10T12:37:57.206 INFO:tasks.workunit.client.1.vm07.stdout:1/471: fsync d9/df/d29/f49 0 2026-03-10T12:37:57.206 INFO:tasks.workunit.client.1.vm07.stdout:1/472: truncate d9/df/d29/d2b/d31/d91/f7f 5142149 0 2026-03-10T12:37:57.206 INFO:tasks.workunit.client.1.vm07.stdout:1/473: write d9/f61 [1568496,17811] 0 2026-03-10T12:37:57.209 INFO:tasks.workunit.client.0.vm00.stdout:9/565: symlink d0/d3d/d59/d4e/dba/d1e/d27/lc8 0 2026-03-10T12:37:57.211 INFO:tasks.workunit.client.0.vm00.stdout:7/377: creat da/d3f/f93 x:0 0 0 2026-03-10T12:37:57.212 INFO:tasks.workunit.client.1.vm07.stdout:9/528: dread d5/d16/d23/fb2 [0,4194304] 0 2026-03-10T12:37:57.219 INFO:tasks.workunit.client.0.vm00.stdout:7/378: dwrite da/fe [0,4194304] 0 2026-03-10T12:37:57.222 INFO:tasks.workunit.client.0.vm00.stdout:5/536: unlink d1f/d6a/f57 0 2026-03-10T12:37:57.234 INFO:tasks.workunit.client.0.vm00.stdout:7/379: rename da/d25/c28 to da/d26/d50/d73/d89/c94 0 2026-03-10T12:37:57.235 INFO:tasks.workunit.client.0.vm00.stdout:5/537: dread f12 [0,4194304] 0 2026-03-10T12:37:57.235 INFO:tasks.workunit.client.0.vm00.stdout:8/385: fsync d0/d12/d17/d48/f4c 0 2026-03-10T12:37:57.236 INFO:tasks.workunit.client.0.vm00.stdout:5/538: chown d1f/d6a/f84 2948476 1 2026-03-10T12:37:57.236 INFO:tasks.workunit.client.0.vm00.stdout:7/380: write da/d1b/d40/f5c [334082,92576] 0 2026-03-10T12:37:57.237 INFO:tasks.workunit.client.1.vm07.stdout:7/461: write d0/f2b [4267642,120079] 0 2026-03-10T12:37:57.243 INFO:tasks.workunit.client.0.vm00.stdout:1/523: dwrite da/d21/f88 [0,4194304] 0 2026-03-10T12:37:57.247 INFO:tasks.workunit.client.1.vm07.stdout:7/462: creat d0/d52/f97 x:0 0 0 2026-03-10T12:37:57.254 INFO:tasks.workunit.client.1.vm07.stdout:4/607: dwrite d0/d4/d7a/d46/f56 [0,4194304] 0 2026-03-10T12:37:57.259 INFO:tasks.workunit.client.0.vm00.stdout:2/507: creat d4/d6/d2d/fa9 x:0 0 0 2026-03-10T12:37:57.259 INFO:tasks.workunit.client.0.vm00.stdout:2/508: chown d4/dd/la0 243996 1 2026-03-10T12:37:57.260 INFO:tasks.workunit.client.0.vm00.stdout:2/509: write d4/dd/ff [620810,57261] 0 2026-03-10T12:37:57.260 INFO:tasks.workunit.client.0.vm00.stdout:2/510: readlink d4/dd/l97 0 2026-03-10T12:37:57.264 INFO:tasks.workunit.client.1.vm07.stdout:7/463: creat d0/d52/f98 x:0 0 0 2026-03-10T12:37:57.264 INFO:tasks.workunit.client.0.vm00.stdout:0/479: write d3/d7/d4c/f96 [4006379,106301] 0 2026-03-10T12:37:57.266 INFO:tasks.workunit.client.1.vm07.stdout:4/608: symlink d0/d4/d5/d78/ld1 0 2026-03-10T12:37:57.269 INFO:tasks.workunit.client.0.vm00.stdout:9/566: creat d0/fc9 x:0 0 0 2026-03-10T12:37:57.272 INFO:tasks.workunit.client.0.vm00.stdout:5/539: mkdir d1f/d26/d2b/d37/dbf 0 2026-03-10T12:37:57.276 INFO:tasks.workunit.client.0.vm00.stdout:8/386: rename d0/c30 to d0/d12/d60/c78 0 2026-03-10T12:37:57.277 INFO:tasks.workunit.client.1.vm07.stdout:7/464: mknod d0/d47/d48/c99 0 2026-03-10T12:37:57.277 INFO:tasks.workunit.client.0.vm00.stdout:8/387: chown d0/d12/d36/d5b/f65 3 1 2026-03-10T12:37:57.281 INFO:tasks.workunit.client.0.vm00.stdout:2/511: creat d4/d6/d41/d6d/faa x:0 0 0 2026-03-10T12:37:57.281 INFO:tasks.workunit.client.0.vm00.stdout:2/512: fdatasync f1 0 2026-03-10T12:37:57.286 INFO:tasks.workunit.client.0.vm00.stdout:3/572: dread dd/d3d/d8a/f8b [0,4194304] 0 2026-03-10T12:37:57.287 INFO:tasks.workunit.client.1.vm07.stdout:9/529: dread d5/d16/d23/d26/f86 [0,4194304] 0 2026-03-10T12:37:57.289 INFO:tasks.workunit.client.0.vm00.stdout:9/567: creat d0/d7f/db8/dc4/fca x:0 0 0 2026-03-10T12:37:57.292 INFO:tasks.workunit.client.0.vm00.stdout:2/513: dread d4/d6/d41/d6d/d40/f5e [0,4194304] 0 2026-03-10T12:37:57.293 INFO:tasks.workunit.client.0.vm00.stdout:0/480: mkdir d3/db/da4 0 2026-03-10T12:37:57.295 INFO:tasks.workunit.client.0.vm00.stdout:2/514: dread d4/dd/f3e [0,4194304] 0 2026-03-10T12:37:57.296 INFO:tasks.workunit.client.0.vm00.stdout:2/515: write d4/d53/d9e/f60 [211601,110810] 0 2026-03-10T12:37:57.296 INFO:tasks.workunit.client.1.vm07.stdout:4/609: mknod d0/d4/d10/d3c/d2b/d2d/da7/cd2 0 2026-03-10T12:37:57.298 INFO:tasks.workunit.client.0.vm00.stdout:8/388: mkdir d0/d12/d36/d51/d79 0 2026-03-10T12:37:57.298 INFO:tasks.workunit.client.0.vm00.stdout:8/389: readlink d0/dd/l57 0 2026-03-10T12:37:57.301 INFO:tasks.workunit.client.0.vm00.stdout:0/481: mkdir d3/d22/da5 0 2026-03-10T12:37:57.311 INFO:tasks.workunit.client.0.vm00.stdout:0/482: rmdir d3/d40/d65 39 2026-03-10T12:37:57.315 INFO:tasks.workunit.client.1.vm07.stdout:9/530: fsync d5/d13/d57/d4f/f88 0 2026-03-10T12:37:57.316 INFO:tasks.workunit.client.0.vm00.stdout:3/573: rename dd/d27/f91 to dd/d3d/d65/fbf 0 2026-03-10T12:37:57.330 INFO:tasks.workunit.client.0.vm00.stdout:8/390: mknod d0/d12/d60/c7a 0 2026-03-10T12:37:57.331 INFO:tasks.workunit.client.0.vm00.stdout:8/391: truncate d0/d46/d6e/f70 686494 0 2026-03-10T12:37:57.337 INFO:tasks.workunit.client.0.vm00.stdout:1/524: link da/d24/d28/c79 da/d24/d28/d44/d59/da6/cb2 0 2026-03-10T12:37:57.339 INFO:tasks.workunit.client.0.vm00.stdout:1/525: chown da/d24/d28/d44/d5d/d80 1717296 1 2026-03-10T12:37:57.343 INFO:tasks.workunit.client.0.vm00.stdout:1/526: dwrite da/d12/f99 [0,4194304] 0 2026-03-10T12:37:57.347 INFO:tasks.workunit.client.0.vm00.stdout:9/568: mkdir d0/d3d/d59/d4e/dba/d1e/dcb 0 2026-03-10T12:37:57.349 INFO:tasks.workunit.client.0.vm00.stdout:9/569: write d0/d3d/d59/d4e/dba/d19/f7d [1002808,16497] 0 2026-03-10T12:37:57.349 INFO:tasks.workunit.client.0.vm00.stdout:7/381: dread f0 [0,4194304] 0 2026-03-10T12:37:57.349 INFO:tasks.workunit.client.0.vm00.stdout:6/366: truncate d2/d16/d74/f59 647910 0 2026-03-10T12:37:57.356 INFO:tasks.workunit.client.0.vm00.stdout:7/382: creat da/d3f/d71/f95 x:0 0 0 2026-03-10T12:37:57.365 INFO:tasks.workunit.client.0.vm00.stdout:5/540: rmdir d1f/d26/d2b/d35/d53/d5b/dbb 0 2026-03-10T12:37:57.366 INFO:tasks.workunit.client.0.vm00.stdout:5/541: stat d1f/d26/d2b/f5e 0 2026-03-10T12:37:57.366 INFO:tasks.workunit.client.0.vm00.stdout:5/542: chown d1f/d26/d2b/d37/f9e 13 1 2026-03-10T12:37:57.366 INFO:tasks.workunit.client.0.vm00.stdout:5/543: write d1f/d26/d2e/f71 [2257159,92605] 0 2026-03-10T12:37:57.366 INFO:tasks.workunit.client.0.vm00.stdout:5/544: dwrite d1f/d6a/f84 [0,4194304] 0 2026-03-10T12:37:57.367 INFO:tasks.workunit.client.0.vm00.stdout:6/367: mkdir d2/d42/d80/d89 0 2026-03-10T12:37:57.379 INFO:tasks.workunit.client.0.vm00.stdout:2/516: symlink d4/d53/lab 0 2026-03-10T12:37:57.394 INFO:tasks.workunit.client.1.vm07.stdout:4/610: creat d0/d4/d5/fd3 x:0 0 0 2026-03-10T12:37:57.394 INFO:tasks.workunit.client.1.vm07.stdout:4/611: truncate d0/d4/d5/da/fcb 574785 0 2026-03-10T12:37:57.394 INFO:tasks.workunit.client.0.vm00.stdout:7/383: dread da/d1b/d40/f5c [0,4194304] 0 2026-03-10T12:37:57.394 INFO:tasks.workunit.client.0.vm00.stdout:8/392: creat d0/d46/d6e/f7b x:0 0 0 2026-03-10T12:37:57.394 INFO:tasks.workunit.client.0.vm00.stdout:7/384: creat da/d26/d37/f96 x:0 0 0 2026-03-10T12:37:57.394 INFO:tasks.workunit.client.0.vm00.stdout:9/570: fdatasync d0/d3d/d59/d4e/dba/fb9 0 2026-03-10T12:37:57.394 INFO:tasks.workunit.client.0.vm00.stdout:9/571: dread - d0/d3d/d59/d4e/dba/d1e/d2b/fc7 zero size 2026-03-10T12:37:57.408 INFO:tasks.workunit.client.1.vm07.stdout:3/528: write dc/dd/d28/d3b/f4c [3747523,36103] 0 2026-03-10T12:37:57.423 INFO:tasks.workunit.client.1.vm07.stdout:3/529: mknod dc/dd/d1f/dac/cb9 0 2026-03-10T12:37:57.423 INFO:tasks.workunit.client.0.vm00.stdout:5/545: readlink d1f/d26/d2b/d35/d53/d72/d9d/d90/lab 0 2026-03-10T12:37:57.424 INFO:tasks.workunit.client.0.vm00.stdout:5/546: dwrite d1f/f97 [0,4194304] 0 2026-03-10T12:37:57.424 INFO:tasks.workunit.client.0.vm00.stdout:2/517: creat d4/d53/d76/fac x:0 0 0 2026-03-10T12:37:57.432 INFO:tasks.workunit.client.0.vm00.stdout:6/368: dread d2/f30 [0,4194304] 0 2026-03-10T12:37:57.441 INFO:tasks.workunit.client.1.vm07.stdout:3/530: creat dc/dd/d28/d7a/fba x:0 0 0 2026-03-10T12:37:57.441 INFO:tasks.workunit.client.0.vm00.stdout:5/547: symlink d1f/d96/lc0 0 2026-03-10T12:37:57.442 INFO:tasks.workunit.client.0.vm00.stdout:5/548: chown d1f/d26/d2b/d37/f9e 20385203 1 2026-03-10T12:37:57.466 INFO:tasks.workunit.client.1.vm07.stdout:3/531: fdatasync dc/dd/d28/f67 0 2026-03-10T12:37:57.466 INFO:tasks.workunit.client.1.vm07.stdout:3/532: chown dc/dd/d1f/f30 12236150 1 2026-03-10T12:37:57.466 INFO:tasks.workunit.client.0.vm00.stdout:8/393: symlink d0/d46/l7c 0 2026-03-10T12:37:57.466 INFO:tasks.workunit.client.0.vm00.stdout:5/549: dwrite d1f/f27 [0,4194304] 0 2026-03-10T12:37:57.469 INFO:tasks.workunit.client.0.vm00.stdout:6/369: dread d2/d16/f23 [0,4194304] 0 2026-03-10T12:37:57.469 INFO:tasks.workunit.client.0.vm00.stdout:6/370: readlink d2/d42/l66 0 2026-03-10T12:37:57.471 INFO:tasks.workunit.client.0.vm00.stdout:6/371: mknod d2/d39/c8a 0 2026-03-10T12:37:57.482 INFO:tasks.workunit.client.0.vm00.stdout:6/372: truncate d2/d51/f5c 560903 0 2026-03-10T12:37:57.482 INFO:tasks.workunit.client.0.vm00.stdout:6/373: chown d2/d16/d29/c67 35133968 1 2026-03-10T12:37:57.488 INFO:tasks.workunit.client.0.vm00.stdout:4/501: dread df/d1f/d36/d3a/d41/f33 [0,4194304] 0 2026-03-10T12:37:57.489 INFO:tasks.workunit.client.0.vm00.stdout:4/502: write df/d1f/d22/d26/d65/f8e [419762,47731] 0 2026-03-10T12:37:57.491 INFO:tasks.workunit.client.0.vm00.stdout:4/503: symlink df/d1f/d22/d26/dab/d73/lae 0 2026-03-10T12:37:57.502 INFO:tasks.workunit.client.0.vm00.stdout:7/385: dread da/d25/d2c/f4f [0,4194304] 0 2026-03-10T12:37:57.506 INFO:tasks.workunit.client.0.vm00.stdout:7/386: truncate da/f10 8128123 0 2026-03-10T12:37:57.512 INFO:tasks.workunit.client.0.vm00.stdout:3/574: sync 2026-03-10T12:37:57.512 INFO:tasks.workunit.client.0.vm00.stdout:9/572: sync 2026-03-10T12:37:57.512 INFO:tasks.workunit.client.1.vm07.stdout:9/531: sync 2026-03-10T12:37:57.513 INFO:tasks.workunit.client.0.vm00.stdout:9/573: stat d0/d3d/d59/d4e/dba/d19/f20 0 2026-03-10T12:37:57.513 INFO:tasks.workunit.client.1.vm07.stdout:9/532: dread - d5/d13/d57/d4f/d6a/fba zero size 2026-03-10T12:37:57.515 INFO:tasks.workunit.client.0.vm00.stdout:9/574: mkdir d0/d7f/db8/dc4/db0/dcc 0 2026-03-10T12:37:57.516 INFO:tasks.workunit.client.1.vm07.stdout:9/533: dread d5/d13/d22/f39 [0,4194304] 0 2026-03-10T12:37:57.517 INFO:tasks.workunit.client.0.vm00.stdout:9/575: rmdir d0/d3d/d59/d4e/dba/d1e/d85 39 2026-03-10T12:37:57.517 INFO:tasks.workunit.client.0.vm00.stdout:9/576: write d0/dc2/f87 [906027,83469] 0 2026-03-10T12:37:57.524 INFO:tasks.workunit.client.1.vm07.stdout:9/534: mknod d5/d13/d6c/d7a/daf/cbd 0 2026-03-10T12:37:57.552 INFO:tasks.workunit.client.1.vm07.stdout:9/535: dread d5/d1f/d31/f56 [0,4194304] 0 2026-03-10T12:37:57.554 INFO:tasks.workunit.client.0.vm00.stdout:1/527: rename da/d24/d28/d44 to da/d21/db3 0 2026-03-10T12:37:57.554 INFO:tasks.workunit.client.1.vm07.stdout:9/536: truncate d5/d13/f67 1636675 0 2026-03-10T12:37:57.556 INFO:tasks.workunit.client.0.vm00.stdout:2/518: rename d4/d6/d41/d6d/d40 to d4/d53/d76/d9b/dad 0 2026-03-10T12:37:57.556 INFO:tasks.workunit.client.1.vm07.stdout:9/537: read d5/d13/d57/d4f/f88 [1548395,32157] 0 2026-03-10T12:37:57.556 INFO:tasks.workunit.client.0.vm00.stdout:2/519: fsync d4/d6/d2d/d3a/d43/d85/fa3 0 2026-03-10T12:37:57.557 INFO:tasks.workunit.client.1.vm07.stdout:9/538: fdatasync d5/f65 0 2026-03-10T12:37:57.557 INFO:tasks.workunit.client.0.vm00.stdout:9/577: sync 2026-03-10T12:37:57.560 INFO:tasks.workunit.client.0.vm00.stdout:2/520: dwrite d4/d53/d9e/f60 [0,4194304] 0 2026-03-10T12:37:57.565 INFO:tasks.workunit.client.0.vm00.stdout:8/394: rename d0/d12/d36/d51/d79 to d0/d12/d36/d7d 0 2026-03-10T12:37:57.565 INFO:tasks.workunit.client.0.vm00.stdout:8/395: chown d0/c66 284163471 1 2026-03-10T12:37:57.566 INFO:tasks.workunit.client.1.vm07.stdout:9/539: fsync d5/d16/d23/d26/f46 0 2026-03-10T12:37:57.574 INFO:tasks.workunit.client.0.vm00.stdout:5/550: rename d1f/d6a/f74 to d1f/d26/d2b/d35/d53/d72/d9d/d8e/fc1 0 2026-03-10T12:37:57.589 INFO:tasks.workunit.client.1.vm07.stdout:9/540: sync 2026-03-10T12:37:57.590 INFO:tasks.workunit.client.1.vm07.stdout:9/541: chown d5/d13/d6c/d89/dac 13990732 1 2026-03-10T12:37:57.590 INFO:tasks.workunit.client.0.vm00.stdout:1/528: fsync da/d12/f99 0 2026-03-10T12:37:57.593 INFO:tasks.workunit.client.0.vm00.stdout:5/551: symlink d1f/d26/d2b/d35/d53/lc2 0 2026-03-10T12:37:57.594 INFO:tasks.workunit.client.0.vm00.stdout:5/552: readlink d1f/d26/d2b/d35/d53/d5b/l93 0 2026-03-10T12:37:57.595 INFO:tasks.workunit.client.0.vm00.stdout:8/396: mkdir d0/d46/d7e 0 2026-03-10T12:37:57.595 INFO:tasks.workunit.client.1.vm07.stdout:6/453: truncate d1/d4/f5a 524915 0 2026-03-10T12:37:57.596 INFO:tasks.workunit.client.1.vm07.stdout:1/474: symlink d9/df/d29/l9c 0 2026-03-10T12:37:57.596 INFO:tasks.workunit.client.1.vm07.stdout:4/612: creat d0/d4/d5/da/fd4 x:0 0 0 2026-03-10T12:37:57.597 INFO:tasks.workunit.client.0.vm00.stdout:2/521: link d4/dd/c21 d4/d53/d9e/cae 0 2026-03-10T12:37:57.608 INFO:tasks.workunit.client.0.vm00.stdout:5/553: unlink d1f/d26/d2b/f5e 0 2026-03-10T12:37:57.608 INFO:tasks.workunit.client.0.vm00.stdout:5/554: write d1f/d26/d2e/fba [85593,4461] 0 2026-03-10T12:37:57.609 INFO:tasks.workunit.client.0.vm00.stdout:5/555: truncate d1f/d26/d6f/fa9 41785 0 2026-03-10T12:37:57.613 INFO:tasks.workunit.client.0.vm00.stdout:8/397: creat d0/d12/d60/f7f x:0 0 0 2026-03-10T12:37:57.614 INFO:tasks.workunit.client.0.vm00.stdout:9/578: getdents d0/d3d/d59/d4e/dba/d19/d50 0 2026-03-10T12:37:57.615 INFO:tasks.workunit.client.1.vm07.stdout:0/545: mkdir d0/d14/d5f/d76/d2f/db2 0 2026-03-10T12:37:57.615 INFO:tasks.workunit.client.0.vm00.stdout:2/522: creat d4/dd/d38/faf x:0 0 0 2026-03-10T12:37:57.616 INFO:tasks.workunit.client.1.vm07.stdout:0/546: write d0/d14/d5f/d76/f78 [1118394,120639] 0 2026-03-10T12:37:57.618 INFO:tasks.workunit.client.0.vm00.stdout:4/504: rename df/d1f/d22/d26/dab/d73/c99 to df/d1f/d22/d26/caf 0 2026-03-10T12:37:57.621 INFO:tasks.workunit.client.0.vm00.stdout:3/575: write dd/d64/fb5 [1086311,85307] 0 2026-03-10T12:37:57.621 INFO:tasks.workunit.client.0.vm00.stdout:3/576: stat dd/d64/fb9 0 2026-03-10T12:37:57.631 INFO:tasks.workunit.client.0.vm00.stdout:1/529: mkdir da/d12/db4 0 2026-03-10T12:37:57.631 INFO:tasks.workunit.client.0.vm00.stdout:1/530: chown da/d21/db3/l7d 7 1 2026-03-10T12:37:57.635 INFO:tasks.workunit.client.0.vm00.stdout:5/556: chown d1f/d26/d2b/d35/d78/d7f/cac 1716419005 1 2026-03-10T12:37:57.636 INFO:tasks.workunit.client.0.vm00.stdout:8/398: readlink d0/d12/d2d/d49/l5a 0 2026-03-10T12:37:57.636 INFO:tasks.workunit.client.0.vm00.stdout:8/399: write d0/d58/d68/f74 [3100125,47805] 0 2026-03-10T12:37:57.639 INFO:tasks.workunit.client.0.vm00.stdout:9/579: truncate d0/d3d/d59/d4e/dba/d1e/d85/d98/fa0 723021 0 2026-03-10T12:37:57.643 INFO:tasks.workunit.client.0.vm00.stdout:4/505: unlink df/d1f/d36/d3a/d41/c54 0 2026-03-10T12:37:57.646 INFO:tasks.workunit.client.1.vm07.stdout:1/475: chown d9/df/d29/d2b/d31/l45 117484 1 2026-03-10T12:37:57.650 INFO:tasks.workunit.client.1.vm07.stdout:4/613: mkdir d0/d4/d10/d8d/db2/dd5 0 2026-03-10T12:37:57.651 INFO:tasks.workunit.client.0.vm00.stdout:2/523: symlink d4/d6/d93/lb0 0 2026-03-10T12:37:57.652 INFO:tasks.workunit.client.0.vm00.stdout:2/524: chown d4/d6/f30 1081918 1 2026-03-10T12:37:57.652 INFO:tasks.workunit.client.1.vm07.stdout:3/533: symlink dc/d18/lbb 0 2026-03-10T12:37:57.656 INFO:tasks.workunit.client.1.vm07.stdout:7/465: creat d0/d47/f9a x:0 0 0 2026-03-10T12:37:57.657 INFO:tasks.workunit.client.0.vm00.stdout:0/483: dwrite d3/d7/d4c/f76 [0,4194304] 0 2026-03-10T12:37:57.663 INFO:tasks.workunit.client.0.vm00.stdout:0/484: dwrite d3/db/d24/d25/f3f [4194304,4194304] 0 2026-03-10T12:37:57.663 INFO:tasks.workunit.client.1.vm07.stdout:5/520: rename d0/d22/d18/d19/d21/f38 to d0/d22/d18/fb4 0 2026-03-10T12:37:57.664 INFO:tasks.workunit.client.1.vm07.stdout:8/485: rename d1/d3/d40/f41 to d1/d3/d6/d54/f9c 0 2026-03-10T12:37:57.668 INFO:tasks.workunit.client.0.vm00.stdout:0/485: symlink d3/d7/d4c/d5b/la6 0 2026-03-10T12:37:57.669 INFO:tasks.workunit.client.1.vm07.stdout:2/393: rename d0/d42/d26/d4b to d0/d29/d64/d74/d88 0 2026-03-10T12:37:57.671 INFO:tasks.workunit.client.1.vm07.stdout:6/454: dread d1/d4/d6/d16/d49/f67 [0,4194304] 0 2026-03-10T12:37:57.671 INFO:tasks.workunit.client.1.vm07.stdout:6/455: stat d1/d4/d6/lf 0 2026-03-10T12:37:57.675 INFO:tasks.workunit.client.1.vm07.stdout:4/614: rename d0/d4/d5/f43 to d0/d4/d10/d3c/d2b/d54/fd6 0 2026-03-10T12:37:57.676 INFO:tasks.workunit.client.1.vm07.stdout:2/394: dread - d0/d42/d4e/d77/f6f zero size 2026-03-10T12:37:57.679 INFO:tasks.workunit.client.0.vm00.stdout:3/577: rmdir dd/d27/d2c 39 2026-03-10T12:37:57.683 INFO:tasks.workunit.client.1.vm07.stdout:7/466: rename d0/d61/f66 to d0/f9b 0 2026-03-10T12:37:57.684 INFO:tasks.workunit.client.1.vm07.stdout:7/467: mknod d0/d47/c9c 0 2026-03-10T12:37:57.686 INFO:tasks.workunit.client.0.vm00.stdout:2/525: creat d4/d53/d68/fb1 x:0 0 0 2026-03-10T12:37:57.691 INFO:tasks.workunit.client.0.vm00.stdout:0/486: dread d3/d22/f83 [0,4194304] 0 2026-03-10T12:37:57.695 INFO:tasks.workunit.client.1.vm07.stdout:7/468: unlink d0/d67/f89 0 2026-03-10T12:37:57.695 INFO:tasks.workunit.client.1.vm07.stdout:5/521: dread d0/d22/d18/d30/f35 [0,4194304] 0 2026-03-10T12:37:57.695 INFO:tasks.workunit.client.0.vm00.stdout:3/578: write dd/d18/d13/f6b [1165163,18676] 0 2026-03-10T12:37:57.695 INFO:tasks.workunit.client.0.vm00.stdout:6/374: write d2/d16/f17 [2702578,56815] 0 2026-03-10T12:37:57.696 INFO:tasks.workunit.client.0.vm00.stdout:9/580: rename d0/d3d/d59/d4e/dba/d1e/d85/d98/cb7 to d0/d3d/d43/d53/ccd 0 2026-03-10T12:37:57.696 INFO:tasks.workunit.client.0.vm00.stdout:9/581: dread - d0/d3d/d59/d4e/f7c zero size 2026-03-10T12:37:57.696 INFO:tasks.workunit.client.0.vm00.stdout:7/387: truncate da/d41/f4b 2287739 0 2026-03-10T12:37:57.696 INFO:tasks.workunit.client.0.vm00.stdout:7/388: fdatasync da/d25/f5a 0 2026-03-10T12:37:57.696 INFO:tasks.workunit.client.1.vm07.stdout:7/469: mkdir d0/d47/d48/d8a/d9d 0 2026-03-10T12:37:57.696 INFO:tasks.workunit.client.1.vm07.stdout:7/470: fdatasync d0/d61/f64 0 2026-03-10T12:37:57.700 INFO:tasks.workunit.client.0.vm00.stdout:7/389: dread da/d26/d37/f79 [0,4194304] 0 2026-03-10T12:37:57.700 INFO:tasks.workunit.client.0.vm00.stdout:3/579: rmdir dd/d18/d13/d1d/d43 39 2026-03-10T12:37:57.703 INFO:tasks.workunit.client.0.vm00.stdout:8/400: getdents d0/d12/d17/d48 0 2026-03-10T12:37:57.703 INFO:tasks.workunit.client.0.vm00.stdout:0/487: truncate d3/d7/d4c/d5b/f56 2808848 0 2026-03-10T12:37:57.704 INFO:tasks.workunit.client.0.vm00.stdout:8/401: readlink d0/d12/d36/l40 0 2026-03-10T12:37:57.704 INFO:tasks.workunit.client.0.vm00.stdout:3/580: dwrite dd/d64/d92/fb8 [0,4194304] 0 2026-03-10T12:37:57.705 INFO:tasks.workunit.client.1.vm07.stdout:4/615: sync 2026-03-10T12:37:57.706 INFO:tasks.workunit.client.0.vm00.stdout:9/582: mknod d0/d3d/d59/d4e/cce 0 2026-03-10T12:37:57.718 INFO:tasks.workunit.client.0.vm00.stdout:6/375: readlink d2/d51/l53 0 2026-03-10T12:37:57.718 INFO:tasks.workunit.client.0.vm00.stdout:8/402: dwrite d0/d12/d17/f63 [0,4194304] 0 2026-03-10T12:37:57.718 INFO:tasks.workunit.client.1.vm07.stdout:7/471: stat d0/l46 0 2026-03-10T12:37:57.718 INFO:tasks.workunit.client.1.vm07.stdout:4/616: symlink d0/d5c/d7c/ld7 0 2026-03-10T12:37:57.718 INFO:tasks.workunit.client.1.vm07.stdout:5/522: getdents d0/d22/d18/d19 0 2026-03-10T12:37:57.718 INFO:tasks.workunit.client.1.vm07.stdout:9/542: dwrite d5/d1f/d31/f43 [0,4194304] 0 2026-03-10T12:37:57.719 INFO:tasks.workunit.client.0.vm00.stdout:8/403: write d0/d12/d17/f63 [795401,12693] 0 2026-03-10T12:37:57.719 INFO:tasks.workunit.client.0.vm00.stdout:8/404: chown d0/f56 2322 1 2026-03-10T12:37:57.723 INFO:tasks.workunit.client.1.vm07.stdout:9/543: chown d5/d13/d2c/f44 119065 1 2026-03-10T12:37:57.730 INFO:tasks.workunit.client.0.vm00.stdout:8/405: dwrite d0/dd/f4d [0,4194304] 0 2026-03-10T12:37:57.730 INFO:tasks.workunit.client.1.vm07.stdout:9/544: readlink d5/d16/d18/l71 0 2026-03-10T12:37:57.731 INFO:tasks.workunit.client.1.vm07.stdout:9/545: mknod d5/d13/d57/d4f/d6a/cbe 0 2026-03-10T12:37:57.735 INFO:tasks.workunit.client.0.vm00.stdout:3/581: fsync dd/d3d/d84/f8c 0 2026-03-10T12:37:57.735 INFO:tasks.workunit.client.0.vm00.stdout:8/406: readlink d0/l3f 0 2026-03-10T12:37:57.735 INFO:tasks.workunit.client.1.vm07.stdout:9/546: symlink d5/d16/d18/lbf 0 2026-03-10T12:37:57.736 INFO:tasks.workunit.client.0.vm00.stdout:8/407: write d0/f8 [4893796,81553] 0 2026-03-10T12:37:57.736 INFO:tasks.workunit.client.1.vm07.stdout:9/547: chown d5/d16/d23/d26/c92 0 1 2026-03-10T12:37:57.738 INFO:tasks.workunit.client.0.vm00.stdout:6/376: fdatasync d2/da/dc/f27 0 2026-03-10T12:37:57.738 INFO:tasks.workunit.client.0.vm00.stdout:6/377: readlink d2/d42/l66 0 2026-03-10T12:37:57.739 INFO:tasks.workunit.client.0.vm00.stdout:9/583: dread d0/d3d/d59/f94 [0,4194304] 0 2026-03-10T12:37:57.739 INFO:tasks.workunit.client.1.vm07.stdout:7/472: sync 2026-03-10T12:37:57.740 INFO:tasks.workunit.client.1.vm07.stdout:7/473: dread - d0/d52/f97 zero size 2026-03-10T12:37:57.741 INFO:tasks.workunit.client.1.vm07.stdout:9/548: read d5/d13/d57/d3e/fa8 [410233,38374] 0 2026-03-10T12:37:57.744 INFO:tasks.workunit.client.0.vm00.stdout:3/582: rename dd/d18/d13/d1d/f42 to dd/d18/d14/fc0 0 2026-03-10T12:37:57.745 INFO:tasks.workunit.client.1.vm07.stdout:9/549: dread - d5/d1f/d31/d76/fb0 zero size 2026-03-10T12:37:57.745 INFO:tasks.workunit.client.0.vm00.stdout:0/488: sync 2026-03-10T12:37:57.747 INFO:tasks.workunit.client.0.vm00.stdout:3/583: dwrite dd/d64/f87 [0,4194304] 0 2026-03-10T12:37:57.749 INFO:tasks.workunit.client.1.vm07.stdout:5/523: dread d0/f9 [4194304,4194304] 0 2026-03-10T12:37:57.750 INFO:tasks.workunit.client.1.vm07.stdout:5/524: chown d0/d22/d18/d19/d2e/d67/fa0 328575 1 2026-03-10T12:37:57.755 INFO:tasks.workunit.client.0.vm00.stdout:3/584: mknod dd/d4e/cc1 0 2026-03-10T12:37:57.756 INFO:tasks.workunit.client.0.vm00.stdout:0/489: sync 2026-03-10T12:37:57.758 INFO:tasks.workunit.client.0.vm00.stdout:9/584: symlink d0/d3d/lcf 0 2026-03-10T12:37:57.761 INFO:tasks.workunit.client.0.vm00.stdout:8/408: dread d0/d12/d2d/f55 [0,4194304] 0 2026-03-10T12:37:57.762 INFO:tasks.workunit.client.0.vm00.stdout:9/585: write d0/d5/dc/f9c [313114,361] 0 2026-03-10T12:37:57.764 INFO:tasks.workunit.client.0.vm00.stdout:8/409: mknod d0/d12/d17/d48/c80 0 2026-03-10T12:37:57.765 INFO:tasks.workunit.client.0.vm00.stdout:9/586: fdatasync d0/d3d/d59/d4e/dba/d1e/d27/f75 0 2026-03-10T12:37:57.767 INFO:tasks.workunit.client.0.vm00.stdout:3/585: rmdir dd/d27/d2c/d34/d45 39 2026-03-10T12:37:57.770 INFO:tasks.workunit.client.1.vm07.stdout:4/617: dread d0/d4/d10/d3c/d2b/d2d/f99 [0,4194304] 0 2026-03-10T12:37:57.770 INFO:tasks.workunit.client.1.vm07.stdout:4/618: stat d0/d19/cca 0 2026-03-10T12:37:57.773 INFO:tasks.workunit.client.0.vm00.stdout:9/587: creat d0/d3d/d59/d4e/dba/d1e/d85/d98/fd0 x:0 0 0 2026-03-10T12:37:57.777 INFO:tasks.workunit.client.0.vm00.stdout:8/410: dwrite d0/d12/f2a [0,4194304] 0 2026-03-10T12:37:57.787 INFO:tasks.workunit.client.1.vm07.stdout:0/547: dread d0/d14/d5f/d76/d2f/d31/f6f [0,4194304] 0 2026-03-10T12:37:57.788 INFO:tasks.workunit.client.0.vm00.stdout:8/411: getdents d0/d46/d7e 0 2026-03-10T12:37:57.790 INFO:tasks.workunit.client.1.vm07.stdout:7/474: rename d0/f3c to d0/d47/d48/f9e 0 2026-03-10T12:37:57.790 INFO:tasks.workunit.client.0.vm00.stdout:9/588: link d0/d3d/d59/d74/faa d0/d3d/d43/d53/fd1 0 2026-03-10T12:37:57.794 INFO:tasks.workunit.client.0.vm00.stdout:8/412: mkdir d0/dd/d38/d81 0 2026-03-10T12:37:57.803 INFO:tasks.workunit.client.0.vm00.stdout:8/413: mknod d0/c82 0 2026-03-10T12:37:57.803 INFO:tasks.workunit.client.0.vm00.stdout:8/414: mknod d0/d12/d2d/c83 0 2026-03-10T12:37:57.803 INFO:tasks.workunit.client.0.vm00.stdout:8/415: dread d0/d46/d6e/f70 [0,4194304] 0 2026-03-10T12:37:57.803 INFO:tasks.workunit.client.0.vm00.stdout:8/416: fsync d0/d12/d2d/f6f 0 2026-03-10T12:37:57.806 INFO:tasks.workunit.client.1.vm07.stdout:1/476: write d9/f52 [920124,112593] 0 2026-03-10T12:37:57.807 INFO:tasks.workunit.client.0.vm00.stdout:5/557: write f19 [525660,80016] 0 2026-03-10T12:37:57.807 INFO:tasks.workunit.client.0.vm00.stdout:1/531: write da/f22 [4344898,121653] 0 2026-03-10T12:37:57.807 INFO:tasks.workunit.client.0.vm00.stdout:4/506: write df/d1f/d36/d3a/d41/f33 [2390751,105823] 0 2026-03-10T12:37:57.808 INFO:tasks.workunit.client.0.vm00.stdout:4/507: chown df/d8a 158963 1 2026-03-10T12:37:57.809 INFO:tasks.workunit.client.0.vm00.stdout:7/390: link da/f17 da/d26/f97 0 2026-03-10T12:37:57.815 INFO:tasks.workunit.client.0.vm00.stdout:8/417: creat d0/d12/d2d/d49/f84 x:0 0 0 2026-03-10T12:37:57.817 INFO:tasks.workunit.client.1.vm07.stdout:3/534: dwrite dc/dd/f9a [0,4194304] 0 2026-03-10T12:37:57.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:57 vm07.local ceph-mon[58582]: pgmap v164: 65 pgs: 65 active+clean; 2.2 GiB data, 7.8 GiB used, 112 GiB / 120 GiB avail; 49 MiB/s rd, 139 MiB/s wr, 300 op/s 2026-03-10T12:37:57.818 INFO:tasks.workunit.client.1.vm07.stdout:2/395: dread d0/d42/d1f/d20/f3f [0,4194304] 0 2026-03-10T12:37:57.818 INFO:tasks.workunit.client.1.vm07.stdout:3/535: write dc/dd/db5/f73 [1551618,78022] 0 2026-03-10T12:37:57.819 INFO:tasks.workunit.client.0.vm00.stdout:8/418: write d0/d12/d2d/f75 [960786,9355] 0 2026-03-10T12:37:57.822 INFO:tasks.workunit.client.1.vm07.stdout:6/456: write d1/d4/d6/f13 [3288837,25443] 0 2026-03-10T12:37:57.826 INFO:tasks.workunit.client.0.vm00.stdout:4/508: dread df/d1f/d36/f51 [0,4194304] 0 2026-03-10T12:37:57.827 INFO:tasks.workunit.client.0.vm00.stdout:8/419: sync 2026-03-10T12:37:57.827 INFO:tasks.workunit.client.0.vm00.stdout:0/490: unlink d3/d7/d4c/d5b/d38/d44/f49 0 2026-03-10T12:37:57.829 INFO:tasks.workunit.client.1.vm07.stdout:8/486: dwrite d1/d3/d6/f24 [0,4194304] 0 2026-03-10T12:37:57.829 INFO:tasks.workunit.client.0.vm00.stdout:2/526: write d4/dd/f45 [1549149,127725] 0 2026-03-10T12:37:57.831 INFO:tasks.workunit.client.0.vm00.stdout:8/420: symlink d0/d12/d36/d51/l85 0 2026-03-10T12:37:57.832 INFO:tasks.workunit.client.0.vm00.stdout:8/421: truncate d0/d12/d2d/d49/f84 601596 0 2026-03-10T12:37:57.834 INFO:tasks.workunit.client.0.vm00.stdout:4/509: symlink df/d93/d9e/lb0 0 2026-03-10T12:37:57.838 INFO:tasks.workunit.client.0.vm00.stdout:4/510: dwrite df/d63/d77/f8d [4194304,4194304] 0 2026-03-10T12:37:57.839 INFO:tasks.workunit.client.0.vm00.stdout:8/422: mkdir d0/d5c/d86 0 2026-03-10T12:37:57.842 INFO:tasks.workunit.client.0.vm00.stdout:8/423: rename d0/d12/f2a to d0/d12/d17/d48/f87 0 2026-03-10T12:37:57.848 INFO:tasks.workunit.client.0.vm00.stdout:4/511: mknod df/d1f/d36/cb1 0 2026-03-10T12:37:57.865 INFO:tasks.workunit.client.0.vm00.stdout:0/491: truncate d3/d7/d4c/d5b/d38/f93 4623815 0 2026-03-10T12:37:57.868 INFO:tasks.workunit.client.1.vm07.stdout:1/477: dread - d9/df/d55/f6f zero size 2026-03-10T12:37:57.869 INFO:tasks.workunit.client.0.vm00.stdout:4/512: dread df/f42 [0,4194304] 0 2026-03-10T12:37:57.869 INFO:tasks.workunit.client.1.vm07.stdout:5/525: dread d0/d22/d18/f95 [0,4194304] 0 2026-03-10T12:37:57.870 INFO:tasks.workunit.client.1.vm07.stdout:5/526: fdatasync d0/d22/d18/d19/d2e/d3f/f6a 0 2026-03-10T12:37:57.874 INFO:tasks.workunit.client.0.vm00.stdout:6/378: link d2/d16/d29/d31/d34/l3d d2/d39/l8b 0 2026-03-10T12:37:57.874 INFO:tasks.workunit.client.0.vm00.stdout:6/379: fdatasync d2/da/dc/d2f/f56 0 2026-03-10T12:37:57.887 INFO:tasks.workunit.client.0.vm00.stdout:4/513: sync 2026-03-10T12:37:57.892 INFO:tasks.workunit.client.0.vm00.stdout:4/514: dwrite df/d1f/d22/d26/dab/d73/f8b [0,4194304] 0 2026-03-10T12:37:57.898 INFO:tasks.workunit.client.0.vm00.stdout:4/515: rename df/d1f/d22/d26/l8c to df/d1f/d22/d26/d65/d91/lb2 0 2026-03-10T12:37:57.899 INFO:tasks.workunit.client.0.vm00.stdout:4/516: chown df/f4e 915 1 2026-03-10T12:37:57.911 INFO:tasks.workunit.client.0.vm00.stdout:3/586: write f7 [13132115,88528] 0 2026-03-10T12:37:57.915 INFO:tasks.workunit.client.0.vm00.stdout:9/589: dwrite d0/d3d/d43/f54 [4194304,4194304] 0 2026-03-10T12:37:57.919 INFO:tasks.workunit.client.0.vm00.stdout:5/558: write d1f/d26/d2b/d35/d53/d72/d9d/d90/fae [1678014,58721] 0 2026-03-10T12:37:57.923 INFO:tasks.workunit.client.0.vm00.stdout:3/587: truncate dd/d64/f7b 777264 0 2026-03-10T12:37:57.925 INFO:tasks.workunit.client.1.vm07.stdout:9/550: fsync d5/d1f/d31/f43 0 2026-03-10T12:37:57.926 INFO:tasks.workunit.client.0.vm00.stdout:3/588: dwrite dd/d64/d92/fb8 [0,4194304] 0 2026-03-10T12:37:57.930 INFO:tasks.workunit.client.0.vm00.stdout:3/589: dwrite dd/d27/d2c/d34/d38/f48 [4194304,4194304] 0 2026-03-10T12:37:57.933 INFO:tasks.workunit.client.0.vm00.stdout:5/559: mkdir d1f/d6a/d94/dc3 0 2026-03-10T12:37:57.937 INFO:tasks.workunit.client.0.vm00.stdout:2/527: dwrite d4/d53/d76/d9b/dad/f5e [0,4194304] 0 2026-03-10T12:37:57.945 INFO:tasks.workunit.client.0.vm00.stdout:3/590: creat dd/d64/fc2 x:0 0 0 2026-03-10T12:37:57.950 INFO:tasks.workunit.client.0.vm00.stdout:3/591: dwrite dd/d18/d13/f6b [0,4194304] 0 2026-03-10T12:37:57.958 INFO:tasks.workunit.client.0.vm00.stdout:5/560: mkdir d1f/d26/d2b/d37/dc4 0 2026-03-10T12:37:57.966 INFO:tasks.workunit.client.0.vm00.stdout:3/592: mknod dd/d18/d13/cc3 0 2026-03-10T12:37:57.966 INFO:tasks.workunit.client.0.vm00.stdout:9/590: link d0/d3d/d59/d74/ca6 d0/d3d/d59/d4e/dba/d1e/d85/d98/cd2 0 2026-03-10T12:37:57.966 INFO:tasks.workunit.client.0.vm00.stdout:0/492: creat d3/db/da4/fa7 x:0 0 0 2026-03-10T12:37:57.966 INFO:tasks.workunit.client.0.vm00.stdout:3/593: symlink dd/d3d/d84/lc4 0 2026-03-10T12:37:57.968 INFO:tasks.workunit.client.0.vm00.stdout:9/591: stat d0/d3d/d59/d4e/dba/d19/d50/cc0 0 2026-03-10T12:37:57.973 INFO:tasks.workunit.client.0.vm00.stdout:9/592: mknod d0/d3d/d59/cd3 0 2026-03-10T12:37:57.976 INFO:tasks.workunit.client.0.vm00.stdout:6/380: mknod d2/da/c8c 0 2026-03-10T12:37:57.991 INFO:tasks.workunit.client.0.vm00.stdout:6/381: write d2/d16/f20 [4972130,61204] 0 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:0/493: rmdir d3/d7/d4c/d5b/d38 39 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:0/494: readlink d3/db/d24/l9a 0 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:0/495: dread - d3/d7/d3c/d4b/f79 zero size 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:6/382: dwrite d2/da/dc/d2f/f56 [0,4194304] 0 2026-03-10T12:37:57.992 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:57 vm00.local ceph-mon[50686]: pgmap v164: 65 pgs: 65 active+clean; 2.2 GiB data, 7.8 GiB used, 112 GiB / 120 GiB avail; 49 MiB/s rd, 139 MiB/s wr, 300 op/s 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:9/593: getdents d0/d3d 0 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:3/594: getdents dd/d18/d13/d1d/d43/d55 0 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:6/383: getdents d2/d51/d7b 0 2026-03-10T12:37:57.992 INFO:tasks.workunit.client.0.vm00.stdout:0/496: read d3/d22/f2e [1233819,22150] 0 2026-03-10T12:37:57.993 INFO:tasks.workunit.client.0.vm00.stdout:3/595: read dd/d18/d13/f22 [2701123,120623] 0 2026-03-10T12:37:57.997 INFO:tasks.workunit.client.0.vm00.stdout:3/596: dwrite dd/d64/f87 [0,4194304] 0 2026-03-10T12:37:58.002 INFO:tasks.workunit.client.0.vm00.stdout:6/384: creat d2/d14/d7a/f8d x:0 0 0 2026-03-10T12:37:58.004 INFO:tasks.workunit.client.0.vm00.stdout:3/597: dread dd/d64/f87 [0,4194304] 0 2026-03-10T12:37:58.009 INFO:tasks.workunit.client.0.vm00.stdout:9/594: getdents d0/d7f/db8 0 2026-03-10T12:37:58.017 INFO:tasks.workunit.client.0.vm00.stdout:3/598: creat dd/d4e/d6a/fc5 x:0 0 0 2026-03-10T12:37:58.018 INFO:tasks.workunit.client.0.vm00.stdout:3/599: mkdir dd/d18/d13/d1d/dc6 0 2026-03-10T12:37:58.018 INFO:tasks.workunit.client.0.vm00.stdout:4/517: fsync df/d1f/d22/d26/d65/d91/f50 0 2026-03-10T12:37:58.018 INFO:tasks.workunit.client.0.vm00.stdout:4/518: stat df/d57 0 2026-03-10T12:37:58.018 INFO:tasks.workunit.client.0.vm00.stdout:4/519: readlink df/d1f/l35 0 2026-03-10T12:37:58.018 INFO:tasks.workunit.client.0.vm00.stdout:3/600: dwrite dd/d64/f98 [0,4194304] 0 2026-03-10T12:37:58.018 INFO:tasks.workunit.client.0.vm00.stdout:4/520: chown df/d63/d94/f96 1593 1 2026-03-10T12:37:58.019 INFO:tasks.workunit.client.0.vm00.stdout:0/497: sync 2026-03-10T12:37:58.019 INFO:tasks.workunit.client.0.vm00.stdout:3/601: unlink dd/d18/d13/d1d/c4f 0 2026-03-10T12:37:58.020 INFO:tasks.workunit.client.0.vm00.stdout:4/521: truncate df/d1f/d22/d26/d65/d91/f50 3411709 0 2026-03-10T12:37:58.022 INFO:tasks.workunit.client.0.vm00.stdout:4/522: write df/f20 [134208,27694] 0 2026-03-10T12:37:58.028 INFO:tasks.workunit.client.0.vm00.stdout:4/523: sync 2026-03-10T12:37:58.030 INFO:tasks.workunit.client.0.vm00.stdout:4/524: dread - df/d1f/d36/d3a/d41/f5e zero size 2026-03-10T12:37:58.032 INFO:tasks.workunit.client.0.vm00.stdout:4/525: mknod df/d1f/d22/d26/d65/d91/da2/cb3 0 2026-03-10T12:37:58.040 INFO:tasks.workunit.client.0.vm00.stdout:6/385: dread d2/da/f11 [0,4194304] 0 2026-03-10T12:37:58.043 INFO:tasks.workunit.client.0.vm00.stdout:0/498: creat d3/d40/d65/fa8 x:0 0 0 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:8/424: truncate d0/d12/d36/d5b/f69 68374 0 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:0/499: dwrite d3/f4 [0,4194304] 0 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:8/425: creat d0/dd/d38/d81/f88 x:0 0 0 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:8/426: dread d0/d12/d2d/f55 [0,4194304] 0 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:6/386: dwrite d2/da/dc/d2f/f56 [0,4194304] 0 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:8/427: readlink d0/d46/d6e/l71 0 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:6/387: chown d2/d16/d29/d31 6864834 1 2026-03-10T12:37:58.059 INFO:tasks.workunit.client.0.vm00.stdout:0/500: unlink d3/d7/d4c/d5b/f5f 0 2026-03-10T12:37:58.065 INFO:tasks.workunit.client.0.vm00.stdout:8/428: dwrite d0/d58/d68/f74 [0,4194304] 0 2026-03-10T12:37:58.071 INFO:tasks.workunit.client.0.vm00.stdout:8/429: write d0/f8 [4846141,77691] 0 2026-03-10T12:37:58.071 INFO:tasks.workunit.client.1.vm07.stdout:6/457: creat d1/d4/d6/d43/f90 x:0 0 0 2026-03-10T12:37:58.071 INFO:tasks.workunit.client.1.vm07.stdout:6/458: write d1/d4/f19 [380486,35776] 0 2026-03-10T12:37:58.074 INFO:tasks.workunit.client.0.vm00.stdout:6/388: dread d2/d16/d29/f4c [0,4194304] 0 2026-03-10T12:37:58.076 INFO:tasks.workunit.client.0.vm00.stdout:6/389: stat d2/da/dc/d2f 0 2026-03-10T12:37:58.092 INFO:tasks.workunit.client.1.vm07.stdout:4/619: creat d0/d4/d10/d8d/db2/dd5/fd8 x:0 0 0 2026-03-10T12:37:58.103 INFO:tasks.workunit.client.0.vm00.stdout:6/390: symlink d2/d16/d29/d31/d88/l8e 0 2026-03-10T12:37:58.112 INFO:tasks.workunit.client.1.vm07.stdout:2/396: write d0/f4a [1665427,20716] 0 2026-03-10T12:37:58.119 INFO:tasks.workunit.client.1.vm07.stdout:1/478: mkdir d9/df/d29/d2b/d92/d9d 0 2026-03-10T12:37:58.132 INFO:tasks.workunit.client.0.vm00.stdout:6/391: sync 2026-03-10T12:37:58.133 INFO:tasks.workunit.client.1.vm07.stdout:4/620: creat d0/d5c/d7c/fd9 x:0 0 0 2026-03-10T12:37:58.140 INFO:tasks.workunit.client.1.vm07.stdout:0/548: creat d0/d14/d5f/fb3 x:0 0 0 2026-03-10T12:37:58.140 INFO:tasks.workunit.client.1.vm07.stdout:0/549: chown d0/d14/d5f/d76/d2f/d31/d4f/d9d 3 1 2026-03-10T12:37:58.142 INFO:tasks.workunit.client.1.vm07.stdout:7/475: rename d0/f37 to d0/d57/f9f 0 2026-03-10T12:37:58.146 INFO:tasks.workunit.client.0.vm00.stdout:6/392: mknod d2/d16/d74/c8f 0 2026-03-10T12:37:58.148 INFO:tasks.workunit.client.1.vm07.stdout:2/397: dread - d0/f73 zero size 2026-03-10T12:37:58.151 INFO:tasks.workunit.client.1.vm07.stdout:5/527: creat d0/d22/d18/d19/d2e/da9/fb5 x:0 0 0 2026-03-10T12:37:58.157 INFO:tasks.workunit.client.1.vm07.stdout:1/479: dread - d9/df/d29/d2b/d31/d91/d59/f68 zero size 2026-03-10T12:37:58.160 INFO:tasks.workunit.client.0.vm00.stdout:6/393: symlink d2/da/dc/d83/l90 0 2026-03-10T12:37:58.160 INFO:tasks.workunit.client.0.vm00.stdout:6/394: chown d2/d39/f4a 2 1 2026-03-10T12:37:58.162 INFO:tasks.workunit.client.1.vm07.stdout:1/480: dwrite d9/d2d/d80/f8d [0,4194304] 0 2026-03-10T12:37:58.164 INFO:tasks.workunit.client.1.vm07.stdout:3/536: creat dc/dd/fbc x:0 0 0 2026-03-10T12:37:58.174 INFO:tasks.workunit.client.0.vm00.stdout:4/526: readlink df/d1f/d22/d26/d65/d91/lb2 0 2026-03-10T12:37:58.175 INFO:tasks.workunit.client.1.vm07.stdout:3/537: dwrite dc/dd/d43/d5c/fa9 [0,4194304] 0 2026-03-10T12:37:58.179 INFO:tasks.workunit.client.0.vm00.stdout:4/527: rename df/d6c/f71 to df/d1f/d22/d26/d70/fb4 0 2026-03-10T12:37:58.179 INFO:tasks.workunit.client.0.vm00.stdout:4/528: readlink df/d1f/d36/d3a/d41/l34 0 2026-03-10T12:37:58.188 INFO:tasks.workunit.client.0.vm00.stdout:4/529: dread df/d1f/d22/f7d [0,4194304] 0 2026-03-10T12:37:58.190 INFO:tasks.workunit.client.0.vm00.stdout:4/530: truncate df/d1f/d22/d26/d65/f8e 825750 0 2026-03-10T12:37:58.191 INFO:tasks.workunit.client.0.vm00.stdout:7/391: dread f1 [4194304,4194304] 0 2026-03-10T12:37:58.192 INFO:tasks.workunit.client.0.vm00.stdout:7/392: readlink da/d26/d37/d61/l70 0 2026-03-10T12:37:58.192 INFO:tasks.workunit.client.0.vm00.stdout:4/531: rename df/l18 to df/d8a/lb5 0 2026-03-10T12:37:58.196 INFO:tasks.workunit.client.1.vm07.stdout:0/550: fdatasync d0/d14/d5f/d41/f77 0 2026-03-10T12:37:58.200 INFO:tasks.workunit.client.1.vm07.stdout:7/476: mkdir d0/d47/da0 0 2026-03-10T12:37:58.200 INFO:tasks.workunit.client.1.vm07.stdout:7/477: dread - d0/d61/d79/f8d zero size 2026-03-10T12:37:58.206 INFO:tasks.workunit.client.1.vm07.stdout:2/398: chown d0/f18 59950897 1 2026-03-10T12:37:58.212 INFO:tasks.workunit.client.0.vm00.stdout:4/532: sync 2026-03-10T12:37:58.222 INFO:tasks.workunit.client.0.vm00.stdout:2/528: dwrite d4/dd/d38/f5a [0,4194304] 0 2026-03-10T12:37:58.225 INFO:tasks.workunit.client.0.vm00.stdout:2/529: fsync d4/d53/d68/fb1 0 2026-03-10T12:37:58.225 INFO:tasks.workunit.client.0.vm00.stdout:5/561: write d1f/d26/d2b/d37/f77 [3580073,116501] 0 2026-03-10T12:37:58.226 INFO:tasks.workunit.client.0.vm00.stdout:2/530: chown d4/dd/c25 471 1 2026-03-10T12:37:58.227 INFO:tasks.workunit.client.0.vm00.stdout:2/531: fsync d4/d6/d2d/f3d 0 2026-03-10T12:37:58.228 INFO:tasks.workunit.client.0.vm00.stdout:9/595: dwrite d0/d3d/d59/d4e/dba/fb9 [0,4194304] 0 2026-03-10T12:37:58.232 INFO:tasks.workunit.client.0.vm00.stdout:2/532: chown d4/d53/d76/d9b/dad/f50 688852 1 2026-03-10T12:37:58.235 INFO:tasks.workunit.client.0.vm00.stdout:3/602: dwrite fb [0,4194304] 0 2026-03-10T12:37:58.243 INFO:tasks.workunit.client.1.vm07.stdout:8/487: write d1/d3/d11/f35 [504865,90941] 0 2026-03-10T12:37:58.248 INFO:tasks.workunit.client.1.vm07.stdout:9/551: truncate d5/d13/d2c/f41 986076 0 2026-03-10T12:37:58.248 INFO:tasks.workunit.client.0.vm00.stdout:5/562: creat d1f/d96/dbd/fc5 x:0 0 0 2026-03-10T12:37:58.249 INFO:tasks.workunit.client.0.vm00.stdout:7/393: fdatasync da/d41/f4b 0 2026-03-10T12:37:58.249 INFO:tasks.workunit.client.0.vm00.stdout:7/394: readlink da/d3f/l8f 0 2026-03-10T12:37:58.250 INFO:tasks.workunit.client.1.vm07.stdout:9/552: chown d5/d1f/d7d 1636507 1 2026-03-10T12:37:58.255 INFO:tasks.workunit.client.0.vm00.stdout:5/563: creat d1f/d26/d2b/d35/d53/d72/d9d/d90/fc6 x:0 0 0 2026-03-10T12:37:58.256 INFO:tasks.workunit.client.1.vm07.stdout:6/459: creat d1/d4/d6/f91 x:0 0 0 2026-03-10T12:37:58.259 INFO:tasks.workunit.client.0.vm00.stdout:2/533: link d4/d6/d41/d6d/l84 d4/d53/lb2 0 2026-03-10T12:37:58.272 INFO:tasks.workunit.client.1.vm07.stdout:6/460: dwrite d1/d4/d6/f80 [0,4194304] 0 2026-03-10T12:37:58.272 INFO:tasks.workunit.client.0.vm00.stdout:2/534: write d4/d6/f89 [522948,122679] 0 2026-03-10T12:37:58.272 INFO:tasks.workunit.client.0.vm00.stdout:2/535: symlink d4/dd/lb3 0 2026-03-10T12:37:58.275 INFO:tasks.workunit.client.0.vm00.stdout:7/395: sync 2026-03-10T12:37:58.276 INFO:tasks.workunit.client.1.vm07.stdout:6/461: dwrite d1/d4/d6/d43/d65/f7f [0,4194304] 0 2026-03-10T12:37:58.280 INFO:tasks.workunit.client.1.vm07.stdout:6/462: dread - d1/d4/d6/d16/d1a/d33/f74 zero size 2026-03-10T12:37:58.289 INFO:tasks.workunit.client.1.vm07.stdout:0/551: mknod d0/d83/cb4 0 2026-03-10T12:37:58.294 INFO:tasks.workunit.client.1.vm07.stdout:2/399: creat d0/d42/d4e/d77/f89 x:0 0 0 2026-03-10T12:37:58.300 INFO:tasks.workunit.client.0.vm00.stdout:5/564: dread d1f/d26/d2e/f8c [0,4194304] 0 2026-03-10T12:37:58.302 INFO:tasks.workunit.client.1.vm07.stdout:5/528: mkdir d0/d22/d18/d3e/d5d/db6 0 2026-03-10T12:37:58.304 INFO:tasks.workunit.client.0.vm00.stdout:5/565: link d1f/d26/d6f/f9b d1f/d26/d2b/d35/d78/fc7 0 2026-03-10T12:37:58.304 INFO:tasks.workunit.client.1.vm07.stdout:8/488: truncate d1/f2 8213324 0 2026-03-10T12:37:58.305 INFO:tasks.workunit.client.0.vm00.stdout:5/566: creat d1f/d26/d2b/d35/d53/d72/d9d/d90/fc8 x:0 0 0 2026-03-10T12:37:58.306 INFO:tasks.workunit.client.0.vm00.stdout:5/567: read - d1f/d39/f5f zero size 2026-03-10T12:37:58.307 INFO:tasks.workunit.client.0.vm00.stdout:6/395: write d2/d39/f4a [2847126,122880] 0 2026-03-10T12:37:58.312 INFO:tasks.workunit.client.1.vm07.stdout:1/481: symlink d9/df/d29/l9e 0 2026-03-10T12:37:58.312 INFO:tasks.workunit.client.0.vm00.stdout:6/396: dwrite d2/d14/d7a/f8d [0,4194304] 0 2026-03-10T12:37:58.317 INFO:tasks.workunit.client.0.vm00.stdout:8/430: dwrite d0/d12/d36/f41 [0,4194304] 0 2026-03-10T12:37:58.320 INFO:tasks.workunit.client.0.vm00.stdout:6/397: unlink d2/d14/f2e 0 2026-03-10T12:37:58.321 INFO:tasks.workunit.client.0.vm00.stdout:8/431: rmdir d0/d12/d2d 39 2026-03-10T12:37:58.321 INFO:tasks.workunit.client.1.vm07.stdout:3/538: mknod dc/dd/d43/d76/daf/cbd 0 2026-03-10T12:37:58.322 INFO:tasks.workunit.client.1.vm07.stdout:3/539: readlink dc/dd/lad 0 2026-03-10T12:37:58.323 INFO:tasks.workunit.client.1.vm07.stdout:3/540: write dc/d18/d99/da3/fb1 [3829205,83182] 0 2026-03-10T12:37:58.330 INFO:tasks.workunit.client.1.vm07.stdout:4/621: link d0/d4/d10/d5f/l8b d0/d4/d5/d78/lda 0 2026-03-10T12:37:58.347 INFO:tasks.workunit.client.0.vm00.stdout:6/398: unlink d2/d16/f19 0 2026-03-10T12:37:58.348 INFO:tasks.workunit.client.0.vm00.stdout:6/399: stat d2/d16/l65 0 2026-03-10T12:37:58.350 INFO:tasks.workunit.client.1.vm07.stdout:6/463: creat d1/d4/d6/d16/d1a/d33/f92 x:0 0 0 2026-03-10T12:37:58.352 INFO:tasks.workunit.client.0.vm00.stdout:4/533: truncate df/f19 156683 0 2026-03-10T12:37:58.355 INFO:tasks.workunit.client.1.vm07.stdout:0/552: chown d0/d14/d5f/d76/ca5 580312 1 2026-03-10T12:37:58.355 INFO:tasks.workunit.client.0.vm00.stdout:9/596: dwrite d0/f21 [4194304,4194304] 0 2026-03-10T12:37:58.358 INFO:tasks.workunit.client.1.vm07.stdout:9/553: dread d5/d13/d57/d4f/f63 [0,4194304] 0 2026-03-10T12:37:58.358 INFO:tasks.workunit.client.0.vm00.stdout:3/603: truncate dd/d64/fb5 1040546 0 2026-03-10T12:37:58.361 INFO:tasks.workunit.client.0.vm00.stdout:9/597: fsync d0/d3d/d59/d74/faa 0 2026-03-10T12:37:58.361 INFO:tasks.workunit.client.1.vm07.stdout:0/553: dwrite d0/d14/d5f/d76/d2f/d31/d79/d9e/fb1 [0,4194304] 0 2026-03-10T12:37:58.362 INFO:tasks.workunit.client.0.vm00.stdout:3/604: mknod dd/d4e/cc7 0 2026-03-10T12:37:58.365 INFO:tasks.workunit.client.0.vm00.stdout:9/598: rmdir d0/d5 39 2026-03-10T12:37:58.376 INFO:tasks.workunit.client.0.vm00.stdout:9/599: dread d0/d3d/d59/d4e/dba/d1e/d27/f75 [0,4194304] 0 2026-03-10T12:37:58.376 INFO:tasks.workunit.client.0.vm00.stdout:0/501: write d3/d7/d4c/d5b/f56 [2367298,129614] 0 2026-03-10T12:37:58.376 INFO:tasks.workunit.client.1.vm07.stdout:5/529: symlink d0/d22/d18/lb7 0 2026-03-10T12:37:58.383 INFO:tasks.workunit.client.1.vm07.stdout:8/489: mknod d1/d3/d6/d50/d70/c9d 0 2026-03-10T12:37:58.385 INFO:tasks.workunit.client.1.vm07.stdout:8/490: rename d1/d3/d6/d50/d70 to d1/d3/d6/d50/d70/d9e 22 2026-03-10T12:37:58.386 INFO:tasks.workunit.client.1.vm07.stdout:0/554: dread d0/f1d [0,4194304] 0 2026-03-10T12:37:58.393 INFO:tasks.workunit.client.0.vm00.stdout:8/432: dread d0/f56 [0,4194304] 0 2026-03-10T12:37:58.398 INFO:tasks.workunit.client.1.vm07.stdout:4/622: sync 2026-03-10T12:37:58.399 INFO:tasks.workunit.client.0.vm00.stdout:8/433: write d0/dd/d38/f3d [848760,130151] 0 2026-03-10T12:37:58.399 INFO:tasks.workunit.client.0.vm00.stdout:8/434: mkdir d0/d46/d89 0 2026-03-10T12:37:58.399 INFO:tasks.workunit.client.0.vm00.stdout:8/435: creat d0/d46/d7e/f8a x:0 0 0 2026-03-10T12:37:58.399 INFO:tasks.workunit.client.0.vm00.stdout:8/436: symlink d0/d12/d43/l8b 0 2026-03-10T12:37:58.400 INFO:tasks.workunit.client.1.vm07.stdout:1/482: rmdir d9/df/d29/d6b 39 2026-03-10T12:37:58.400 INFO:tasks.workunit.client.1.vm07.stdout:3/541: mknod dc/d18/d24/cbe 0 2026-03-10T12:37:58.401 INFO:tasks.workunit.client.1.vm07.stdout:2/400: creat d0/d42/d4e/d77/d70/f8a x:0 0 0 2026-03-10T12:37:58.402 INFO:tasks.workunit.client.1.vm07.stdout:3/542: dread - dc/dd/d28/d7a/fba zero size 2026-03-10T12:37:58.403 INFO:tasks.workunit.client.1.vm07.stdout:8/491: creat d1/d3/d11/f9f x:0 0 0 2026-03-10T12:37:58.403 INFO:tasks.workunit.client.0.vm00.stdout:8/437: dwrite d0/d12/d60/f7f [0,4194304] 0 2026-03-10T12:37:58.404 INFO:tasks.workunit.client.1.vm07.stdout:6/464: dwrite d1/d4/d6/d16/d49/f67 [4194304,4194304] 0 2026-03-10T12:37:58.407 INFO:tasks.workunit.client.1.vm07.stdout:6/465: truncate d1/d4/d6/d16/d1a/d33/f7b 456359 0 2026-03-10T12:37:58.408 INFO:tasks.workunit.client.1.vm07.stdout:4/623: creat d0/d4/d10/d3c/d2b/d2d/da7/fdb x:0 0 0 2026-03-10T12:37:58.409 INFO:tasks.workunit.client.1.vm07.stdout:0/555: read d0/d14/d5f/d76/d2f/d31/d4f/f70 [4091607,53195] 0 2026-03-10T12:37:58.410 INFO:tasks.workunit.client.0.vm00.stdout:8/438: write d0/d12/d2d/f6f [2687918,99135] 0 2026-03-10T12:37:58.415 INFO:tasks.workunit.client.1.vm07.stdout:1/483: fsync d9/df/d29/d2b/d3d/f47 0 2026-03-10T12:37:58.421 INFO:tasks.workunit.client.0.vm00.stdout:2/536: rename d4/d6/d41 to d4/d6/d2d/db4 0 2026-03-10T12:37:58.421 INFO:tasks.workunit.client.1.vm07.stdout:9/554: getdents d5/d13/d9d 0 2026-03-10T12:37:58.428 INFO:tasks.workunit.client.1.vm07.stdout:8/492: sync 2026-03-10T12:37:58.430 INFO:tasks.workunit.client.1.vm07.stdout:3/543: mkdir dc/dd/d1f/d45/dbf 0 2026-03-10T12:37:58.434 INFO:tasks.workunit.client.0.vm00.stdout:2/537: symlink d4/d6/d2d/lb5 0 2026-03-10T12:37:58.439 INFO:tasks.workunit.client.0.vm00.stdout:5/568: rmdir d1f/d26/d6f 39 2026-03-10T12:37:58.441 INFO:tasks.workunit.client.0.vm00.stdout:7/396: rename da/d41/f8e to da/d25/d2c/f98 0 2026-03-10T12:37:58.443 INFO:tasks.workunit.client.1.vm07.stdout:6/466: mknod d1/d4/d6/d43/c93 0 2026-03-10T12:37:58.446 INFO:tasks.workunit.client.0.vm00.stdout:3/605: dwrite dd/d27/d2c/d34/d38/f48 [0,4194304] 0 2026-03-10T12:37:58.449 INFO:tasks.workunit.client.1.vm07.stdout:4/624: symlink d0/d4/d10/d3c/d2b/d54/ldc 0 2026-03-10T12:37:58.452 INFO:tasks.workunit.client.1.vm07.stdout:3/544: dread dc/dd/d28/f67 [0,4194304] 0 2026-03-10T12:37:58.458 INFO:tasks.workunit.client.0.vm00.stdout:4/534: truncate f8 2645600 0 2026-03-10T12:37:58.458 INFO:tasks.workunit.client.0.vm00.stdout:3/606: symlink dd/d18/d13/d99/lc8 0 2026-03-10T12:37:58.458 INFO:tasks.workunit.client.0.vm00.stdout:6/400: rename d2/da/l52 to d2/d39/l91 0 2026-03-10T12:37:58.458 INFO:tasks.workunit.client.0.vm00.stdout:9/600: rename d0/d7f to d0/d7f/d88/dd4 22 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.1.vm07.stdout:0/556: rmdir d0/d14/d5f/d41/d6a/d74 39 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.1.vm07.stdout:0/557: chown d0/d14/d5f/d76/da1 383902332 1 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.1.vm07.stdout:0/558: fdatasync d0/d14/d5f/d76/f8a 0 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.1.vm07.stdout:0/559: fdatasync d0/d14/d5f/d76/d2f/d31/f4d 0 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.1.vm07.stdout:1/484: mkdir d9/df/d55/d9f 0 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.1.vm07.stdout:1/485: write d9/df/d29/d2b/d31/d91/d59/f73 [617977,15478] 0 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.1.vm07.stdout:1/486: chown d9/df/d54/f57 3558 1 2026-03-10T12:37:58.459 INFO:tasks.workunit.client.0.vm00.stdout:4/535: rmdir df/d32/d76 39 2026-03-10T12:37:58.465 INFO:tasks.workunit.client.0.vm00.stdout:8/439: rename d0/d12/d36/d51/f61 to d0/d58/f8c 0 2026-03-10T12:37:58.469 INFO:tasks.workunit.client.0.vm00.stdout:7/397: dread da/d1b/f1e [0,4194304] 0 2026-03-10T12:37:58.469 INFO:tasks.workunit.client.0.vm00.stdout:9/601: unlink d0/d7f/db8/dc4/l55 0 2026-03-10T12:37:58.469 INFO:tasks.workunit.client.1.vm07.stdout:8/493: fdatasync d1/f79 0 2026-03-10T12:37:58.471 INFO:tasks.workunit.client.0.vm00.stdout:4/536: write df/d63/d77/f8f [581244,87450] 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.1.vm07.stdout:6/467: rename d1/d4/d6/d43/f73 to d1/d4/d4a/f94 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.1.vm07.stdout:6/468: dwrite d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.1.vm07.stdout:7/478: truncate d0/d47/d48/f9e 1513024 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.1.vm07.stdout:3/545: creat dc/fc0 x:0 0 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:3/607: creat dd/d18/d13/d1d/fc9 x:0 0 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:3/608: dwrite dd/d18/d14/fbe [0,4194304] 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:7/398: mknod da/d26/d37/c99 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:2/538: rename d4/d53/d68/c90 to d4/d6/cb6 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:4/537: truncate df/d1f/d22/f4c 711912 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:4/538: write df/d1f/d36/f6f [3398411,67365] 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:3/609: getdents dd/d18/d13/d1d/d43 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:4/539: mkdir df/d63/d94/db6 0 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:4/540: chown df/d1f/d22 17536 1 2026-03-10T12:37:58.489 INFO:tasks.workunit.client.0.vm00.stdout:0/502: dwrite d3/d7/f70 [4194304,4194304] 0 2026-03-10T12:37:58.490 INFO:tasks.workunit.client.0.vm00.stdout:4/541: write df/d1f/d36/f92 [182572,98798] 0 2026-03-10T12:37:58.507 INFO:tasks.workunit.client.1.vm07.stdout:8/494: unlink d1/d3/d5d/l76 0 2026-03-10T12:37:58.508 INFO:tasks.workunit.client.1.vm07.stdout:1/487: fdatasync d9/df/d54/f7a 0 2026-03-10T12:37:58.508 INFO:tasks.workunit.client.1.vm07.stdout:1/488: dread - d9/df/f96 zero size 2026-03-10T12:37:58.512 INFO:tasks.workunit.client.0.vm00.stdout:4/542: fsync df/d32/d76/f7e 0 2026-03-10T12:37:58.513 INFO:tasks.workunit.client.1.vm07.stdout:6/469: rename d1/d4/d6/d16/d49/l5c to d1/d4/d6/d43/d65/l95 0 2026-03-10T12:37:58.515 INFO:tasks.workunit.client.1.vm07.stdout:6/470: write d1/d4/d6/d16/d1a/d2c/f78 [2549804,82113] 0 2026-03-10T12:37:58.523 INFO:tasks.workunit.client.1.vm07.stdout:7/479: mkdir d0/d57/d62/d90/da1 0 2026-03-10T12:37:58.534 INFO:tasks.workunit.client.0.vm00.stdout:9/602: sync 2026-03-10T12:37:58.556 INFO:tasks.workunit.client.1.vm07.stdout:2/401: getdents d0 0 2026-03-10T12:37:58.566 INFO:tasks.workunit.client.1.vm07.stdout:6/471: dread d1/d4/d44/f45 [0,4194304] 0 2026-03-10T12:37:58.570 INFO:tasks.workunit.client.1.vm07.stdout:5/530: dwrite d0/d22/f16 [0,4194304] 0 2026-03-10T12:37:58.582 INFO:tasks.workunit.client.1.vm07.stdout:3/546: rename dc/d18/d24/f2c to dc/dd/d28/d3b/fc1 0 2026-03-10T12:37:58.583 INFO:tasks.workunit.client.1.vm07.stdout:3/547: chown dc/dd/d43/d76 3913867 1 2026-03-10T12:37:58.589 INFO:tasks.workunit.client.1.vm07.stdout:3/548: fsync dc/dd/d43/d76/d95/da0/fa2 0 2026-03-10T12:37:58.637 INFO:tasks.workunit.client.1.vm07.stdout:2/402: symlink d0/d42/d4e/l8b 0 2026-03-10T12:37:58.661 INFO:tasks.workunit.client.1.vm07.stdout:5/531: rmdir d0 39 2026-03-10T12:37:58.666 INFO:tasks.workunit.client.1.vm07.stdout:1/489: creat d9/d2d/d80/d8e/fa0 x:0 0 0 2026-03-10T12:37:58.666 INFO:tasks.workunit.client.1.vm07.stdout:1/490: truncate d9/df/d29/f70 1292274 0 2026-03-10T12:37:58.676 INFO:tasks.workunit.client.0.vm00.stdout:1/532: dread da/d12/f20 [0,4194304] 0 2026-03-10T12:37:58.677 INFO:tasks.workunit.client.0.vm00.stdout:1/533: creat da/d12/d91/fb5 x:0 0 0 2026-03-10T12:37:58.678 INFO:tasks.workunit.client.0.vm00.stdout:1/534: read - da/d21/d27/d6a/f6b zero size 2026-03-10T12:37:58.686 INFO:tasks.workunit.client.0.vm00.stdout:5/569: write d1f/d26/f28 [981950,17026] 0 2026-03-10T12:37:58.687 INFO:tasks.workunit.client.0.vm00.stdout:5/570: fdatasync d1f/d26/d2b/d35/d53/d72/d9d/d8e/fb6 0 2026-03-10T12:37:58.691 INFO:tasks.workunit.client.0.vm00.stdout:8/440: read - d0/d58/f8c zero size 2026-03-10T12:37:58.692 INFO:tasks.workunit.client.0.vm00.stdout:6/401: dwrite d2/d51/f63 [0,4194304] 0 2026-03-10T12:37:58.697 INFO:tasks.workunit.client.0.vm00.stdout:5/571: rename d1f/d26/d2b/d35/d53/d72/d9d/d90 to d1f/d6a/d94/dc9 0 2026-03-10T12:37:58.702 INFO:tasks.workunit.client.0.vm00.stdout:2/539: write d4/d53/f7d [173507,82871] 0 2026-03-10T12:37:58.703 INFO:tasks.workunit.client.0.vm00.stdout:2/540: read - d4/d53/d68/f69 zero size 2026-03-10T12:37:58.703 INFO:tasks.workunit.client.1.vm07.stdout:9/555: dwrite d5/f91 [0,4194304] 0 2026-03-10T12:37:58.704 INFO:tasks.workunit.client.1.vm07.stdout:9/556: readlink d5/d13/d6c/d7a/l9c 0 2026-03-10T12:37:58.704 INFO:tasks.workunit.client.0.vm00.stdout:2/541: stat d4/dd/d63/l6b 0 2026-03-10T12:37:58.708 INFO:tasks.workunit.client.0.vm00.stdout:3/610: dwrite dd/d27/d2c/fb1 [0,4194304] 0 2026-03-10T12:37:58.711 INFO:tasks.workunit.client.0.vm00.stdout:1/535: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:37:58.713 INFO:tasks.workunit.client.0.vm00.stdout:1/536: dread - da/d21/d39/f92 zero size 2026-03-10T12:37:58.714 INFO:tasks.workunit.client.0.vm00.stdout:4/543: dwrite df/d1f/d36/d3a/d41/f47 [0,4194304] 0 2026-03-10T12:37:58.714 INFO:tasks.workunit.client.0.vm00.stdout:1/537: chown da/d24/d28/d67/c5f 13 1 2026-03-10T12:37:58.716 INFO:tasks.workunit.client.0.vm00.stdout:4/544: chown df/d63 1385 1 2026-03-10T12:37:58.717 INFO:tasks.workunit.client.1.vm07.stdout:2/403: fsync d0/d42/d26/d7d/f7f 0 2026-03-10T12:37:58.719 INFO:tasks.workunit.client.0.vm00.stdout:1/538: dread da/d12/f99 [0,4194304] 0 2026-03-10T12:37:58.724 INFO:tasks.workunit.client.0.vm00.stdout:7/399: dwrite da/f13 [0,4194304] 0 2026-03-10T12:37:58.725 INFO:tasks.workunit.client.0.vm00.stdout:8/441: rename d0/d12/d36/c3c to d0/dd/d38/d81/c8d 0 2026-03-10T12:37:58.726 INFO:tasks.workunit.client.0.vm00.stdout:8/442: write d0/d12/d2d/f6f [3770967,114384] 0 2026-03-10T12:37:58.726 INFO:tasks.workunit.client.0.vm00.stdout:5/572: symlink d1f/d26/d2e/lca 0 2026-03-10T12:37:58.727 INFO:tasks.workunit.client.0.vm00.stdout:5/573: readlink d1f/d39/l40 0 2026-03-10T12:37:58.728 INFO:tasks.workunit.client.1.vm07.stdout:0/560: write d0/f15 [2978755,48847] 0 2026-03-10T12:37:58.728 INFO:tasks.workunit.client.1.vm07.stdout:0/561: fdatasync d0/d14/d5f/f54 0 2026-03-10T12:37:58.732 INFO:tasks.workunit.client.0.vm00.stdout:2/542: mknod d4/d6/cb7 0 2026-03-10T12:37:58.733 INFO:tasks.workunit.client.1.vm07.stdout:5/532: fsync d0/d22/d18/d19/d36/f3d 0 2026-03-10T12:37:58.735 INFO:tasks.workunit.client.0.vm00.stdout:7/400: creat da/d26/d37/d56/f9a x:0 0 0 2026-03-10T12:37:58.738 INFO:tasks.workunit.client.0.vm00.stdout:7/401: dwrite da/d1b/d40/f7d [0,4194304] 0 2026-03-10T12:37:58.744 INFO:tasks.workunit.client.1.vm07.stdout:4/625: dwrite d0/d4/d5/d34/fa3 [0,4194304] 0 2026-03-10T12:37:58.744 INFO:tasks.workunit.client.0.vm00.stdout:4/545: mkdir df/d57/db7 0 2026-03-10T12:37:58.744 INFO:tasks.workunit.client.0.vm00.stdout:4/546: readlink df/d1f/d36/d3a/d41/l46 0 2026-03-10T12:37:58.744 INFO:tasks.workunit.client.0.vm00.stdout:4/547: stat df/d1f/d22/d26/d65/da7 0 2026-03-10T12:37:58.746 INFO:tasks.workunit.client.1.vm07.stdout:2/404: dwrite d0/d42/d26/f2e [4194304,4194304] 0 2026-03-10T12:37:58.747 INFO:tasks.workunit.client.1.vm07.stdout:2/405: chown d0/d42/d26/f3e 803 1 2026-03-10T12:37:58.748 INFO:tasks.workunit.client.0.vm00.stdout:3/611: dwrite dd/d18/d13/f22 [0,4194304] 0 2026-03-10T12:37:58.749 INFO:tasks.workunit.client.0.vm00.stdout:5/574: mkdir d1f/d26/d2b/d35/d53/d72/d9d/dcb 0 2026-03-10T12:37:58.749 INFO:tasks.workunit.client.0.vm00.stdout:8/443: chown d0/c2f 114 1 2026-03-10T12:37:58.750 INFO:tasks.workunit.client.0.vm00.stdout:0/503: write d3/d7/d4c/d5b/f37 [8151956,101113] 0 2026-03-10T12:37:58.750 INFO:tasks.workunit.client.0.vm00.stdout:5/575: stat d1f/d26/d2e/d58 0 2026-03-10T12:37:58.753 INFO:tasks.workunit.client.0.vm00.stdout:8/444: truncate d0/d12/d2d/d49/f84 1473255 0 2026-03-10T12:37:58.754 INFO:tasks.workunit.client.0.vm00.stdout:6/402: dread d2/d14/f3f [0,4194304] 0 2026-03-10T12:37:58.754 INFO:tasks.workunit.client.1.vm07.stdout:2/406: dwrite d0/f4a [0,4194304] 0 2026-03-10T12:37:58.756 INFO:tasks.workunit.client.1.vm07.stdout:2/407: chown d0/d42/d4e/d77/d70 32644 1 2026-03-10T12:37:58.760 INFO:tasks.workunit.client.0.vm00.stdout:7/402: symlink da/d25/d2e/l9b 0 2026-03-10T12:37:58.760 INFO:tasks.workunit.client.0.vm00.stdout:7/403: write da/d3f/d60/f85 [47281,44678] 0 2026-03-10T12:37:58.761 INFO:tasks.workunit.client.0.vm00.stdout:7/404: chown da/d1b 24020 1 2026-03-10T12:37:58.761 INFO:tasks.workunit.client.0.vm00.stdout:7/405: read da/d25/d2e/d4c/f92 [3693637,33184] 0 2026-03-10T12:37:58.762 INFO:tasks.workunit.client.0.vm00.stdout:7/406: chown da/d25/d2c/d82/d68/f38 2367 1 2026-03-10T12:37:58.762 INFO:tasks.workunit.client.0.vm00.stdout:7/407: stat da/d41/f72 0 2026-03-10T12:37:58.762 INFO:tasks.workunit.client.0.vm00.stdout:2/543: symlink d4/d6/lb8 0 2026-03-10T12:37:58.767 INFO:tasks.workunit.client.0.vm00.stdout:0/504: stat d3/d7/d3c/l20 0 2026-03-10T12:37:58.767 INFO:tasks.workunit.client.1.vm07.stdout:6/472: write d1/d4/d6/d16/d1a/d33/f74 [541420,67414] 0 2026-03-10T12:37:58.768 INFO:tasks.workunit.client.0.vm00.stdout:7/408: dwrite da/d26/d37/f96 [0,4194304] 0 2026-03-10T12:37:58.771 INFO:tasks.workunit.client.1.vm07.stdout:8/495: dwrite d1/d3/f1f [4194304,4194304] 0 2026-03-10T12:37:58.781 INFO:tasks.workunit.client.1.vm07.stdout:0/562: creat d0/d14/d5f/d76/d2f/d31/d79/d85/fb5 x:0 0 0 2026-03-10T12:37:58.781 INFO:tasks.workunit.client.0.vm00.stdout:0/505: dread d3/d7/d4c/f76 [0,4194304] 0 2026-03-10T12:37:58.781 INFO:tasks.workunit.client.0.vm00.stdout:3/612: rename dd/d2a/la3 to dd/d18/d14/d2b/lca 0 2026-03-10T12:37:58.781 INFO:tasks.workunit.client.0.vm00.stdout:6/403: truncate d2/da/f11 3352945 0 2026-03-10T12:37:58.793 INFO:tasks.workunit.client.0.vm00.stdout:4/548: mknod df/d6c/d90/cb8 0 2026-03-10T12:37:58.806 INFO:tasks.workunit.client.0.vm00.stdout:4/549: mkdir df/d1f/d22/d26/d65/d91/db9 0 2026-03-10T12:37:58.810 INFO:tasks.workunit.client.0.vm00.stdout:3/613: mknod dd/d3d/ccb 0 2026-03-10T12:37:58.812 INFO:tasks.workunit.client.0.vm00.stdout:4/550: creat df/d1f/d22/d26/d65/fba x:0 0 0 2026-03-10T12:37:58.817 INFO:tasks.workunit.client.0.vm00.stdout:2/544: rename d4/d6/d2d/db4 to d4/dd/db9 0 2026-03-10T12:37:58.820 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:58 vm07.local ceph-mon[58582]: Upgrade: Updating mgr.vm07.kfawlb 2026-03-10T12:37:58.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:58 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:37:58.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:58 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:37:58.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:58 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:37:58.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:58 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:37:58.821 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:58 vm07.local ceph-mon[58582]: Deploying daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:37:58.826 INFO:tasks.workunit.client.0.vm00.stdout:3/614: read dd/d64/f7b [635876,40568] 0 2026-03-10T12:37:58.839 INFO:tasks.workunit.client.0.vm00.stdout:3/615: read - dd/d4e/d6a/fc5 zero size 2026-03-10T12:37:58.839 INFO:tasks.workunit.client.0.vm00.stdout:4/551: unlink df/d1f/d22/d26/f84 0 2026-03-10T12:37:58.839 INFO:tasks.workunit.client.0.vm00.stdout:5/576: dread d1f/d26/d2e/f3c [0,4194304] 0 2026-03-10T12:37:58.839 INFO:tasks.workunit.client.0.vm00.stdout:7/409: rmdir da/d26/d50 39 2026-03-10T12:37:58.839 INFO:tasks.workunit.client.0.vm00.stdout:7/410: dwrite da/d3f/d71/f8c [0,4194304] 0 2026-03-10T12:37:58.844 INFO:tasks.workunit.client.0.vm00.stdout:4/552: symlink df/d1f/lbb 0 2026-03-10T12:37:58.845 INFO:tasks.workunit.client.0.vm00.stdout:3/616: creat dd/d18/d13/d99/da5/fcc x:0 0 0 2026-03-10T12:37:58.848 INFO:tasks.workunit.client.0.vm00.stdout:3/617: rmdir dd/d27 39 2026-03-10T12:37:58.851 INFO:tasks.workunit.client.0.vm00.stdout:2/545: getdents d4/d53/d76 0 2026-03-10T12:37:58.852 INFO:tasks.workunit.client.0.vm00.stdout:2/546: chown d4/d6/f4e 208 1 2026-03-10T12:37:58.852 INFO:tasks.workunit.client.0.vm00.stdout:2/547: write d4/dd/ff [2425772,85524] 0 2026-03-10T12:37:58.856 INFO:tasks.workunit.client.0.vm00.stdout:2/548: mkdir d4/d53/d76/dba 0 2026-03-10T12:37:58.856 INFO:tasks.workunit.client.0.vm00.stdout:2/549: chown d4/f73 29852442 1 2026-03-10T12:37:58.857 INFO:tasks.workunit.client.0.vm00.stdout:5/577: getdents d1f/d26/d6f 0 2026-03-10T12:37:58.857 INFO:tasks.workunit.client.0.vm00.stdout:5/578: readlink d1f/lb1 0 2026-03-10T12:37:58.858 INFO:tasks.workunit.client.1.vm07.stdout:6/473: chown d1/d4/l8a 161086972 1 2026-03-10T12:37:58.858 INFO:tasks.workunit.client.0.vm00.stdout:2/550: mknod d4/dd/db9/d6d/cbb 0 2026-03-10T12:37:58.859 INFO:tasks.workunit.client.1.vm07.stdout:6/474: chown d1/d4/d6/d16/d1a/f6a 1867177 1 2026-03-10T12:37:58.861 INFO:tasks.workunit.client.1.vm07.stdout:6/475: dread d1/d4/d6/d16/d49/f67 [0,4194304] 0 2026-03-10T12:37:58.863 INFO:tasks.workunit.client.0.vm00.stdout:2/551: dread - d4/dd/db9/f7a zero size 2026-03-10T12:37:58.868 INFO:tasks.workunit.client.1.vm07.stdout:8/496: fdatasync d1/f3d 0 2026-03-10T12:37:58.868 INFO:tasks.workunit.client.0.vm00.stdout:3/618: creat dd/d27/d2c/d34/fcd x:0 0 0 2026-03-10T12:37:58.868 INFO:tasks.workunit.client.0.vm00.stdout:5/579: mkdir d1f/d26/d2b/d37/dcc 0 2026-03-10T12:37:58.868 INFO:tasks.workunit.client.0.vm00.stdout:5/580: dread - d1f/d26/d2b/d35/d53/d72/d9d/d8e/fb6 zero size 2026-03-10T12:37:58.868 INFO:tasks.workunit.client.0.vm00.stdout:2/552: mkdir d4/d53/d76/d9b/dad/dbc 0 2026-03-10T12:37:58.868 INFO:tasks.workunit.client.0.vm00.stdout:7/411: creat da/d25/d2e/f9c x:0 0 0 2026-03-10T12:37:58.870 INFO:tasks.workunit.client.0.vm00.stdout:3/619: creat dd/d64/d93/fce x:0 0 0 2026-03-10T12:37:58.875 INFO:tasks.workunit.client.0.vm00.stdout:2/553: creat d4/d6/d2d/d3a/fbd x:0 0 0 2026-03-10T12:37:58.876 INFO:tasks.workunit.client.0.vm00.stdout:7/412: mkdir da/d41/d7b/d9d 0 2026-03-10T12:37:58.877 INFO:tasks.workunit.client.0.vm00.stdout:2/554: write d4/f6e [4495064,84397] 0 2026-03-10T12:37:58.877 INFO:tasks.workunit.client.0.vm00.stdout:2/555: stat d4/d6/cb7 0 2026-03-10T12:37:58.884 INFO:tasks.workunit.client.0.vm00.stdout:2/556: creat d4/d53/d76/d9b/dad/dbc/fbe x:0 0 0 2026-03-10T12:37:58.885 INFO:tasks.workunit.client.1.vm07.stdout:3/549: write dc/dd/d28/d7a/f88 [4174553,49825] 0 2026-03-10T12:37:58.888 INFO:tasks.workunit.client.0.vm00.stdout:2/557: truncate d4/d53/d76/d9b/dad/f80 210522 0 2026-03-10T12:37:58.889 INFO:tasks.workunit.client.0.vm00.stdout:2/558: creat d4/d6/d93/fbf x:0 0 0 2026-03-10T12:37:58.891 INFO:tasks.workunit.client.0.vm00.stdout:2/559: mknod d4/d53/d76/cc0 0 2026-03-10T12:37:58.891 INFO:tasks.workunit.client.1.vm07.stdout:1/491: dwrite d9/df/d29/d2b/f7c [0,4194304] 0 2026-03-10T12:37:58.891 INFO:tasks.workunit.client.1.vm07.stdout:7/480: dwrite d0/d47/d48/f9e [0,4194304] 0 2026-03-10T12:37:58.895 INFO:tasks.workunit.client.1.vm07.stdout:7/481: truncate d0/d61/f93 918547 0 2026-03-10T12:37:58.895 INFO:tasks.workunit.client.1.vm07.stdout:9/557: dwrite d5/d13/f2b [0,4194304] 0 2026-03-10T12:37:58.905 INFO:tasks.workunit.client.1.vm07.stdout:7/482: dwrite d0/f2b [0,4194304] 0 2026-03-10T12:37:58.915 INFO:tasks.workunit.client.0.vm00.stdout:7/413: symlink da/d26/d37/l9e 0 2026-03-10T12:37:58.938 INFO:tasks.workunit.client.1.vm07.stdout:5/533: dread d0/d22/f27 [0,4194304] 0 2026-03-10T12:37:58.944 INFO:tasks.workunit.client.0.vm00.stdout:9/603: write d0/d3d/d43/d53/f66 [1239119,96352] 0 2026-03-10T12:37:58.950 INFO:tasks.workunit.client.0.vm00.stdout:9/604: creat d0/d3d/d59/d4e/dba/fd5 x:0 0 0 2026-03-10T12:37:58.950 INFO:tasks.workunit.client.0.vm00.stdout:9/605: creat d0/d3d/d43/d53/fd6 x:0 0 0 2026-03-10T12:37:58.954 INFO:tasks.workunit.client.1.vm07.stdout:2/408: fdatasync d0/d42/d26/d38/d4f/f65 0 2026-03-10T12:37:58.958 INFO:tasks.workunit.client.1.vm07.stdout:0/563: truncate d0/d14/d5f/d76/d2f/d31/d4f/d60/f89 176151 0 2026-03-10T12:37:58.960 INFO:tasks.workunit.client.1.vm07.stdout:3/550: unlink dc/dd/d1f/d6f/f8c 0 2026-03-10T12:37:58.963 INFO:tasks.workunit.client.1.vm07.stdout:7/483: dread - d0/d57/d62/f75 zero size 2026-03-10T12:37:58.964 INFO:tasks.workunit.client.1.vm07.stdout:7/484: read - d0/f5f zero size 2026-03-10T12:37:58.964 INFO:tasks.workunit.client.1.vm07.stdout:7/485: write d0/d57/d62/f6c [2887478,125551] 0 2026-03-10T12:37:58.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:58 vm00.local ceph-mon[50686]: Upgrade: Updating mgr.vm07.kfawlb 2026-03-10T12:37:58.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:58 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:37:58.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:58 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:37:58.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:58 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:37:58.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:58 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:37:58.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:58 vm00.local ceph-mon[50686]: Deploying daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:37:58.984 INFO:tasks.workunit.client.1.vm07.stdout:6/476: mkdir d1/d4/d6/d96 0 2026-03-10T12:37:58.997 INFO:tasks.workunit.client.0.vm00.stdout:0/506: dread d3/d7/d4c/f73 [0,4194304] 0 2026-03-10T12:37:59.001 INFO:tasks.workunit.client.1.vm07.stdout:7/486: creat d0/d67/d6f/fa2 x:0 0 0 2026-03-10T12:37:59.001 INFO:tasks.workunit.client.1.vm07.stdout:7/487: chown d0/f4f 402465622 1 2026-03-10T12:37:59.009 INFO:tasks.workunit.client.1.vm07.stdout:6/477: rmdir d1/d4/d6/d16/d49 39 2026-03-10T12:37:59.011 INFO:tasks.workunit.client.1.vm07.stdout:9/558: link d5/d13/d6c/d7a/l9c d5/d16/d18/lc0 0 2026-03-10T12:37:59.013 INFO:tasks.workunit.client.1.vm07.stdout:6/478: dwrite d1/d4/d6/d16/d1a/f29 [4194304,4194304] 0 2026-03-10T12:37:59.027 INFO:tasks.workunit.client.1.vm07.stdout:8/497: getdents d1/d3/d40 0 2026-03-10T12:37:59.028 INFO:tasks.workunit.client.1.vm07.stdout:1/492: getdents d9/df/d79 0 2026-03-10T12:37:59.029 INFO:tasks.workunit.client.1.vm07.stdout:9/559: mknod d5/d13/d57/d4f/d6a/cc1 0 2026-03-10T12:37:59.030 INFO:tasks.workunit.client.0.vm00.stdout:0/507: getdents d3/d7/d4c/d5b/d38/d44/d5a 0 2026-03-10T12:37:59.032 INFO:tasks.workunit.client.1.vm07.stdout:6/479: unlink d1/d4/d6/d16/d1a/d33/f74 0 2026-03-10T12:37:59.037 INFO:tasks.workunit.client.1.vm07.stdout:2/409: getdents d0/d42 0 2026-03-10T12:37:59.037 INFO:tasks.workunit.client.1.vm07.stdout:8/498: rmdir d1/d3/d40/d92 39 2026-03-10T12:37:59.040 INFO:tasks.workunit.client.1.vm07.stdout:8/499: dwrite d1/fc [0,4194304] 0 2026-03-10T12:37:59.054 INFO:tasks.workunit.client.1.vm07.stdout:3/551: getdents dc/dd 0 2026-03-10T12:37:59.054 INFO:tasks.workunit.client.0.vm00.stdout:0/508: truncate d3/d7/d4c/d5b/d38/f89 5780857 0 2026-03-10T12:37:59.055 INFO:tasks.workunit.client.1.vm07.stdout:9/560: creat d5/d69/fc2 x:0 0 0 2026-03-10T12:37:59.061 INFO:tasks.workunit.client.0.vm00.stdout:5/581: dread d1f/d26/d2b/d35/d53/d72/fa0 [0,4194304] 0 2026-03-10T12:37:59.062 INFO:tasks.workunit.client.1.vm07.stdout:1/493: rename d9/df/d29/d2b/d31/d91/d59/f68 to d9/df/d29/d6b/fa1 0 2026-03-10T12:37:59.062 INFO:tasks.workunit.client.0.vm00.stdout:5/582: mkdir d1f/d26/d2b/d35/d78/d99/dcd 0 2026-03-10T12:37:59.064 INFO:tasks.workunit.client.0.vm00.stdout:5/583: creat d1f/d26/d2b/fce x:0 0 0 2026-03-10T12:37:59.065 INFO:tasks.workunit.client.0.vm00.stdout:5/584: chown d1f/d96/lc0 226153 1 2026-03-10T12:37:59.065 INFO:tasks.workunit.client.0.vm00.stdout:5/585: fsync d1f/d26/d2e/f3c 0 2026-03-10T12:37:59.066 INFO:tasks.workunit.client.0.vm00.stdout:5/586: read - d1f/d26/d2b/d35/d53/d72/f85 zero size 2026-03-10T12:37:59.067 INFO:tasks.workunit.client.0.vm00.stdout:5/587: write d1f/d6a/f84 [844943,33920] 0 2026-03-10T12:37:59.068 INFO:tasks.workunit.client.0.vm00.stdout:5/588: symlink d1f/d26/d2e/d58/d6b/lcf 0 2026-03-10T12:37:59.069 INFO:tasks.workunit.client.1.vm07.stdout:7/488: link d0/f3 d0/d47/d48/d8a/d9d/fa3 0 2026-03-10T12:37:59.070 INFO:tasks.workunit.client.0.vm00.stdout:5/589: creat d1f/d26/d2b/fd0 x:0 0 0 2026-03-10T12:37:59.074 INFO:tasks.workunit.client.1.vm07.stdout:7/489: dread d0/f4f [0,4194304] 0 2026-03-10T12:37:59.075 INFO:tasks.workunit.client.0.vm00.stdout:5/590: dread f19 [0,4194304] 0 2026-03-10T12:37:59.076 INFO:tasks.workunit.client.0.vm00.stdout:5/591: readlink d1f/d39/l47 0 2026-03-10T12:37:59.077 INFO:tasks.workunit.client.0.vm00.stdout:5/592: write d1f/d39/f9c [3876774,47357] 0 2026-03-10T12:37:59.079 INFO:tasks.workunit.client.0.vm00.stdout:5/593: mkdir d1f/d26/d2b/d35/d53/d5b/dd1 0 2026-03-10T12:37:59.080 INFO:tasks.workunit.client.0.vm00.stdout:0/509: mknod d3/d7/ca9 0 2026-03-10T12:37:59.080 INFO:tasks.workunit.client.0.vm00.stdout:5/594: mknod d1f/d26/d2b/d35/d53/cd2 0 2026-03-10T12:37:59.081 INFO:tasks.workunit.client.1.vm07.stdout:2/410: symlink d0/d42/d26/d38/d4f/d62/l8c 0 2026-03-10T12:37:59.085 INFO:tasks.workunit.client.0.vm00.stdout:5/595: dread d1f/d26/d2b/d35/d78/d7f/fb9 [0,4194304] 0 2026-03-10T12:37:59.087 INFO:tasks.workunit.client.1.vm07.stdout:8/500: unlink d1/f36 0 2026-03-10T12:37:59.089 INFO:tasks.workunit.client.0.vm00.stdout:5/596: mknod d1f/d26/d2b/d35/cd3 0 2026-03-10T12:37:59.099 INFO:tasks.workunit.client.0.vm00.stdout:0/510: creat d3/db/d77/faa x:0 0 0 2026-03-10T12:37:59.099 INFO:tasks.workunit.client.1.vm07.stdout:3/552: creat dc/d18/d24/d72/fc2 x:0 0 0 2026-03-10T12:37:59.100 INFO:tasks.workunit.client.0.vm00.stdout:5/597: dread d1f/d26/d2b/d35/d53/d5b/f6e [0,4194304] 0 2026-03-10T12:37:59.100 INFO:tasks.workunit.client.1.vm07.stdout:9/561: creat d5/d69/d93/d97/fc3 x:0 0 0 2026-03-10T12:37:59.101 INFO:tasks.workunit.client.0.vm00.stdout:5/598: chown d1f/d26/d2b/d35/d53/d72/d9d/d8e/fc1 48714 1 2026-03-10T12:37:59.105 INFO:tasks.workunit.client.0.vm00.stdout:0/511: symlink d3/db/d24/d25/lab 0 2026-03-10T12:37:59.105 INFO:tasks.workunit.client.1.vm07.stdout:6/480: mkdir d1/d4/d6/d43/d88/d97 0 2026-03-10T12:37:59.112 INFO:tasks.workunit.client.0.vm00.stdout:1/539: dwrite da/d21/db3/d5d/d80/f8a [0,4194304] 0 2026-03-10T12:37:59.113 INFO:tasks.workunit.client.0.vm00.stdout:0/512: write d3/d22/f2e [4629059,11852] 0 2026-03-10T12:37:59.114 INFO:tasks.workunit.client.0.vm00.stdout:8/445: truncate d0/d12/d36/d5b/f65 4269868 0 2026-03-10T12:37:59.115 INFO:tasks.workunit.client.1.vm07.stdout:2/411: fdatasync d0/f4 0 2026-03-10T12:37:59.115 INFO:tasks.workunit.client.1.vm07.stdout:2/412: dread - d0/d42/d4e/f81 zero size 2026-03-10T12:37:59.117 INFO:tasks.workunit.client.0.vm00.stdout:1/540: rmdir da/d21/d27/d6a 39 2026-03-10T12:37:59.122 INFO:tasks.workunit.client.0.vm00.stdout:8/446: dwrite d0/d12/d17/d48/f87 [0,4194304] 0 2026-03-10T12:37:59.124 INFO:tasks.workunit.client.1.vm07.stdout:4/626: dwrite d0/d4/d7a/fcd [0,4194304] 0 2026-03-10T12:37:59.125 INFO:tasks.workunit.client.0.vm00.stdout:0/513: read f2 [3560605,80856] 0 2026-03-10T12:37:59.130 INFO:tasks.workunit.client.0.vm00.stdout:8/447: dread d0/d12/d2d/f75 [0,4194304] 0 2026-03-10T12:37:59.130 INFO:tasks.workunit.client.0.vm00.stdout:1/541: creat da/d24/d73/fb6 x:0 0 0 2026-03-10T12:37:59.132 INFO:tasks.workunit.client.0.vm00.stdout:4/553: dwrite df/f1c [0,4194304] 0 2026-03-10T12:37:59.135 INFO:tasks.workunit.client.0.vm00.stdout:6/404: truncate d2/da/dc/d2f/f4f 1450671 0 2026-03-10T12:37:59.137 INFO:tasks.workunit.client.0.vm00.stdout:6/405: write d2/da/dc/d2f/f3a [4467412,5024] 0 2026-03-10T12:37:59.141 INFO:tasks.workunit.client.0.vm00.stdout:2/560: write d4/d6/f16 [1616245,80849] 0 2026-03-10T12:37:59.148 INFO:tasks.workunit.client.1.vm07.stdout:1/494: unlink d9/df/c63 0 2026-03-10T12:37:59.148 INFO:tasks.workunit.client.0.vm00.stdout:9/606: dwrite d0/d3d/f8f [0,4194304] 0 2026-03-10T12:37:59.153 INFO:tasks.workunit.client.1.vm07.stdout:6/481: mkdir d1/d4/d44/d98 0 2026-03-10T12:37:59.155 INFO:tasks.workunit.client.0.vm00.stdout:2/561: rename d4/d53/d76/d9b/dad/f7e to d4/fc1 0 2026-03-10T12:37:59.156 INFO:tasks.workunit.client.0.vm00.stdout:8/448: getdents d0/dd/d38/d81 0 2026-03-10T12:37:59.158 INFO:tasks.workunit.client.0.vm00.stdout:0/514: dread d3/d7/d3c/f19 [0,4194304] 0 2026-03-10T12:37:59.160 INFO:tasks.workunit.client.0.vm00.stdout:7/414: dwrite f1 [8388608,4194304] 0 2026-03-10T12:37:59.167 INFO:tasks.workunit.client.0.vm00.stdout:9/607: getdents d0 0 2026-03-10T12:37:59.177 INFO:tasks.workunit.client.0.vm00.stdout:9/608: read d0/d3d/d59/d4e/dba/d1e/d27/f9e [176552,55693] 0 2026-03-10T12:37:59.177 INFO:tasks.workunit.client.0.vm00.stdout:9/609: chown d0/d3d/d43/f54 1785613222 1 2026-03-10T12:37:59.178 INFO:tasks.workunit.client.0.vm00.stdout:1/542: dread da/d12/d26/f57 [0,4194304] 0 2026-03-10T12:37:59.179 INFO:tasks.workunit.client.0.vm00.stdout:1/543: read - da/d21/d39/f92 zero size 2026-03-10T12:37:59.180 INFO:tasks.workunit.client.0.vm00.stdout:6/406: dread d2/f5e [0,4194304] 0 2026-03-10T12:37:59.183 INFO:tasks.workunit.client.0.vm00.stdout:6/407: dwrite d2/da/dc/d2f/f56 [0,4194304] 0 2026-03-10T12:37:59.188 INFO:tasks.workunit.client.0.vm00.stdout:6/408: stat d2/d16/d74/c49 0 2026-03-10T12:37:59.192 INFO:tasks.workunit.client.0.vm00.stdout:8/449: dread d0/f8 [0,4194304] 0 2026-03-10T12:37:59.194 INFO:tasks.workunit.client.0.vm00.stdout:9/610: rename d0/d3d/d59/d4e/dba/d1e/d85/lc3 to d0/d3d/d43/ld7 0 2026-03-10T12:37:59.197 INFO:tasks.workunit.client.0.vm00.stdout:0/515: creat d3/d7/d4c/d5b/d38/d44/d5a/fac x:0 0 0 2026-03-10T12:37:59.198 INFO:tasks.workunit.client.0.vm00.stdout:8/450: readlink d0/dd/lf 0 2026-03-10T12:37:59.207 INFO:tasks.workunit.client.0.vm00.stdout:1/544: link da/l4c da/d12/db4/lb7 0 2026-03-10T12:37:59.207 INFO:tasks.workunit.client.0.vm00.stdout:1/545: fsync f3 0 2026-03-10T12:37:59.212 INFO:tasks.workunit.client.0.vm00.stdout:9/611: truncate d0/d3d/d59/d4e/dba/d19/f1b 3582888 0 2026-03-10T12:37:59.213 INFO:tasks.workunit.client.1.vm07.stdout:1/495: chown d9/df/l4d 4373 1 2026-03-10T12:37:59.214 INFO:tasks.workunit.client.0.vm00.stdout:1/546: creat da/d12/d91/fb8 x:0 0 0 2026-03-10T12:37:59.216 INFO:tasks.workunit.client.0.vm00.stdout:9/612: symlink d0/d3d/d59/d4e/dba/d1e/d27/ld8 0 2026-03-10T12:37:59.220 INFO:tasks.workunit.client.0.vm00.stdout:1/547: mkdir da/d21/d27/d6a/d94/db9 0 2026-03-10T12:37:59.227 INFO:tasks.workunit.client.1.vm07.stdout:6/482: mkdir d1/d4/d6/d16/d1a/d99 0 2026-03-10T12:37:59.227 INFO:tasks.workunit.client.1.vm07.stdout:2/413: rename d0/f13 to d0/f8d 0 2026-03-10T12:37:59.227 INFO:tasks.workunit.client.0.vm00.stdout:0/516: chown d3/l28 5216 1 2026-03-10T12:37:59.227 INFO:tasks.workunit.client.0.vm00.stdout:3/620: dread dd/d18/f7c [4194304,4194304] 0 2026-03-10T12:37:59.227 INFO:tasks.workunit.client.0.vm00.stdout:7/415: symlink da/d47/d87/l9f 0 2026-03-10T12:37:59.228 INFO:tasks.workunit.client.0.vm00.stdout:3/621: dread dd/d64/d92/fb8 [0,4194304] 0 2026-03-10T12:37:59.231 INFO:tasks.workunit.client.0.vm00.stdout:0/517: creat d3/d7/d58/fad x:0 0 0 2026-03-10T12:37:59.232 INFO:tasks.workunit.client.1.vm07.stdout:4/627: creat d0/d4/d5/d8f/fdd x:0 0 0 2026-03-10T12:37:59.233 INFO:tasks.workunit.client.0.vm00.stdout:7/416: creat da/d41/fa0 x:0 0 0 2026-03-10T12:37:59.234 INFO:tasks.workunit.client.1.vm07.stdout:1/496: creat d9/df/d29/d2b/d31/d91/d59/fa2 x:0 0 0 2026-03-10T12:37:59.238 INFO:tasks.workunit.client.0.vm00.stdout:0/518: rmdir d3/db/da4 39 2026-03-10T12:37:59.242 INFO:tasks.workunit.client.0.vm00.stdout:0/519: read d3/d22/d3a/f8c [2064163,67784] 0 2026-03-10T12:37:59.244 INFO:tasks.workunit.client.1.vm07.stdout:6/483: symlink d1/d4/d71/l9a 0 2026-03-10T12:37:59.246 INFO:tasks.workunit.client.1.vm07.stdout:3/553: dread dc/dd/d1f/f6d [0,4194304] 0 2026-03-10T12:37:59.248 INFO:tasks.workunit.client.1.vm07.stdout:3/554: read dc/dd/d1f/d45/f68 [423276,67962] 0 2026-03-10T12:37:59.248 INFO:tasks.workunit.client.1.vm07.stdout:3/555: chown dc/dd/d1f/d45/f5e 113582296 1 2026-03-10T12:37:59.250 INFO:tasks.workunit.client.0.vm00.stdout:5/599: write d1f/d26/f48 [2087374,1290] 0 2026-03-10T12:37:59.253 INFO:tasks.workunit.client.0.vm00.stdout:5/600: symlink d1f/d26/d2b/d35/d53/d72/d9d/dcb/ld4 0 2026-03-10T12:37:59.255 INFO:tasks.workunit.client.0.vm00.stdout:5/601: mknod d1f/d26/d2b/cd5 0 2026-03-10T12:37:59.259 INFO:tasks.workunit.client.0.vm00.stdout:7/417: dread da/f16 [0,4194304] 0 2026-03-10T12:37:59.261 INFO:tasks.workunit.client.1.vm07.stdout:0/564: write d0/d14/d5f/d76/d2f/d31/f5a [238465,24100] 0 2026-03-10T12:37:59.261 INFO:tasks.workunit.client.1.vm07.stdout:5/534: dwrite d0/d22/f50 [0,4194304] 0 2026-03-10T12:37:59.262 INFO:tasks.workunit.client.1.vm07.stdout:8/501: link d1/d3/d11/l84 d1/la0 0 2026-03-10T12:37:59.263 INFO:tasks.workunit.client.1.vm07.stdout:0/565: chown d0/d14/d5f/d3b 109 1 2026-03-10T12:37:59.263 INFO:tasks.workunit.client.1.vm07.stdout:8/502: stat d1/d3/l4a 0 2026-03-10T12:37:59.263 INFO:tasks.workunit.client.1.vm07.stdout:5/535: stat d0/d22/d18/d3e/d5d/f6d 0 2026-03-10T12:37:59.269 INFO:tasks.workunit.client.1.vm07.stdout:4/628: symlink d0/d4/d5/d34/lde 0 2026-03-10T12:37:59.284 INFO:tasks.workunit.client.1.vm07.stdout:7/490: getdents d0/d47/d48/d8a/d9d 0 2026-03-10T12:37:59.285 INFO:tasks.workunit.client.0.vm00.stdout:7/418: rmdir da/d41/d7b 39 2026-03-10T12:37:59.287 INFO:tasks.workunit.client.1.vm07.stdout:1/497: symlink d9/df/d29/d2b/d31/d91/d59/la3 0 2026-03-10T12:37:59.302 INFO:tasks.workunit.client.0.vm00.stdout:7/419: mknod da/d3f/ca1 0 2026-03-10T12:37:59.305 INFO:tasks.workunit.client.0.vm00.stdout:7/420: dread da/d25/d2e/d4c/f92 [0,4194304] 0 2026-03-10T12:37:59.308 INFO:tasks.workunit.client.1.vm07.stdout:4/629: dread - d0/d4/d10/d8d/fb0 zero size 2026-03-10T12:37:59.309 INFO:tasks.workunit.client.1.vm07.stdout:1/498: creat d9/df/d29/d2b/d31/d91/d59/fa4 x:0 0 0 2026-03-10T12:37:59.310 INFO:tasks.workunit.client.0.vm00.stdout:7/421: symlink da/d41/d48/d81/la2 0 2026-03-10T12:37:59.313 INFO:tasks.workunit.client.1.vm07.stdout:3/556: symlink dc/dd/d1f/d45/dbf/lc3 0 2026-03-10T12:37:59.314 INFO:tasks.workunit.client.1.vm07.stdout:9/562: dwrite d5/d13/d6c/da4/fa6 [0,4194304] 0 2026-03-10T12:37:59.321 INFO:tasks.workunit.client.0.vm00.stdout:4/554: truncate df/d1f/d22/d26/d65/f8e 715040 0 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.1.vm07.stdout:5/536: mkdir d0/d22/d18/d19/d2e/d3f/db8 0 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.1.vm07.stdout:1/499: unlink d9/d2d/d4f/f5e 0 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.0.vm00.stdout:4/555: mkdir df/d93/dbc 0 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.0.vm00.stdout:7/422: write da/d1b/d40/f5c [629875,17146] 0 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.0.vm00.stdout:4/556: rmdir df/d63/d77 39 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.0.vm00.stdout:7/423: mknod da/d1b/ca3 0 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.0.vm00.stdout:7/424: truncate da/f10 13485949 0 2026-03-10T12:37:59.341 INFO:tasks.workunit.client.0.vm00.stdout:6/409: write d2/f5e [1543503,72662] 0 2026-03-10T12:37:59.342 INFO:tasks.workunit.client.1.vm07.stdout:5/537: dread d0/f9 [0,4194304] 0 2026-03-10T12:37:59.342 INFO:tasks.workunit.client.1.vm07.stdout:5/538: stat d0/d22/d18/d19/d2e/da9/fb5 0 2026-03-10T12:37:59.356 INFO:tasks.workunit.client.0.vm00.stdout:6/410: dread d2/d14/f1b [0,4194304] 0 2026-03-10T12:37:59.356 INFO:tasks.workunit.client.0.vm00.stdout:7/425: mknod da/d1b/ca4 0 2026-03-10T12:37:59.356 INFO:tasks.workunit.client.1.vm07.stdout:9/563: unlink d5/d13/d22/l5d 0 2026-03-10T12:37:59.356 INFO:tasks.workunit.client.1.vm07.stdout:9/564: chown d5/d1f 393 1 2026-03-10T12:37:59.356 INFO:tasks.workunit.client.1.vm07.stdout:3/557: dread f2 [0,4194304] 0 2026-03-10T12:37:59.356 INFO:tasks.workunit.client.1.vm07.stdout:8/503: rename d1/d3/d18/f32 to d1/d3/d6/d54/fa1 0 2026-03-10T12:37:59.356 INFO:tasks.workunit.client.1.vm07.stdout:9/565: rmdir d5 39 2026-03-10T12:37:59.357 INFO:tasks.workunit.client.0.vm00.stdout:6/411: dwrite d2/f30 [0,4194304] 0 2026-03-10T12:37:59.359 INFO:tasks.workunit.client.0.vm00.stdout:6/412: stat d2/da/dc/l12 0 2026-03-10T12:37:59.361 INFO:tasks.workunit.client.0.vm00.stdout:2/562: write d4/f7b [73330,78730] 0 2026-03-10T12:37:59.363 INFO:tasks.workunit.client.0.vm00.stdout:2/563: mkdir d4/d53/d68/dc2 0 2026-03-10T12:37:59.363 INFO:tasks.workunit.client.0.vm00.stdout:2/564: stat d4/d53/d9e/f60 0 2026-03-10T12:37:59.363 INFO:tasks.workunit.client.1.vm07.stdout:0/566: getdents d0/d14/d5f/d41/d6a/d9a 0 2026-03-10T12:37:59.365 INFO:tasks.workunit.client.0.vm00.stdout:2/565: mkdir d4/d6/d2d/dc3 0 2026-03-10T12:37:59.367 INFO:tasks.workunit.client.1.vm07.stdout:9/566: dread d5/f91 [0,4194304] 0 2026-03-10T12:37:59.369 INFO:tasks.workunit.client.0.vm00.stdout:2/566: rename d4/d6/d2d/d3a/fa6 to d4/d53/d76/d9b/dad/dbc/fc4 0 2026-03-10T12:37:59.373 INFO:tasks.workunit.client.1.vm07.stdout:9/567: creat d5/d13/d57/d4f/d6a/fc4 x:0 0 0 2026-03-10T12:37:59.376 INFO:tasks.workunit.client.0.vm00.stdout:6/413: getdents d2/d16/d74 0 2026-03-10T12:37:59.380 INFO:tasks.workunit.client.0.vm00.stdout:6/414: readlink d2/d16/d29/l5f 0 2026-03-10T12:37:59.380 INFO:tasks.workunit.client.0.vm00.stdout:2/567: creat d4/d53/d76/dba/fc5 x:0 0 0 2026-03-10T12:37:59.381 INFO:tasks.workunit.client.0.vm00.stdout:2/568: read d4/f1d [2826778,25693] 0 2026-03-10T12:37:59.381 INFO:tasks.workunit.client.1.vm07.stdout:9/568: symlink d5/d13/d6c/da4/lc5 0 2026-03-10T12:37:59.382 INFO:tasks.workunit.client.0.vm00.stdout:2/569: dread - d4/d53/d76/d9b/dad/dbc/fc4 zero size 2026-03-10T12:37:59.382 INFO:tasks.workunit.client.1.vm07.stdout:3/558: getdents dc/dd 0 2026-03-10T12:37:59.385 INFO:tasks.workunit.client.1.vm07.stdout:2/414: dwrite d0/d42/f1b [0,4194304] 0 2026-03-10T12:37:59.386 INFO:tasks.workunit.client.0.vm00.stdout:8/451: dwrite d0/d5c/f4a [0,4194304] 0 2026-03-10T12:37:59.387 INFO:tasks.workunit.client.1.vm07.stdout:5/539: read d0/d22/d18/f20 [3859884,88562] 0 2026-03-10T12:37:59.388 INFO:tasks.workunit.client.1.vm07.stdout:7/491: sync 2026-03-10T12:37:59.388 INFO:tasks.workunit.client.1.vm07.stdout:0/567: sync 2026-03-10T12:37:59.389 INFO:tasks.workunit.client.1.vm07.stdout:5/540: dread - d0/d22/d18/d19/d2e/d3f/f87 zero size 2026-03-10T12:37:59.389 INFO:tasks.workunit.client.1.vm07.stdout:0/568: dread - d0/d14/d5f/fb3 zero size 2026-03-10T12:37:59.391 INFO:tasks.workunit.client.1.vm07.stdout:0/569: chown d0/d14/d7c/c82 39373 1 2026-03-10T12:37:59.397 INFO:tasks.workunit.client.0.vm00.stdout:2/570: truncate d4/d53/d76/f8b 3736804 0 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:4/557: dread df/d1f/d36/f92 [0,4194304] 0 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:8/452: rmdir d0 39 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:9/613: truncate d0/d3d/d59/d4e/dba/d1e/d2b/f36 6097833 0 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:4/558: creat df/d1f/d22/d26/d70/fbd x:0 0 0 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:1/548: truncate da/fc 1722643 0 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:4/559: dread - df/d32/d76/f7e zero size 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:3/622: truncate dd/d4e/d5d/f71 1222890 0 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.0.vm00.stdout:3/623: dread - dd/d2a/fbc zero size 2026-03-10T12:37:59.433 INFO:tasks.workunit.client.1.vm07.stdout:3/559: symlink dc/dd/d1f/d6f/lc4 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:5/541: symlink d0/d22/d18/d19/d36/d75/d77/lb9 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:0/570: creat d0/d83/fb6 x:0 0 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:9/569: mknod d5/cc6 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:9/570: stat d5/d13/d2c 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:2/415: link d0/f46 d0/d29/d64/d74/f8e 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:2/416: mknod d0/d5b/c8f 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/492: rmdir d0/d57/d62/d90/d91 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/493: chown d0/l24 592054020 1 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:0/571: mknod d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dae/cb7 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:0/572: readlink d0/d14/d5f/d76/d2f/l63 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/494: creat d0/d52/fa4 x:0 0 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/495: chown d0/l2e 76 1 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/496: write d0/d57/d62/f6c [653752,48011] 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:0/573: creat d0/d14/d5f/d76/d8e/fb8 x:0 0 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/497: mknod d0/d57/d62/ca5 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/498: truncate d0/f9b 2099404 0 2026-03-10T12:37:59.434 INFO:tasks.workunit.client.1.vm07.stdout:7/499: dwrite d0/d52/f98 [0,4194304] 0 2026-03-10T12:37:59.436 INFO:tasks.workunit.client.0.vm00.stdout:9/614: dread d0/dc2/f87 [0,4194304] 0 2026-03-10T12:37:59.437 INFO:tasks.workunit.client.0.vm00.stdout:1/549: mknod da/d12/d91/cba 0 2026-03-10T12:37:59.437 INFO:tasks.workunit.client.0.vm00.stdout:9/615: chown d0/d3d/d59/fad 8 1 2026-03-10T12:37:59.443 INFO:tasks.workunit.client.0.vm00.stdout:9/616: creat d0/d3d/d59/d4e/dba/d1e/d27/fd9 x:0 0 0 2026-03-10T12:37:59.446 INFO:tasks.workunit.client.0.vm00.stdout:1/550: creat da/d21/d39/d77/fbb x:0 0 0 2026-03-10T12:37:59.446 INFO:tasks.workunit.client.0.vm00.stdout:1/551: write da/d24/d28/faa [144164,103888] 0 2026-03-10T12:37:59.447 INFO:tasks.workunit.client.1.vm07.stdout:7/500: fsync d0/f27 0 2026-03-10T12:37:59.447 INFO:tasks.workunit.client.1.vm07.stdout:7/501: chown d0/d57 12 1 2026-03-10T12:37:59.452 INFO:tasks.workunit.client.1.vm07.stdout:7/502: mknod d0/ca6 0 2026-03-10T12:37:59.457 INFO:tasks.workunit.client.0.vm00.stdout:9/617: dread d0/d3d/d59/d4e/dba/f24 [4194304,4194304] 0 2026-03-10T12:37:59.459 INFO:tasks.workunit.client.1.vm07.stdout:2/417: sync 2026-03-10T12:37:59.461 INFO:tasks.workunit.client.1.vm07.stdout:9/571: sync 2026-03-10T12:37:59.461 INFO:tasks.workunit.client.1.vm07.stdout:9/572: read - d5/d16/da3/fb1 zero size 2026-03-10T12:37:59.461 INFO:tasks.workunit.client.0.vm00.stdout:1/552: mknod da/d12/d91/cbc 0 2026-03-10T12:37:59.461 INFO:tasks.workunit.client.0.vm00.stdout:1/553: symlink da/d21/db3/d5d/d80/lbd 0 2026-03-10T12:37:59.462 INFO:tasks.workunit.client.0.vm00.stdout:0/520: truncate d3/d7/d4c/d5b/f37 58915 0 2026-03-10T12:37:59.464 INFO:tasks.workunit.client.1.vm07.stdout:9/573: mknod d5/d16/d18/cc7 0 2026-03-10T12:37:59.464 INFO:tasks.workunit.client.0.vm00.stdout:1/554: dread da/d12/f99 [0,4194304] 0 2026-03-10T12:37:59.464 INFO:tasks.workunit.client.1.vm07.stdout:9/574: stat d5/d13/d9d 0 2026-03-10T12:37:59.465 INFO:tasks.workunit.client.0.vm00.stdout:9/618: rename d0/d3d/d59/d4e/dba/d19/f1b to d0/d7f/db8/dc4/db0/fda 0 2026-03-10T12:37:59.465 INFO:tasks.workunit.client.1.vm07.stdout:7/503: link d0/d52/l55 d0/d52/la7 0 2026-03-10T12:37:59.466 INFO:tasks.workunit.client.0.vm00.stdout:9/619: write d0/d7f/db8/dc4/f67 [556012,26499] 0 2026-03-10T12:37:59.466 INFO:tasks.workunit.client.1.vm07.stdout:3/560: dread dc/dd/f29 [0,4194304] 0 2026-03-10T12:37:59.466 INFO:tasks.workunit.client.0.vm00.stdout:2/571: dread d4/dd/d38/f3f [0,4194304] 0 2026-03-10T12:37:59.467 INFO:tasks.workunit.client.0.vm00.stdout:2/572: chown d4/d53/d76/d9b/dad/f50 2922 1 2026-03-10T12:37:59.473 INFO:tasks.workunit.client.0.vm00.stdout:9/620: mknod d0/d7f/db8/dc4/cdb 0 2026-03-10T12:37:59.475 INFO:tasks.workunit.client.1.vm07.stdout:9/575: dread d5/fb [8388608,4194304] 0 2026-03-10T12:37:59.475 INFO:tasks.workunit.client.1.vm07.stdout:9/576: chown d5/d13/d57/f95 304805434 1 2026-03-10T12:37:59.476 INFO:tasks.workunit.client.0.vm00.stdout:1/555: unlink da/d24/d28/f3c 0 2026-03-10T12:37:59.477 INFO:tasks.workunit.client.0.vm00.stdout:2/573: unlink d4/d53/lb2 0 2026-03-10T12:37:59.478 INFO:tasks.workunit.client.0.vm00.stdout:1/556: mkdir da/d24/d28/d67/da2/d78/dbe 0 2026-03-10T12:37:59.480 INFO:tasks.workunit.client.1.vm07.stdout:7/504: dread - d0/d47/d48/f7a zero size 2026-03-10T12:37:59.480 INFO:tasks.workunit.client.0.vm00.stdout:9/621: link d0/d3d/d59/d4e/dba/f39 d0/fdc 0 2026-03-10T12:37:59.486 INFO:tasks.workunit.client.0.vm00.stdout:0/521: getdents d3/d7/d3c/d74 0 2026-03-10T12:37:59.490 INFO:tasks.workunit.client.0.vm00.stdout:0/522: mkdir d3/d7/d4c/d9d/dae 0 2026-03-10T12:37:59.491 INFO:tasks.workunit.client.1.vm07.stdout:7/505: mkdir d0/d47/d48/d8a/da8 0 2026-03-10T12:37:59.492 INFO:tasks.workunit.client.0.vm00.stdout:6/415: sync 2026-03-10T12:37:59.493 INFO:tasks.workunit.client.0.vm00.stdout:6/416: chown d2/da/dc/d2f 3879592 1 2026-03-10T12:37:59.496 INFO:tasks.workunit.client.0.vm00.stdout:6/417: mkdir d2/d16/d29/d31/d88/d92 0 2026-03-10T12:37:59.497 INFO:tasks.workunit.client.0.vm00.stdout:6/418: symlink d2/d42/d80/d89/l93 0 2026-03-10T12:37:59.499 INFO:tasks.workunit.client.0.vm00.stdout:6/419: fdatasync d2/da/dc/f13 0 2026-03-10T12:37:59.505 INFO:tasks.workunit.client.0.vm00.stdout:0/523: dread d3/d7/d3c/f30 [0,4194304] 0 2026-03-10T12:37:59.505 INFO:tasks.workunit.client.0.vm00.stdout:0/524: chown d3/d7/f31 46488682 1 2026-03-10T12:37:59.505 INFO:tasks.workunit.client.0.vm00.stdout:1/557: sync 2026-03-10T12:37:59.506 INFO:tasks.workunit.client.0.vm00.stdout:8/453: sync 2026-03-10T12:37:59.506 INFO:tasks.workunit.client.1.vm07.stdout:6/484: dwrite d1/d4/d4a/f56 [0,4194304] 0 2026-03-10T12:37:59.507 INFO:tasks.workunit.client.1.vm07.stdout:6/485: readlink d1/l21 0 2026-03-10T12:37:59.508 INFO:tasks.workunit.client.0.vm00.stdout:1/558: dread - da/d24/f76 zero size 2026-03-10T12:37:59.512 INFO:tasks.workunit.client.0.vm00.stdout:5/602: dwrite d1f/d26/d2b/d37/f38 [0,4194304] 0 2026-03-10T12:37:59.516 INFO:tasks.workunit.client.1.vm07.stdout:4/630: dwrite d0/d4/d5/da/fb3 [0,4194304] 0 2026-03-10T12:37:59.518 INFO:tasks.workunit.client.0.vm00.stdout:7/426: dwrite da/fb [4194304,4194304] 0 2026-03-10T12:37:59.519 INFO:tasks.workunit.client.1.vm07.stdout:8/504: dwrite d1/f3d [0,4194304] 0 2026-03-10T12:37:59.522 INFO:tasks.workunit.client.1.vm07.stdout:8/505: fdatasync d1/fc 0 2026-03-10T12:37:59.523 INFO:tasks.workunit.client.1.vm07.stdout:4/631: dread d0/d4/d10/d3c/f68 [0,4194304] 0 2026-03-10T12:37:59.526 INFO:tasks.workunit.client.0.vm00.stdout:1/559: mkdir da/d21/db3/d5d/d72/d7e/dbf 0 2026-03-10T12:37:59.539 INFO:tasks.workunit.client.0.vm00.stdout:1/560: dwrite da/d21/d39/f55 [0,4194304] 0 2026-03-10T12:37:59.542 INFO:tasks.workunit.client.0.vm00.stdout:8/454: symlink d0/l8e 0 2026-03-10T12:37:59.546 INFO:tasks.workunit.client.0.vm00.stdout:1/561: dwrite da/d24/d28/fb1 [0,4194304] 0 2026-03-10T12:37:59.548 INFO:tasks.workunit.client.0.vm00.stdout:1/562: chown da/d21/db3/d59/da6 5258 1 2026-03-10T12:37:59.550 INFO:tasks.workunit.client.0.vm00.stdout:0/525: symlink d3/d7/d3c/laf 0 2026-03-10T12:37:59.554 INFO:tasks.workunit.client.0.vm00.stdout:0/526: write d3/f9c [523424,21366] 0 2026-03-10T12:37:59.555 INFO:tasks.workunit.client.1.vm07.stdout:8/506: chown d1/d3/d40/d92/f94 507639282 1 2026-03-10T12:37:59.573 INFO:tasks.workunit.client.1.vm07.stdout:6/486: mkdir d1/d4/d9b 0 2026-03-10T12:37:59.587 INFO:tasks.workunit.client.1.vm07.stdout:6/487: dread d1/d4/d6/d16/d1a/d2c/f59 [0,4194304] 0 2026-03-10T12:37:59.589 INFO:tasks.workunit.client.1.vm07.stdout:6/488: write d1/d4/d6/f13 [2659372,10967] 0 2026-03-10T12:37:59.590 INFO:tasks.workunit.client.1.vm07.stdout:5/542: write d0/d22/d18/d3e/d53/d9e/f8c [60895,111459] 0 2026-03-10T12:37:59.591 INFO:tasks.workunit.client.1.vm07.stdout:5/543: chown d0/d22/d18/d30 73 1 2026-03-10T12:37:59.592 INFO:tasks.workunit.client.1.vm07.stdout:5/544: chown d0/d22/d18/d19/d2e/da9/fb5 13609 1 2026-03-10T12:37:59.594 INFO:tasks.workunit.client.1.vm07.stdout:6/489: dwrite d1/d4/d6/d16/d1a/d2c/f59 [0,4194304] 0 2026-03-10T12:37:59.598 INFO:tasks.workunit.client.0.vm00.stdout:7/427: dread da/d1b/d40/f5c [0,4194304] 0 2026-03-10T12:37:59.619 INFO:tasks.workunit.client.0.vm00.stdout:7/428: dread - da/d47/f49 zero size 2026-03-10T12:37:59.620 INFO:tasks.workunit.client.1.vm07.stdout:8/507: read - d1/f6b zero size 2026-03-10T12:37:59.623 INFO:tasks.workunit.client.1.vm07.stdout:0/574: dwrite d0/d14/d5f/d3b/f4b [0,4194304] 0 2026-03-10T12:37:59.624 INFO:tasks.workunit.client.1.vm07.stdout:0/575: fdatasync d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/fa4 0 2026-03-10T12:37:59.638 INFO:tasks.workunit.client.0.vm00.stdout:0/527: mkdir d3/d7/db0 0 2026-03-10T12:37:59.644 INFO:tasks.workunit.client.0.vm00.stdout:0/528: creat d3/db/d24/fb1 x:0 0 0 2026-03-10T12:37:59.645 INFO:tasks.workunit.client.0.vm00.stdout:7/429: read da/d25/d2e/f43 [2086379,12157] 0 2026-03-10T12:37:59.646 INFO:tasks.workunit.client.0.vm00.stdout:7/430: stat da/d25/d2c/f4f 0 2026-03-10T12:37:59.650 INFO:tasks.workunit.client.1.vm07.stdout:5/545: mknod d0/d22/d18/d19/d21/d3a/cba 0 2026-03-10T12:37:59.651 INFO:tasks.workunit.client.1.vm07.stdout:2/418: write d0/f73 [643,56246] 0 2026-03-10T12:37:59.651 INFO:tasks.workunit.client.1.vm07.stdout:5/546: readlink d0/d22/d18/d19/d2e/d67/l6b 0 2026-03-10T12:37:59.652 INFO:tasks.workunit.client.0.vm00.stdout:0/529: mknod d3/d22/d3a/cb2 0 2026-03-10T12:37:59.653 INFO:tasks.workunit.client.1.vm07.stdout:6/490: creat d1/d4/d6/d43/d65/f9c x:0 0 0 2026-03-10T12:37:59.655 INFO:tasks.workunit.client.1.vm07.stdout:3/561: write dc/dd/d43/d5c/f9d [423656,67788] 0 2026-03-10T12:37:59.661 INFO:tasks.workunit.client.1.vm07.stdout:9/577: dwrite d5/d16/d18/fa1 [0,4194304] 0 2026-03-10T12:37:59.672 INFO:tasks.workunit.client.1.vm07.stdout:7/506: write d0/f1e [1594766,56411] 0 2026-03-10T12:37:59.672 INFO:tasks.workunit.client.0.vm00.stdout:6/420: truncate d2/f5e 1315145 0 2026-03-10T12:37:59.674 INFO:tasks.workunit.client.0.vm00.stdout:0/530: rmdir d3/db/da4 39 2026-03-10T12:37:59.675 INFO:tasks.workunit.client.1.vm07.stdout:5/547: mknod d0/d22/d18/d19/cbb 0 2026-03-10T12:37:59.677 INFO:tasks.workunit.client.1.vm07.stdout:6/491: mkdir d1/d4/d6/d16/d1a/d9d 0 2026-03-10T12:37:59.684 INFO:tasks.workunit.client.0.vm00.stdout:4/560: write df/d1f/d36/d3a/d41/f2f [1214678,80119] 0 2026-03-10T12:37:59.690 INFO:tasks.workunit.client.1.vm07.stdout:3/562: creat dc/dd/fc5 x:0 0 0 2026-03-10T12:37:59.694 INFO:tasks.workunit.client.1.vm07.stdout:8/508: creat d1/d3/d11/d87/fa2 x:0 0 0 2026-03-10T12:37:59.694 INFO:tasks.workunit.client.0.vm00.stdout:3/624: write dd/d64/d92/f9c [334802,9452] 0 2026-03-10T12:37:59.697 INFO:tasks.workunit.client.0.vm00.stdout:3/625: write dd/d18/f7c [56739,77230] 0 2026-03-10T12:37:59.699 INFO:tasks.workunit.client.1.vm07.stdout:9/578: unlink d5/d13/d57/c3b 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.0.vm00.stdout:3/626: rename dd/d4e/d6a/c6d to dd/d3d/d84/ccf 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.0.vm00.stdout:3/627: mkdir dd/d18/d13/d99/da5/dd0 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.0.vm00.stdout:3/628: rename dd/ce to dd/d18/d13/d99/da5/dd0/cd1 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.1.vm07.stdout:7/507: creat d0/d57/d62/fa9 x:0 0 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.1.vm07.stdout:9/579: dread d5/d16/d18/fa1 [0,4194304] 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.1.vm07.stdout:4/632: rename d0/d4/d5/d78/lda to d0/d4/d5/ldf 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.1.vm07.stdout:0/576: creat d0/d14/d5f/d41/d6a/d74/fb9 x:0 0 0 2026-03-10T12:37:59.707 INFO:tasks.workunit.client.1.vm07.stdout:6/492: mknod d1/c9e 0 2026-03-10T12:37:59.708 INFO:tasks.workunit.client.0.vm00.stdout:3/629: symlink dd/d18/d13/d99/da5/ld2 0 2026-03-10T12:37:59.710 INFO:tasks.workunit.client.0.vm00.stdout:6/421: rmdir d2/d51/d7b 0 2026-03-10T12:37:59.715 INFO:tasks.workunit.client.1.vm07.stdout:2/419: rename d0/d45/d54 to d0/d42/d1f/d90 0 2026-03-10T12:37:59.716 INFO:tasks.workunit.client.1.vm07.stdout:4/633: symlink d0/d4/d5/da/d95/le0 0 2026-03-10T12:37:59.716 INFO:tasks.workunit.client.0.vm00.stdout:6/422: chown d2/c5 13285010 1 2026-03-10T12:37:59.716 INFO:tasks.workunit.client.1.vm07.stdout:4/634: dread - d0/d19/f91 zero size 2026-03-10T12:37:59.717 INFO:tasks.workunit.client.1.vm07.stdout:4/635: dread - d0/d4/d5/d8f/fdd zero size 2026-03-10T12:37:59.717 INFO:tasks.workunit.client.1.vm07.stdout:0/577: creat d0/d14/d7c/fba x:0 0 0 2026-03-10T12:37:59.717 INFO:tasks.workunit.client.0.vm00.stdout:2/574: write d4/d6/d2d/d3a/d43/d85/f8f [797729,39134] 0 2026-03-10T12:37:59.719 INFO:tasks.workunit.client.0.vm00.stdout:2/575: fsync d4/d53/d76/d9b/dad/f65 0 2026-03-10T12:37:59.728 INFO:tasks.workunit.client.0.vm00.stdout:2/576: chown d4/dd/l1c 91 1 2026-03-10T12:37:59.729 INFO:tasks.workunit.client.0.vm00.stdout:9/622: rename d0/d7f/db8/fbb to d0/dc2/fdd 0 2026-03-10T12:37:59.729 INFO:tasks.workunit.client.0.vm00.stdout:2/577: stat d4/l8d 0 2026-03-10T12:37:59.729 INFO:tasks.workunit.client.0.vm00.stdout:2/578: dwrite f1 [0,4194304] 0 2026-03-10T12:37:59.729 INFO:tasks.workunit.client.1.vm07.stdout:8/509: symlink d1/d3/d6/d50/d70/la3 0 2026-03-10T12:37:59.735 INFO:tasks.workunit.client.0.vm00.stdout:9/623: truncate d0/d3d/f83 94012 0 2026-03-10T12:37:59.737 INFO:tasks.workunit.client.0.vm00.stdout:2/579: getdents d4/dd/da7 0 2026-03-10T12:37:59.738 INFO:tasks.workunit.client.0.vm00.stdout:2/580: chown d4/d53/d76/d9b/dad/f5e 0 1 2026-03-10T12:37:59.738 INFO:tasks.workunit.client.0.vm00.stdout:9/624: read d0/d7f/db8/dc4/f8b [2508549,21715] 0 2026-03-10T12:37:59.743 INFO:tasks.workunit.client.0.vm00.stdout:9/625: unlink d0/d3d/d59/d4e/dba/d1e/c5e 0 2026-03-10T12:37:59.745 INFO:tasks.workunit.client.0.vm00.stdout:2/581: dread d4/d6/f4e [0,4194304] 0 2026-03-10T12:37:59.746 INFO:tasks.workunit.client.0.vm00.stdout:4/561: dread df/d1f/d22/d26/f56 [0,4194304] 0 2026-03-10T12:37:59.747 INFO:tasks.workunit.client.0.vm00.stdout:4/562: chown df/d1f/d22/d26/d65/d91/db9 18765 1 2026-03-10T12:37:59.750 INFO:tasks.workunit.client.1.vm07.stdout:3/563: sync 2026-03-10T12:37:59.750 INFO:tasks.workunit.client.1.vm07.stdout:1/500: rename d9/df/d29/d2b/d30/l50 to d9/df/d29/d2b/d92/d9d/la5 0 2026-03-10T12:37:59.750 INFO:tasks.workunit.client.1.vm07.stdout:4/636: sync 2026-03-10T12:37:59.751 INFO:tasks.workunit.client.0.vm00.stdout:9/626: rename d0/d3d/f8f to d0/d7f/db8/dc4/fde 0 2026-03-10T12:37:59.753 INFO:tasks.workunit.client.0.vm00.stdout:2/582: mkdir d4/d6/d93/dc6 0 2026-03-10T12:37:59.759 INFO:tasks.workunit.client.0.vm00.stdout:9/627: mknod d0/d7f/d88/cdf 0 2026-03-10T12:37:59.759 INFO:tasks.workunit.client.1.vm07.stdout:0/578: mknod d0/d14/cbb 0 2026-03-10T12:37:59.759 INFO:tasks.workunit.client.0.vm00.stdout:9/628: chown d0/fdc 0 1 2026-03-10T12:37:59.760 INFO:tasks.workunit.client.0.vm00.stdout:9/629: write d0/d3d/d59/d4e/dba/fd5 [769048,104072] 0 2026-03-10T12:37:59.763 INFO:tasks.workunit.client.0.vm00.stdout:4/563: fdatasync f9 0 2026-03-10T12:37:59.764 INFO:tasks.workunit.client.1.vm07.stdout:1/501: dread d9/df/d29/f49 [0,4194304] 0 2026-03-10T12:37:59.766 INFO:tasks.workunit.client.0.vm00.stdout:2/583: mknod d4/d6/d93/dc6/cc7 0 2026-03-10T12:37:59.766 INFO:tasks.workunit.client.1.vm07.stdout:2/420: dwrite d0/d42/f53 [0,4194304] 0 2026-03-10T12:37:59.766 INFO:tasks.workunit.client.0.vm00.stdout:2/584: dread - d4/d53/d76/fac zero size 2026-03-10T12:37:59.767 INFO:tasks.workunit.client.0.vm00.stdout:2/585: write d4/d53/d9e/f60 [200070,89683] 0 2026-03-10T12:37:59.770 INFO:tasks.workunit.client.0.vm00.stdout:9/630: fdatasync d0/d3d/d59/d4e/dba/f39 0 2026-03-10T12:37:59.777 INFO:tasks.workunit.client.0.vm00.stdout:2/586: rename d4/d6/lb8 to d4/d6/d2d/dc3/lc8 0 2026-03-10T12:37:59.777 INFO:tasks.workunit.client.1.vm07.stdout:3/564: symlink dc/dd/d43/d76/d95/da0/lc6 0 2026-03-10T12:37:59.777 INFO:tasks.workunit.client.1.vm07.stdout:3/565: chown dc/d18/d24/f49 38105 1 2026-03-10T12:37:59.778 INFO:tasks.workunit.client.0.vm00.stdout:6/423: write d2/d16/d74/f62 [1406419,102816] 0 2026-03-10T12:37:59.787 INFO:tasks.workunit.client.0.vm00.stdout:5/603: dwrite d1f/d26/d2b/d35/d78/d7f/fb9 [0,4194304] 0 2026-03-10T12:37:59.789 INFO:tasks.workunit.client.0.vm00.stdout:6/424: dread d2/d16/f23 [0,4194304] 0 2026-03-10T12:37:59.793 INFO:tasks.workunit.client.0.vm00.stdout:8/455: truncate d0/d5c/f42 706958 0 2026-03-10T12:37:59.795 INFO:tasks.workunit.client.1.vm07.stdout:4/637: dwrite d0/d4/d7a/d46/f85 [0,4194304] 0 2026-03-10T12:37:59.796 INFO:tasks.workunit.client.0.vm00.stdout:1/563: dwrite da/d24/d5a/f75 [0,4194304] 0 2026-03-10T12:37:59.797 INFO:tasks.workunit.client.0.vm00.stdout:2/587: symlink d4/d6/d2d/d31/lc9 0 2026-03-10T12:37:59.802 INFO:tasks.workunit.client.0.vm00.stdout:5/604: dread d1f/d26/d2b/f44 [0,4194304] 0 2026-03-10T12:37:59.802 INFO:tasks.workunit.client.0.vm00.stdout:9/631: link d0/d3d/d59/d4e/dba/d1e/d85/d98/fd0 d0/d3d/d59/d4e/dba/d19/d50/fe0 0 2026-03-10T12:37:59.806 INFO:tasks.workunit.client.1.vm07.stdout:8/510: creat d1/d3/d6/d7b/fa4 x:0 0 0 2026-03-10T12:37:59.806 INFO:tasks.workunit.client.1.vm07.stdout:8/511: chown d1/d3/f73 377 1 2026-03-10T12:37:59.810 INFO:tasks.workunit.client.0.vm00.stdout:1/564: mkdir da/d21/d39/d77/dc0 0 2026-03-10T12:37:59.811 INFO:tasks.workunit.client.0.vm00.stdout:1/565: truncate da/d21/db3/fad 316631 0 2026-03-10T12:37:59.811 INFO:tasks.workunit.client.0.vm00.stdout:1/566: chown da/d24/d28/d67/da2/d78/dbe 403370 1 2026-03-10T12:37:59.812 INFO:tasks.workunit.client.0.vm00.stdout:1/567: chown da/d21/d39/f8c 888393 1 2026-03-10T12:37:59.814 INFO:tasks.workunit.client.0.vm00.stdout:5/605: write d1f/d26/d2b/d35/d53/d72/d9d/d8e/fc1 [141155,101733] 0 2026-03-10T12:37:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:37:59 vm07.local ceph-mon[58582]: pgmap v165: 65 pgs: 65 active+clean; 2.2 GiB data, 7.8 GiB used, 112 GiB / 120 GiB avail; 44 MiB/s rd, 117 MiB/s wr, 257 op/s 2026-03-10T12:37:59.818 INFO:tasks.workunit.client.0.vm00.stdout:9/632: creat d0/d7f/db8/dc4/db0/fe1 x:0 0 0 2026-03-10T12:37:59.825 INFO:tasks.workunit.client.0.vm00.stdout:9/633: read d0/d7f/db8/dc4/db0/fda [3405813,36882] 0 2026-03-10T12:37:59.826 INFO:tasks.workunit.client.0.vm00.stdout:9/634: read d0/dc2/f87 [833507,89005] 0 2026-03-10T12:37:59.826 INFO:tasks.workunit.client.0.vm00.stdout:5/606: dread f12 [0,4194304] 0 2026-03-10T12:37:59.829 INFO:tasks.workunit.client.0.vm00.stdout:9/635: mknod d0/d3d/d59/d4e/ce2 0 2026-03-10T12:37:59.831 INFO:tasks.workunit.client.0.vm00.stdout:5/607: mkdir d1f/d26/d2b/d35/d53/dd6 0 2026-03-10T12:37:59.835 INFO:tasks.workunit.client.0.vm00.stdout:5/608: dwrite d1f/d26/f9f [0,4194304] 0 2026-03-10T12:37:59.837 INFO:tasks.workunit.client.0.vm00.stdout:1/568: rmdir da/d21/d93 0 2026-03-10T12:37:59.841 INFO:tasks.workunit.client.0.vm00.stdout:7/431: dwrite da/f17 [0,4194304] 0 2026-03-10T12:37:59.844 INFO:tasks.workunit.client.0.vm00.stdout:9/636: chown d0/f21 45830841 1 2026-03-10T12:37:59.850 INFO:tasks.workunit.client.0.vm00.stdout:0/531: dwrite d3/d7/d3c/f30 [0,4194304] 0 2026-03-10T12:37:59.853 INFO:tasks.workunit.client.0.vm00.stdout:0/532: dwrite d3/d40/f7a [0,4194304] 0 2026-03-10T12:37:59.865 INFO:tasks.workunit.client.0.vm00.stdout:1/569: sync 2026-03-10T12:37:59.866 INFO:tasks.workunit.client.0.vm00.stdout:4/564: dread df/f3d [0,4194304] 0 2026-03-10T12:37:59.868 INFO:tasks.workunit.client.1.vm07.stdout:2/421: symlink d0/d42/d26/d38/l91 0 2026-03-10T12:37:59.870 INFO:tasks.workunit.client.0.vm00.stdout:7/432: dwrite da/d1b/d40/f5c [0,4194304] 0 2026-03-10T12:37:59.882 INFO:tasks.workunit.client.1.vm07.stdout:1/502: truncate d9/d2d/d4f/d5a/f6e 4015047 0 2026-03-10T12:37:59.882 INFO:tasks.workunit.client.0.vm00.stdout:5/609: rename d1f/d26/d2b/d37/f77 to d1f/d26/d2b/d35/d53/dd6/fd7 0 2026-03-10T12:37:59.882 INFO:tasks.workunit.client.0.vm00.stdout:0/533: truncate d3/d7/f10 1873644 0 2026-03-10T12:37:59.890 INFO:tasks.workunit.client.0.vm00.stdout:1/570: dread da/d21/d39/f89 [0,4194304] 0 2026-03-10T12:37:59.891 INFO:tasks.workunit.client.0.vm00.stdout:0/534: sync 2026-03-10T12:37:59.902 INFO:tasks.workunit.client.0.vm00.stdout:4/565: rmdir df/d1f/d22/d26/d65/d91 39 2026-03-10T12:37:59.902 INFO:tasks.workunit.client.0.vm00.stdout:7/433: write da/d25/d2c/f98 [42563,92830] 0 2026-03-10T12:37:59.902 INFO:tasks.workunit.client.0.vm00.stdout:1/571: dread da/d21/d39/f55 [0,4194304] 0 2026-03-10T12:37:59.902 INFO:tasks.workunit.client.0.vm00.stdout:5/610: symlink d1f/d26/d2e/d58/d6b/ld8 0 2026-03-10T12:37:59.908 INFO:tasks.workunit.client.0.vm00.stdout:9/637: rename d0/d3d/d43/ld7 to d0/d7f/db8/le3 0 2026-03-10T12:37:59.913 INFO:tasks.workunit.client.0.vm00.stdout:1/572: symlink da/d12/db4/lc1 0 2026-03-10T12:37:59.915 INFO:tasks.workunit.client.0.vm00.stdout:1/573: dread da/d21/d39/f89 [0,4194304] 0 2026-03-10T12:37:59.922 INFO:tasks.workunit.client.0.vm00.stdout:5/611: link d1f/d6a/d94/dc9/fae d1f/d6a/d94/dc3/fd9 0 2026-03-10T12:37:59.926 INFO:tasks.workunit.client.0.vm00.stdout:5/612: symlink d1f/d6a/d94/dc9/lda 0 2026-03-10T12:37:59.927 INFO:tasks.workunit.client.0.vm00.stdout:5/613: chown d1f/d39/l47 143 1 2026-03-10T12:37:59.930 INFO:tasks.workunit.client.0.vm00.stdout:5/614: dwrite d1f/d26/d2b/d35/f42 [0,4194304] 0 2026-03-10T12:37:59.931 INFO:tasks.workunit.client.0.vm00.stdout:5/615: read - d1f/d96/dbd/fc5 zero size 2026-03-10T12:37:59.936 INFO:tasks.workunit.client.0.vm00.stdout:4/566: getdents df/d1f/d22 0 2026-03-10T12:37:59.939 INFO:tasks.workunit.client.0.vm00.stdout:3/630: write dd/d27/d2c/f89 [1464867,129128] 0 2026-03-10T12:37:59.946 INFO:tasks.workunit.client.1.vm07.stdout:1/503: read d9/df/d54/f57 [615496,38511] 0 2026-03-10T12:37:59.951 INFO:tasks.workunit.client.0.vm00.stdout:5/616: fdatasync d1f/d26/d2e/f3c 0 2026-03-10T12:37:59.954 INFO:tasks.workunit.client.0.vm00.stdout:5/617: dread d1f/f46 [0,4194304] 0 2026-03-10T12:37:59.955 INFO:tasks.workunit.client.0.vm00.stdout:5/618: chown d1f/d26/d2e/d58/c76 3591 1 2026-03-10T12:37:59.961 INFO:tasks.workunit.client.0.vm00.stdout:3/631: mkdir dd/d18/d13/d1d/d43/d55/dd3 0 2026-03-10T12:37:59.971 INFO:tasks.workunit.client.0.vm00.stdout:0/535: rename d3/d7/d4c/d9d/dae to d3/d7/d4c/d5b/d38/db3 0 2026-03-10T12:37:59.971 INFO:tasks.workunit.client.0.vm00.stdout:2/588: dwrite d4/d6/d2d/d31/f79 [0,4194304] 0 2026-03-10T12:37:59.971 INFO:tasks.workunit.client.0.vm00.stdout:1/574: getdents da/d24/d28/d67 0 2026-03-10T12:37:59.972 INFO:tasks.workunit.client.0.vm00.stdout:5/619: creat d1f/d26/d2b/d35/d78/d99/daf/fdb x:0 0 0 2026-03-10T12:37:59.975 INFO:tasks.workunit.client.1.vm07.stdout:7/508: write d0/d47/d48/f53 [675244,56869] 0 2026-03-10T12:37:59.976 INFO:tasks.workunit.client.0.vm00.stdout:5/620: mknod d1f/d26/d2b/d35/d78/d99/daf/cdc 0 2026-03-10T12:37:59.977 INFO:tasks.workunit.client.0.vm00.stdout:1/575: symlink da/d21/d27/d6a/d94/db9/lc2 0 2026-03-10T12:37:59.979 INFO:tasks.workunit.client.0.vm00.stdout:5/621: mknod d1f/d26/d2b/d35/d78/d99/dcd/cdd 0 2026-03-10T12:37:59.979 INFO:tasks.workunit.client.0.vm00.stdout:1/576: mkdir da/d21/d39/d77/dc0/dc3 0 2026-03-10T12:37:59.980 INFO:tasks.workunit.client.0.vm00.stdout:1/577: truncate da/d21/db3/fad 1269301 0 2026-03-10T12:37:59.980 INFO:tasks.workunit.client.0.vm00.stdout:5/622: creat d1f/d26/d2b/d37/da4/fde x:0 0 0 2026-03-10T12:37:59.983 INFO:tasks.workunit.client.0.vm00.stdout:5/623: creat d1f/d96/fdf x:0 0 0 2026-03-10T12:37:59.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:37:59 vm00.local ceph-mon[50686]: pgmap v165: 65 pgs: 65 active+clean; 2.2 GiB data, 7.8 GiB used, 112 GiB / 120 GiB avail; 44 MiB/s rd, 117 MiB/s wr, 257 op/s 2026-03-10T12:37:59.987 INFO:tasks.workunit.client.0.vm00.stdout:5/624: mkdir d1f/d26/d2e/d58/d6b/de0 0 2026-03-10T12:37:59.987 INFO:tasks.workunit.client.0.vm00.stdout:1/578: dwrite da/d21/d27/f6e [0,4194304] 0 2026-03-10T12:37:59.989 INFO:tasks.workunit.client.0.vm00.stdout:1/579: read da/d21/d39/f8c [1968359,39500] 0 2026-03-10T12:37:59.991 INFO:tasks.workunit.client.0.vm00.stdout:1/580: symlink da/d21/db3/d59/da6/d8b/lc4 0 2026-03-10T12:37:59.992 INFO:tasks.workunit.client.0.vm00.stdout:1/581: fdatasync da/d24/d73/fb6 0 2026-03-10T12:38:00.001 INFO:tasks.workunit.client.0.vm00.stdout:0/536: mknod d3/d7/d3c/d74/cb4 0 2026-03-10T12:38:00.001 INFO:tasks.workunit.client.0.vm00.stdout:7/434: getdents da/d26/d50/d73/d89 0 2026-03-10T12:38:00.002 INFO:tasks.workunit.client.0.vm00.stdout:0/537: symlink d3/db/d77/d82/lb5 0 2026-03-10T12:38:00.009 INFO:tasks.workunit.client.1.vm07.stdout:2/422: fsync d0/d42/d1f/f2f 0 2026-03-10T12:38:00.009 INFO:tasks.workunit.client.1.vm07.stdout:9/580: rename d5/d1f/d7d/f7f to d5/d16/d23/fc8 0 2026-03-10T12:38:00.021 INFO:tasks.workunit.client.1.vm07.stdout:5/548: write d0/d22/d18/d19/d21/f42 [7654119,674] 0 2026-03-10T12:38:00.021 INFO:tasks.workunit.client.1.vm07.stdout:9/581: creat d5/d13/d57/fc9 x:0 0 0 2026-03-10T12:38:00.028 INFO:tasks.workunit.client.1.vm07.stdout:6/493: creat d1/d4/d6/d16/d1a/f9f x:0 0 0 2026-03-10T12:38:00.032 INFO:tasks.workunit.client.1.vm07.stdout:4/638: getdents d0/d4/d10/d3c/d2b/d2d/d9c 0 2026-03-10T12:38:00.033 INFO:tasks.workunit.client.1.vm07.stdout:1/504: rename d9/df/d29/d2b/f7c to d9/df/d54/fa6 0 2026-03-10T12:38:00.037 INFO:tasks.workunit.client.0.vm00.stdout:0/538: dread d3/d7/d4c/d5b/f37 [0,4194304] 0 2026-03-10T12:38:00.038 INFO:tasks.workunit.client.0.vm00.stdout:0/539: write d3/d7/d3c/f30 [4474806,108062] 0 2026-03-10T12:38:00.039 INFO:tasks.workunit.client.1.vm07.stdout:6/494: read d1/d4/d6/d16/f50 [504788,64756] 0 2026-03-10T12:38:00.041 INFO:tasks.workunit.client.1.vm07.stdout:1/505: dwrite d9/df/d29/d2b/d31/d91/d59/fa2 [0,4194304] 0 2026-03-10T12:38:00.045 INFO:tasks.workunit.client.1.vm07.stdout:5/549: rename d0/d22/d18/d80 to d0/d22/dbc 0 2026-03-10T12:38:00.046 INFO:tasks.workunit.client.1.vm07.stdout:2/423: link d0/d29/l34 d0/d29/d64/d74/d88/l92 0 2026-03-10T12:38:00.052 INFO:tasks.workunit.client.1.vm07.stdout:5/550: creat d0/d22/d18/d19/d21/fbd x:0 0 0 2026-03-10T12:38:00.052 INFO:tasks.workunit.client.1.vm07.stdout:5/551: chown d0/d22/d18/f4c 3672 1 2026-03-10T12:38:00.053 INFO:tasks.workunit.client.1.vm07.stdout:4/639: sync 2026-03-10T12:38:00.054 INFO:tasks.workunit.client.1.vm07.stdout:3/566: dread dc/d18/f36 [0,4194304] 0 2026-03-10T12:38:00.057 INFO:tasks.workunit.client.0.vm00.stdout:8/456: write d0/d12/d36/d5b/f65 [4146726,101828] 0 2026-03-10T12:38:00.059 INFO:tasks.workunit.client.1.vm07.stdout:9/582: rename d5/c17 to d5/d16/da3/cca 0 2026-03-10T12:38:00.070 INFO:tasks.workunit.client.0.vm00.stdout:8/457: write d0/d12/d36/d5b/f65 [3910431,106180] 0 2026-03-10T12:38:00.070 INFO:tasks.workunit.client.0.vm00.stdout:8/458: creat d0/d12/f8f x:0 0 0 2026-03-10T12:38:00.071 INFO:tasks.workunit.client.0.vm00.stdout:8/459: symlink d0/d58/d68/l90 0 2026-03-10T12:38:00.071 INFO:tasks.workunit.client.1.vm07.stdout:2/424: mkdir d0/d80/d93 0 2026-03-10T12:38:00.071 INFO:tasks.workunit.client.1.vm07.stdout:3/567: rmdir dc/dd/d43 39 2026-03-10T12:38:00.071 INFO:tasks.workunit.client.1.vm07.stdout:6/495: rename d1/d4/d6/lf to d1/d4/d71/la0 0 2026-03-10T12:38:00.071 INFO:tasks.workunit.client.1.vm07.stdout:6/496: read - d1/d4/d6/d16/d1a/d33/f92 zero size 2026-03-10T12:38:00.071 INFO:tasks.workunit.client.1.vm07.stdout:6/497: stat d1/d4/d6/d16/d1a/c87 0 2026-03-10T12:38:00.072 INFO:tasks.workunit.client.1.vm07.stdout:6/498: chown d1/d4/d6/d46/d4d/c81 1 1 2026-03-10T12:38:00.075 INFO:tasks.workunit.client.0.vm00.stdout:9/638: dwrite d0/d3d/d43/d53/fa5 [0,4194304] 0 2026-03-10T12:38:00.078 INFO:tasks.workunit.client.1.vm07.stdout:9/583: creat d5/d13/d6c/da4/fcb x:0 0 0 2026-03-10T12:38:00.080 INFO:tasks.workunit.client.0.vm00.stdout:4/567: write df/d1f/d36/f51 [2411856,87899] 0 2026-03-10T12:38:00.081 INFO:tasks.workunit.client.1.vm07.stdout:9/584: read d5/d1f/d31/f56 [342381,45802] 0 2026-03-10T12:38:00.081 INFO:tasks.workunit.client.0.vm00.stdout:4/568: dread - df/f4e zero size 2026-03-10T12:38:00.085 INFO:tasks.workunit.client.1.vm07.stdout:1/506: dread d9/df/f26 [0,4194304] 0 2026-03-10T12:38:00.093 INFO:tasks.workunit.client.0.vm00.stdout:2/589: dwrite d4/d6/f30 [4194304,4194304] 0 2026-03-10T12:38:00.093 INFO:tasks.workunit.client.0.vm00.stdout:3/632: dwrite dd/d27/d2c/d34/d38/f63 [0,4194304] 0 2026-03-10T12:38:00.093 INFO:tasks.workunit.client.1.vm07.stdout:2/425: mkdir d0/d29/d64/d6c/d94 0 2026-03-10T12:38:00.093 INFO:tasks.workunit.client.1.vm07.stdout:3/568: mkdir dc/dd/d1f/dc7 0 2026-03-10T12:38:00.103 INFO:tasks.workunit.client.0.vm00.stdout:9/639: symlink d0/d7f/le4 0 2026-03-10T12:38:00.107 INFO:tasks.workunit.client.0.vm00.stdout:5/625: rmdir d1f/d26/d2e/d58 39 2026-03-10T12:38:00.109 INFO:tasks.workunit.client.0.vm00.stdout:4/569: rename df/d8a/ca8 to df/d1f/d22/d26/d70/cbe 0 2026-03-10T12:38:00.109 INFO:tasks.workunit.client.1.vm07.stdout:9/585: creat d5/d1f/d7d/fcc x:0 0 0 2026-03-10T12:38:00.109 INFO:tasks.workunit.client.0.vm00.stdout:4/570: chown df/d32/d76/c9f 184162 1 2026-03-10T12:38:00.112 INFO:tasks.workunit.client.0.vm00.stdout:9/640: mkdir d0/d3d/d59/d4e/dba/d1e/d85/de5 0 2026-03-10T12:38:00.113 INFO:tasks.workunit.client.0.vm00.stdout:9/641: fdatasync d0/d7f/db8/dc4/fca 0 2026-03-10T12:38:00.113 INFO:tasks.workunit.client.0.vm00.stdout:8/460: sync 2026-03-10T12:38:00.114 INFO:tasks.workunit.client.1.vm07.stdout:2/426: readlink d0/d29/d64/d74/d88/l92 0 2026-03-10T12:38:00.116 INFO:tasks.workunit.client.0.vm00.stdout:2/590: rename d4/dd/d38 to d4/d6/dca 0 2026-03-10T12:38:00.116 INFO:tasks.workunit.client.1.vm07.stdout:2/427: dread d0/d42/f53 [0,4194304] 0 2026-03-10T12:38:00.119 INFO:tasks.workunit.client.0.vm00.stdout:5/626: mknod d1f/d6a/d94/dc3/ce1 0 2026-03-10T12:38:00.122 INFO:tasks.workunit.client.0.vm00.stdout:3/633: rename dd/d3d/f3e to dd/d18/d13/d99/da5/fd4 0 2026-03-10T12:38:00.125 INFO:tasks.workunit.client.0.vm00.stdout:4/571: symlink df/lbf 0 2026-03-10T12:38:00.125 INFO:tasks.workunit.client.1.vm07.stdout:1/507: dread d9/f19 [0,4194304] 0 2026-03-10T12:38:00.126 INFO:tasks.workunit.client.1.vm07.stdout:1/508: write d9/df/f97 [393529,15407] 0 2026-03-10T12:38:00.126 INFO:tasks.workunit.client.1.vm07.stdout:1/509: dread - d9/d2d/d80/d8e/fa0 zero size 2026-03-10T12:38:00.129 INFO:tasks.workunit.client.0.vm00.stdout:4/572: chown df/d32/d76/c9b 872993 1 2026-03-10T12:38:00.129 INFO:tasks.workunit.client.1.vm07.stdout:9/586: unlink d5/d16/f19 0 2026-03-10T12:38:00.130 INFO:tasks.workunit.client.1.vm07.stdout:9/587: stat d5/d69/d93/d97/fc3 0 2026-03-10T12:38:00.130 INFO:tasks.workunit.client.0.vm00.stdout:4/573: creat df/d93/fc0 x:0 0 0 2026-03-10T12:38:00.132 INFO:tasks.workunit.client.0.vm00.stdout:9/642: getdents d0/d3d/d59 0 2026-03-10T12:38:00.144 INFO:tasks.workunit.client.1.vm07.stdout:2/428: mknod d0/d42/d26/d38/d4f/d5d/c95 0 2026-03-10T12:38:00.145 INFO:tasks.workunit.client.1.vm07.stdout:2/429: truncate d0/d42/d4e/d77/d70/f8a 225323 0 2026-03-10T12:38:00.148 INFO:tasks.workunit.client.1.vm07.stdout:5/552: rename d0/d22/d18/d19/cbb to d0/cbe 0 2026-03-10T12:38:00.149 INFO:tasks.workunit.client.0.vm00.stdout:2/591: dread d4/dd/f10 [4194304,4194304] 0 2026-03-10T12:38:00.150 INFO:tasks.workunit.client.0.vm00.stdout:8/461: sync 2026-03-10T12:38:00.151 INFO:tasks.workunit.client.0.vm00.stdout:5/627: sync 2026-03-10T12:38:00.151 INFO:tasks.workunit.client.0.vm00.stdout:5/628: readlink d1f/d26/l92 0 2026-03-10T12:38:00.151 INFO:tasks.workunit.client.0.vm00.stdout:5/629: fsync d1f/d26/d2b/d35/f68 0 2026-03-10T12:38:00.152 INFO:tasks.workunit.client.0.vm00.stdout:2/592: creat d4/d6/d2d/d3a/fcb x:0 0 0 2026-03-10T12:38:00.153 INFO:tasks.workunit.client.1.vm07.stdout:2/430: truncate d0/d42/d26/d38/f3a 4297326 0 2026-03-10T12:38:00.153 INFO:tasks.workunit.client.0.vm00.stdout:2/593: chown d4/d6/f34 177 1 2026-03-10T12:38:00.153 INFO:tasks.workunit.client.1.vm07.stdout:2/431: readlink d0/d42/d26/d38/l91 0 2026-03-10T12:38:00.153 INFO:tasks.workunit.client.0.vm00.stdout:2/594: readlink d4/d53/lab 0 2026-03-10T12:38:00.154 INFO:tasks.workunit.client.1.vm07.stdout:2/432: chown d0/d42/d1f/d20/f39 187767493 1 2026-03-10T12:38:00.154 INFO:tasks.workunit.client.0.vm00.stdout:8/462: unlink d0/f56 0 2026-03-10T12:38:00.160 INFO:tasks.workunit.client.1.vm07.stdout:5/553: rmdir d0/d22/d18/d3e 39 2026-03-10T12:38:00.163 INFO:tasks.workunit.client.1.vm07.stdout:2/433: fdatasync d0/d42/d26/f48 0 2026-03-10T12:38:00.166 INFO:tasks.workunit.client.0.vm00.stdout:1/582: getdents da/d21/db3/d59/da6/d8b 0 2026-03-10T12:38:00.174 INFO:tasks.workunit.client.1.vm07.stdout:0/579: write d0/d14/d5f/d41/f55 [3434967,113856] 0 2026-03-10T12:38:00.175 INFO:tasks.workunit.client.0.vm00.stdout:6/425: dwrite d2/da/dc/f27 [0,4194304] 0 2026-03-10T12:38:00.178 INFO:tasks.workunit.client.0.vm00.stdout:1/583: dread da/f22 [0,4194304] 0 2026-03-10T12:38:00.182 INFO:tasks.workunit.client.1.vm07.stdout:8/512: dwrite d1/f68 [0,4194304] 0 2026-03-10T12:38:00.183 INFO:tasks.workunit.client.0.vm00.stdout:9/643: write d0/d7f/db8/fc6 [560519,58094] 0 2026-03-10T12:38:00.186 INFO:tasks.workunit.client.0.vm00.stdout:3/634: dwrite dd/d27/f72 [0,4194304] 0 2026-03-10T12:38:00.187 INFO:tasks.workunit.client.0.vm00.stdout:4/574: dwrite df/d1f/d36/d3a/d41/f33 [0,4194304] 0 2026-03-10T12:38:00.190 INFO:tasks.workunit.client.0.vm00.stdout:2/595: write d4/f1d [4091822,19777] 0 2026-03-10T12:38:00.193 INFO:tasks.workunit.client.0.vm00.stdout:5/630: dwrite d1f/f22 [4194304,4194304] 0 2026-03-10T12:38:00.194 INFO:tasks.workunit.client.0.vm00.stdout:5/631: chown d1f/d26/d2b/d35/d78/d99 58 1 2026-03-10T12:38:00.194 INFO:tasks.workunit.client.0.vm00.stdout:5/632: dread f19 [0,4194304] 0 2026-03-10T12:38:00.195 INFO:tasks.workunit.client.0.vm00.stdout:5/633: truncate d1f/d96/dbd/fc5 503737 0 2026-03-10T12:38:00.203 INFO:tasks.workunit.client.0.vm00.stdout:8/463: dread d0/d12/d2d/f44 [0,4194304] 0 2026-03-10T12:38:00.205 INFO:tasks.workunit.client.0.vm00.stdout:1/584: chown da/c97 7789398 1 2026-03-10T12:38:00.205 INFO:tasks.workunit.client.1.vm07.stdout:9/588: rename d5/d13/d57/f73 to d5/fcd 0 2026-03-10T12:38:00.205 INFO:tasks.workunit.client.0.vm00.stdout:9/644: mkdir d0/d7f/db8/dc4/de6 0 2026-03-10T12:38:00.210 INFO:tasks.workunit.client.1.vm07.stdout:5/554: mkdir d0/dbf 0 2026-03-10T12:38:00.220 INFO:tasks.workunit.client.0.vm00.stdout:8/464: fsync d0/f9 0 2026-03-10T12:38:00.220 INFO:tasks.workunit.client.0.vm00.stdout:2/596: fdatasync d4/d6/d2d/d3a/f44 0 2026-03-10T12:38:00.224 INFO:tasks.workunit.client.1.vm07.stdout:9/589: mkdir d5/d1f/d31/dce 0 2026-03-10T12:38:00.229 INFO:tasks.workunit.client.0.vm00.stdout:1/585: dread da/f13 [0,4194304] 0 2026-03-10T12:38:00.229 INFO:tasks.workunit.client.0.vm00.stdout:1/586: write da/d24/d73/fb6 [838570,107047] 0 2026-03-10T12:38:00.229 INFO:tasks.workunit.client.0.vm00.stdout:6/426: rmdir d2/d39 39 2026-03-10T12:38:00.231 INFO:tasks.workunit.client.0.vm00.stdout:1/587: dwrite da/d21/d27/f6e [4194304,4194304] 0 2026-03-10T12:38:00.234 INFO:tasks.workunit.client.1.vm07.stdout:8/513: symlink d1/d3/la5 0 2026-03-10T12:38:00.238 INFO:tasks.workunit.client.0.vm00.stdout:5/634: creat d1f/d26/d2b/fe2 x:0 0 0 2026-03-10T12:38:00.239 INFO:tasks.workunit.client.0.vm00.stdout:5/635: write d1f/d26/d2b/d37/f38 [5027458,58255] 0 2026-03-10T12:38:00.239 INFO:tasks.workunit.client.0.vm00.stdout:5/636: dread - d1f/d26/d6f/f9b zero size 2026-03-10T12:38:00.240 INFO:tasks.workunit.client.1.vm07.stdout:8/514: chown d1/d3/d40/l8d 596 1 2026-03-10T12:38:00.241 INFO:tasks.workunit.client.1.vm07.stdout:9/590: dwrite d5/d16/d23/fb2 [0,4194304] 0 2026-03-10T12:38:00.243 INFO:tasks.workunit.client.0.vm00.stdout:8/465: creat d0/d46/d89/f91 x:0 0 0 2026-03-10T12:38:00.246 INFO:tasks.workunit.client.0.vm00.stdout:8/466: truncate d0/d12/f23 410282 0 2026-03-10T12:38:00.247 INFO:tasks.workunit.client.0.vm00.stdout:7/435: truncate da/d26/f97 1188623 0 2026-03-10T12:38:00.250 INFO:tasks.workunit.client.0.vm00.stdout:8/467: link d0/d12/d2d/d49/f6c d0/d46/f92 0 2026-03-10T12:38:00.250 INFO:tasks.workunit.client.0.vm00.stdout:8/468: fdatasync d0/dd/f4d 0 2026-03-10T12:38:00.250 INFO:tasks.workunit.client.0.vm00.stdout:8/469: readlink d0/l8e 0 2026-03-10T12:38:00.257 INFO:tasks.workunit.client.0.vm00.stdout:1/588: sync 2026-03-10T12:38:00.259 INFO:tasks.workunit.client.0.vm00.stdout:1/589: creat da/d12/fc5 x:0 0 0 2026-03-10T12:38:00.270 INFO:tasks.workunit.client.1.vm07.stdout:9/591: dread - d5/d13/d22/f9e zero size 2026-03-10T12:38:00.270 INFO:tasks.workunit.client.1.vm07.stdout:7/509: dwrite d0/d47/d48/d8a/d9d/fa3 [4194304,4194304] 0 2026-03-10T12:38:00.270 INFO:tasks.workunit.client.0.vm00.stdout:1/590: mkdir da/d24/d73/dc6 0 2026-03-10T12:38:00.270 INFO:tasks.workunit.client.0.vm00.stdout:5/637: dread d1f/d26/d2b/d35/fad [0,4194304] 0 2026-03-10T12:38:00.270 INFO:tasks.workunit.client.0.vm00.stdout:1/591: dwrite da/d21/db3/fad [0,4194304] 0 2026-03-10T12:38:00.270 INFO:tasks.workunit.client.0.vm00.stdout:5/638: getdents d1f/d96/dbd 0 2026-03-10T12:38:00.274 INFO:tasks.workunit.client.1.vm07.stdout:9/592: truncate d5/d13/d57/d3e/fa9 3614295 0 2026-03-10T12:38:00.274 INFO:tasks.workunit.client.1.vm07.stdout:9/593: stat d5/d69 0 2026-03-10T12:38:00.275 INFO:tasks.workunit.client.0.vm00.stdout:1/592: rename da/d12/c3a to da/d21/db3/d59/da6/d8b/cc7 0 2026-03-10T12:38:00.276 INFO:tasks.workunit.client.0.vm00.stdout:5/639: unlink d1f/d26/l92 0 2026-03-10T12:38:00.280 INFO:tasks.workunit.client.0.vm00.stdout:5/640: rename d1f/d26/d2b/d35/d53/d72/da3 to d1f/d26/de3 0 2026-03-10T12:38:00.284 INFO:tasks.workunit.client.0.vm00.stdout:5/641: mkdir d1f/d26/d2b/de4 0 2026-03-10T12:38:00.284 INFO:tasks.workunit.client.0.vm00.stdout:5/642: read - d1f/d96/fdf zero size 2026-03-10T12:38:00.284 INFO:tasks.workunit.client.0.vm00.stdout:5/643: unlink d1f/l31 0 2026-03-10T12:38:00.286 INFO:tasks.workunit.client.0.vm00.stdout:5/644: getdents d1f/d26/d2b/d35/d53 0 2026-03-10T12:38:00.287 INFO:tasks.workunit.client.0.vm00.stdout:6/427: read d2/da/dc/d2f/f4f [57288,105792] 0 2026-03-10T12:38:00.289 INFO:tasks.workunit.client.0.vm00.stdout:7/436: mknod da/ca5 0 2026-03-10T12:38:00.289 INFO:tasks.workunit.client.0.vm00.stdout:5/645: link d1f/d26/cb4 d1f/d26/d2e/d58/d6b/d86/ce5 0 2026-03-10T12:38:00.291 INFO:tasks.workunit.client.0.vm00.stdout:5/646: creat d1f/d26/d2b/fe6 x:0 0 0 2026-03-10T12:38:00.291 INFO:tasks.workunit.client.0.vm00.stdout:5/647: write d1f/d96/fdf [341457,69260] 0 2026-03-10T12:38:00.292 INFO:tasks.workunit.client.0.vm00.stdout:5/648: mkdir d1f/d6a/d94/dc3/de7 0 2026-03-10T12:38:00.297 INFO:tasks.workunit.client.0.vm00.stdout:0/540: dwrite d3/d7/f11 [0,4194304] 0 2026-03-10T12:38:00.298 INFO:tasks.workunit.client.1.vm07.stdout:7/510: symlink d0/d57/laa 0 2026-03-10T12:38:00.301 INFO:tasks.workunit.client.0.vm00.stdout:6/428: mkdir d2/da/dc/d94 0 2026-03-10T12:38:00.305 INFO:tasks.workunit.client.1.vm07.stdout:7/511: mkdir d0/d47/dab 0 2026-03-10T12:38:00.306 INFO:tasks.workunit.client.0.vm00.stdout:0/541: rmdir d3/d7/d4c/d5b/d38 39 2026-03-10T12:38:00.307 INFO:tasks.workunit.client.1.vm07.stdout:4/640: dwrite d0/d4/d5/d34/f94 [0,4194304] 0 2026-03-10T12:38:00.315 INFO:tasks.workunit.client.1.vm07.stdout:7/512: fsync d0/f56 0 2026-03-10T12:38:00.318 INFO:tasks.workunit.client.0.vm00.stdout:4/575: dread df/d57/f7c [0,4194304] 0 2026-03-10T12:38:00.322 INFO:tasks.workunit.client.0.vm00.stdout:0/542: unlink d3/d33/l61 0 2026-03-10T12:38:00.325 INFO:tasks.workunit.client.0.vm00.stdout:4/576: dread df/d1f/d22/d26/f39 [0,4194304] 0 2026-03-10T12:38:00.327 INFO:tasks.workunit.client.0.vm00.stdout:4/577: getdents df/d1f 0 2026-03-10T12:38:00.327 INFO:tasks.workunit.client.0.vm00.stdout:4/578: dread - df/d1f/d22/f5a zero size 2026-03-10T12:38:00.329 INFO:tasks.workunit.client.1.vm07.stdout:6/499: write d1/d4/d6/d16/d1a/f8e [2418708,115248] 0 2026-03-10T12:38:00.330 INFO:tasks.workunit.client.0.vm00.stdout:4/579: rename df/d63/c83 to df/d6c/d90/cc1 0 2026-03-10T12:38:00.331 INFO:tasks.workunit.client.0.vm00.stdout:4/580: creat df/d32/d76/fc2 x:0 0 0 2026-03-10T12:38:00.337 INFO:tasks.workunit.client.0.vm00.stdout:4/581: truncate df/d32/d64/f67 526053 0 2026-03-10T12:38:00.340 INFO:tasks.workunit.client.0.vm00.stdout:4/582: creat df/d93/dbc/fc3 x:0 0 0 2026-03-10T12:38:00.341 INFO:tasks.workunit.client.0.vm00.stdout:4/583: fsync df/d57/fa0 0 2026-03-10T12:38:00.342 INFO:tasks.workunit.client.0.vm00.stdout:6/429: symlink d2/d14/l95 0 2026-03-10T12:38:00.343 INFO:tasks.workunit.client.0.vm00.stdout:4/584: symlink df/d1f/d22/d26/d65/da7/lc4 0 2026-03-10T12:38:00.349 INFO:tasks.workunit.client.0.vm00.stdout:4/585: dwrite df/d1f/d36/d3a/d41/f47 [0,4194304] 0 2026-03-10T12:38:00.364 INFO:tasks.workunit.client.1.vm07.stdout:3/569: truncate dc/dd/d28/d7a/f88 1410584 0 2026-03-10T12:38:00.364 INFO:tasks.workunit.client.1.vm07.stdout:3/570: chown dc/dd/d1f/dc7 83493 1 2026-03-10T12:38:00.364 INFO:tasks.workunit.client.1.vm07.stdout:6/500: fdatasync d1/d4/d6/d16/d1a/d33/f3c 0 2026-03-10T12:38:00.364 INFO:tasks.workunit.client.0.vm00.stdout:4/586: rmdir df/d57 39 2026-03-10T12:38:00.364 INFO:tasks.workunit.client.0.vm00.stdout:4/587: creat df/d1f/d22/d26/dab/fc5 x:0 0 0 2026-03-10T12:38:00.364 INFO:tasks.workunit.client.0.vm00.stdout:4/588: chown df/d1f/d22/d26/f31 190 1 2026-03-10T12:38:00.364 INFO:tasks.workunit.client.0.vm00.stdout:4/589: readlink df/d1f/d22/d26/dab/d73/lae 0 2026-03-10T12:38:00.365 INFO:tasks.workunit.client.0.vm00.stdout:4/590: readlink df/d8a/lb5 0 2026-03-10T12:38:00.368 INFO:tasks.workunit.client.0.vm00.stdout:0/543: symlink d3/d40/lb6 0 2026-03-10T12:38:00.368 INFO:tasks.workunit.client.1.vm07.stdout:1/510: write d9/df/d29/f8b [1245540,40872] 0 2026-03-10T12:38:00.371 INFO:tasks.workunit.client.0.vm00.stdout:4/591: dwrite df/f20 [0,4194304] 0 2026-03-10T12:38:00.374 INFO:tasks.workunit.client.0.vm00.stdout:6/430: symlink d2/d42/d80/d89/l96 0 2026-03-10T12:38:00.378 INFO:tasks.workunit.client.1.vm07.stdout:2/434: rename d0/d42/d26/d38/f3a to d0/d42/d4e/d77/d70/f96 0 2026-03-10T12:38:00.381 INFO:tasks.workunit.client.1.vm07.stdout:6/501: creat d1/d4/d6/d4e/fa1 x:0 0 0 2026-03-10T12:38:00.383 INFO:tasks.workunit.client.0.vm00.stdout:6/431: rename d2/d14/f3f to d2/da/dc/d83/f97 0 2026-03-10T12:38:00.386 INFO:tasks.workunit.client.0.vm00.stdout:0/544: symlink d3/d22/da5/lb7 0 2026-03-10T12:38:00.392 INFO:tasks.workunit.client.1.vm07.stdout:3/571: rmdir dc/dd/d43 39 2026-03-10T12:38:00.394 INFO:tasks.workunit.client.1.vm07.stdout:3/572: read dc/d18/d99/da3/fb1 [3970773,120186] 0 2026-03-10T12:38:00.415 INFO:tasks.workunit.client.1.vm07.stdout:4/641: rename d0/d19 to d0/d4/d10/d3c/d2b/d54/de1 0 2026-03-10T12:38:00.418 INFO:tasks.workunit.client.1.vm07.stdout:0/580: write d0/d14/f19 [2801566,2121] 0 2026-03-10T12:38:00.418 INFO:tasks.workunit.client.1.vm07.stdout:6/502: truncate d1/d4/d6/f60 488846 0 2026-03-10T12:38:00.419 INFO:tasks.workunit.client.1.vm07.stdout:6/503: fsync d1/d4/d6/d16/d1a/d33/f61 0 2026-03-10T12:38:00.421 INFO:tasks.workunit.client.0.vm00.stdout:7/437: write da/f35 [1418927,64258] 0 2026-03-10T12:38:00.423 INFO:tasks.workunit.client.1.vm07.stdout:8/515: write d1/d3/d6/d50/f80 [39126,103844] 0 2026-03-10T12:38:00.424 INFO:tasks.workunit.client.1.vm07.stdout:8/516: write d1/f7 [1048173,68991] 0 2026-03-10T12:38:00.433 INFO:tasks.workunit.client.1.vm07.stdout:1/511: mkdir d9/d2d/d4f/d75/d77/da7 0 2026-03-10T12:38:00.439 INFO:tasks.workunit.client.1.vm07.stdout:9/594: dwrite d5/d1f/d31/fad [0,4194304] 0 2026-03-10T12:38:00.441 INFO:tasks.workunit.client.1.vm07.stdout:9/595: chown d5/d1f/d5e/d6b/fae 120 1 2026-03-10T12:38:00.445 INFO:tasks.workunit.client.1.vm07.stdout:3/573: fsync dc/dd/d1f/d45/f56 0 2026-03-10T12:38:00.452 INFO:tasks.workunit.client.1.vm07.stdout:5/555: rename d0/d22/d18/d19/d21/c29 to d0/d22/d18/d3e/d53/cc0 0 2026-03-10T12:38:00.469 INFO:tasks.workunit.client.1.vm07.stdout:8/517: rmdir d1/d3/d6c 39 2026-03-10T12:38:00.469 INFO:tasks.workunit.client.0.vm00.stdout:0/545: rename d3/d7/d4c/d5b/d38/f89 to d3/db/d24/d25/fb8 0 2026-03-10T12:38:00.469 INFO:tasks.workunit.client.0.vm00.stdout:0/546: fsync d3/db/d77/faa 0 2026-03-10T12:38:00.469 INFO:tasks.workunit.client.0.vm00.stdout:7/438: symlink da/d47/d87/la6 0 2026-03-10T12:38:00.473 INFO:tasks.workunit.client.0.vm00.stdout:0/547: unlink d3/db/d77/d82/c8d 0 2026-03-10T12:38:00.473 INFO:tasks.workunit.client.1.vm07.stdout:1/512: rename d9/df/f10 to d9/df/d29/d2b/d30/fa8 0 2026-03-10T12:38:00.475 INFO:tasks.workunit.client.1.vm07.stdout:8/518: unlink d1/d3/d11/c3b 0 2026-03-10T12:38:00.482 INFO:tasks.workunit.client.0.vm00.stdout:0/548: fdatasync d3/d7/d3c/d74/f78 0 2026-03-10T12:38:00.485 INFO:tasks.workunit.client.1.vm07.stdout:4/642: dread d0/d4/d10/f4b [0,4194304] 0 2026-03-10T12:38:00.485 INFO:tasks.workunit.client.1.vm07.stdout:9/596: mknod d5/d16/ccf 0 2026-03-10T12:38:00.485 INFO:tasks.workunit.client.0.vm00.stdout:0/549: mknod d3/d7/d4c/d5b/d38/d44/d5a/cb9 0 2026-03-10T12:38:00.485 INFO:tasks.workunit.client.0.vm00.stdout:0/550: rmdir d3 39 2026-03-10T12:38:00.486 INFO:tasks.workunit.client.1.vm07.stdout:9/597: creat d5/d13/d6c/da4/fd0 x:0 0 0 2026-03-10T12:38:00.490 INFO:tasks.workunit.client.1.vm07.stdout:9/598: rename d5/d1f/d31/d76/fb0 to d5/d69/d93/fd1 0 2026-03-10T12:38:00.491 INFO:tasks.workunit.client.1.vm07.stdout:4/643: creat d0/d5c/fe2 x:0 0 0 2026-03-10T12:38:00.513 INFO:tasks.workunit.client.0.vm00.stdout:7/439: sync 2026-03-10T12:38:00.516 INFO:tasks.workunit.client.0.vm00.stdout:7/440: stat da/d41/d48/l5d 0 2026-03-10T12:38:00.520 INFO:tasks.workunit.client.0.vm00.stdout:3/635: dwrite dd/d4e/faa [0,4194304] 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.1.vm07.stdout:7/513: dwrite d0/f28 [0,4194304] 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.1.vm07.stdout:2/435: write d0/f12 [2414380,45160] 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:2/597: write d4/d53/d76/d9b/dad/f80 [1209747,85479] 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:2/598: chown d4/d6/f2b 142894 1 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:9/645: dwrite d0/d3d/d59/d4e/dba/d1e/d2b/f6b [4194304,4194304] 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:9/646: chown d0/d3d/d59/d4e/dba/d1e/d2b/f5f 654 1 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:8/470: rename d0/d12 to d0/d93 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:2/599: creat d4/d6/d2d/d31/fcc x:0 0 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:2/600: readlink d4/d6/l1e 0 2026-03-10T12:38:00.539 INFO:tasks.workunit.client.0.vm00.stdout:2/601: write d4/d6/d2d/d31/f79 [3792161,29955] 0 2026-03-10T12:38:00.540 INFO:tasks.workunit.client.0.vm00.stdout:3/636: write dd/d18/d13/d1d/d43/f95 [577458,70075] 0 2026-03-10T12:38:00.542 INFO:tasks.workunit.client.1.vm07.stdout:2/436: fdatasync d0/d42/d4e/d77/f6f 0 2026-03-10T12:38:00.545 INFO:tasks.workunit.client.0.vm00.stdout:2/602: mknod d4/d53/d76/ccd 0 2026-03-10T12:38:00.545 INFO:tasks.workunit.client.1.vm07.stdout:7/514: read d0/d61/f93 [602519,59814] 0 2026-03-10T12:38:00.546 INFO:tasks.workunit.client.0.vm00.stdout:7/441: symlink da/d26/d50/la7 0 2026-03-10T12:38:00.550 INFO:tasks.workunit.client.0.vm00.stdout:7/442: creat da/d41/d7b/d9d/fa8 x:0 0 0 2026-03-10T12:38:00.553 INFO:tasks.workunit.client.1.vm07.stdout:4/644: sync 2026-03-10T12:38:00.554 INFO:tasks.workunit.client.1.vm07.stdout:4/645: symlink d0/d4/d5/d78/dc5/le3 0 2026-03-10T12:38:00.558 INFO:tasks.workunit.client.1.vm07.stdout:4/646: mknod d0/d4/d5/d8f/ce4 0 2026-03-10T12:38:00.558 INFO:tasks.workunit.client.0.vm00.stdout:7/443: dwrite da/f17 [0,4194304] 0 2026-03-10T12:38:00.558 INFO:tasks.workunit.client.0.vm00.stdout:1/593: write da/d24/f81 [2157500,5466] 0 2026-03-10T12:38:00.561 INFO:tasks.workunit.client.0.vm00.stdout:3/637: unlink fb 0 2026-03-10T12:38:00.565 INFO:tasks.workunit.client.0.vm00.stdout:5/649: dwrite d1f/d26/f9f [4194304,4194304] 0 2026-03-10T12:38:00.570 INFO:tasks.workunit.client.0.vm00.stdout:4/592: rmdir df/d93/dbc 39 2026-03-10T12:38:00.571 INFO:tasks.workunit.client.0.vm00.stdout:1/594: creat da/d24/d73/fc8 x:0 0 0 2026-03-10T12:38:00.582 INFO:tasks.workunit.client.0.vm00.stdout:8/471: creat d0/d46/f94 x:0 0 0 2026-03-10T12:38:00.586 INFO:tasks.workunit.client.0.vm00.stdout:3/638: dwrite dd/d18/d14/fc0 [0,4194304] 0 2026-03-10T12:38:00.587 INFO:tasks.workunit.client.0.vm00.stdout:3/639: write dd/d2a/fbc [549391,129673] 0 2026-03-10T12:38:00.593 INFO:tasks.workunit.client.1.vm07.stdout:4/647: dread d0/d4/d7a/f4f [0,4194304] 0 2026-03-10T12:38:00.596 INFO:tasks.workunit.client.0.vm00.stdout:8/472: truncate d0/d93/d2d/f75 1099302 0 2026-03-10T12:38:00.598 INFO:tasks.workunit.client.1.vm07.stdout:4/648: dwrite d0/d4/d10/d3c/d2b/f60 [0,4194304] 0 2026-03-10T12:38:00.599 INFO:tasks.workunit.client.0.vm00.stdout:4/593: sync 2026-03-10T12:38:00.604 INFO:tasks.workunit.client.1.vm07.stdout:2/437: dread d0/f44 [0,4194304] 0 2026-03-10T12:38:00.606 INFO:tasks.workunit.client.1.vm07.stdout:4/649: link d0/d4/d5/d34/f94 d0/d4/d10/d3c/fe5 0 2026-03-10T12:38:00.608 INFO:tasks.workunit.client.0.vm00.stdout:5/650: creat d1f/d26/d2b/d35/fe8 x:0 0 0 2026-03-10T12:38:00.613 INFO:tasks.workunit.client.0.vm00.stdout:5/651: chown d1f/d26/d2b/d35/d53/d72/d9d 15949099 1 2026-03-10T12:38:00.613 INFO:tasks.workunit.client.1.vm07.stdout:2/438: symlink d0/d80/d93/l97 0 2026-03-10T12:38:00.613 INFO:tasks.workunit.client.1.vm07.stdout:2/439: dread d0/d42/d26/f2e [4194304,4194304] 0 2026-03-10T12:38:00.616 INFO:tasks.workunit.client.1.vm07.stdout:2/440: dread - d0/d29/d64/f78 zero size 2026-03-10T12:38:00.619 INFO:tasks.workunit.client.0.vm00.stdout:9/647: dread d0/d3d/d43/f54 [4194304,4194304] 0 2026-03-10T12:38:00.622 INFO:tasks.workunit.client.1.vm07.stdout:2/441: unlink d0/d42/c55 0 2026-03-10T12:38:00.631 INFO:tasks.workunit.client.0.vm00.stdout:4/594: chown df/d63/d77/f9d 242224 1 2026-03-10T12:38:00.637 INFO:tasks.workunit.client.0.vm00.stdout:9/648: creat d0/d3d/d59/d4e/dba/d1e/d85/fe7 x:0 0 0 2026-03-10T12:38:00.642 INFO:tasks.workunit.client.0.vm00.stdout:2/603: dread d4/dd/f3e [0,4194304] 0 2026-03-10T12:38:00.647 INFO:tasks.workunit.client.0.vm00.stdout:6/432: dwrite d2/d16/f47 [0,4194304] 0 2026-03-10T12:38:00.650 INFO:tasks.workunit.client.0.vm00.stdout:2/604: creat d4/d6/d2d/dc3/fce x:0 0 0 2026-03-10T12:38:00.656 INFO:tasks.workunit.client.0.vm00.stdout:9/649: dwrite d0/d3d/d43/f68 [0,4194304] 0 2026-03-10T12:38:00.656 INFO:tasks.workunit.client.0.vm00.stdout:4/595: rename df/d1f/d22/d26/d65/d91/da2 to df/d1f/d36/dc6 0 2026-03-10T12:38:00.656 INFO:tasks.workunit.client.0.vm00.stdout:6/433: mkdir d2/d42/d80/d98 0 2026-03-10T12:38:00.657 INFO:tasks.workunit.client.0.vm00.stdout:6/434: write d2/da/f2c [437061,96239] 0 2026-03-10T12:38:00.659 INFO:tasks.workunit.client.0.vm00.stdout:4/596: creat df/d1f/d36/d3a/d41/fc7 x:0 0 0 2026-03-10T12:38:00.660 INFO:tasks.workunit.client.0.vm00.stdout:4/597: dread - df/d1f/d22/d26/f31 zero size 2026-03-10T12:38:00.660 INFO:tasks.workunit.client.0.vm00.stdout:9/650: dwrite d0/d3d/d59/d4e/dba/d1e/d27/f75 [0,4194304] 0 2026-03-10T12:38:00.666 INFO:tasks.workunit.client.0.vm00.stdout:4/598: symlink df/d63/d77/lc8 0 2026-03-10T12:38:00.666 INFO:tasks.workunit.client.0.vm00.stdout:9/651: creat d0/d3d/d59/fe8 x:0 0 0 2026-03-10T12:38:00.667 INFO:tasks.workunit.client.0.vm00.stdout:6/435: rename d2/d16/d29/c67 to d2/d16/d74/c99 0 2026-03-10T12:38:00.668 INFO:tasks.workunit.client.0.vm00.stdout:9/652: read d0/d3d/f83 [71713,1641] 0 2026-03-10T12:38:00.670 INFO:tasks.workunit.client.0.vm00.stdout:6/436: dread - d2/d16/d74/f6e zero size 2026-03-10T12:38:00.671 INFO:tasks.workunit.client.0.vm00.stdout:4/599: symlink df/d1f/d36/d3a/d41/lc9 0 2026-03-10T12:38:00.673 INFO:tasks.workunit.client.0.vm00.stdout:6/437: dread d2/f30 [0,4194304] 0 2026-03-10T12:38:00.678 INFO:tasks.workunit.client.0.vm00.stdout:7/444: read f0 [10440585,61702] 0 2026-03-10T12:38:00.683 INFO:tasks.workunit.client.0.vm00.stdout:7/445: chown da/d1b/d40/f44 4447729 1 2026-03-10T12:38:00.684 INFO:tasks.workunit.client.0.vm00.stdout:9/653: mknod d0/d3d/d59/d4e/dba/d1e/d27/ce9 0 2026-03-10T12:38:00.687 INFO:tasks.workunit.client.0.vm00.stdout:4/600: mkdir df/d6c/dca 0 2026-03-10T12:38:00.689 INFO:tasks.workunit.client.0.vm00.stdout:4/601: truncate df/d1f/d22/f52 12164 0 2026-03-10T12:38:00.690 INFO:tasks.workunit.client.0.vm00.stdout:4/602: read df/d1f/d22/d26/d65/d91/f50 [3104823,26317] 0 2026-03-10T12:38:00.694 INFO:tasks.workunit.client.0.vm00.stdout:4/603: unlink df/d1f/l21 0 2026-03-10T12:38:00.703 INFO:tasks.workunit.client.0.vm00.stdout:7/446: creat da/d26/d50/d73/d89/fa9 x:0 0 0 2026-03-10T12:38:00.703 INFO:tasks.workunit.client.0.vm00.stdout:4/604: mkdir df/d1f/d22/dcb 0 2026-03-10T12:38:00.703 INFO:tasks.workunit.client.0.vm00.stdout:7/447: fdatasync da/d25/d2c/f4f 0 2026-03-10T12:38:00.705 INFO:tasks.workunit.client.0.vm00.stdout:7/448: dwrite da/d3f/d60/f85 [0,4194304] 0 2026-03-10T12:38:00.706 INFO:tasks.workunit.client.0.vm00.stdout:7/449: chown da/d26/d37/l7a 158 1 2026-03-10T12:38:00.711 INFO:tasks.workunit.client.0.vm00.stdout:7/450: write da/d26/d37/f79 [1406128,45670] 0 2026-03-10T12:38:00.735 INFO:tasks.workunit.client.0.vm00.stdout:9/654: dread d0/d3d/d59/f94 [0,4194304] 0 2026-03-10T12:38:00.738 INFO:tasks.workunit.client.0.vm00.stdout:4/605: dread df/f42 [0,4194304] 0 2026-03-10T12:38:00.738 INFO:tasks.workunit.client.0.vm00.stdout:4/606: chown df/d93/fc0 0 1 2026-03-10T12:38:00.762 INFO:tasks.workunit.client.0.vm00.stdout:5/652: dread d1f/f59 [0,4194304] 0 2026-03-10T12:38:00.763 INFO:tasks.workunit.client.0.vm00.stdout:5/653: truncate d1f/d26/d6f/fa9 636228 0 2026-03-10T12:38:00.763 INFO:tasks.workunit.client.0.vm00.stdout:5/654: fdatasync d1f/d26/f48 0 2026-03-10T12:38:00.770 INFO:tasks.workunit.client.0.vm00.stdout:1/595: write da/d21/d27/d6a/f9e [394996,87167] 0 2026-03-10T12:38:00.771 INFO:tasks.workunit.client.0.vm00.stdout:1/596: dread - da/d21/d27/d6a/f6b zero size 2026-03-10T12:38:00.774 INFO:tasks.workunit.client.0.vm00.stdout:1/597: creat da/d12/da8/fc9 x:0 0 0 2026-03-10T12:38:00.774 INFO:tasks.workunit.client.0.vm00.stdout:1/598: read - da/d21/d39/d77/fbb zero size 2026-03-10T12:38:00.795 INFO:tasks.workunit.client.1.vm07.stdout:0/581: write d0/f1c [12807727,3878] 0 2026-03-10T12:38:00.797 INFO:tasks.workunit.client.1.vm07.stdout:3/574: write dc/d18/d24/f3e [4710613,90360] 0 2026-03-10T12:38:00.798 INFO:tasks.workunit.client.1.vm07.stdout:3/575: dread dc/d18/d2d/f71 [0,4194304] 0 2026-03-10T12:38:00.800 INFO:tasks.workunit.client.1.vm07.stdout:5/556: write d0/d22/f89 [149129,129992] 0 2026-03-10T12:38:00.802 INFO:tasks.workunit.client.0.vm00.stdout:3/640: truncate dd/d18/d14/fbe 3814401 0 2026-03-10T12:38:00.803 INFO:tasks.workunit.client.0.vm00.stdout:3/641: chown dd/d3d/d84 12 1 2026-03-10T12:38:00.803 INFO:tasks.workunit.client.1.vm07.stdout:1/513: write d9/df/d29/d2b/d30/f38 [868761,29197] 0 2026-03-10T12:38:00.805 INFO:tasks.workunit.client.0.vm00.stdout:3/642: mkdir dd/d18/d13/d1d/dc6/dd5 0 2026-03-10T12:38:00.811 INFO:tasks.workunit.client.1.vm07.stdout:8/519: dwrite d1/d3/d6/d50/f5e [0,4194304] 0 2026-03-10T12:38:00.815 INFO:tasks.workunit.client.1.vm07.stdout:6/504: dread d1/d4/d6/f60 [0,4194304] 0 2026-03-10T12:38:00.820 INFO:tasks.workunit.client.1.vm07.stdout:1/514: rename d9/f19 to d9/df/d29/d2b/d31/d91/fa9 0 2026-03-10T12:38:00.822 INFO:tasks.workunit.client.0.vm00.stdout:0/551: write d3/d7/d4c/d5b/d38/d44/d5a/f86 [229314,764] 0 2026-03-10T12:38:00.823 INFO:tasks.workunit.client.0.vm00.stdout:0/552: dread - d3/d40/d65/fa8 zero size 2026-03-10T12:38:00.823 INFO:tasks.workunit.client.1.vm07.stdout:9/599: write d5/d13/f67 [565008,49451] 0 2026-03-10T12:38:00.835 INFO:tasks.workunit.client.1.vm07.stdout:7/515: write d0/d47/f81 [766870,78194] 0 2026-03-10T12:38:00.845 INFO:tasks.workunit.client.1.vm07.stdout:9/600: dread d5/d1f/d31/f82 [0,4194304] 0 2026-03-10T12:38:00.846 INFO:tasks.workunit.client.1.vm07.stdout:6/505: dread d1/d4/d6/d16/d1a/d33/f3c [0,4194304] 0 2026-03-10T12:38:00.856 INFO:tasks.workunit.client.1.vm07.stdout:6/506: dwrite d1/d4/d6/d16/d1a/d2c/f78 [0,4194304] 0 2026-03-10T12:38:00.862 INFO:tasks.workunit.client.1.vm07.stdout:3/576: mknod dc/dd/d43/d76/d95/da0/cc8 0 2026-03-10T12:38:00.867 INFO:tasks.workunit.client.0.vm00.stdout:2/605: dread d4/d6/d2d/d3a/f7c [0,4194304] 0 2026-03-10T12:38:00.870 INFO:tasks.workunit.client.1.vm07.stdout:4/650: dwrite d0/d4/d10/d3c/d2b/d54/de1/f25 [0,4194304] 0 2026-03-10T12:38:00.872 INFO:tasks.workunit.client.1.vm07.stdout:5/557: creat d0/d22/d18/d19/d36/fc1 x:0 0 0 2026-03-10T12:38:00.873 INFO:tasks.workunit.client.1.vm07.stdout:6/507: dwrite d1/d4/d6/d16/d1a/d33/f92 [0,4194304] 0 2026-03-10T12:38:00.888 INFO:tasks.workunit.client.0.vm00.stdout:3/643: rename dd/d64/l7e to dd/d64/d93/ld6 0 2026-03-10T12:38:00.889 INFO:tasks.workunit.client.0.vm00.stdout:3/644: dread - dd/d18/d13/d1d/fc9 zero size 2026-03-10T12:38:00.892 INFO:tasks.workunit.client.0.vm00.stdout:9/655: write d0/d3d/d59/d4e/dba/d19/f20 [5667604,115966] 0 2026-03-10T12:38:00.895 INFO:tasks.workunit.client.0.vm00.stdout:4/607: dwrite df/f1e [0,4194304] 0 2026-03-10T12:38:00.901 INFO:tasks.workunit.client.0.vm00.stdout:6/438: write d2/f5e [1682710,43151] 0 2026-03-10T12:38:00.908 INFO:tasks.workunit.client.0.vm00.stdout:0/553: creat d3/d7/d3c/fba x:0 0 0 2026-03-10T12:38:00.911 INFO:tasks.workunit.client.0.vm00.stdout:0/554: stat d3/d7/d3c/f72 0 2026-03-10T12:38:00.916 INFO:tasks.workunit.client.0.vm00.stdout:7/451: dwrite da/d1b/d40/f44 [0,4194304] 0 2026-03-10T12:38:00.918 INFO:tasks.workunit.client.0.vm00.stdout:7/452: chown da/d1b 6 1 2026-03-10T12:38:00.923 INFO:tasks.workunit.client.0.vm00.stdout:9/656: mknod d0/d7f/cea 0 2026-03-10T12:38:00.932 INFO:tasks.workunit.client.0.vm00.stdout:6/439: sync 2026-03-10T12:38:00.932 INFO:tasks.workunit.client.0.vm00.stdout:3/645: creat dd/d18/d13/d1d/dc6/dd5/fd7 x:0 0 0 2026-03-10T12:38:00.933 INFO:tasks.workunit.client.0.vm00.stdout:9/657: read d0/d3d/d59/d4e/f7c [2765341,32639] 0 2026-03-10T12:38:00.936 INFO:tasks.workunit.client.0.vm00.stdout:2/606: link d4/d6/d2d/dc3/lc8 d4/d6/d93/lcf 0 2026-03-10T12:38:00.943 INFO:tasks.workunit.client.0.vm00.stdout:9/658: mknod d0/d7f/db8/dc4/db0/dcc/ceb 0 2026-03-10T12:38:00.947 INFO:tasks.workunit.client.0.vm00.stdout:9/659: symlink d0/d3d/d59/d4e/dba/d1e/d85/d98/lec 0 2026-03-10T12:38:00.947 INFO:tasks.workunit.client.0.vm00.stdout:3/646: creat dd/d27/d2c/d34/fd8 x:0 0 0 2026-03-10T12:38:00.949 INFO:tasks.workunit.client.0.vm00.stdout:9/660: read d0/d3d/d43/f68 [2893892,121736] 0 2026-03-10T12:38:00.954 INFO:tasks.workunit.client.0.vm00.stdout:7/453: unlink da/d25/d2e/f5e 0 2026-03-10T12:38:00.967 INFO:tasks.workunit.client.0.vm00.stdout:1/599: dwrite da/d24/d5a/f68 [0,4194304] 0 2026-03-10T12:38:00.981 INFO:tasks.workunit.client.0.vm00.stdout:2/607: read d4/d53/d68/f8a [1082201,99565] 0 2026-03-10T12:38:00.984 INFO:tasks.workunit.client.0.vm00.stdout:1/600: mknod da/d21/d39/cca 0 2026-03-10T12:38:00.986 INFO:tasks.workunit.client.0.vm00.stdout:1/601: read da/d21/db3/d5d/d80/f8a [2466923,68607] 0 2026-03-10T12:38:00.998 INFO:tasks.workunit.client.0.vm00.stdout:6/440: mknod d2/d39/c9a 0 2026-03-10T12:38:00.998 INFO:tasks.workunit.client.0.vm00.stdout:2/608: truncate d4/f73 3601849 0 2026-03-10T12:38:00.999 INFO:tasks.workunit.client.0.vm00.stdout:1/602: write da/d24/d73/fb6 [1291569,108063] 0 2026-03-10T12:38:00.999 INFO:tasks.workunit.client.0.vm00.stdout:1/603: stat da/d12/d26/f2e 0 2026-03-10T12:38:00.999 INFO:tasks.workunit.client.0.vm00.stdout:1/604: stat da/d24/d28 0 2026-03-10T12:38:00.999 INFO:tasks.workunit.client.0.vm00.stdout:1/605: mkdir da/d12/d91/dcb 0 2026-03-10T12:38:00.999 INFO:tasks.workunit.client.0.vm00.stdout:2/609: dread d4/dd/f3e [0,4194304] 0 2026-03-10T12:38:00.999 INFO:tasks.workunit.client.0.vm00.stdout:2/610: stat d4/d6/f22 0 2026-03-10T12:38:00.999 INFO:tasks.workunit.client.0.vm00.stdout:2/611: rmdir d4/d53/d76/d9b/dad 39 2026-03-10T12:38:01.003 INFO:tasks.workunit.client.0.vm00.stdout:1/606: rmdir da/d24/d4a 0 2026-03-10T12:38:01.008 INFO:tasks.workunit.client.0.vm00.stdout:2/612: creat d4/d53/d76/d9b/dad/d8e/fd0 x:0 0 0 2026-03-10T12:38:01.008 INFO:tasks.workunit.client.0.vm00.stdout:6/441: creat d2/d39/f9b x:0 0 0 2026-03-10T12:38:01.009 INFO:tasks.workunit.client.0.vm00.stdout:6/442: fdatasync d2/da/dc/f27 0 2026-03-10T12:38:01.019 INFO:tasks.workunit.client.0.vm00.stdout:2/613: sync 2026-03-10T12:38:01.022 INFO:tasks.workunit.client.0.vm00.stdout:8/473: dread d0/d93/d17/f1d [0,4194304] 0 2026-03-10T12:38:01.023 INFO:tasks.workunit.client.0.vm00.stdout:2/614: creat d4/d6/d93/dc6/fd1 x:0 0 0 2026-03-10T12:38:01.024 INFO:tasks.workunit.client.0.vm00.stdout:2/615: chown d4/d6/dca/l72 106 1 2026-03-10T12:38:01.028 INFO:tasks.workunit.client.0.vm00.stdout:8/474: creat d0/d93/d36/d5b/f95 x:0 0 0 2026-03-10T12:38:01.032 INFO:tasks.workunit.client.0.vm00.stdout:8/475: mknod d0/d93/d36/c96 0 2026-03-10T12:38:01.044 INFO:tasks.workunit.client.1.vm07.stdout:1/515: write d9/d2d/d4f/d75/f83 [2073100,37822] 0 2026-03-10T12:38:01.063 INFO:tasks.workunit.client.0.vm00.stdout:1/607: unlink da/d12/f20 0 2026-03-10T12:38:01.065 INFO:tasks.workunit.client.0.vm00.stdout:5/655: creat d1f/d26/fe9 x:0 0 0 2026-03-10T12:38:01.065 INFO:tasks.workunit.client.1.vm07.stdout:7/516: write d0/f70 [2512157,37667] 0 2026-03-10T12:38:01.069 INFO:tasks.workunit.client.0.vm00.stdout:0/555: dwrite d3/d40/f4e [0,4194304] 0 2026-03-10T12:38:01.069 INFO:tasks.workunit.client.0.vm00.stdout:1/608: dwrite da/d21/f74 [0,4194304] 0 2026-03-10T12:38:01.071 INFO:tasks.workunit.client.0.vm00.stdout:0/556: write d3/db/d24/f2f [8632646,8688] 0 2026-03-10T12:38:01.077 INFO:tasks.workunit.client.0.vm00.stdout:0/557: fsync d3/d7/d3c/d4b/f79 0 2026-03-10T12:38:01.083 INFO:tasks.workunit.client.0.vm00.stdout:1/609: unlink da/f14 0 2026-03-10T12:38:01.089 INFO:tasks.workunit.client.0.vm00.stdout:1/610: creat da/d21/db3/d5d/d80/fcc x:0 0 0 2026-03-10T12:38:01.094 INFO:tasks.workunit.client.1.vm07.stdout:5/558: dread - d0/d22/d18/f97 zero size 2026-03-10T12:38:01.097 INFO:tasks.workunit.client.0.vm00.stdout:7/454: dwrite da/d1b/f39 [0,4194304] 0 2026-03-10T12:38:01.106 INFO:tasks.workunit.client.0.vm00.stdout:0/558: rename d3/d7/d4c/d5b/d38/d44/d5a/fac to d3/d7/d4c/d5b/d38/db3/fbb 0 2026-03-10T12:38:01.127 INFO:tasks.workunit.client.0.vm00.stdout:5/656: link d1f/d26/cb4 d1f/d26/d2b/d37/cea 0 2026-03-10T12:38:01.127 INFO:tasks.workunit.client.0.vm00.stdout:1/611: link da/d21/db3/l7d da/d21/db3/d5d/d72/d7e/dbf/lcd 0 2026-03-10T12:38:01.127 INFO:tasks.workunit.client.0.vm00.stdout:1/612: fsync da/d24/d73/fc8 0 2026-03-10T12:38:01.127 INFO:tasks.workunit.client.0.vm00.stdout:1/613: write f3 [772715,13974] 0 2026-03-10T12:38:01.128 INFO:tasks.workunit.client.0.vm00.stdout:7/455: write da/d41/f72 [795456,111112] 0 2026-03-10T12:38:01.131 INFO:tasks.workunit.client.0.vm00.stdout:5/657: stat d1f/d26/d2e/c8f 0 2026-03-10T12:38:01.152 INFO:tasks.workunit.client.1.vm07.stdout:1/516: symlink d9/df/d29/d2b/d31/d91/d59/laa 0 2026-03-10T12:38:01.152 INFO:tasks.workunit.client.0.vm00.stdout:6/443: write d2/f68 [1003728,73190] 0 2026-03-10T12:38:01.159 INFO:tasks.workunit.client.0.vm00.stdout:0/559: creat d3/db/fbc x:0 0 0 2026-03-10T12:38:01.161 INFO:tasks.workunit.client.1.vm07.stdout:7/517: dread d0/d61/f93 [0,4194304] 0 2026-03-10T12:38:01.167 INFO:tasks.workunit.client.1.vm07.stdout:7/518: dwrite d0/d61/f64 [0,4194304] 0 2026-03-10T12:38:01.167 INFO:tasks.workunit.client.0.vm00.stdout:6/444: truncate d2/d14/f3b 661473 0 2026-03-10T12:38:01.178 INFO:tasks.workunit.client.0.vm00.stdout:0/560: creat d3/db/d24/d25/fbd x:0 0 0 2026-03-10T12:38:01.182 INFO:tasks.workunit.client.1.vm07.stdout:0/582: getdents d0/d14/d5f/d76/d2f 0 2026-03-10T12:38:01.184 INFO:tasks.workunit.client.0.vm00.stdout:6/445: mkdir d2/d42/d9c 0 2026-03-10T12:38:01.190 INFO:tasks.workunit.client.1.vm07.stdout:3/577: mkdir dc/dd/d1f/dc7/dc9 0 2026-03-10T12:38:01.197 INFO:tasks.workunit.client.0.vm00.stdout:0/561: chown d3/d7/d3c/ca3 1066217 1 2026-03-10T12:38:01.197 INFO:tasks.workunit.client.0.vm00.stdout:0/562: chown d3/l28 5675929 1 2026-03-10T12:38:01.198 INFO:tasks.workunit.client.0.vm00.stdout:0/563: write d3/db/d24/d25/fbd [26925,100922] 0 2026-03-10T12:38:01.201 INFO:tasks.workunit.client.0.vm00.stdout:6/446: mkdir d2/d42/d80/d9d 0 2026-03-10T12:38:01.201 INFO:tasks.workunit.client.0.vm00.stdout:7/456: dwrite da/d25/d2c/d82/d68/f38 [4194304,4194304] 0 2026-03-10T12:38:01.203 INFO:tasks.workunit.client.1.vm07.stdout:5/559: dread d0/d22/d18/d19/d2e/f59 [4194304,4194304] 0 2026-03-10T12:38:01.206 INFO:tasks.workunit.client.1.vm07.stdout:3/578: dread dc/dd/f29 [0,4194304] 0 2026-03-10T12:38:01.224 INFO:tasks.workunit.client.0.vm00.stdout:1/614: dread da/d12/d26/f2e [0,4194304] 0 2026-03-10T12:38:01.225 INFO:tasks.workunit.client.0.vm00.stdout:0/564: creat d3/d7/d4c/d5b/d38/db3/fbe x:0 0 0 2026-03-10T12:38:01.226 INFO:tasks.workunit.client.0.vm00.stdout:6/447: write d2/d16/d74/f5a [1496633,48005] 0 2026-03-10T12:38:01.227 INFO:tasks.workunit.client.0.vm00.stdout:0/565: write d3/d40/d65/fa8 [202686,114128] 0 2026-03-10T12:38:01.227 INFO:tasks.workunit.client.1.vm07.stdout:4/651: unlink d0/d4/d10/l9f 0 2026-03-10T12:38:01.228 INFO:tasks.workunit.client.1.vm07.stdout:6/508: creat d1/d4/d6/d43/d88/d97/fa2 x:0 0 0 2026-03-10T12:38:01.229 INFO:tasks.workunit.client.1.vm07.stdout:8/520: rmdir d1/d3/d5d/d65 39 2026-03-10T12:38:01.229 INFO:tasks.workunit.client.1.vm07.stdout:6/509: readlink d1/d4/d71/l9a 0 2026-03-10T12:38:01.230 INFO:tasks.workunit.client.1.vm07.stdout:4/652: fdatasync d0/d4/d7a/d46/d76/fa2 0 2026-03-10T12:38:01.232 INFO:tasks.workunit.client.0.vm00.stdout:1/615: dwrite da/d12/d26/f31 [0,4194304] 0 2026-03-10T12:38:01.234 INFO:tasks.workunit.client.0.vm00.stdout:1/616: chown da/d21/db3/d59/l85 8772 1 2026-03-10T12:38:01.252 INFO:tasks.workunit.client.0.vm00.stdout:7/457: rename da/d25/d2c/l86 to da/d26/laa 0 2026-03-10T12:38:01.252 INFO:tasks.workunit.client.0.vm00.stdout:6/448: fsync d2/da/dc/d2f/f4f 0 2026-03-10T12:38:01.255 INFO:tasks.workunit.client.0.vm00.stdout:7/458: dread da/d3f/d60/f85 [0,4194304] 0 2026-03-10T12:38:01.265 INFO:tasks.workunit.client.0.vm00.stdout:7/459: creat da/d41/d48/d81/fab x:0 0 0 2026-03-10T12:38:01.265 INFO:tasks.workunit.client.1.vm07.stdout:9/601: rmdir d5/d13/d57/d3e/d85 0 2026-03-10T12:38:01.265 INFO:tasks.workunit.client.1.vm07.stdout:0/583: rename d0/d83 to d0/d14/d5f/d3b/dbc 0 2026-03-10T12:38:01.265 INFO:tasks.workunit.client.1.vm07.stdout:9/602: chown d5/d13/d57/d3e/c72 946980 1 2026-03-10T12:38:01.265 INFO:tasks.workunit.client.1.vm07.stdout:5/560: mkdir d0/d22/d18/d19/d21/dc2 0 2026-03-10T12:38:01.265 INFO:tasks.workunit.client.1.vm07.stdout:5/561: chown d0/d22 267601816 1 2026-03-10T12:38:01.265 INFO:tasks.workunit.client.1.vm07.stdout:8/521: creat d1/d3/d40/d92/fa6 x:0 0 0 2026-03-10T12:38:01.267 INFO:tasks.workunit.client.0.vm00.stdout:6/449: mkdir d2/d42/d80/d9d/d9e 0 2026-03-10T12:38:01.269 INFO:tasks.workunit.client.0.vm00.stdout:7/460: creat da/d26/d50/d73/d89/fac x:0 0 0 2026-03-10T12:38:01.270 INFO:tasks.workunit.client.0.vm00.stdout:7/461: chown da/d1b/f39 752746 1 2026-03-10T12:38:01.270 INFO:tasks.workunit.client.0.vm00.stdout:7/462: chown da/d26/d37/f79 1904450950 1 2026-03-10T12:38:01.274 INFO:tasks.workunit.client.1.vm07.stdout:1/517: unlink d9/l69 0 2026-03-10T12:38:01.274 INFO:tasks.workunit.client.0.vm00.stdout:7/463: chown da/d3f/l8f 376363 1 2026-03-10T12:38:01.274 INFO:tasks.workunit.client.0.vm00.stdout:7/464: readlink da/d25/d2c/l31 0 2026-03-10T12:38:01.275 INFO:tasks.workunit.client.1.vm07.stdout:0/584: symlink d0/d14/d5f/d76/d2f/d31/d79/d85/lbd 0 2026-03-10T12:38:01.275 INFO:tasks.workunit.client.1.vm07.stdout:5/562: mknod d0/d22/d18/d19/d21/d3a/cc3 0 2026-03-10T12:38:01.279 INFO:tasks.workunit.client.0.vm00.stdout:7/465: symlink da/d41/d48/d81/lad 0 2026-03-10T12:38:01.279 INFO:tasks.workunit.client.0.vm00.stdout:7/466: chown da/d41/d7b 499 1 2026-03-10T12:38:01.280 INFO:tasks.workunit.client.1.vm07.stdout:1/518: creat d9/d2d/d4f/d75/fab x:0 0 0 2026-03-10T12:38:01.282 INFO:tasks.workunit.client.1.vm07.stdout:0/585: creat d0/d14/d5f/d3b/dbc/fbe x:0 0 0 2026-03-10T12:38:01.283 INFO:tasks.workunit.client.0.vm00.stdout:1/617: sync 2026-03-10T12:38:01.283 INFO:tasks.workunit.client.0.vm00.stdout:1/618: symlink da/d21/d27/d6a/lce 2 2026-03-10T12:38:01.284 INFO:tasks.workunit.client.1.vm07.stdout:6/510: getdents d1/d4/d6/d46/d4d 0 2026-03-10T12:38:01.292 INFO:tasks.workunit.client.1.vm07.stdout:4/653: sync 2026-03-10T12:38:01.296 INFO:tasks.workunit.client.1.vm07.stdout:9/603: sync 2026-03-10T12:38:01.296 INFO:tasks.workunit.client.1.vm07.stdout:8/522: sync 2026-03-10T12:38:01.305 INFO:tasks.workunit.client.1.vm07.stdout:1/519: rename d9/df/d29/c71 to d9/df/d29/d2b/d92/d9d/cac 0 2026-03-10T12:38:01.305 INFO:tasks.workunit.client.1.vm07.stdout:0/586: fsync d0/d14/d5f/d41/d86/f96 0 2026-03-10T12:38:01.306 INFO:tasks.workunit.client.1.vm07.stdout:5/563: creat d0/d22/d18/d3e/d5d/db6/fc4 x:0 0 0 2026-03-10T12:38:01.313 INFO:tasks.workunit.client.1.vm07.stdout:6/511: mkdir d1/d4/d6/d53/da3 0 2026-03-10T12:38:01.318 INFO:tasks.workunit.client.1.vm07.stdout:8/523: unlink d1/c49 0 2026-03-10T12:38:01.320 INFO:tasks.workunit.client.1.vm07.stdout:4/654: dwrite d0/d4/d5/da/f15 [4194304,4194304] 0 2026-03-10T12:38:01.324 INFO:tasks.workunit.client.0.vm00.stdout:2/616: unlink d4/d6/cb6 0 2026-03-10T12:38:01.325 INFO:tasks.workunit.client.0.vm00.stdout:2/617: fdatasync d4/d6/f30 0 2026-03-10T12:38:01.326 INFO:tasks.workunit.client.1.vm07.stdout:1/520: fdatasync d9/f36 0 2026-03-10T12:38:01.328 INFO:tasks.workunit.client.0.vm00.stdout:2/618: creat d4/dd/da7/fd2 x:0 0 0 2026-03-10T12:38:01.334 INFO:tasks.workunit.client.1.vm07.stdout:0/587: truncate d0/d14/d5f/d76/d2f/d31/d4f/f70 770134 0 2026-03-10T12:38:01.334 INFO:tasks.workunit.client.0.vm00.stdout:9/661: rmdir d0/d3d/d59/d4e/dba/d1e/d85/d98 39 2026-03-10T12:38:01.336 INFO:tasks.workunit.client.0.vm00.stdout:2/619: fsync d4/dd/db9/f96 0 2026-03-10T12:38:01.337 INFO:tasks.workunit.client.1.vm07.stdout:5/564: symlink d0/d22/d18/d19/d21/d54/lc5 0 2026-03-10T12:38:01.339 INFO:tasks.workunit.client.0.vm00.stdout:9/662: fsync d0/d3d/d43/d53/fd1 0 2026-03-10T12:38:01.343 INFO:tasks.workunit.client.1.vm07.stdout:6/512: chown d1/d4/d6/d16/d49/f67 2434403 1 2026-03-10T12:38:01.348 INFO:tasks.workunit.client.1.vm07.stdout:9/604: symlink d5/d13/d9d/ld2 0 2026-03-10T12:38:01.349 INFO:tasks.workunit.client.0.vm00.stdout:8/476: write d0/f9 [3413208,56532] 0 2026-03-10T12:38:01.352 INFO:tasks.workunit.client.0.vm00.stdout:9/663: sync 2026-03-10T12:38:01.357 INFO:tasks.workunit.client.0.vm00.stdout:9/664: fdatasync d0/d3d/d59/d4e/dba/d1e/d85/d98/fa7 0 2026-03-10T12:38:01.368 INFO:tasks.workunit.client.1.vm07.stdout:2/442: dwrite d0/d42/f53 [0,4194304] 0 2026-03-10T12:38:01.369 INFO:tasks.workunit.client.1.vm07.stdout:2/443: write d0/d42/f53 [3277187,105607] 0 2026-03-10T12:38:01.369 INFO:tasks.workunit.client.0.vm00.stdout:4/608: rmdir df/d8a 39 2026-03-10T12:38:01.371 INFO:tasks.workunit.client.0.vm00.stdout:9/665: getdents d0/d3d/d43 0 2026-03-10T12:38:01.372 INFO:tasks.workunit.client.0.vm00.stdout:9/666: write d0/d7f/db8/dc4/db0/fbf [27595,89151] 0 2026-03-10T12:38:01.375 INFO:tasks.workunit.client.0.vm00.stdout:3/647: unlink dd/c9a 0 2026-03-10T12:38:01.376 INFO:tasks.workunit.client.0.vm00.stdout:4/609: rmdir df/d1f/d36/d3a/d41 39 2026-03-10T12:38:01.377 INFO:tasks.workunit.client.1.vm07.stdout:8/524: rename d1/d3/d18/f75 to d1/d3/d6c/fa7 0 2026-03-10T12:38:01.377 INFO:tasks.workunit.client.0.vm00.stdout:3/648: truncate dd/d64/fa4 778900 0 2026-03-10T12:38:01.379 INFO:tasks.workunit.client.0.vm00.stdout:4/610: read df/f1b [1350852,5077] 0 2026-03-10T12:38:01.380 INFO:tasks.workunit.client.0.vm00.stdout:9/667: symlink d0/d3d/d59/d4e/dba/d1e/dcb/led 0 2026-03-10T12:38:01.381 INFO:tasks.workunit.client.0.vm00.stdout:3/649: mkdir dd/d18/d13/d99/dd9 0 2026-03-10T12:38:01.391 INFO:tasks.workunit.client.0.vm00.stdout:5/658: write d1f/d26/d2b/d35/d78/fc7 [256945,98551] 0 2026-03-10T12:38:01.395 INFO:tasks.workunit.client.1.vm07.stdout:9/605: fsync d5/f8 0 2026-03-10T12:38:01.397 INFO:tasks.workunit.client.1.vm07.stdout:1/521: symlink d9/d2d/d4f/d75/d77/da7/lad 0 2026-03-10T12:38:01.403 INFO:tasks.workunit.client.0.vm00.stdout:4/611: rmdir df/d6c/dca 0 2026-03-10T12:38:01.412 INFO:tasks.workunit.client.0.vm00.stdout:1/619: write da/d21/d27/d6a/f6d [1036887,15838] 0 2026-03-10T12:38:01.412 INFO:tasks.workunit.client.1.vm07.stdout:8/525: creat d1/d3/d6/d54/fa8 x:0 0 0 2026-03-10T12:38:01.418 INFO:tasks.workunit.client.0.vm00.stdout:8/477: dread d0/f9 [0,4194304] 0 2026-03-10T12:38:01.419 INFO:tasks.workunit.client.0.vm00.stdout:8/478: readlink d0/d93/d2d/d49/l5a 0 2026-03-10T12:38:01.422 INFO:tasks.workunit.client.1.vm07.stdout:7/519: write d0/d47/f8e [736285,27868] 0 2026-03-10T12:38:01.425 INFO:tasks.workunit.client.0.vm00.stdout:4/612: unlink df/d1f/d36/d3a/f6e 0 2026-03-10T12:38:01.425 INFO:tasks.workunit.client.0.vm00.stdout:4/613: chown df/d32/l4a 112802 1 2026-03-10T12:38:01.431 INFO:tasks.workunit.client.0.vm00.stdout:5/659: dread d1f/d26/d2b/d35/d53/d72/d9d/d8e/fc1 [0,4194304] 0 2026-03-10T12:38:01.433 INFO:tasks.workunit.client.0.vm00.stdout:0/566: truncate d3/d40/d65/f8f 4115913 0 2026-03-10T12:38:01.433 INFO:tasks.workunit.client.1.vm07.stdout:9/606: fdatasync d5/f91 0 2026-03-10T12:38:01.437 INFO:tasks.workunit.client.1.vm07.stdout:5/565: dread d0/d22/d18/d19/d2e/f52 [0,4194304] 0 2026-03-10T12:38:01.440 INFO:tasks.workunit.client.0.vm00.stdout:5/660: dread d1f/d26/d2b/d37/f38 [0,4194304] 0 2026-03-10T12:38:01.447 INFO:tasks.workunit.client.0.vm00.stdout:5/661: dwrite d1f/d26/d2b/d35/f68 [0,4194304] 0 2026-03-10T12:38:01.450 INFO:tasks.workunit.client.0.vm00.stdout:5/662: truncate d1f/d26/d2b/f44 973402 0 2026-03-10T12:38:01.455 INFO:tasks.workunit.client.1.vm07.stdout:8/526: symlink d1/d3/d6/d7b/la9 0 2026-03-10T12:38:01.458 INFO:tasks.workunit.client.1.vm07.stdout:7/520: creat d0/d67/d6f/d80/fac x:0 0 0 2026-03-10T12:38:01.460 INFO:tasks.workunit.client.1.vm07.stdout:3/579: write dc/d18/f36 [1145702,6436] 0 2026-03-10T12:38:01.460 INFO:tasks.workunit.client.1.vm07.stdout:8/527: stat d1/d3/d6/d50/d70 0 2026-03-10T12:38:01.463 INFO:tasks.workunit.client.1.vm07.stdout:0/588: dread d0/d14/d5f/f54 [0,4194304] 0 2026-03-10T12:38:01.466 INFO:tasks.workunit.client.0.vm00.stdout:2/620: dwrite d4/d6/dca/f3f [0,4194304] 0 2026-03-10T12:38:01.475 INFO:tasks.workunit.client.0.vm00.stdout:3/650: write dd/d3d/f50 [97537,36176] 0 2026-03-10T12:38:01.475 INFO:tasks.workunit.client.0.vm00.stdout:9/668: write d0/d7f/db8/dc4/f4f [3296455,85856] 0 2026-03-10T12:38:01.476 INFO:tasks.workunit.client.0.vm00.stdout:3/651: chown dd/d64/cbb 1 1 2026-03-10T12:38:01.476 INFO:tasks.workunit.client.0.vm00.stdout:1/620: write da/d21/d39/f89 [37992,39576] 0 2026-03-10T12:38:01.477 INFO:tasks.workunit.client.0.vm00.stdout:3/652: read dd/d4e/faa [1390915,30269] 0 2026-03-10T12:38:01.482 INFO:tasks.workunit.client.1.vm07.stdout:9/607: creat d5/d1f/fd3 x:0 0 0 2026-03-10T12:38:01.484 INFO:tasks.workunit.client.0.vm00.stdout:6/450: chown d2/d51 1422 1 2026-03-10T12:38:01.489 INFO:tasks.workunit.client.1.vm07.stdout:9/608: dread - d5/d1f/d7d/fcc zero size 2026-03-10T12:38:01.490 INFO:tasks.workunit.client.1.vm07.stdout:5/566: mknod d0/d22/d18/d3e/d5d/cc6 0 2026-03-10T12:38:01.491 INFO:tasks.workunit.client.0.vm00.stdout:8/479: mknod d0/d93/d36/d5b/c97 0 2026-03-10T12:38:01.495 INFO:tasks.workunit.client.0.vm00.stdout:4/614: mknod df/d1f/ccc 0 2026-03-10T12:38:01.497 INFO:tasks.workunit.client.1.vm07.stdout:6/513: dread d1/d4/d6/f30 [0,4194304] 0 2026-03-10T12:38:01.499 INFO:tasks.workunit.client.0.vm00.stdout:7/467: write da/d1b/f22 [2380478,10896] 0 2026-03-10T12:38:01.505 INFO:tasks.workunit.client.0.vm00.stdout:7/468: dwrite da/f35 [0,4194304] 0 2026-03-10T12:38:01.507 INFO:tasks.workunit.client.0.vm00.stdout:7/469: dread - da/d25/d2e/f9c zero size 2026-03-10T12:38:01.510 INFO:tasks.workunit.client.0.vm00.stdout:5/663: mkdir d1f/d26/d2e/d58/d6b/deb 0 2026-03-10T12:38:01.515 INFO:tasks.workunit.client.1.vm07.stdout:7/521: creat d0/d47/d48/fad x:0 0 0 2026-03-10T12:38:01.516 INFO:tasks.workunit.client.1.vm07.stdout:6/514: dwrite d1/d4/d6/d16/d1a/d2c/f78 [0,4194304] 0 2026-03-10T12:38:01.533 INFO:tasks.workunit.client.1.vm07.stdout:4/655: write d0/d4/d10/d5f/fb6 [600416,42051] 0 2026-03-10T12:38:01.534 INFO:tasks.workunit.client.1.vm07.stdout:8/528: rmdir d1/d3/d6/d54 39 2026-03-10T12:38:01.534 INFO:tasks.workunit.client.1.vm07.stdout:8/529: chown d1/d3/f1f 1 1 2026-03-10T12:38:01.538 INFO:tasks.workunit.client.0.vm00.stdout:1/621: creat da/d21/d27/d6a/d94/fcf x:0 0 0 2026-03-10T12:38:01.539 INFO:tasks.workunit.client.0.vm00.stdout:1/622: write da/d21/d27/f6e [3048397,66471] 0 2026-03-10T12:38:01.552 INFO:tasks.workunit.client.0.vm00.stdout:9/669: truncate d0/d3d/d59/d4e/dba/d19/fb1 1002683 0 2026-03-10T12:38:01.554 INFO:tasks.workunit.client.0.vm00.stdout:9/670: write d0/d7f/db8/fc6 [666186,21736] 0 2026-03-10T12:38:01.554 INFO:tasks.workunit.client.0.vm00.stdout:0/567: dread d3/f4 [0,4194304] 0 2026-03-10T12:38:01.555 INFO:tasks.workunit.client.0.vm00.stdout:8/480: creat d0/d93/d60/f98 x:0 0 0 2026-03-10T12:38:01.557 INFO:tasks.workunit.client.0.vm00.stdout:4/615: chown df/d1f/d22/d26/d65/f8e 46707 1 2026-03-10T12:38:01.563 INFO:tasks.workunit.client.0.vm00.stdout:5/664: mknod d1f/d26/d2b/d35/d53/dd6/cec 0 2026-03-10T12:38:01.574 INFO:tasks.workunit.client.0.vm00.stdout:9/671: rmdir d0/d7f/db8/dc4 39 2026-03-10T12:38:01.577 INFO:tasks.workunit.client.0.vm00.stdout:9/672: dwrite d0/d3d/d59/d4e/dba/d19/f7d [0,4194304] 0 2026-03-10T12:38:01.579 INFO:tasks.workunit.client.0.vm00.stdout:8/481: symlink d0/d93/d36/d5b/l99 0 2026-03-10T12:38:01.580 INFO:tasks.workunit.client.0.vm00.stdout:4/616: truncate df/d1f/d22/d26/dab/f89 122883 0 2026-03-10T12:38:01.587 INFO:tasks.workunit.client.0.vm00.stdout:2/621: rename d4/d53/d76/d9b/dad/dbc to d4/d6/d2d/d3a/dd3 0 2026-03-10T12:38:01.590 INFO:tasks.workunit.client.0.vm00.stdout:5/665: dread d1f/d26/d2b/d37/f4c [0,4194304] 0 2026-03-10T12:38:01.591 INFO:tasks.workunit.client.0.vm00.stdout:5/666: write d1f/d26/d2e/fa5 [73382,19982] 0 2026-03-10T12:38:01.593 INFO:tasks.workunit.client.0.vm00.stdout:9/673: creat d0/d3d/d59/d4e/dba/d1e/d2b/fee x:0 0 0 2026-03-10T12:38:01.594 INFO:tasks.workunit.client.1.vm07.stdout:0/589: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf x:0 0 0 2026-03-10T12:38:01.597 INFO:tasks.workunit.client.0.vm00.stdout:2/622: creat d4/dd/d63/fd4 x:0 0 0 2026-03-10T12:38:01.598 INFO:tasks.workunit.client.0.vm00.stdout:9/674: read - d0/d3d/d59/d4e/f70 zero size 2026-03-10T12:38:01.599 INFO:tasks.workunit.client.0.vm00.stdout:9/675: chown d0/d3d/d59/d4e/dba/d1e/d27/c7a 55 1 2026-03-10T12:38:01.601 INFO:tasks.workunit.client.0.vm00.stdout:1/623: link da/d12/d26/f2e da/d12/d26/fd0 0 2026-03-10T12:38:01.602 INFO:tasks.workunit.client.0.vm00.stdout:6/451: mkdir d2/d9f 0 2026-03-10T12:38:01.603 INFO:tasks.workunit.client.0.vm00.stdout:5/667: creat d1f/d26/d2b/d37/dcc/fed x:0 0 0 2026-03-10T12:38:01.604 INFO:tasks.workunit.client.0.vm00.stdout:8/482: creat d0/dd/f9a x:0 0 0 2026-03-10T12:38:01.605 INFO:tasks.workunit.client.0.vm00.stdout:2/623: mkdir d4/d6/d2d/d3a/d43/dd5 0 2026-03-10T12:38:01.609 INFO:tasks.workunit.client.0.vm00.stdout:2/624: read d4/d53/f7d [227287,37529] 0 2026-03-10T12:38:01.610 INFO:tasks.workunit.client.0.vm00.stdout:5/668: creat d1f/d26/d2e/d58/d6b/d86/fee x:0 0 0 2026-03-10T12:38:01.614 INFO:tasks.workunit.client.0.vm00.stdout:8/483: mkdir d0/d46/d6e/d9b 0 2026-03-10T12:38:01.615 INFO:tasks.workunit.client.1.vm07.stdout:5/567: dread d0/d22/d18/d19/d2e/f88 [0,4194304] 0 2026-03-10T12:38:01.615 INFO:tasks.workunit.client.0.vm00.stdout:2/625: mknod d4/d78/cd6 0 2026-03-10T12:38:01.616 INFO:tasks.workunit.client.0.vm00.stdout:8/484: stat d0/d46/l54 0 2026-03-10T12:38:01.619 INFO:tasks.workunit.client.0.vm00.stdout:5/669: creat d1f/d26/d2e/d58/d6b/deb/fef x:0 0 0 2026-03-10T12:38:01.620 INFO:tasks.workunit.client.0.vm00.stdout:2/626: fsync d4/d6/d2d/d3a/f44 0 2026-03-10T12:38:01.621 INFO:tasks.workunit.client.0.vm00.stdout:2/627: write d4/d6/d93/fbf [1027607,74741] 0 2026-03-10T12:38:01.621 INFO:tasks.workunit.client.0.vm00.stdout:8/485: dwrite d0/d93/d17/d48/f4c [0,4194304] 0 2026-03-10T12:38:01.629 INFO:tasks.workunit.client.0.vm00.stdout:3/653: dread dd/d27/f44 [0,4194304] 0 2026-03-10T12:38:01.630 INFO:tasks.workunit.client.0.vm00.stdout:3/654: stat l6 0 2026-03-10T12:38:01.631 INFO:tasks.workunit.client.0.vm00.stdout:3/655: fsync dd/d27/d2c/d34/d38/f48 0 2026-03-10T12:38:01.631 INFO:tasks.workunit.client.0.vm00.stdout:5/670: mkdir d1f/d96/dbd/df0 0 2026-03-10T12:38:01.637 INFO:tasks.workunit.client.0.vm00.stdout:8/486: dwrite d0/f8 [0,4194304] 0 2026-03-10T12:38:01.637 INFO:tasks.workunit.client.0.vm00.stdout:3/656: dwrite dd/d18/d14/d2b/f8d [0,4194304] 0 2026-03-10T12:38:01.641 INFO:tasks.workunit.client.0.vm00.stdout:3/657: fdatasync dd/d27/d2c/d34/fd8 0 2026-03-10T12:38:01.645 INFO:tasks.workunit.client.0.vm00.stdout:2/628: sync 2026-03-10T12:38:01.653 INFO:tasks.workunit.client.0.vm00.stdout:3/658: creat dd/d18/d13/d1d/d43/d55/fda x:0 0 0 2026-03-10T12:38:01.656 INFO:tasks.workunit.client.0.vm00.stdout:3/659: dwrite dd/d27/d2c/fb1 [0,4194304] 0 2026-03-10T12:38:01.659 INFO:tasks.workunit.client.0.vm00.stdout:8/487: rename d0/l6 to d0/d93/d17/d48/l9c 0 2026-03-10T12:38:01.659 INFO:tasks.workunit.client.0.vm00.stdout:5/671: creat d1f/d26/d2e/ff1 x:0 0 0 2026-03-10T12:38:01.666 INFO:tasks.workunit.client.0.vm00.stdout:2/629: rename d4/d53/d68/f8a to d4/d6/d93/dc6/fd7 0 2026-03-10T12:38:01.673 INFO:tasks.workunit.client.0.vm00.stdout:3/660: creat dd/d2a/da2/db4/fdb x:0 0 0 2026-03-10T12:38:01.674 INFO:tasks.workunit.client.0.vm00.stdout:3/661: readlink dd/d4e/d6a/l82 0 2026-03-10T12:38:01.675 INFO:tasks.workunit.client.0.vm00.stdout:2/630: rename d4/dd/lb3 to d4/d6/d93/ld8 0 2026-03-10T12:38:01.676 INFO:tasks.workunit.client.0.vm00.stdout:2/631: read d4/d6/d2d/d3a/d43/fa1 [41022,85037] 0 2026-03-10T12:38:01.680 INFO:tasks.workunit.client.0.vm00.stdout:8/488: link d0/dd/d38/f3d d0/f9d 0 2026-03-10T12:38:01.685 INFO:tasks.workunit.client.0.vm00.stdout:2/632: mkdir d4/d53/d68/dc2/dd9 0 2026-03-10T12:38:01.686 INFO:tasks.workunit.client.0.vm00.stdout:2/633: stat d4/d6/c2a 0 2026-03-10T12:38:01.688 INFO:tasks.workunit.client.0.vm00.stdout:2/634: symlink d4/dd/lda 0 2026-03-10T12:38:01.689 INFO:tasks.workunit.client.0.vm00.stdout:2/635: unlink d4/d53/lab 0 2026-03-10T12:38:01.690 INFO:tasks.workunit.client.0.vm00.stdout:2/636: chown d4/c7f 8919 1 2026-03-10T12:38:01.694 INFO:tasks.workunit.client.0.vm00.stdout:2/637: dwrite d4/d6/d2d/d3a/dd3/fbe [0,4194304] 0 2026-03-10T12:38:01.698 INFO:tasks.workunit.client.0.vm00.stdout:8/489: sync 2026-03-10T12:38:01.700 INFO:tasks.workunit.client.0.vm00.stdout:8/490: creat d0/dd/f9e x:0 0 0 2026-03-10T12:38:01.701 INFO:tasks.workunit.client.0.vm00.stdout:8/491: write d0/dd/f4d [4294611,112779] 0 2026-03-10T12:38:01.707 INFO:tasks.workunit.client.0.vm00.stdout:2/638: getdents d4/d53/d76/d9b/dad 0 2026-03-10T12:38:01.708 INFO:tasks.workunit.client.0.vm00.stdout:4/617: write df/f16 [1108752,14132] 0 2026-03-10T12:38:01.709 INFO:tasks.workunit.client.0.vm00.stdout:4/618: write df/d1f/d36/f51 [1623701,123288] 0 2026-03-10T12:38:01.710 INFO:tasks.workunit.client.0.vm00.stdout:1/624: write da/d21/d39/f8c [1229204,77852] 0 2026-03-10T12:38:01.711 INFO:tasks.workunit.client.0.vm00.stdout:1/625: write da/d24/d5a/f75 [3343014,31240] 0 2026-03-10T12:38:01.712 INFO:tasks.workunit.client.0.vm00.stdout:1/626: read - da/d21/db3/d5d/d80/fcc zero size 2026-03-10T12:38:01.717 INFO:tasks.workunit.client.0.vm00.stdout:2/639: creat d4/d6/d93/fdb x:0 0 0 2026-03-10T12:38:01.724 INFO:tasks.workunit.client.0.vm00.stdout:4/619: dwrite df/d93/dbc/fc3 [0,4194304] 0 2026-03-10T12:38:01.731 INFO:tasks.workunit.client.0.vm00.stdout:8/492: creat d0/dd/f9f x:0 0 0 2026-03-10T12:38:01.731 INFO:tasks.workunit.client.0.vm00.stdout:8/493: readlink d0/dd/l57 0 2026-03-10T12:38:01.732 INFO:tasks.workunit.client.0.vm00.stdout:8/494: chown d0/d93/d43/f6a 608678 1 2026-03-10T12:38:01.732 INFO:tasks.workunit.client.0.vm00.stdout:2/640: rmdir d4/d6/d2d/d3a/d43 39 2026-03-10T12:38:01.735 INFO:tasks.workunit.client.0.vm00.stdout:1/627: mknod da/d21/d27/cd1 0 2026-03-10T12:38:01.736 INFO:tasks.workunit.client.0.vm00.stdout:1/628: dread - da/d21/db3/d5d/d72/d7e/fac zero size 2026-03-10T12:38:01.736 INFO:tasks.workunit.client.0.vm00.stdout:2/641: write d4/d6/d2d/d3a/d43/d85/f8f [926370,34091] 0 2026-03-10T12:38:01.738 INFO:tasks.workunit.client.0.vm00.stdout:4/620: sync 2026-03-10T12:38:01.738 INFO:tasks.workunit.client.0.vm00.stdout:1/629: truncate da/d12/f1d 4729963 0 2026-03-10T12:38:01.738 INFO:tasks.workunit.client.0.vm00.stdout:1/630: readlink da/d24/l2f 0 2026-03-10T12:38:01.741 INFO:tasks.workunit.client.0.vm00.stdout:4/621: write df/d1f/d36/f6f [252857,74183] 0 2026-03-10T12:38:01.742 INFO:tasks.workunit.client.0.vm00.stdout:7/470: creat da/d41/d48/fae x:0 0 0 2026-03-10T12:38:01.743 INFO:tasks.workunit.client.0.vm00.stdout:4/622: dread - df/d1f/d22/d26/f31 zero size 2026-03-10T12:38:01.743 INFO:tasks.workunit.client.0.vm00.stdout:7/471: readlink da/d1b/l6d 0 2026-03-10T12:38:01.748 INFO:tasks.workunit.client.1.vm07.stdout:2/444: dwrite d0/d42/d26/f5a [0,4194304] 0 2026-03-10T12:38:01.748 INFO:tasks.workunit.client.0.vm00.stdout:7/472: dwrite f9 [0,4194304] 0 2026-03-10T12:38:01.750 INFO:tasks.workunit.client.0.vm00.stdout:7/473: chown da/d41 28 1 2026-03-10T12:38:01.751 INFO:tasks.workunit.client.0.vm00.stdout:7/474: dread - da/d41/d7b/d9d/fa8 zero size 2026-03-10T12:38:01.762 INFO:tasks.workunit.client.0.vm00.stdout:8/495: dread d0/d93/d43/f6a [0,4194304] 0 2026-03-10T12:38:01.767 INFO:tasks.workunit.client.0.vm00.stdout:1/631: chown da/d24/l3f 547 1 2026-03-10T12:38:01.767 INFO:tasks.workunit.client.0.vm00.stdout:4/623: creat df/d1f/d22/d26/d70/fcd x:0 0 0 2026-03-10T12:38:01.770 INFO:tasks.workunit.client.0.vm00.stdout:4/624: unlink df/d1f/d22/d26/d70/fcd 0 2026-03-10T12:38:01.773 INFO:tasks.workunit.client.0.vm00.stdout:4/625: dwrite df/d32/d76/fc2 [0,4194304] 0 2026-03-10T12:38:01.775 INFO:tasks.workunit.client.1.vm07.stdout:7/522: truncate d0/d47/f58 141388 0 2026-03-10T12:38:01.782 INFO:tasks.workunit.client.0.vm00.stdout:1/632: sync 2026-03-10T12:38:01.786 INFO:tasks.workunit.client.1.vm07.stdout:6/515: truncate d1/d4/d6/d16/f50 429550 0 2026-03-10T12:38:01.788 INFO:tasks.workunit.client.0.vm00.stdout:4/626: rmdir df/d1f/d36/dc6 39 2026-03-10T12:38:01.801 INFO:tasks.workunit.client.0.vm00.stdout:4/627: symlink df/d1f/d22/d26/d65/d91/lce 0 2026-03-10T12:38:01.805 INFO:tasks.workunit.client.0.vm00.stdout:4/628: dread - df/d1f/d22/d26/d70/fb4 zero size 2026-03-10T12:38:01.828 INFO:tasks.workunit.client.0.vm00.stdout:8/496: dread d0/dd/d38/f3d [0,4194304] 0 2026-03-10T12:38:01.831 INFO:tasks.workunit.client.0.vm00.stdout:7/475: mknod da/d41/d48/caf 0 2026-03-10T12:38:01.831 INFO:tasks.workunit.client.0.vm00.stdout:7/476: write da/d3f/f93 [61458,94028] 0 2026-03-10T12:38:01.831 INFO:tasks.workunit.client.0.vm00.stdout:8/497: dwrite d0/dd/f9a [0,4194304] 0 2026-03-10T12:38:01.852 INFO:tasks.workunit.client.1.vm07.stdout:0/590: dread d0/d14/d7c/f90 [0,4194304] 0 2026-03-10T12:38:01.852 INFO:tasks.workunit.client.1.vm07.stdout:0/591: chown d0/d14/d5f/d76/d2f/d31/d79 123287 1 2026-03-10T12:38:01.856 INFO:tasks.workunit.client.1.vm07.stdout:0/592: dwrite d0/d14/d5f/d76/d2f/d31/d79/d9e/fb1 [0,4194304] 0 2026-03-10T12:38:01.859 INFO:tasks.workunit.client.1.vm07.stdout:3/580: dread dc/d18/d24/f3f [0,4194304] 0 2026-03-10T12:38:01.866 INFO:tasks.workunit.client.0.vm00.stdout:9/676: dwrite d0/d3d/d59/d4e/dba/d19/fb1 [0,4194304] 0 2026-03-10T12:38:01.870 INFO:tasks.workunit.client.1.vm07.stdout:5/568: fdatasync d0/d22/d18/f95 0 2026-03-10T12:38:01.878 INFO:tasks.workunit.client.0.vm00.stdout:7/477: rename da/d47/f49 to da/d41/d7b/fb0 0 2026-03-10T12:38:01.892 INFO:tasks.workunit.client.1.vm07.stdout:1/522: link c4 d9/df/d29/d2b/d3d/cae 0 2026-03-10T12:38:01.894 INFO:tasks.workunit.client.1.vm07.stdout:9/609: rename d5/d13/d57/d3e/fa8 to d5/fd4 0 2026-03-10T12:38:01.917 INFO:tasks.workunit.client.1.vm07.stdout:3/581: truncate dc/d18/fa1 880954 0 2026-03-10T12:38:01.920 INFO:tasks.workunit.client.0.vm00.stdout:8/498: dread d0/d93/d2d/f6f [0,4194304] 0 2026-03-10T12:38:01.924 INFO:tasks.workunit.client.1.vm07.stdout:5/569: write d0/d22/d18/d19/d2e/d3f/fb3 [755682,78430] 0 2026-03-10T12:38:01.927 INFO:tasks.workunit.client.1.vm07.stdout:6/516: creat d1/d4/d6/d4e/d64/fa4 x:0 0 0 2026-03-10T12:38:01.927 INFO:tasks.workunit.client.1.vm07.stdout:1/523: creat d9/df/d29/d2b/d31/d91/faf x:0 0 0 2026-03-10T12:38:01.927 INFO:tasks.workunit.client.0.vm00.stdout:8/499: dwrite d0/d93/d43/f6a [0,4194304] 0 2026-03-10T12:38:01.927 INFO:tasks.workunit.client.0.vm00.stdout:8/500: readlink d0/d93/d17/d48/l4f 0 2026-03-10T12:38:01.929 INFO:tasks.workunit.client.0.vm00.stdout:8/501: dwrite d0/d46/f94 [0,4194304] 0 2026-03-10T12:38:01.933 INFO:tasks.workunit.client.0.vm00.stdout:8/502: write d0/dd/d38/d81/f88 [469295,28493] 0 2026-03-10T12:38:01.934 INFO:tasks.workunit.client.0.vm00.stdout:8/503: chown d0/dd 95372 1 2026-03-10T12:38:01.936 INFO:tasks.workunit.client.0.vm00.stdout:8/504: unlink d0/l3f 0 2026-03-10T12:38:01.940 INFO:tasks.workunit.client.1.vm07.stdout:0/593: mknod d0/d14/d5f/d76/d2f/d31/d4f/cc0 0 2026-03-10T12:38:01.943 INFO:tasks.workunit.client.1.vm07.stdout:0/594: fdatasync d0/d14/d7c/f90 0 2026-03-10T12:38:01.944 INFO:tasks.workunit.client.1.vm07.stdout:5/570: mkdir d0/d22/d18/dc7 0 2026-03-10T12:38:01.951 INFO:tasks.workunit.client.1.vm07.stdout:4/656: link d0/d4/d10/d9a/c38 d0/d8e/ce6 0 2026-03-10T12:38:01.957 INFO:tasks.workunit.client.1.vm07.stdout:2/445: getdents d0/d42/d4e/d77 0 2026-03-10T12:38:01.958 INFO:tasks.workunit.client.1.vm07.stdout:4/657: mknod d0/d4/d10/d3c/d2b/d54/ce7 0 2026-03-10T12:38:01.958 INFO:tasks.workunit.client.1.vm07.stdout:4/658: readlink d0/d4/d5/l20 0 2026-03-10T12:38:01.959 INFO:tasks.workunit.client.1.vm07.stdout:4/659: chown d0/d4/d5/da/d66/lc2 428852 1 2026-03-10T12:38:01.988 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:01 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:01.988 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:01 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:01.988 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:01 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:01.988 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:01 vm07.local ceph-mon[58582]: pgmap v166: 65 pgs: 65 active+clean; 2.4 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 68 MiB/s rd, 168 MiB/s wr, 386 op/s 2026-03-10T12:38:01.989 INFO:tasks.workunit.client.1.vm07.stdout:0/595: sync 2026-03-10T12:38:01.989 INFO:tasks.workunit.client.1.vm07.stdout:6/517: sync 2026-03-10T12:38:01.991 INFO:tasks.workunit.client.1.vm07.stdout:6/518: rename d1/d4/d71/l9a to d1/d4/d6/d16/d1a/d33/la5 0 2026-03-10T12:38:01.992 INFO:tasks.workunit.client.1.vm07.stdout:6/519: chown d1/d4/d6/d16/d1a/f29 2033 1 2026-03-10T12:38:01.994 INFO:tasks.workunit.client.1.vm07.stdout:0/596: symlink d0/d14/d5f/d76/d93/lc1 0 2026-03-10T12:38:02.000 INFO:tasks.workunit.client.1.vm07.stdout:6/520: symlink d1/d4/d6/d16/d49/la6 0 2026-03-10T12:38:02.016 INFO:tasks.workunit.client.1.vm07.stdout:0/597: unlink d0/d14/d5f/d76/d2f/f5d 0 2026-03-10T12:38:02.023 INFO:tasks.workunit.client.0.vm00.stdout:8/505: dread d0/d93/d2d/f33 [0,4194304] 0 2026-03-10T12:38:02.026 INFO:tasks.workunit.client.0.vm00.stdout:5/672: write d1f/d26/d2b/f5c [1039495,79448] 0 2026-03-10T12:38:02.027 INFO:tasks.workunit.client.0.vm00.stdout:8/506: dwrite d0/dd/d38/d81/f88 [0,4194304] 0 2026-03-10T12:38:02.028 INFO:tasks.workunit.client.1.vm07.stdout:0/598: creat d0/d14/d5f/d41/d6a/d74/fc2 x:0 0 0 2026-03-10T12:38:02.036 INFO:tasks.workunit.client.0.vm00.stdout:5/673: dwrite d1f/d26/d2b/d35/d53/d72/fa0 [0,4194304] 0 2026-03-10T12:38:02.037 INFO:tasks.workunit.client.1.vm07.stdout:0/599: rename d0/d14/d5f/d76/d2f to d0/d14/d5f/d76/d2f/d31/d79/d85/dc3 22 2026-03-10T12:38:02.039 INFO:tasks.workunit.client.0.vm00.stdout:5/674: link d1f/d26/d2b/d35/d53/d72/lbc d1f/d26/d2b/d35/d78/d7f/lf2 0 2026-03-10T12:38:02.040 INFO:tasks.workunit.client.0.vm00.stdout:5/675: write d1f/d26/d2b/d35/d53/d72/d9d/d8e/fb6 [668232,62011] 0 2026-03-10T12:38:02.042 INFO:tasks.workunit.client.0.vm00.stdout:5/676: symlink d1f/d26/d2b/d35/d78/d99/lf3 0 2026-03-10T12:38:02.043 INFO:tasks.workunit.client.0.vm00.stdout:5/677: readlink d1f/d26/d2b/d35/d53/d72/lbc 0 2026-03-10T12:38:02.047 INFO:tasks.workunit.client.0.vm00.stdout:5/678: truncate d1f/f46 141866 0 2026-03-10T12:38:02.052 INFO:tasks.workunit.client.0.vm00.stdout:5/679: creat d1f/d26/d2b/d37/dc4/ff4 x:0 0 0 2026-03-10T12:38:02.053 INFO:tasks.workunit.client.0.vm00.stdout:5/680: truncate d1f/d26/d2b/d35/fe8 860976 0 2026-03-10T12:38:02.053 INFO:tasks.workunit.client.0.vm00.stdout:5/681: chown d1f/f46 1071495 1 2026-03-10T12:38:02.054 INFO:tasks.workunit.client.0.vm00.stdout:5/682: chown d1f/d26/d2e/f8c 0 1 2026-03-10T12:38:02.063 INFO:tasks.workunit.client.0.vm00.stdout:5/683: link d1f/d26/f79 d1f/d26/d2b/d35/d78/d99/daf/ff5 0 2026-03-10T12:38:02.064 INFO:tasks.workunit.client.0.vm00.stdout:5/684: fdatasync d1f/d26/d2b/d35/f50 0 2026-03-10T12:38:02.068 INFO:tasks.workunit.client.0.vm00.stdout:5/685: dwrite f19 [0,4194304] 0 2026-03-10T12:38:02.074 INFO:tasks.workunit.client.0.vm00.stdout:5/686: creat d1f/d26/d2e/d58/ff6 x:0 0 0 2026-03-10T12:38:02.074 INFO:tasks.workunit.client.0.vm00.stdout:5/687: chown d1f/d6a/d94 76 1 2026-03-10T12:38:02.074 INFO:tasks.workunit.client.0.vm00.stdout:5/688: fdatasync d1f/d6a/f84 0 2026-03-10T12:38:02.078 INFO:tasks.workunit.client.0.vm00.stdout:7/478: dread da/d26/f27 [0,4194304] 0 2026-03-10T12:38:02.084 INFO:tasks.workunit.client.0.vm00.stdout:7/479: write da/d1b/d40/f7d [4106348,43669] 0 2026-03-10T12:38:02.084 INFO:tasks.workunit.client.0.vm00.stdout:5/689: creat d1f/d26/de3/db7/ff7 x:0 0 0 2026-03-10T12:38:02.084 INFO:tasks.workunit.client.0.vm00.stdout:5/690: chown d1f/d26/d2b/d35/d53/lc2 5373 1 2026-03-10T12:38:02.084 INFO:tasks.workunit.client.0.vm00.stdout:7/480: creat da/d3f/d60/fb1 x:0 0 0 2026-03-10T12:38:02.084 INFO:tasks.workunit.client.0.vm00.stdout:5/691: fdatasync d1f/d26/d2e/d58/ff6 0 2026-03-10T12:38:02.087 INFO:tasks.workunit.client.0.vm00.stdout:5/692: mknod d1f/d26/d2b/d37/cf8 0 2026-03-10T12:38:02.088 INFO:tasks.workunit.client.0.vm00.stdout:5/693: rename d1f/d26/d2e/f71 to d1f/d26/d2b/d35/d53/d72/ff9 0 2026-03-10T12:38:02.089 INFO:tasks.workunit.client.0.vm00.stdout:5/694: truncate d1f/d6a/d94/dc3/fd9 1820758 0 2026-03-10T12:38:02.093 INFO:tasks.workunit.client.0.vm00.stdout:5/695: dwrite d1f/d26/d2b/d35/d53/d72/ff9 [0,4194304] 0 2026-03-10T12:38:02.097 INFO:tasks.workunit.client.0.vm00.stdout:5/696: read d1f/d26/d2b/d37/f8a [2392095,110231] 0 2026-03-10T12:38:02.149 INFO:tasks.workunit.client.0.vm00.stdout:5/697: dread d1f/d6a/d94/dc3/fd9 [0,4194304] 0 2026-03-10T12:38:02.184 INFO:tasks.workunit.client.0.vm00.stdout:3/662: dwrite dd/d27/d2c/d34/f60 [0,4194304] 0 2026-03-10T12:38:02.185 INFO:tasks.workunit.client.0.vm00.stdout:3/663: stat dd/d27/d2c/d34/f60 0 2026-03-10T12:38:02.185 INFO:tasks.workunit.client.0.vm00.stdout:2/642: getdents d4/dd 0 2026-03-10T12:38:02.186 INFO:tasks.workunit.client.0.vm00.stdout:2/643: fsync d4/d6/f16 0 2026-03-10T12:38:02.187 INFO:tasks.workunit.client.0.vm00.stdout:3/664: truncate dd/d3d/d65/f90 551168 0 2026-03-10T12:38:02.189 INFO:tasks.workunit.client.0.vm00.stdout:2/644: symlink d4/d6/d2d/d3a/d43/d85/ldc 0 2026-03-10T12:38:02.190 INFO:tasks.workunit.client.0.vm00.stdout:3/665: creat dd/d18/d13/d1d/d43/fdc x:0 0 0 2026-03-10T12:38:02.193 INFO:tasks.workunit.client.0.vm00.stdout:2/645: mknod d4/d6/d93/dc6/cdd 0 2026-03-10T12:38:02.193 INFO:tasks.workunit.client.0.vm00.stdout:2/646: chown d4/d53 18259055 1 2026-03-10T12:38:02.197 INFO:tasks.workunit.client.0.vm00.stdout:2/647: dwrite d4/d6/d2d/d3a/dd3/fbe [4194304,4194304] 0 2026-03-10T12:38:02.203 INFO:tasks.workunit.client.0.vm00.stdout:3/666: rename dd/d18/d13/l1e to dd/d18/d13/d1d/dc6/ldd 0 2026-03-10T12:38:02.204 INFO:tasks.workunit.client.0.vm00.stdout:2/648: dread d4/d6/dca/f3f [0,4194304] 0 2026-03-10T12:38:02.205 INFO:tasks.workunit.client.0.vm00.stdout:2/649: write d4/d6/f16 [7394588,124154] 0 2026-03-10T12:38:02.212 INFO:tasks.workunit.client.0.vm00.stdout:3/667: symlink dd/d18/d13/d1d/dc6/dd5/lde 0 2026-03-10T12:38:02.216 INFO:tasks.workunit.client.0.vm00.stdout:2/650: mkdir d4/d53/d68/dc2/dd9/dde 0 2026-03-10T12:38:02.218 INFO:tasks.workunit.client.0.vm00.stdout:2/651: creat d4/d6/d2d/dc3/fdf x:0 0 0 2026-03-10T12:38:02.221 INFO:tasks.workunit.client.0.vm00.stdout:3/668: getdents dd/d27 0 2026-03-10T12:38:02.221 INFO:tasks.workunit.client.0.vm00.stdout:3/669: write dd/d27/d2c/d34/fcd [239296,93518] 0 2026-03-10T12:38:02.223 INFO:tasks.workunit.client.0.vm00.stdout:2/652: symlink d4/d6/d2d/d3a/d43/le0 0 2026-03-10T12:38:02.233 INFO:tasks.workunit.client.0.vm00.stdout:3/670: rename dd/d27/d2c/d34/fd8 to dd/d18/d13/d99/da5/fdf 0 2026-03-10T12:38:02.233 INFO:tasks.workunit.client.0.vm00.stdout:2/653: mkdir d4/d6/d2d/dc3/de1 0 2026-03-10T12:38:02.233 INFO:tasks.workunit.client.0.vm00.stdout:3/671: dread dd/d64/fb2 [0,4194304] 0 2026-03-10T12:38:02.233 INFO:tasks.workunit.client.0.vm00.stdout:3/672: read dd/d64/fa4 [197206,22539] 0 2026-03-10T12:38:02.233 INFO:tasks.workunit.client.0.vm00.stdout:2/654: getdents d4/d78 0 2026-03-10T12:38:02.233 INFO:tasks.workunit.client.0.vm00.stdout:2/655: chown d4/dd/l1c 1453159101 1 2026-03-10T12:38:02.233 INFO:tasks.workunit.client.0.vm00.stdout:2/656: write d4/d53/d76/d9b/dad/f80 [993058,115860] 0 2026-03-10T12:38:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:01 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:01 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:01 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:01 vm00.local ceph-mon[50686]: pgmap v166: 65 pgs: 65 active+clean; 2.4 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 68 MiB/s rd, 168 MiB/s wr, 386 op/s 2026-03-10T12:38:02.236 INFO:tasks.workunit.client.0.vm00.stdout:2/657: fdatasync d4/d53/d76/d9b/dad/f65 0 2026-03-10T12:38:02.236 INFO:tasks.workunit.client.0.vm00.stdout:2/658: stat d4/d6/d2d/lb5 0 2026-03-10T12:38:02.237 INFO:tasks.workunit.client.0.vm00.stdout:1/633: truncate da/d12/f64 728570 0 2026-03-10T12:38:02.241 INFO:tasks.workunit.client.0.vm00.stdout:2/659: creat d4/dd/fe2 x:0 0 0 2026-03-10T12:38:02.241 INFO:tasks.workunit.client.0.vm00.stdout:2/660: chown d4/d53/d76/fac 13967 1 2026-03-10T12:38:02.244 INFO:tasks.workunit.client.0.vm00.stdout:1/634: mkdir da/d12/d26/dd2 0 2026-03-10T12:38:02.244 INFO:tasks.workunit.client.0.vm00.stdout:4/629: dwrite df/d1f/d22/f7d [0,4194304] 0 2026-03-10T12:38:02.245 INFO:tasks.workunit.client.0.vm00.stdout:4/630: dread - df/fac zero size 2026-03-10T12:38:02.246 INFO:tasks.workunit.client.0.vm00.stdout:4/631: chown df/d1f/d22/d26/d65/d91/lce 0 1 2026-03-10T12:38:02.247 INFO:tasks.workunit.client.0.vm00.stdout:2/661: truncate d4/f39 3540051 0 2026-03-10T12:38:02.249 INFO:tasks.workunit.client.0.vm00.stdout:1/635: creat da/d21/db3/d59/da6/fd3 x:0 0 0 2026-03-10T12:38:02.250 INFO:tasks.workunit.client.0.vm00.stdout:4/632: creat df/d63/d94/fcf x:0 0 0 2026-03-10T12:38:02.251 INFO:tasks.workunit.client.0.vm00.stdout:4/633: chown df/d1f/lbb 10145660 1 2026-03-10T12:38:02.252 INFO:tasks.workunit.client.0.vm00.stdout:2/662: write d4/d6/d2d/d3a/dd3/fc4 [878866,24656] 0 2026-03-10T12:38:02.252 INFO:tasks.workunit.client.0.vm00.stdout:2/663: fdatasync d4/f1d 0 2026-03-10T12:38:02.254 INFO:tasks.workunit.client.0.vm00.stdout:1/636: chown da/d12/c25 7299 1 2026-03-10T12:38:02.257 INFO:tasks.workunit.client.0.vm00.stdout:4/634: creat df/d1f/d22/d26/d65/d91/db9/fd0 x:0 0 0 2026-03-10T12:38:02.259 INFO:tasks.workunit.client.0.vm00.stdout:4/635: fsync df/d1f/d22/d26/f31 0 2026-03-10T12:38:02.261 INFO:tasks.workunit.client.0.vm00.stdout:4/636: dread - df/d63/d77/f9d zero size 2026-03-10T12:38:02.276 INFO:tasks.workunit.client.0.vm00.stdout:9/677: write d0/d3d/d59/d4e/dba/d19/f65 [473998,54289] 0 2026-03-10T12:38:02.278 INFO:tasks.workunit.client.0.vm00.stdout:8/507: truncate d0/dd/d38/d81/f88 2100073 0 2026-03-10T12:38:02.280 INFO:tasks.workunit.client.0.vm00.stdout:8/508: creat d0/d5c/fa0 x:0 0 0 2026-03-10T12:38:02.282 INFO:tasks.workunit.client.0.vm00.stdout:9/678: symlink d0/lef 0 2026-03-10T12:38:02.283 INFO:tasks.workunit.client.0.vm00.stdout:8/509: dread d0/d46/f94 [0,4194304] 0 2026-03-10T12:38:02.287 INFO:tasks.workunit.client.0.vm00.stdout:9/679: unlink d0/d3d/d43/c5b 0 2026-03-10T12:38:02.289 INFO:tasks.workunit.client.0.vm00.stdout:9/680: creat d0/d3d/d59/d4e/dba/d1e/d85/de5/ff0 x:0 0 0 2026-03-10T12:38:02.290 INFO:tasks.workunit.client.0.vm00.stdout:9/681: creat d0/d3d/d59/d4e/dba/d1e/d2b/ff1 x:0 0 0 2026-03-10T12:38:02.292 INFO:tasks.workunit.client.0.vm00.stdout:9/682: mkdir d0/d3d/df2 0 2026-03-10T12:38:02.292 INFO:tasks.workunit.client.0.vm00.stdout:4/637: sync 2026-03-10T12:38:02.296 INFO:tasks.workunit.client.0.vm00.stdout:4/638: creat df/d1f/d22/d26/fd1 x:0 0 0 2026-03-10T12:38:02.296 INFO:tasks.workunit.client.0.vm00.stdout:4/639: chown df/d1f/d22/f3c 503793 1 2026-03-10T12:38:02.297 INFO:tasks.workunit.client.0.vm00.stdout:4/640: stat df/d1f/d22/c88 0 2026-03-10T12:38:02.298 INFO:tasks.workunit.client.0.vm00.stdout:9/683: getdents d0/d7f/db8/dc4/db0 0 2026-03-10T12:38:02.298 INFO:tasks.workunit.client.0.vm00.stdout:4/641: chown df/d1f/d22/d26/dab/d73/lae 12423910 1 2026-03-10T12:38:02.299 INFO:tasks.workunit.client.0.vm00.stdout:4/642: mknod df/d63/d77/cd2 0 2026-03-10T12:38:02.301 INFO:tasks.workunit.client.0.vm00.stdout:9/684: rename d0/d7f/db8/dc4/f67 to d0/d3d/d43/ff3 0 2026-03-10T12:38:02.304 INFO:tasks.workunit.client.0.vm00.stdout:4/643: creat df/d1f/fd3 x:0 0 0 2026-03-10T12:38:02.306 INFO:tasks.workunit.client.0.vm00.stdout:9/685: fdatasync d0/d7f/db8/dc4/f6c 0 2026-03-10T12:38:02.309 INFO:tasks.workunit.client.0.vm00.stdout:4/644: rename df/d1f/lbb to df/d93/dbc/ld4 0 2026-03-10T12:38:02.310 INFO:tasks.workunit.client.0.vm00.stdout:9/686: symlink d0/d3d/d59/d4e/dba/d19/d50/lf4 0 2026-03-10T12:38:02.311 INFO:tasks.workunit.client.0.vm00.stdout:4/645: rmdir df/d93/dbc 39 2026-03-10T12:38:02.311 INFO:tasks.workunit.client.0.vm00.stdout:9/687: chown d0/d3d/d43/d53/f66 325858286 1 2026-03-10T12:38:02.311 INFO:tasks.workunit.client.0.vm00.stdout:9/688: chown d0/d7f/d88 37831070 1 2026-03-10T12:38:02.314 INFO:tasks.workunit.client.0.vm00.stdout:9/689: truncate d0/d7f/d88/fa8 335907 0 2026-03-10T12:38:02.315 INFO:tasks.workunit.client.0.vm00.stdout:9/690: dread - d0/d3d/d59/d4e/dba/d1e/d85/fe7 zero size 2026-03-10T12:38:02.316 INFO:tasks.workunit.client.0.vm00.stdout:4/646: getdents df/d8a 0 2026-03-10T12:38:02.320 INFO:tasks.workunit.client.0.vm00.stdout:4/647: dwrite df/f16 [0,4194304] 0 2026-03-10T12:38:02.328 INFO:tasks.workunit.client.0.vm00.stdout:4/648: dwrite df/f42 [4194304,4194304] 0 2026-03-10T12:38:02.338 INFO:tasks.workunit.client.0.vm00.stdout:4/649: creat df/d1f/d36/d3a/fd5 x:0 0 0 2026-03-10T12:38:02.342 INFO:tasks.workunit.client.0.vm00.stdout:4/650: dwrite df/d63/d77/f8f [0,4194304] 0 2026-03-10T12:38:02.343 INFO:tasks.workunit.client.0.vm00.stdout:4/651: write df/f42 [8429388,101333] 0 2026-03-10T12:38:02.346 INFO:tasks.workunit.client.0.vm00.stdout:5/698: truncate d1f/d6a/d94/dc3/fd9 1947784 0 2026-03-10T12:38:02.357 INFO:tasks.workunit.client.0.vm00.stdout:5/699: creat d1f/d26/d2b/d35/d53/d72/ffa x:0 0 0 2026-03-10T12:38:02.357 INFO:tasks.workunit.client.0.vm00.stdout:5/700: truncate d1f/d26/d2b/d35/d78/fc7 814279 0 2026-03-10T12:38:02.357 INFO:tasks.workunit.client.0.vm00.stdout:3/673: chown dd/d18/d13/d1d/d43/c9b 3821 1 2026-03-10T12:38:02.357 INFO:tasks.workunit.client.0.vm00.stdout:5/701: creat d1f/d96/dbd/ffb x:0 0 0 2026-03-10T12:38:02.365 INFO:tasks.workunit.client.0.vm00.stdout:4/652: sync 2026-03-10T12:38:02.366 INFO:tasks.workunit.client.0.vm00.stdout:9/691: dread d0/d3d/d59/d4e/dba/d1e/d2b/f6b [4194304,4194304] 0 2026-03-10T12:38:02.366 INFO:tasks.workunit.client.0.vm00.stdout:4/653: write df/d1f/d36/f51 [1482012,14554] 0 2026-03-10T12:38:02.366 INFO:tasks.workunit.client.0.vm00.stdout:4/654: chown df/d1f/d22/d26/d65/d91/l86 1 1 2026-03-10T12:38:02.370 INFO:tasks.workunit.client.0.vm00.stdout:5/702: creat d1f/d26/d2b/d35/d53/d72/d9d/d8e/ffc x:0 0 0 2026-03-10T12:38:02.376 INFO:tasks.workunit.client.0.vm00.stdout:4/655: creat df/d32/d64/fd6 x:0 0 0 2026-03-10T12:38:02.377 INFO:tasks.workunit.client.0.vm00.stdout:2/664: truncate d4/f1d 4119751 0 2026-03-10T12:38:02.381 INFO:tasks.workunit.client.0.vm00.stdout:4/656: creat df/d1f/d22/d26/dab/fd7 x:0 0 0 2026-03-10T12:38:02.385 INFO:tasks.workunit.client.0.vm00.stdout:5/703: symlink d1f/d26/d2e/lfd 0 2026-03-10T12:38:02.385 INFO:tasks.workunit.client.0.vm00.stdout:3/674: rmdir dd/d18/d13/d1d/d43/d55/d66 0 2026-03-10T12:38:02.385 INFO:tasks.workunit.client.0.vm00.stdout:3/675: chown dd/d3d/d8a/f8b 23987453 1 2026-03-10T12:38:02.390 INFO:tasks.workunit.client.0.vm00.stdout:5/704: fdatasync d1f/d26/d2b/d35/d53/dd6/fd7 0 2026-03-10T12:38:02.392 INFO:tasks.workunit.client.0.vm00.stdout:5/705: link d1f/d26/d2b/d35/d78/l95 d1f/d96/lfe 0 2026-03-10T12:38:02.396 INFO:tasks.workunit.client.0.vm00.stdout:5/706: dwrite d1f/d26/d2b/f5c [0,4194304] 0 2026-03-10T12:38:02.399 INFO:tasks.workunit.client.1.vm07.stdout:8/530: creat d1/d3/d6/d50/faa x:0 0 0 2026-03-10T12:38:02.405 INFO:tasks.workunit.client.1.vm07.stdout:8/531: chown d1/d3/d6/l2b 14 1 2026-03-10T12:38:02.412 INFO:tasks.workunit.client.1.vm07.stdout:8/532: dread d1/f19 [0,4194304] 0 2026-03-10T12:38:02.429 INFO:tasks.workunit.client.0.vm00.stdout:0/568: write d3/db/d24/d25/fb8 [967100,102565] 0 2026-03-10T12:38:02.431 INFO:tasks.workunit.client.0.vm00.stdout:0/569: unlink d3/d40/lb6 0 2026-03-10T12:38:02.436 INFO:tasks.workunit.client.0.vm00.stdout:4/657: rmdir df/d1f/d22/d26/d65/d91/db9 39 2026-03-10T12:38:02.437 INFO:tasks.workunit.client.0.vm00.stdout:1/637: truncate da/d24/d5a/f68 3989960 0 2026-03-10T12:38:02.437 INFO:tasks.workunit.client.0.vm00.stdout:1/638: chown da/d24/d28/d67/db0 7242990 1 2026-03-10T12:38:02.441 INFO:tasks.workunit.client.0.vm00.stdout:4/658: symlink df/d1f/d22/dcb/ld8 0 2026-03-10T12:38:02.441 INFO:tasks.workunit.client.0.vm00.stdout:4/659: dread - df/d32/d76/f7e zero size 2026-03-10T12:38:02.442 INFO:tasks.workunit.client.0.vm00.stdout:4/660: truncate df/d1f/d36/d3a/fa9 728060 0 2026-03-10T12:38:02.443 INFO:tasks.workunit.client.0.vm00.stdout:4/661: chown df/d1f/d22/d26/dab/d73/f7a 69226529 1 2026-03-10T12:38:02.445 INFO:tasks.workunit.client.1.vm07.stdout:8/533: dread d1/d3/d6/d54/f7d [0,4194304] 0 2026-03-10T12:38:02.446 INFO:tasks.workunit.client.0.vm00.stdout:1/639: mkdir da/d24/d5a/d71/dd4 0 2026-03-10T12:38:02.453 INFO:tasks.workunit.client.1.vm07.stdout:8/534: truncate d1/d3/d40/f7e 862632 0 2026-03-10T12:38:02.454 INFO:tasks.workunit.client.0.vm00.stdout:1/640: creat da/fd5 x:0 0 0 2026-03-10T12:38:02.460 INFO:tasks.workunit.client.0.vm00.stdout:1/641: creat da/d24/d28/fd6 x:0 0 0 2026-03-10T12:38:02.460 INFO:tasks.workunit.client.1.vm07.stdout:8/535: truncate d1/f6b 240655 0 2026-03-10T12:38:02.461 INFO:tasks.workunit.client.0.vm00.stdout:1/642: stat da/d24/l50 0 2026-03-10T12:38:02.461 INFO:tasks.workunit.client.1.vm07.stdout:8/536: chown d1/d3/d6/l78 6 1 2026-03-10T12:38:02.463 INFO:tasks.workunit.client.1.vm07.stdout:8/537: symlink d1/d3/d6/d7b/lab 0 2026-03-10T12:38:02.466 INFO:tasks.workunit.client.0.vm00.stdout:1/643: dwrite da/d12/da8/fc9 [0,4194304] 0 2026-03-10T12:38:02.468 INFO:tasks.workunit.client.1.vm07.stdout:8/538: unlink d1/d3/d40/c60 0 2026-03-10T12:38:02.474 INFO:tasks.workunit.client.0.vm00.stdout:8/510: dwrite d0/f9d [0,4194304] 0 2026-03-10T12:38:02.480 INFO:tasks.workunit.client.1.vm07.stdout:8/539: mkdir d1/d3/d40/dac 0 2026-03-10T12:38:02.481 INFO:tasks.workunit.client.1.vm07.stdout:8/540: chown d1/d3/f59 443634 1 2026-03-10T12:38:02.481 INFO:tasks.workunit.client.0.vm00.stdout:1/644: fdatasync da/d24/d28/d67/f5b 0 2026-03-10T12:38:02.483 INFO:tasks.workunit.client.1.vm07.stdout:8/541: truncate d1/d3/d6c/f9b 2446217 0 2026-03-10T12:38:02.505 INFO:tasks.workunit.client.0.vm00.stdout:1/645: dread da/d12/f1d [0,4194304] 0 2026-03-10T12:38:02.508 INFO:tasks.workunit.client.0.vm00.stdout:6/452: truncate d2/d14/f1b 1048359 0 2026-03-10T12:38:02.510 INFO:tasks.workunit.client.0.vm00.stdout:1/646: mknod da/d21/db3/d5d/d80/cd7 0 2026-03-10T12:38:02.511 INFO:tasks.workunit.client.0.vm00.stdout:7/481: dread - da/d41/d7b/fb0 zero size 2026-03-10T12:38:02.513 INFO:tasks.workunit.client.0.vm00.stdout:8/511: getdents d0/d58/d68 0 2026-03-10T12:38:02.514 INFO:tasks.workunit.client.0.vm00.stdout:8/512: write d0/d93/d36/d5b/f65 [51616,98984] 0 2026-03-10T12:38:02.514 INFO:tasks.workunit.client.0.vm00.stdout:8/513: stat d0/d93/d36/l53 0 2026-03-10T12:38:02.525 INFO:tasks.workunit.client.0.vm00.stdout:1/647: rmdir da/d24/d5a/d71 39 2026-03-10T12:38:02.525 INFO:tasks.workunit.client.0.vm00.stdout:1/648: mkdir da/d21/db3/d5d/d80/dd8 0 2026-03-10T12:38:02.526 INFO:tasks.workunit.client.1.vm07.stdout:7/523: dwrite d0/d57/d62/f84 [0,4194304] 0 2026-03-10T12:38:02.526 INFO:tasks.workunit.client.1.vm07.stdout:7/524: dwrite d0/d47/d48/f9e [0,4194304] 0 2026-03-10T12:38:02.526 INFO:tasks.workunit.client.1.vm07.stdout:7/525: mkdir d0/d47/dab/dae 0 2026-03-10T12:38:02.526 INFO:tasks.workunit.client.1.vm07.stdout:7/526: dread - d0/d67/d6f/d80/fac zero size 2026-03-10T12:38:02.526 INFO:tasks.workunit.client.0.vm00.stdout:8/514: symlink d0/d46/d6e/d9b/la1 0 2026-03-10T12:38:02.530 INFO:tasks.workunit.client.1.vm07.stdout:7/527: truncate d0/d61/d79/f95 1011043 0 2026-03-10T12:38:02.530 INFO:tasks.workunit.client.0.vm00.stdout:8/515: dwrite d0/d93/d43/f6a [0,4194304] 0 2026-03-10T12:38:02.534 INFO:tasks.workunit.client.0.vm00.stdout:8/516: fsync d0/d5c/f4a 0 2026-03-10T12:38:02.539 INFO:tasks.workunit.client.1.vm07.stdout:9/610: dwrite d5/d1f/d31/d64/f70 [0,4194304] 0 2026-03-10T12:38:02.540 INFO:tasks.workunit.client.0.vm00.stdout:1/649: mkdir da/d24/d5a/dd9 0 2026-03-10T12:38:02.542 INFO:tasks.workunit.client.1.vm07.stdout:9/611: creat d5/d13/d6c/fd5 x:0 0 0 2026-03-10T12:38:02.542 INFO:tasks.workunit.client.1.vm07.stdout:9/612: dread - d5/d16/da3/fb1 zero size 2026-03-10T12:38:02.565 INFO:tasks.workunit.client.0.vm00.stdout:1/650: dwrite da/d21/d39/f8c [4194304,4194304] 0 2026-03-10T12:38:02.565 INFO:tasks.workunit.client.0.vm00.stdout:8/517: mkdir d0/d93/d17/da2 0 2026-03-10T12:38:02.567 INFO:tasks.workunit.client.1.vm07.stdout:7/528: sync 2026-03-10T12:38:02.569 INFO:tasks.workunit.client.0.vm00.stdout:8/518: mknod d0/d46/ca3 0 2026-03-10T12:38:02.574 INFO:tasks.workunit.client.0.vm00.stdout:8/519: mknod d0/d93/d17/da2/ca4 0 2026-03-10T12:38:02.574 INFO:tasks.workunit.client.0.vm00.stdout:1/651: dread da/d12/d26/f57 [0,4194304] 0 2026-03-10T12:38:02.579 INFO:tasks.workunit.client.0.vm00.stdout:8/520: link d0/d93/d17/d48/f87 d0/d93/fa5 0 2026-03-10T12:38:02.582 INFO:tasks.workunit.client.0.vm00.stdout:8/521: creat d0/d93/d17/fa6 x:0 0 0 2026-03-10T12:38:02.583 INFO:tasks.workunit.client.0.vm00.stdout:8/522: write d0/d46/d7e/f8a [247931,6549] 0 2026-03-10T12:38:02.586 INFO:tasks.workunit.client.0.vm00.stdout:8/523: mknod d0/ca7 0 2026-03-10T12:38:02.602 INFO:tasks.workunit.client.0.vm00.stdout:9/692: write d0/d5/f26 [5170310,3397] 0 2026-03-10T12:38:02.604 INFO:tasks.workunit.client.0.vm00.stdout:3/676: rename dd/d18/d13/d1d/d43 to dd/d3d/d8a/de0 0 2026-03-10T12:38:02.608 INFO:tasks.workunit.client.0.vm00.stdout:5/707: rename d1f/d26/d2b/d35/d53/d72/d9d/f88 to d1f/d6a/d94/dc9/fff 0 2026-03-10T12:38:02.611 INFO:tasks.workunit.client.0.vm00.stdout:5/708: dread d1f/d26/d6f/f9b [0,4194304] 0 2026-03-10T12:38:02.613 INFO:tasks.workunit.client.0.vm00.stdout:5/709: stat f19 0 2026-03-10T12:38:02.613 INFO:tasks.workunit.client.0.vm00.stdout:1/652: rename da/d21/d39/d77 to da/d21/db3/d59/da6/da4/dda 0 2026-03-10T12:38:02.614 INFO:tasks.workunit.client.0.vm00.stdout:3/677: rename dd/d27/d2c/d34 to dd/d2a/da2/de1 0 2026-03-10T12:38:02.615 INFO:tasks.workunit.client.0.vm00.stdout:1/653: symlink da/d24/d5a/ldb 0 2026-03-10T12:38:02.617 INFO:tasks.workunit.client.0.vm00.stdout:9/693: rename d0/d5/dc/c29 to d0/d9b/cf5 0 2026-03-10T12:38:02.618 INFO:tasks.workunit.client.0.vm00.stdout:3/678: creat dd/d4e/d5d/fe2 x:0 0 0 2026-03-10T12:38:02.618 INFO:tasks.workunit.client.0.vm00.stdout:1/654: write da/d21/db3/d5d/d72/d7e/fac [898434,120726] 0 2026-03-10T12:38:02.618 INFO:tasks.workunit.client.1.vm07.stdout:7/529: sync 2026-03-10T12:38:02.620 INFO:tasks.workunit.client.1.vm07.stdout:7/530: dread - d0/d57/d62/f8b zero size 2026-03-10T12:38:02.621 INFO:tasks.workunit.client.0.vm00.stdout:3/679: creat dd/d3d/fe3 x:0 0 0 2026-03-10T12:38:02.623 INFO:tasks.workunit.client.0.vm00.stdout:3/680: rename dd/d4e/d6a to dd/d3d/d8a/de0/de4 0 2026-03-10T12:38:02.624 INFO:tasks.workunit.client.0.vm00.stdout:3/681: mknod dd/d64/d93/ce5 0 2026-03-10T12:38:02.625 INFO:tasks.workunit.client.0.vm00.stdout:3/682: dread - dd/d3d/d8a/de0/d55/fda zero size 2026-03-10T12:38:02.626 INFO:tasks.workunit.client.0.vm00.stdout:3/683: creat dd/d3d/d8a/de0/d55/fe6 x:0 0 0 2026-03-10T12:38:02.627 INFO:tasks.workunit.client.0.vm00.stdout:3/684: symlink dd/d18/d14/d2b/le7 0 2026-03-10T12:38:02.629 INFO:tasks.workunit.client.0.vm00.stdout:3/685: getdents dd/d18 0 2026-03-10T12:38:02.630 INFO:tasks.workunit.client.0.vm00.stdout:3/686: creat dd/d2a/da2/db4/fe8 x:0 0 0 2026-03-10T12:38:02.631 INFO:tasks.workunit.client.0.vm00.stdout:3/687: write dd/d18/d13/d99/da5/fcc [999468,57721] 0 2026-03-10T12:38:02.633 INFO:tasks.workunit.client.0.vm00.stdout:3/688: link dd/d64/d92/lb3 dd/d18/d13/d1d/le9 0 2026-03-10T12:38:02.634 INFO:tasks.workunit.client.0.vm00.stdout:3/689: truncate dd/d18/d14/fa0 838758 0 2026-03-10T12:38:02.647 INFO:tasks.workunit.client.0.vm00.stdout:3/690: dwrite dd/d27/d2c/fb1 [0,4194304] 0 2026-03-10T12:38:02.672 INFO:tasks.workunit.client.0.vm00.stdout:9/694: sync 2026-03-10T12:38:02.701 INFO:tasks.workunit.client.0.vm00.stdout:2/665: write d4/dd/f62 [797590,60040] 0 2026-03-10T12:38:02.702 INFO:tasks.workunit.client.0.vm00.stdout:2/666: symlink d4/d6/d2d/d31/le3 0 2026-03-10T12:38:02.727 INFO:tasks.workunit.client.1.vm07.stdout:6/521: rmdir d1/d4/d71 39 2026-03-10T12:38:02.728 INFO:tasks.workunit.client.1.vm07.stdout:2/446: write d0/d29/d64/f78 [519172,2397] 0 2026-03-10T12:38:02.728 INFO:tasks.workunit.client.1.vm07.stdout:5/571: write d0/d22/d18/d3e/d53/fa3 [2392060,119460] 0 2026-03-10T12:38:02.731 INFO:tasks.workunit.client.1.vm07.stdout:4/660: dwrite d0/d4/d5/d34/f94 [0,4194304] 0 2026-03-10T12:38:02.731 INFO:tasks.workunit.client.1.vm07.stdout:5/572: read - d0/d22/d18/d19/d21/fbd zero size 2026-03-10T12:38:02.736 INFO:tasks.workunit.client.1.vm07.stdout:6/522: creat d1/d4/d6/d43/d88/d97/fa7 x:0 0 0 2026-03-10T12:38:02.736 INFO:tasks.workunit.client.1.vm07.stdout:2/447: mkdir d0/d5b/d98 0 2026-03-10T12:38:02.736 INFO:tasks.workunit.client.1.vm07.stdout:1/524: rmdir d9/df/d29/d2b/d3d 39 2026-03-10T12:38:02.738 INFO:tasks.workunit.client.1.vm07.stdout:6/523: chown d1/d4/d6/d16/d49/f7a 54755424 1 2026-03-10T12:38:02.740 INFO:tasks.workunit.client.1.vm07.stdout:5/573: link d0/d22/f89 d0/d22/d18/d19/d2e/d67/fc8 0 2026-03-10T12:38:02.742 INFO:tasks.workunit.client.1.vm07.stdout:1/525: symlink d9/df/d55/lb0 0 2026-03-10T12:38:02.743 INFO:tasks.workunit.client.1.vm07.stdout:4/661: creat d0/d4/d5/fe8 x:0 0 0 2026-03-10T12:38:02.753 INFO:tasks.workunit.client.1.vm07.stdout:1/526: dread d9/df/d29/d2b/d30/f38 [0,4194304] 0 2026-03-10T12:38:02.754 INFO:tasks.workunit.client.1.vm07.stdout:2/448: link d0/d29/c3b d0/d5b/c99 0 2026-03-10T12:38:02.758 INFO:tasks.workunit.client.0.vm00.stdout:2/667: dread d4/d6/f16 [8388608,4194304] 0 2026-03-10T12:38:02.758 INFO:tasks.workunit.client.1.vm07.stdout:5/574: rename d0/d22/d18/d19/d21/d3a/l43 to d0/d22/d18/d19/d2e/d67/lc9 0 2026-03-10T12:38:02.767 INFO:tasks.workunit.client.1.vm07.stdout:4/662: dread d0/d4/d7a/f27 [0,4194304] 0 2026-03-10T12:38:02.781 INFO:tasks.workunit.client.0.vm00.stdout:4/662: dwrite df/f85 [0,4194304] 0 2026-03-10T12:38:02.782 INFO:tasks.workunit.client.1.vm07.stdout:0/600: write d0/d14/f37 [2831989,113674] 0 2026-03-10T12:38:02.783 INFO:tasks.workunit.client.0.vm00.stdout:4/663: dread - df/d1f/d36/d3a/fd5 zero size 2026-03-10T12:38:02.783 INFO:tasks.workunit.client.1.vm07.stdout:0/601: readlink d0/d14/d5f/d76/d2f/l4a 0 2026-03-10T12:38:02.786 INFO:tasks.workunit.client.0.vm00.stdout:4/664: creat df/d1f/d22/dcb/fd9 x:0 0 0 2026-03-10T12:38:02.787 INFO:tasks.workunit.client.0.vm00.stdout:4/665: mkdir df/d1f/d22/d26/dab/d73/dda 0 2026-03-10T12:38:02.788 INFO:tasks.workunit.client.0.vm00.stdout:4/666: unlink df/d32/c43 0 2026-03-10T12:38:02.790 INFO:tasks.workunit.client.0.vm00.stdout:4/667: write df/d93/dbc/fc3 [2195884,95064] 0 2026-03-10T12:38:02.791 INFO:tasks.workunit.client.1.vm07.stdout:4/663: mknod d0/d4/d10/d5f/ce9 0 2026-03-10T12:38:02.793 INFO:tasks.workunit.client.0.vm00.stdout:4/668: mkdir df/d63/ddb 0 2026-03-10T12:38:02.797 INFO:tasks.workunit.client.0.vm00.stdout:4/669: dwrite df/f42 [0,4194304] 0 2026-03-10T12:38:02.797 INFO:tasks.workunit.client.0.vm00.stdout:4/670: stat df/d1f/d36/f92 0 2026-03-10T12:38:02.797 INFO:tasks.workunit.client.0.vm00.stdout:4/671: read - df/d93/fc0 zero size 2026-03-10T12:38:02.803 INFO:tasks.workunit.client.1.vm07.stdout:5/575: creat d0/d22/d18/d19/d2e/d3f/db8/fca x:0 0 0 2026-03-10T12:38:02.804 INFO:tasks.workunit.client.0.vm00.stdout:8/524: truncate d0/d93/d17/d48/f87 1367038 0 2026-03-10T12:38:02.805 INFO:tasks.workunit.client.0.vm00.stdout:8/525: chown d0/d93/d17/c1b 172 1 2026-03-10T12:38:02.805 INFO:tasks.workunit.client.0.vm00.stdout:8/526: write d0/dd/f9a [932733,120065] 0 2026-03-10T12:38:02.805 INFO:tasks.workunit.client.0.vm00.stdout:8/527: chown d0/c2f 5102953 1 2026-03-10T12:38:02.806 INFO:tasks.workunit.client.0.vm00.stdout:8/528: readlink d0/d93/d36/l53 0 2026-03-10T12:38:02.806 INFO:tasks.workunit.client.0.vm00.stdout:8/529: chown d0/d5c/f4a 4 1 2026-03-10T12:38:02.808 INFO:tasks.workunit.client.0.vm00.stdout:8/530: rename d0/d93/d36/l40 to d0/dd/d38/d81/la8 0 2026-03-10T12:38:02.809 INFO:tasks.workunit.client.0.vm00.stdout:3/691: getdents dd/d3d/d8a 0 2026-03-10T12:38:02.810 INFO:tasks.workunit.client.0.vm00.stdout:3/692: mkdir dd/dea 0 2026-03-10T12:38:02.811 INFO:tasks.workunit.client.0.vm00.stdout:3/693: creat dd/d18/d14/feb x:0 0 0 2026-03-10T12:38:02.812 INFO:tasks.workunit.client.0.vm00.stdout:3/694: truncate dd/d64/d92/f9c 552567 0 2026-03-10T12:38:02.813 INFO:tasks.workunit.client.0.vm00.stdout:3/695: chown dd/d18/d13/d1d/l2d 41197 1 2026-03-10T12:38:02.814 INFO:tasks.workunit.client.0.vm00.stdout:5/710: chown d1f/d6a/d94/dc9/fff 2 1 2026-03-10T12:38:02.819 INFO:tasks.workunit.client.0.vm00.stdout:3/696: rename dd/d3d/d84/lc4 to dd/d64/d93/lec 0 2026-03-10T12:38:02.830 INFO:tasks.workunit.client.0.vm00.stdout:5/711: mknod d1f/d26/d2b/d35/d53/dd6/c100 0 2026-03-10T12:38:02.830 INFO:tasks.workunit.client.0.vm00.stdout:5/712: fsync d1f/d26/d2b/d37/da4/fde 0 2026-03-10T12:38:02.830 INFO:tasks.workunit.client.0.vm00.stdout:5/713: mkdir d1f/d26/d101 0 2026-03-10T12:38:02.830 INFO:tasks.workunit.client.0.vm00.stdout:5/714: getdents d1f 0 2026-03-10T12:38:02.830 INFO:tasks.workunit.client.0.vm00.stdout:1/655: dwrite da/d21/db3/f83 [0,4194304] 0 2026-03-10T12:38:02.830 INFO:tasks.workunit.client.0.vm00.stdout:5/715: dwrite d1f/d26/d2b/d35/d78/d7f/fb9 [0,4194304] 0 2026-03-10T12:38:02.831 INFO:tasks.workunit.client.0.vm00.stdout:5/716: readlink d1f/d6a/d94/dc9/lab 0 2026-03-10T12:38:02.834 INFO:tasks.workunit.client.1.vm07.stdout:1/527: getdents d9/d2d/d4f/d5a 0 2026-03-10T12:38:02.834 INFO:tasks.workunit.client.0.vm00.stdout:8/531: sync 2026-03-10T12:38:02.835 INFO:tasks.workunit.client.0.vm00.stdout:3/697: sync 2026-03-10T12:38:02.842 INFO:tasks.workunit.client.0.vm00.stdout:1/656: creat da/d21/db3/d5d/fdc x:0 0 0 2026-03-10T12:38:02.842 INFO:tasks.workunit.client.0.vm00.stdout:1/657: write da/d21/d39/f89 [767562,87236] 0 2026-03-10T12:38:02.847 INFO:tasks.workunit.client.0.vm00.stdout:5/717: creat d1f/d26/d2b/d35/d53/f102 x:0 0 0 2026-03-10T12:38:02.848 INFO:tasks.workunit.client.0.vm00.stdout:8/532: symlink d0/d46/d6e/d9b/la9 0 2026-03-10T12:38:02.851 INFO:tasks.workunit.client.0.vm00.stdout:1/658: rmdir da/d21/db3/d59/da6/da4 39 2026-03-10T12:38:02.852 INFO:tasks.workunit.client.0.vm00.stdout:5/718: truncate d1f/d26/d2b/d37/f38 4291445 0 2026-03-10T12:38:02.852 INFO:tasks.workunit.client.0.vm00.stdout:9/695: write d0/d5/dc/f2d [156834,125014] 0 2026-03-10T12:38:02.853 INFO:tasks.workunit.client.0.vm00.stdout:5/719: chown d1f/d26/d2b/d35/d78/d99/dcd/cdd 12504 1 2026-03-10T12:38:02.854 INFO:tasks.workunit.client.0.vm00.stdout:5/720: fdatasync d1f/d26/d2b/d37/dcc/fed 0 2026-03-10T12:38:02.855 INFO:tasks.workunit.client.0.vm00.stdout:5/721: write d1f/d26/d2b/d37/dc4/ff4 [899032,125960] 0 2026-03-10T12:38:02.856 INFO:tasks.workunit.client.0.vm00.stdout:1/659: truncate da/d21/db3/d59/da6/da4/dda/fbb 64350 0 2026-03-10T12:38:02.856 INFO:tasks.workunit.client.1.vm07.stdout:3/582: rmdir dc/dd/d28/d3b 39 2026-03-10T12:38:02.860 INFO:tasks.workunit.client.0.vm00.stdout:8/533: dwrite d0/f9 [0,4194304] 0 2026-03-10T12:38:02.863 INFO:tasks.workunit.client.0.vm00.stdout:8/534: dwrite d0/d5c/f4a [0,4194304] 0 2026-03-10T12:38:02.874 INFO:tasks.workunit.client.0.vm00.stdout:5/722: mkdir d1f/d26/d2b/de4/d103 0 2026-03-10T12:38:02.874 INFO:tasks.workunit.client.0.vm00.stdout:9/696: rename d0/d7f/db8/dc4/cdb to d0/d3d/d59/d4e/dba/d19/cf6 0 2026-03-10T12:38:02.875 INFO:tasks.workunit.client.0.vm00.stdout:9/697: write d0/d3d/d59/fe8 [445283,33907] 0 2026-03-10T12:38:02.875 INFO:tasks.workunit.client.0.vm00.stdout:1/660: creat da/d24/d28/fdd x:0 0 0 2026-03-10T12:38:02.878 INFO:tasks.workunit.client.0.vm00.stdout:5/723: stat d1f/d26/d2b/d35/d78/d7f/lf2 0 2026-03-10T12:38:02.883 INFO:tasks.workunit.client.0.vm00.stdout:5/724: read d1f/d26/d2b/d35/d53/d72/ff9 [4685675,85346] 0 2026-03-10T12:38:02.884 INFO:tasks.workunit.client.0.vm00.stdout:8/535: dread d0/d93/f23 [0,4194304] 0 2026-03-10T12:38:02.884 INFO:tasks.workunit.client.0.vm00.stdout:5/725: chown d1f/d26/f9f 7 1 2026-03-10T12:38:02.885 INFO:tasks.workunit.client.0.vm00.stdout:2/668: dwrite d4/d53/d68/f69 [0,4194304] 0 2026-03-10T12:38:02.894 INFO:tasks.workunit.client.0.vm00.stdout:9/698: dread d0/d3d/d59/d4e/dba/d1e/d2b/f36 [0,4194304] 0 2026-03-10T12:38:02.901 INFO:tasks.workunit.client.0.vm00.stdout:1/661: dread da/d21/db3/f7a [0,4194304] 0 2026-03-10T12:38:02.902 INFO:tasks.workunit.client.0.vm00.stdout:1/662: dread - da/d24/f76 zero size 2026-03-10T12:38:02.909 INFO:tasks.workunit.client.0.vm00.stdout:4/672: dwrite df/d1f/d22/d26/f56 [0,4194304] 0 2026-03-10T12:38:02.914 INFO:tasks.workunit.client.0.vm00.stdout:3/698: readlink dd/d64/d93/lec 0 2026-03-10T12:38:02.915 INFO:tasks.workunit.client.0.vm00.stdout:4/673: dread df/d1f/d36/f92 [0,4194304] 0 2026-03-10T12:38:02.915 INFO:tasks.workunit.client.0.vm00.stdout:4/674: readlink df/d1f/d22/d26/d65/d91/l86 0 2026-03-10T12:38:02.920 INFO:tasks.workunit.client.0.vm00.stdout:5/726: mkdir d1f/d26/de3/d104 0 2026-03-10T12:38:02.925 INFO:tasks.workunit.client.0.vm00.stdout:9/699: creat d0/d3d/d43/ff7 x:0 0 0 2026-03-10T12:38:02.929 INFO:tasks.workunit.client.0.vm00.stdout:3/699: chown dd/d2a/da2/de1/d45 828741312 1 2026-03-10T12:38:02.932 INFO:tasks.workunit.client.0.vm00.stdout:4/675: rename df/d1f/d22/d26/c2c to df/d1f/d22/d26/dab/d73/cdc 0 2026-03-10T12:38:02.937 INFO:tasks.workunit.client.0.vm00.stdout:2/669: link d4/d6/d93/fbf d4/d6/d93/dc6/fe4 0 2026-03-10T12:38:02.938 INFO:tasks.workunit.client.0.vm00.stdout:2/670: chown d4/f7b 6980 1 2026-03-10T12:38:02.940 INFO:tasks.workunit.client.0.vm00.stdout:1/663: mknod da/d24/d28/d67/db0/cde 0 2026-03-10T12:38:02.952 INFO:tasks.workunit.client.0.vm00.stdout:3/700: creat dd/d18/d13/d99/da5/dd0/fed x:0 0 0 2026-03-10T12:38:02.952 INFO:tasks.workunit.client.0.vm00.stdout:3/701: stat dd/d3d/d65/f90 0 2026-03-10T12:38:02.952 INFO:tasks.workunit.client.0.vm00.stdout:4/676: dwrite df/d1f/d36/d3a/d41/f47 [0,4194304] 0 2026-03-10T12:38:02.952 INFO:tasks.workunit.client.0.vm00.stdout:3/702: dwrite dd/d3d/f50 [0,4194304] 0 2026-03-10T12:38:02.953 INFO:tasks.workunit.client.1.vm07.stdout:0/602: link d0/d14/d5f/fb3 d0/d14/d5f/d76/d2f/d31/d4f/fc4 0 2026-03-10T12:38:02.953 INFO:tasks.workunit.client.1.vm07.stdout:1/528: rmdir d9/df/d29/d2b/d31 39 2026-03-10T12:38:02.957 INFO:tasks.workunit.client.1.vm07.stdout:4/664: rename d0/d4/d10/d9a/f1a to d0/d4/d5/fea 0 2026-03-10T12:38:02.957 INFO:tasks.workunit.client.1.vm07.stdout:3/583: mknod dc/dd/d28/d7a/cca 0 2026-03-10T12:38:02.960 INFO:tasks.workunit.client.0.vm00.stdout:4/677: mknod df/d6c/cdd 0 2026-03-10T12:38:02.961 INFO:tasks.workunit.client.0.vm00.stdout:4/678: write df/d1f/d36/d3a/fd5 [301333,18212] 0 2026-03-10T12:38:02.963 INFO:tasks.workunit.client.0.vm00.stdout:9/700: link d0/d3d/d43/d53/f79 d0/d7f/d88/ff8 0 2026-03-10T12:38:02.963 INFO:tasks.workunit.client.0.vm00.stdout:9/701: stat d0/d9b/lc1 0 2026-03-10T12:38:02.964 INFO:tasks.workunit.client.0.vm00.stdout:9/702: dread - d0/d3d/d59/d4e/dba/d1e/d85/d98/fa7 zero size 2026-03-10T12:38:02.965 INFO:tasks.workunit.client.0.vm00.stdout:1/664: mkdir da/d12/d26/dd2/ddf 0 2026-03-10T12:38:02.965 INFO:tasks.workunit.client.0.vm00.stdout:2/671: getdents d4/d53/d76/d9b/dad 0 2026-03-10T12:38:02.966 INFO:tasks.workunit.client.0.vm00.stdout:2/672: write d4/f7b [834426,76896] 0 2026-03-10T12:38:02.966 INFO:tasks.workunit.client.1.vm07.stdout:1/529: dwrite d9/df/f96 [0,4194304] 0 2026-03-10T12:38:02.967 INFO:tasks.workunit.client.1.vm07.stdout:1/530: stat d9/df/d29/l9c 0 2026-03-10T12:38:02.968 INFO:tasks.workunit.client.0.vm00.stdout:4/679: creat df/d1f/d22/d26/d65/da7/fde x:0 0 0 2026-03-10T12:38:02.970 INFO:tasks.workunit.client.0.vm00.stdout:2/673: dwrite d4/f7b [0,4194304] 0 2026-03-10T12:38:02.983 INFO:tasks.workunit.client.0.vm00.stdout:9/703: mkdir d0/d7f/db8/df9 0 2026-03-10T12:38:02.985 INFO:tasks.workunit.client.1.vm07.stdout:0/603: symlink d0/lc5 0 2026-03-10T12:38:02.985 INFO:tasks.workunit.client.0.vm00.stdout:2/674: stat d4/fc1 0 2026-03-10T12:38:02.988 INFO:tasks.workunit.client.0.vm00.stdout:1/665: creat da/fe0 x:0 0 0 2026-03-10T12:38:02.988 INFO:tasks.workunit.client.0.vm00.stdout:2/675: fsync d4/d6/dca/f3f 0 2026-03-10T12:38:02.992 INFO:tasks.workunit.client.0.vm00.stdout:4/680: rename df/d93/fc0 to df/d1f/d36/d3a/fdf 0 2026-03-10T12:38:02.992 INFO:tasks.workunit.client.1.vm07.stdout:3/584: fsync dc/dd/d28/d7a/f7f 0 2026-03-10T12:38:02.996 INFO:tasks.workunit.client.1.vm07.stdout:4/665: symlink d0/d4/d5/leb 0 2026-03-10T12:38:02.996 INFO:tasks.workunit.client.0.vm00.stdout:9/704: rename d0/d3d/d59/d4e/dba/d1e/l6e to d0/d3d/d59/d4e/dba/d1e/d27/lfa 0 2026-03-10T12:38:02.997 INFO:tasks.workunit.client.0.vm00.stdout:9/705: chown d0/d3d/d43/f54 4625 1 2026-03-10T12:38:02.999 INFO:tasks.workunit.client.0.vm00.stdout:2/676: mkdir d4/d6/d2d/de5 0 2026-03-10T12:38:03.001 INFO:tasks.workunit.client.0.vm00.stdout:4/681: fsync df/d1f/d22/f52 0 2026-03-10T12:38:03.002 INFO:tasks.workunit.client.0.vm00.stdout:2/677: fsync d4/d6/f4e 0 2026-03-10T12:38:03.002 INFO:tasks.workunit.client.0.vm00.stdout:9/706: mknod d0/d7f/db8/cfb 0 2026-03-10T12:38:03.013 INFO:tasks.workunit.client.1.vm07.stdout:1/531: mkdir d9/df/d29/d2b/db1 0 2026-03-10T12:38:03.013 INFO:tasks.workunit.client.0.vm00.stdout:4/682: creat df/d1f/d36/d3a/d41/fe0 x:0 0 0 2026-03-10T12:38:03.014 INFO:tasks.workunit.client.0.vm00.stdout:2/678: creat d4/dd/fe6 x:0 0 0 2026-03-10T12:38:03.016 INFO:tasks.workunit.client.0.vm00.stdout:9/707: rename d0/d5/dc/l18 to d0/d7f/db8/lfc 0 2026-03-10T12:38:03.019 INFO:tasks.workunit.client.0.vm00.stdout:4/683: link df/d1f/d36/d3a/d41/f47 df/d1f/d36/d3a/fe1 0 2026-03-10T12:38:03.043 INFO:tasks.workunit.client.1.vm07.stdout:1/532: link d9/df/d29/d2b/d3d/c46 d9/df/d29/cb2 0 2026-03-10T12:38:03.043 INFO:tasks.workunit.client.0.vm00.stdout:4/684: stat df/d1f/d22/d26/d65/da7/lc4 0 2026-03-10T12:38:03.043 INFO:tasks.workunit.client.0.vm00.stdout:4/685: dread - df/d1f/d22/d26/fd1 zero size 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:4/686: fsync df/d63/d77/f8d 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:2/679: dread - d4/dd/db9/d6d/faa zero size 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:2/680: rmdir d4/d53/d76/d9b/dad/d8e 39 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/708: rename d0/d3d/d59/d4e/dba/d19/c22 to d0/d3d/d59/d4e/dba/d1e/d85/cfd 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:4/687: symlink df/d57/db7/le2 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/709: rename d0/d3d/d59/d4e/dba/d19/l1f to d0/d7f/lfe 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/710: truncate d0/d3d/f8c 2118860 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/711: mknod d0/d3d/d59/d4e/dba/d1e/dcb/cff 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/712: chown d0/d3d/d59/d4e/c92 5172501 1 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/713: creat d0/dc2/f100 x:0 0 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/714: creat d0/d3d/d59/d4e/dba/d1e/d2b/f101 x:0 0 0 2026-03-10T12:38:03.044 INFO:tasks.workunit.client.0.vm00.stdout:9/715: creat d0/d3d/d59/d74/f102 x:0 0 0 2026-03-10T12:38:03.047 INFO:tasks.workunit.client.0.vm00.stdout:9/716: mknod d0/dc2/c103 0 2026-03-10T12:38:03.051 INFO:tasks.workunit.client.0.vm00.stdout:9/717: mkdir d0/d3d/d59/d4e/d104 0 2026-03-10T12:38:03.051 INFO:tasks.workunit.client.0.vm00.stdout:9/718: mknod d0/d3d/d59/d4e/dba/d1e/d85/d98/c105 0 2026-03-10T12:38:03.051 INFO:tasks.workunit.client.0.vm00.stdout:9/719: getdents d0/d7f/db8/dc4/db0 0 2026-03-10T12:38:03.051 INFO:tasks.workunit.client.1.vm07.stdout:3/585: sync 2026-03-10T12:38:03.054 INFO:tasks.workunit.client.0.vm00.stdout:9/720: mkdir d0/d7f/db8/dc4/d106 0 2026-03-10T12:38:03.056 INFO:tasks.workunit.client.0.vm00.stdout:1/666: dread da/d21/d27/f54 [0,4194304] 0 2026-03-10T12:38:03.059 INFO:tasks.workunit.client.0.vm00.stdout:2/681: dread d4/dd/db9/f4c [0,4194304] 0 2026-03-10T12:38:03.066 INFO:tasks.workunit.client.0.vm00.stdout:9/721: symlink d0/d3d/d59/d4e/dba/d1e/d85/de5/l107 0 2026-03-10T12:38:03.069 INFO:tasks.workunit.client.0.vm00.stdout:1/667: symlink da/d24/d28/d67/db0/le1 0 2026-03-10T12:38:03.069 INFO:tasks.workunit.client.1.vm07.stdout:3/586: dread dc/dd/f41 [4194304,4194304] 0 2026-03-10T12:38:03.072 INFO:tasks.workunit.client.0.vm00.stdout:1/668: symlink da/d24/d5a/dd9/le2 0 2026-03-10T12:38:03.082 INFO:tasks.workunit.client.0.vm00.stdout:1/669: chown da/d12/d91/cba 411209895 1 2026-03-10T12:38:03.082 INFO:tasks.workunit.client.0.vm00.stdout:1/670: getdents da/d21/d27/d6a 0 2026-03-10T12:38:03.082 INFO:tasks.workunit.client.0.vm00.stdout:1/671: symlink da/d12/d91/le3 0 2026-03-10T12:38:03.082 INFO:tasks.workunit.client.0.vm00.stdout:1/672: write da/d21/db3/d5d/d80/fcc [681158,95518] 0 2026-03-10T12:38:03.082 INFO:tasks.workunit.client.0.vm00.stdout:1/673: stat da/d21/db3/d5d/dab 0 2026-03-10T12:38:03.082 INFO:tasks.workunit.client.0.vm00.stdout:1/674: chown da/d12/d91/cba 569 1 2026-03-10T12:38:03.088 INFO:tasks.workunit.client.0.vm00.stdout:1/675: dread da/d21/db3/fad [0,4194304] 0 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:1/676: symlink da/d21/db3/d59/da6/da4/dda/dc0/le4 0 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:1/677: dread - da/d24/d73/fc8 zero size 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:1/678: symlink da/d21/db3/le5 0 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:1/679: read - da/d21/d27/d6a/f6b zero size 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:0/570: dwrite d3/db/f97 [0,4194304] 0 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:1/680: fdatasync da/fd5 0 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:1/681: chown da/d24/d28/d67/da2/d78/dbe 36978538 1 2026-03-10T12:38:03.099 INFO:tasks.workunit.client.0.vm00.stdout:1/682: chown da/d12/f64 7264 1 2026-03-10T12:38:03.100 INFO:tasks.workunit.client.0.vm00.stdout:1/683: symlink da/d24/d73/le6 0 2026-03-10T12:38:03.100 INFO:tasks.workunit.client.0.vm00.stdout:1/684: rmdir da/d21/db3/d59 39 2026-03-10T12:38:03.101 INFO:tasks.workunit.client.1.vm07.stdout:8/542: dwrite d1/f79 [4194304,4194304] 0 2026-03-10T12:38:03.101 INFO:tasks.workunit.client.0.vm00.stdout:1/685: fsync da/f13 0 2026-03-10T12:38:03.105 INFO:tasks.workunit.client.0.vm00.stdout:1/686: write da/d12/f99 [387736,121042] 0 2026-03-10T12:38:03.110 INFO:tasks.workunit.client.0.vm00.stdout:1/687: truncate da/d12/d26/f57 2638571 0 2026-03-10T12:38:03.111 INFO:tasks.workunit.client.0.vm00.stdout:2/682: sync 2026-03-10T12:38:03.111 INFO:tasks.workunit.client.0.vm00.stdout:0/571: sync 2026-03-10T12:38:03.112 INFO:tasks.workunit.client.1.vm07.stdout:8/543: unlink d1/f7 0 2026-03-10T12:38:03.114 INFO:tasks.workunit.client.0.vm00.stdout:0/572: readlink d3/d7/d4c/d5b/l32 0 2026-03-10T12:38:03.114 INFO:tasks.workunit.client.0.vm00.stdout:0/573: dread - d3/d7/d4c/d5b/d38/d44/d5a/f7e zero size 2026-03-10T12:38:03.120 INFO:tasks.workunit.client.0.vm00.stdout:1/688: dread da/d21/db3/d5d/d80/f8a [0,4194304] 0 2026-03-10T12:38:03.124 INFO:tasks.workunit.client.0.vm00.stdout:1/689: dread da/d21/db3/f83 [0,4194304] 0 2026-03-10T12:38:03.130 INFO:tasks.workunit.client.1.vm07.stdout:8/544: creat d1/d3/d5d/d65/fad x:0 0 0 2026-03-10T12:38:03.133 INFO:tasks.workunit.client.0.vm00.stdout:1/690: symlink da/d24/d28/d67/da2/d78/dbe/le7 0 2026-03-10T12:38:03.137 INFO:tasks.workunit.client.0.vm00.stdout:0/574: creat d3/d7/d4c/d5b/d38/fbf x:0 0 0 2026-03-10T12:38:03.141 INFO:tasks.workunit.client.0.vm00.stdout:0/575: dwrite d3/d7/d4c/d5b/d38/db3/fbe [0,4194304] 0 2026-03-10T12:38:03.184 INFO:tasks.workunit.client.0.vm00.stdout:9/722: dread d0/d5/dc/f2a [0,4194304] 0 2026-03-10T12:38:03.194 INFO:tasks.workunit.client.0.vm00.stdout:8/536: write d0/d46/d6e/f7b [951009,71471] 0 2026-03-10T12:38:03.194 INFO:tasks.workunit.client.0.vm00.stdout:8/537: dread - d0/d93/f8f zero size 2026-03-10T12:38:03.198 INFO:tasks.workunit.client.0.vm00.stdout:8/538: symlink d0/d93/d36/d5b/laa 0 2026-03-10T12:38:03.199 INFO:tasks.workunit.client.0.vm00.stdout:9/723: sync 2026-03-10T12:38:03.199 INFO:tasks.workunit.client.0.vm00.stdout:9/724: readlink d0/d5/l69 0 2026-03-10T12:38:03.207 INFO:tasks.workunit.client.0.vm00.stdout:8/539: dread d0/d93/d36/f39 [0,4194304] 0 2026-03-10T12:38:03.209 INFO:tasks.workunit.client.0.vm00.stdout:8/540: dwrite d0/d93/d17/f63 [0,4194304] 0 2026-03-10T12:38:03.215 INFO:tasks.workunit.client.0.vm00.stdout:5/727: dwrite d1f/d26/d2e/f3c [0,4194304] 0 2026-03-10T12:38:03.218 INFO:tasks.workunit.client.0.vm00.stdout:3/703: write dd/d64/f7b [1261846,4267] 0 2026-03-10T12:38:03.224 INFO:tasks.workunit.client.0.vm00.stdout:8/541: symlink d0/d93/d17/d48/lab 0 2026-03-10T12:38:03.228 INFO:tasks.workunit.client.0.vm00.stdout:5/728: dwrite d1f/d26/d2b/d35/d53/d5b/f6e [0,4194304] 0 2026-03-10T12:38:03.229 INFO:tasks.workunit.client.0.vm00.stdout:3/704: fdatasync dd/d27/f44 0 2026-03-10T12:38:03.232 INFO:tasks.workunit.client.0.vm00.stdout:0/576: dread d3/d7/d4c/d5b/d38/f81 [0,4194304] 0 2026-03-10T12:38:03.232 INFO:tasks.workunit.client.0.vm00.stdout:0/577: readlink d3/d7/d4c/d5b/d38/l51 0 2026-03-10T12:38:03.233 INFO:tasks.workunit.client.0.vm00.stdout:0/578: chown d3/d22/f55 28431 1 2026-03-10T12:38:03.236 INFO:tasks.workunit.client.0.vm00.stdout:3/705: dwrite dd/d2a/da2/de1/fcd [0,4194304] 0 2026-03-10T12:38:03.237 INFO:tasks.workunit.client.0.vm00.stdout:0/579: link d3/d7/d4c/f96 d3/d40/d65/fc0 0 2026-03-10T12:38:03.240 INFO:tasks.workunit.client.0.vm00.stdout:0/580: unlink d3/d7/f31 0 2026-03-10T12:38:03.241 INFO:tasks.workunit.client.0.vm00.stdout:0/581: stat d3/d7/d4c/d9d 0 2026-03-10T12:38:03.245 INFO:tasks.workunit.client.0.vm00.stdout:8/542: link d0/d93/d60/c7a d0/d58/cac 0 2026-03-10T12:38:03.247 INFO:tasks.workunit.client.0.vm00.stdout:5/729: dread d1f/d6a/f84 [0,4194304] 0 2026-03-10T12:38:03.249 INFO:tasks.workunit.client.0.vm00.stdout:8/543: symlink d0/d58/lad 0 2026-03-10T12:38:03.252 INFO:tasks.workunit.client.0.vm00.stdout:8/544: creat d0/d93/d2d/d49/fae x:0 0 0 2026-03-10T12:38:03.257 INFO:tasks.workunit.client.0.vm00.stdout:8/545: dwrite d0/f8 [0,4194304] 0 2026-03-10T12:38:03.257 INFO:tasks.workunit.client.0.vm00.stdout:7/482: dwrite da/f10 [4194304,4194304] 0 2026-03-10T12:38:03.261 INFO:tasks.workunit.client.0.vm00.stdout:7/483: dread - da/d3f/d60/fb1 zero size 2026-03-10T12:38:03.263 INFO:tasks.workunit.client.0.vm00.stdout:7/484: chown da/d41/d7b 13707 1 2026-03-10T12:38:03.263 INFO:tasks.workunit.client.0.vm00.stdout:7/485: truncate da/d26/d37/d56/f9a 855122 0 2026-03-10T12:38:03.265 INFO:tasks.workunit.client.0.vm00.stdout:5/730: mkdir d1f/d26/d2b/d37/d105 0 2026-03-10T12:38:03.265 INFO:tasks.workunit.client.0.vm00.stdout:5/731: chown d1f/f27 0 1 2026-03-10T12:38:03.266 INFO:tasks.workunit.client.0.vm00.stdout:7/486: truncate da/d26/d50/d73/d89/fa9 426727 0 2026-03-10T12:38:03.288 INFO:tasks.workunit.client.0.vm00.stdout:8/546: unlink d0/d93/d2d/f33 0 2026-03-10T12:38:03.289 INFO:tasks.workunit.client.0.vm00.stdout:7/487: dread da/d26/f97 [0,4194304] 0 2026-03-10T12:38:03.289 INFO:tasks.workunit.client.0.vm00.stdout:7/488: write da/d41/d48/fae [1009466,82027] 0 2026-03-10T12:38:03.294 INFO:tasks.workunit.client.0.vm00.stdout:4/688: truncate df/f3d 452662 0 2026-03-10T12:38:03.296 INFO:tasks.workunit.client.0.vm00.stdout:7/489: dread da/d26/d37/f79 [0,4194304] 0 2026-03-10T12:38:03.299 INFO:tasks.workunit.client.0.vm00.stdout:3/706: dread dd/d3d/d65/f90 [0,4194304] 0 2026-03-10T12:38:03.299 INFO:tasks.workunit.client.0.vm00.stdout:3/707: dread - dd/d2a/da2/db4/fe8 zero size 2026-03-10T12:38:03.303 INFO:tasks.workunit.client.0.vm00.stdout:8/547: creat d0/d46/d6e/d9b/faf x:0 0 0 2026-03-10T12:38:03.306 INFO:tasks.workunit.client.0.vm00.stdout:8/548: dwrite d0/d93/d36/f41 [0,4194304] 0 2026-03-10T12:38:03.310 INFO:tasks.workunit.client.0.vm00.stdout:5/732: truncate d1f/d26/d2b/d35/f50 3523813 0 2026-03-10T12:38:03.313 INFO:tasks.workunit.client.0.vm00.stdout:5/733: read f11 [1641818,113890] 0 2026-03-10T12:38:03.320 INFO:tasks.workunit.client.0.vm00.stdout:8/549: unlink d0/dd/f9f 0 2026-03-10T12:38:03.323 INFO:tasks.workunit.client.1.vm07.stdout:7/531: dwrite d0/f40 [0,4194304] 0 2026-03-10T12:38:03.324 INFO:tasks.workunit.client.0.vm00.stdout:7/490: dread da/d1b/f22 [0,4194304] 0 2026-03-10T12:38:03.327 INFO:tasks.workunit.client.0.vm00.stdout:3/708: fsync dd/d4e/d5d/f71 0 2026-03-10T12:38:03.327 INFO:tasks.workunit.client.1.vm07.stdout:9/613: dwrite d5/d16/d23/d26/f5c [0,4194304] 0 2026-03-10T12:38:03.336 INFO:tasks.workunit.client.0.vm00.stdout:4/689: rename df/d1f/d22/d26/d65/d91/lb2 to df/d1f/d36/d3a/le3 0 2026-03-10T12:38:03.346 INFO:tasks.workunit.client.1.vm07.stdout:9/614: symlink d5/d13/d6c/ld6 0 2026-03-10T12:38:03.346 INFO:tasks.workunit.client.0.vm00.stdout:4/690: chown df/d6c/c98 6 1 2026-03-10T12:38:03.346 INFO:tasks.workunit.client.0.vm00.stdout:5/734: mkdir d1f/d6a/d94/dc9/d106 0 2026-03-10T12:38:03.346 INFO:tasks.workunit.client.0.vm00.stdout:3/709: mknod dd/d3d/d8a/de0/d55/cee 0 2026-03-10T12:38:03.347 INFO:tasks.workunit.client.0.vm00.stdout:8/550: rename d0/d93/d2d/d49/f84 to d0/d93/d36/d7d/fb0 0 2026-03-10T12:38:03.349 INFO:tasks.workunit.client.0.vm00.stdout:7/491: dread da/d1b/d40/f74 [0,4194304] 0 2026-03-10T12:38:03.356 INFO:tasks.workunit.client.1.vm07.stdout:9/615: dread d5/fd4 [0,4194304] 0 2026-03-10T12:38:03.360 INFO:tasks.workunit.client.1.vm07.stdout:9/616: fsync d5/d13/d57/d3e/fa9 0 2026-03-10T12:38:03.367 INFO:tasks.workunit.client.0.vm00.stdout:5/735: rename d1f/d26/d2b/d35/d53/dd6/cec to d1f/d26/de3/db7/c107 0 2026-03-10T12:38:03.372 INFO:tasks.workunit.client.0.vm00.stdout:7/492: rename da/d1b/l45 to da/d3f/d60/lb2 0 2026-03-10T12:38:03.372 INFO:tasks.workunit.client.0.vm00.stdout:7/493: truncate da/f16 1858687 0 2026-03-10T12:38:03.373 INFO:tasks.workunit.client.0.vm00.stdout:7/494: dread da/d26/f97 [0,4194304] 0 2026-03-10T12:38:03.375 INFO:tasks.workunit.client.0.vm00.stdout:7/495: creat da/d47/d87/fb3 x:0 0 0 2026-03-10T12:38:03.380 INFO:tasks.workunit.client.0.vm00.stdout:2/683: dwrite d4/dd/f3c [0,4194304] 0 2026-03-10T12:38:03.381 INFO:tasks.workunit.client.0.vm00.stdout:3/710: mkdir dd/d27/d2c/def 0 2026-03-10T12:38:03.382 INFO:tasks.workunit.client.0.vm00.stdout:2/684: write d4/dd/f62 [1414323,44169] 0 2026-03-10T12:38:03.382 INFO:tasks.workunit.client.0.vm00.stdout:3/711: fdatasync dd/d2a/da2/de1/d38/f48 0 2026-03-10T12:38:03.382 INFO:tasks.workunit.client.0.vm00.stdout:3/712: write f7 [8609491,21710] 0 2026-03-10T12:38:03.394 INFO:tasks.workunit.client.0.vm00.stdout:5/736: sync 2026-03-10T12:38:03.396 INFO:tasks.workunit.client.0.vm00.stdout:1/691: write da/d24/d28/d67/f5b [701560,13301] 0 2026-03-10T12:38:03.397 INFO:tasks.workunit.client.0.vm00.stdout:5/737: dwrite d1f/d26/d2b/d35/f68 [0,4194304] 0 2026-03-10T12:38:03.397 INFO:tasks.workunit.client.0.vm00.stdout:1/692: write da/d21/d27/d6a/f6d [252256,2304] 0 2026-03-10T12:38:03.398 INFO:tasks.workunit.client.0.vm00.stdout:1/693: readlink da/d21/d39/l3b 0 2026-03-10T12:38:03.402 INFO:tasks.workunit.client.0.vm00.stdout:1/694: dread da/d21/d27/f54 [0,4194304] 0 2026-03-10T12:38:03.406 INFO:tasks.workunit.client.0.vm00.stdout:1/695: stat da/d12/d91/le3 0 2026-03-10T12:38:03.421 INFO:tasks.workunit.client.0.vm00.stdout:9/725: write d0/d5/dc/f2a [3846920,35147] 0 2026-03-10T12:38:03.423 INFO:tasks.workunit.client.0.vm00.stdout:4/691: getdents df/d93/d9e 0 2026-03-10T12:38:03.441 INFO:tasks.workunit.client.0.vm00.stdout:5/738: rename d1f/d26/d2b/d35/d78/d99/lf3 to d1f/d26/d2b/d37/da4/l108 0 2026-03-10T12:38:03.441 INFO:tasks.workunit.client.0.vm00.stdout:5/739: dwrite d1f/d26/d2b/d35/d53/d72/fa0 [0,4194304] 0 2026-03-10T12:38:03.452 INFO:tasks.workunit.client.0.vm00.stdout:7/496: dread da/d1b/d40/f5c [0,4194304] 0 2026-03-10T12:38:03.457 INFO:tasks.workunit.client.0.vm00.stdout:3/713: symlink dd/d2a/da2/de1/lf0 0 2026-03-10T12:38:03.466 INFO:tasks.workunit.client.0.vm00.stdout:3/714: chown dd/d3d/d8a/de0/d55/l97 5826 1 2026-03-10T12:38:03.467 INFO:tasks.workunit.client.0.vm00.stdout:4/692: mkdir df/d1f/d36/d3a/d41/de4 0 2026-03-10T12:38:03.467 INFO:tasks.workunit.client.0.vm00.stdout:4/693: chown df/d1f/d36/d3a/d41/l27 18129 1 2026-03-10T12:38:03.467 INFO:tasks.workunit.client.0.vm00.stdout:5/740: dwrite d1f/f97 [4194304,4194304] 0 2026-03-10T12:38:03.467 INFO:tasks.workunit.client.0.vm00.stdout:5/741: dread d1f/d26/d2b/d35/d53/d72/fa0 [0,4194304] 0 2026-03-10T12:38:03.473 INFO:tasks.workunit.client.0.vm00.stdout:3/715: unlink dd/d18/l94 0 2026-03-10T12:38:03.477 INFO:tasks.workunit.client.0.vm00.stdout:4/694: dread - df/d1f/d22/d26/d65/d91/db9/fd0 zero size 2026-03-10T12:38:03.477 INFO:tasks.workunit.client.0.vm00.stdout:3/716: chown dd/c1b 22467849 1 2026-03-10T12:38:03.477 INFO:tasks.workunit.client.0.vm00.stdout:3/717: fdatasync dd/d2a/da2/de1/d38/f48 0 2026-03-10T12:38:03.477 INFO:tasks.workunit.client.0.vm00.stdout:2/685: getdents d4/dd/da7 0 2026-03-10T12:38:03.478 INFO:tasks.workunit.client.0.vm00.stdout:5/742: creat d1f/d6a/d94/dc9/d106/f109 x:0 0 0 2026-03-10T12:38:03.479 INFO:tasks.workunit.client.0.vm00.stdout:7/497: sync 2026-03-10T12:38:03.482 INFO:tasks.workunit.client.0.vm00.stdout:7/498: symlink da/d1b/d40/lb4 0 2026-03-10T12:38:03.483 INFO:tasks.workunit.client.0.vm00.stdout:4/695: getdents df/d1f/d22/d26/d65/d91/db9 0 2026-03-10T12:38:03.485 INFO:tasks.workunit.client.0.vm00.stdout:4/696: rename df/d1f/ccc to df/d93/dbc/ce5 0 2026-03-10T12:38:03.486 INFO:tasks.workunit.client.0.vm00.stdout:4/697: creat df/d57/fe6 x:0 0 0 2026-03-10T12:38:03.486 INFO:tasks.workunit.client.0.vm00.stdout:4/698: dread - df/d1f/d22/f5a zero size 2026-03-10T12:38:03.492 INFO:tasks.workunit.client.0.vm00.stdout:4/699: stat df/f42 0 2026-03-10T12:38:03.492 INFO:tasks.workunit.client.0.vm00.stdout:4/700: creat df/d32/d76/fe7 x:0 0 0 2026-03-10T12:38:03.492 INFO:tasks.workunit.client.0.vm00.stdout:4/701: dread - df/d1f/d22/d26/d65/da7/fde zero size 2026-03-10T12:38:03.492 INFO:tasks.workunit.client.0.vm00.stdout:4/702: rename f8 to df/d63/d77/fe8 0 2026-03-10T12:38:03.492 INFO:tasks.workunit.client.0.vm00.stdout:4/703: write df/d1f/d36/faa [405812,21297] 0 2026-03-10T12:38:03.495 INFO:tasks.workunit.client.1.vm07.stdout:6/524: write d1/d4/d6/f7c [880881,11990] 0 2026-03-10T12:38:03.496 INFO:tasks.workunit.client.0.vm00.stdout:9/726: read d0/dc2/f87 [706169,32715] 0 2026-03-10T12:38:03.500 INFO:tasks.workunit.client.1.vm07.stdout:2/449: dwrite d0/d42/d26/d38/f3d [0,4194304] 0 2026-03-10T12:38:03.511 INFO:tasks.workunit.client.0.vm00.stdout:9/727: mknod d0/d3d/d59/d4e/dba/d19/c108 0 2026-03-10T12:38:03.524 INFO:tasks.workunit.client.1.vm07.stdout:5/576: rename d0/d22/d18/d19/d2e/d3f to d0/d22/d18/d19/d21/d54/dcb 0 2026-03-10T12:38:03.528 INFO:tasks.workunit.client.1.vm07.stdout:0/604: rename d0/f15 to d0/d14/d5f/d76/d2f/d31/d79/d85/fc6 0 2026-03-10T12:38:03.528 INFO:tasks.workunit.client.1.vm07.stdout:0/605: chown d0/d14/d5f/d3b/f6c 0 1 2026-03-10T12:38:03.529 INFO:tasks.workunit.client.1.vm07.stdout:5/577: rmdir d0/d22/d18/d3e/d5d 39 2026-03-10T12:38:03.529 INFO:tasks.workunit.client.1.vm07.stdout:5/578: readlink d0/d22/d18/d3e/l40 0 2026-03-10T12:38:03.531 INFO:tasks.workunit.client.1.vm07.stdout:1/533: rename d9/df/d29/d2b/d3d/f43 to d9/df/d55/d9f/fb3 0 2026-03-10T12:38:03.531 INFO:tasks.workunit.client.1.vm07.stdout:2/450: read d0/d42/f2c [2272337,22229] 0 2026-03-10T12:38:03.531 INFO:tasks.workunit.client.0.vm00.stdout:7/499: read da/d25/f2b [7500517,89992] 0 2026-03-10T12:38:03.537 INFO:tasks.workunit.client.0.vm00.stdout:7/500: dwrite da/f17 [0,4194304] 0 2026-03-10T12:38:03.545 INFO:tasks.workunit.client.1.vm07.stdout:0/606: dread d0/d14/d5f/d76/f3d [4194304,4194304] 0 2026-03-10T12:38:03.547 INFO:tasks.workunit.client.1.vm07.stdout:1/534: symlink d9/d2d/d4f/d75/d77/lb4 0 2026-03-10T12:38:03.548 INFO:tasks.workunit.client.1.vm07.stdout:1/535: write d9/d2d/d4f/d75/f83 [5072557,7092] 0 2026-03-10T12:38:03.562 INFO:tasks.workunit.client.1.vm07.stdout:5/579: mkdir d0/d22/d18/d19/d72/dcc 0 2026-03-10T12:38:03.563 INFO:tasks.workunit.client.1.vm07.stdout:5/580: readlink d0/l32 0 2026-03-10T12:38:03.571 INFO:tasks.workunit.client.1.vm07.stdout:0/607: creat d0/d14/d5f/d76/d8e/fc7 x:0 0 0 2026-03-10T12:38:03.571 INFO:tasks.workunit.client.0.vm00.stdout:8/551: write d0/d93/d17/d48/f87 [2371219,120806] 0 2026-03-10T12:38:03.572 INFO:tasks.workunit.client.0.vm00.stdout:8/552: readlink d0/l73 0 2026-03-10T12:38:03.572 INFO:tasks.workunit.client.1.vm07.stdout:0/608: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf [554711,32203] 0 2026-03-10T12:38:03.573 INFO:tasks.workunit.client.1.vm07.stdout:0/609: chown d0/d14/d5f/d76/f78 0 1 2026-03-10T12:38:03.576 INFO:tasks.workunit.client.0.vm00.stdout:1/696: dwrite da/d24/d28/f37 [0,4194304] 0 2026-03-10T12:38:03.580 INFO:tasks.workunit.client.0.vm00.stdout:1/697: dwrite da/fe0 [0,4194304] 0 2026-03-10T12:38:03.596 INFO:tasks.workunit.client.0.vm00.stdout:8/553: dread d0/d93/d2d/f55 [0,4194304] 0 2026-03-10T12:38:03.601 INFO:tasks.workunit.client.0.vm00.stdout:8/554: mkdir d0/d93/d17/db1 0 2026-03-10T12:38:03.603 INFO:tasks.workunit.client.0.vm00.stdout:8/555: dread d0/f8 [0,4194304] 0 2026-03-10T12:38:03.603 INFO:tasks.workunit.client.0.vm00.stdout:1/698: creat da/d21/d27/fe8 x:0 0 0 2026-03-10T12:38:03.605 INFO:tasks.workunit.client.1.vm07.stdout:1/536: unlink d9/df/d29/l9e 0 2026-03-10T12:38:03.609 INFO:tasks.workunit.client.0.vm00.stdout:8/556: creat d0/d93/d17/fb2 x:0 0 0 2026-03-10T12:38:03.615 INFO:tasks.workunit.client.0.vm00.stdout:2/686: write d4/f67 [954273,101140] 0 2026-03-10T12:38:03.623 INFO:tasks.workunit.client.1.vm07.stdout:9/617: truncate d5/d13/d57/d4f/d6a/fba 832671 0 2026-03-10T12:38:03.623 INFO:tasks.workunit.client.1.vm07.stdout:9/618: chown d5/d13/d57/d4f/d6a/f8a 15 1 2026-03-10T12:38:03.623 INFO:tasks.workunit.client.0.vm00.stdout:3/718: dwrite dd/d3d/d84/f8c [0,4194304] 0 2026-03-10T12:38:03.623 INFO:tasks.workunit.client.0.vm00.stdout:5/743: dwrite d1f/d6a/d94/dc3/fd9 [0,4194304] 0 2026-03-10T12:38:03.627 INFO:tasks.workunit.client.0.vm00.stdout:1/699: sync 2026-03-10T12:38:03.627 INFO:tasks.workunit.client.0.vm00.stdout:2/687: sync 2026-03-10T12:38:03.631 INFO:tasks.workunit.client.0.vm00.stdout:1/700: dwrite da/d21/db3/d5d/fdc [0,4194304] 0 2026-03-10T12:38:03.642 INFO:tasks.workunit.client.0.vm00.stdout:5/744: dread d1f/d26/d2e/fba [0,4194304] 0 2026-03-10T12:38:03.645 INFO:tasks.workunit.client.0.vm00.stdout:2/688: mkdir d4/d6/de7 0 2026-03-10T12:38:03.650 INFO:tasks.workunit.client.1.vm07.stdout:4/666: write d0/d4/d10/d5f/f63 [10788,45105] 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:3/719: dread f7 [0,4194304] 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:5/745: fdatasync d1f/d26/d2e/f8c 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:9/728: write d0/d3d/d59/d4e/dba/d1e/d27/f9e [785313,68333] 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:9/729: write d0/d3d/d43/d53/fa5 [1341539,42716] 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:6/453: link d2/d51/l53 d2/d14/d7a/la0 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:6/454: readlink d2/d14/d7a/la0 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:7/501: link da/d3f/l57 da/d26/lb5 0 2026-03-10T12:38:03.663 INFO:tasks.workunit.client.0.vm00.stdout:9/730: creat d0/d3d/d59/d4e/dba/d19/f109 x:0 0 0 2026-03-10T12:38:03.665 INFO:tasks.workunit.client.1.vm07.stdout:3/587: dwrite dc/dd/d1f/d45/f54 [0,4194304] 0 2026-03-10T12:38:03.673 INFO:tasks.workunit.client.0.vm00.stdout:5/746: dread d1f/d26/f9f [0,4194304] 0 2026-03-10T12:38:03.673 INFO:tasks.workunit.client.0.vm00.stdout:5/747: chown d1f/d26/d2b/d35/d53/d5b/f6e 501 1 2026-03-10T12:38:03.675 INFO:tasks.workunit.client.0.vm00.stdout:5/748: link c10 d1f/d26/d2e/d58/d6b/c10a 0 2026-03-10T12:38:03.681 INFO:tasks.workunit.client.0.vm00.stdout:5/749: link d1f/d26/f48 d1f/d26/d2e/f10b 0 2026-03-10T12:38:03.681 INFO:tasks.workunit.client.1.vm07.stdout:1/537: truncate d9/df/d29/d2b/d3d/f85 3487848 0 2026-03-10T12:38:03.681 INFO:tasks.workunit.client.0.vm00.stdout:3/720: sync 2026-03-10T12:38:03.681 INFO:tasks.workunit.client.0.vm00.stdout:9/731: sync 2026-03-10T12:38:03.682 INFO:tasks.workunit.client.1.vm07.stdout:8/545: write d1/d3/d11/f43 [1786248,35690] 0 2026-03-10T12:38:03.682 INFO:tasks.workunit.client.0.vm00.stdout:3/721: dread - dd/d2a/da2/db4/fdb zero size 2026-03-10T12:38:03.684 INFO:tasks.workunit.client.0.vm00.stdout:7/502: mkdir da/d1b/d40/db6 0 2026-03-10T12:38:03.689 INFO:tasks.workunit.client.0.vm00.stdout:9/732: rmdir d0/d3d/d59/d4e/dba/d19/d50 39 2026-03-10T12:38:03.693 INFO:tasks.workunit.client.0.vm00.stdout:7/503: dread da/d41/d48/fae [0,4194304] 0 2026-03-10T12:38:03.695 INFO:tasks.workunit.client.0.vm00.stdout:9/733: creat d0/dc2/f10a x:0 0 0 2026-03-10T12:38:03.698 INFO:tasks.workunit.client.0.vm00.stdout:4/704: dwrite df/f3d [0,4194304] 0 2026-03-10T12:38:03.700 INFO:tasks.workunit.client.0.vm00.stdout:4/705: read df/d1f/d36/f51 [2535423,98214] 0 2026-03-10T12:38:03.701 INFO:tasks.workunit.client.1.vm07.stdout:4/667: creat d0/d8e/fec x:0 0 0 2026-03-10T12:38:03.706 INFO:tasks.workunit.client.0.vm00.stdout:9/734: creat d0/d3d/d59/d4e/dba/d19/d50/f10b x:0 0 0 2026-03-10T12:38:03.706 INFO:tasks.workunit.client.0.vm00.stdout:0/582: truncate d3/f4 3489446 0 2026-03-10T12:38:03.707 INFO:tasks.workunit.client.0.vm00.stdout:9/735: truncate d0/d3d/d59/d74/f102 580712 0 2026-03-10T12:38:03.709 INFO:tasks.workunit.client.0.vm00.stdout:4/706: dread df/d1f/d22/d26/f39 [0,4194304] 0 2026-03-10T12:38:03.715 INFO:tasks.workunit.client.0.vm00.stdout:4/707: dwrite df/d1f/d36/d3a/d41/fc7 [0,4194304] 0 2026-03-10T12:38:03.722 INFO:tasks.workunit.client.0.vm00.stdout:4/708: rmdir df/d1f/d22/d26/d65/d91/db9 39 2026-03-10T12:38:03.731 INFO:tasks.workunit.client.1.vm07.stdout:5/581: mknod d0/d22/d18/d30/ccd 0 2026-03-10T12:38:03.737 INFO:tasks.workunit.client.1.vm07.stdout:7/532: dwrite d0/f7b [0,4194304] 0 2026-03-10T12:38:03.744 INFO:tasks.workunit.client.1.vm07.stdout:3/588: rmdir dc/dd/d43/d76/d95 39 2026-03-10T12:38:03.754 INFO:tasks.workunit.client.0.vm00.stdout:4/709: dread df/d1f/d22/d26/dab/d73/f8b [0,4194304] 0 2026-03-10T12:38:03.755 INFO:tasks.workunit.client.0.vm00.stdout:4/710: readlink df/lbf 0 2026-03-10T12:38:03.756 INFO:tasks.workunit.client.1.vm07.stdout:3/589: dwrite dc/dd/d43/d5c/fa9 [0,4194304] 0 2026-03-10T12:38:03.780 INFO:tasks.workunit.client.0.vm00.stdout:1/701: dwrite da/d24/d28/d67/da2/d78/f86 [0,4194304] 0 2026-03-10T12:38:03.781 INFO:tasks.workunit.client.0.vm00.stdout:2/689: dwrite d4/d53/d68/fb1 [0,4194304] 0 2026-03-10T12:38:03.793 INFO:tasks.workunit.client.0.vm00.stdout:8/557: dread d0/dd/f9a [0,4194304] 0 2026-03-10T12:38:03.796 INFO:tasks.workunit.client.0.vm00.stdout:8/558: dwrite d0/d5c/fa0 [0,4194304] 0 2026-03-10T12:38:03.796 INFO:tasks.workunit.client.0.vm00.stdout:8/559: chown d0/c66 3 1 2026-03-10T12:38:03.800 INFO:tasks.workunit.client.0.vm00.stdout:8/560: dwrite d0/d93/fa5 [0,4194304] 0 2026-03-10T12:38:03.816 INFO:tasks.workunit.client.0.vm00.stdout:5/750: write d1f/d26/d2b/d35/d78/fc7 [501594,20611] 0 2026-03-10T12:38:03.817 INFO:tasks.workunit.client.0.vm00.stdout:8/561: symlink d0/d93/d17/d48/lb3 0 2026-03-10T12:38:03.819 INFO:tasks.workunit.client.1.vm07.stdout:2/451: write d0/f46 [1709085,56400] 0 2026-03-10T12:38:03.819 INFO:tasks.workunit.client.0.vm00.stdout:5/751: dread d1f/d26/d2b/d35/d78/d7f/fb9 [0,4194304] 0 2026-03-10T12:38:03.823 INFO:tasks.workunit.client.0.vm00.stdout:8/562: dread d0/d46/d6e/f7b [0,4194304] 0 2026-03-10T12:38:03.823 INFO:tasks.workunit.client.1.vm07.stdout:4/668: mknod d0/d4/d5/da/ced 0 2026-03-10T12:38:03.824 INFO:tasks.workunit.client.1.vm07.stdout:4/669: chown d0/d4/d7a/fcd 61 1 2026-03-10T12:38:03.831 INFO:tasks.workunit.client.0.vm00.stdout:2/690: rmdir d4/d53/d68/dc2/dd9/dde 0 2026-03-10T12:38:03.835 INFO:tasks.workunit.client.1.vm07.stdout:5/582: creat d0/d22/d18/d3e/d53/fce x:0 0 0 2026-03-10T12:38:03.835 INFO:tasks.workunit.client.0.vm00.stdout:2/691: dwrite d4/d6/d2d/d3a/fbd [0,4194304] 0 2026-03-10T12:38:03.836 INFO:tasks.workunit.client.1.vm07.stdout:7/533: fsync d0/f5f 0 2026-03-10T12:38:03.837 INFO:tasks.workunit.client.0.vm00.stdout:3/722: dwrite dd/d2a/da2/de1/d45/f47 [0,4194304] 0 2026-03-10T12:38:03.843 INFO:tasks.workunit.client.1.vm07.stdout:3/590: dread dc/dd/d28/d3b/fc1 [0,4194304] 0 2026-03-10T12:38:03.843 INFO:tasks.workunit.client.1.vm07.stdout:3/591: chown dc/d18/d24/l63 22463338 1 2026-03-10T12:38:03.844 INFO:tasks.workunit.client.1.vm07.stdout:3/592: stat dc/dd/d1f/f6d 0 2026-03-10T12:38:03.846 INFO:tasks.workunit.client.1.vm07.stdout:4/670: creat d0/d4/d5/da/fee x:0 0 0 2026-03-10T12:38:03.846 INFO:tasks.workunit.client.1.vm07.stdout:4/671: write d0/d4/d5/da/f15 [481836,16756] 0 2026-03-10T12:38:03.852 INFO:tasks.workunit.client.0.vm00.stdout:1/702: rename da/d21/db3/d5d/d72/d7e/dbf/lcd to da/d24/le9 0 2026-03-10T12:38:03.852 INFO:tasks.workunit.client.0.vm00.stdout:8/563: mknod d0/d93/d36/cb4 0 2026-03-10T12:38:03.853 INFO:tasks.workunit.client.0.vm00.stdout:1/703: creat da/d24/d28/d67/da2/fea x:0 0 0 2026-03-10T12:38:03.854 INFO:tasks.workunit.client.0.vm00.stdout:8/564: unlink d0/l1 0 2026-03-10T12:38:03.856 INFO:tasks.workunit.client.0.vm00.stdout:8/565: symlink d0/d93/d60/lb5 0 2026-03-10T12:38:03.857 INFO:tasks.workunit.client.0.vm00.stdout:3/723: creat dd/ff1 x:0 0 0 2026-03-10T12:38:03.858 INFO:tasks.workunit.client.0.vm00.stdout:1/704: creat da/d12/d26/dd2/ddf/feb x:0 0 0 2026-03-10T12:38:03.859 INFO:tasks.workunit.client.0.vm00.stdout:1/705: creat da/d12/d26/fec x:0 0 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.1.vm07.stdout:8/546: rmdir d1/d3/d6/d50/d70 39 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:1/706: chown da/l16 377757433 1 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:8/566: link d0/d93/d17/d48/f4c d0/d46/d89/fb6 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:1/707: creat da/d24/d28/d67/fed x:0 0 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:1/708: dread - da/d24/d73/fc8 zero size 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:8/567: symlink d0/d93/d17/db1/lb7 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:8/568: read - d0/dd/f9e zero size 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:3/724: dwrite dd/d18/f7c [0,4194304] 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:1/709: mknod da/d12/db4/cee 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:8/569: mkdir d0/d93/d36/db8 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:3/725: fdatasync f7 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:8/570: write d0/d93/d17/d48/f4c [2504516,92709] 0 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:3/726: dread - dd/d18/d13/d99/da5/fdf zero size 2026-03-10T12:38:03.880 INFO:tasks.workunit.client.0.vm00.stdout:8/571: symlink d0/dd/d38/lb9 0 2026-03-10T12:38:03.881 INFO:tasks.workunit.client.0.vm00.stdout:1/710: link da/d24/d28/d67/da2/d78/c96 da/d24/d28/d67/da2/d78/cef 0 2026-03-10T12:38:03.882 INFO:tasks.workunit.client.0.vm00.stdout:1/711: dread da/d21/db3/f7a [0,4194304] 0 2026-03-10T12:38:03.884 INFO:tasks.workunit.client.0.vm00.stdout:1/712: readlink da/d24/d5a/la9 0 2026-03-10T12:38:03.886 INFO:tasks.workunit.client.0.vm00.stdout:6/455: dread d2/f68 [0,4194304] 0 2026-03-10T12:38:03.887 INFO:tasks.workunit.client.1.vm07.stdout:2/452: dread - d0/d42/d1f/f84 zero size 2026-03-10T12:38:03.887 INFO:tasks.workunit.client.1.vm07.stdout:2/453: stat d0/d42/d4e/d77/d70/f8a 0 2026-03-10T12:38:03.888 INFO:tasks.workunit.client.1.vm07.stdout:5/583: mkdir d0/d22/d18/d3e/d5d/dcf 0 2026-03-10T12:38:03.890 INFO:tasks.workunit.client.0.vm00.stdout:8/572: dread d0/d93/d2d/f44 [0,4194304] 0 2026-03-10T12:38:03.890 INFO:tasks.workunit.client.0.vm00.stdout:8/573: fsync d0/d5c/f4a 0 2026-03-10T12:38:03.891 INFO:tasks.workunit.client.1.vm07.stdout:8/547: symlink d1/d3/d6c/lae 0 2026-03-10T12:38:03.893 INFO:tasks.workunit.client.0.vm00.stdout:6/456: fdatasync d2/da/f6a 0 2026-03-10T12:38:03.893 INFO:tasks.workunit.client.0.vm00.stdout:3/727: getdents dd/d18/d13/d1d 0 2026-03-10T12:38:03.895 INFO:tasks.workunit.client.0.vm00.stdout:6/457: chown d2/d16/f1e 317252 1 2026-03-10T12:38:03.895 INFO:tasks.workunit.client.1.vm07.stdout:8/548: readlink d1/d3/d6/d50/d70/la3 0 2026-03-10T12:38:03.897 INFO:tasks.workunit.client.0.vm00.stdout:6/458: fdatasync d2/da/dc/d83/f97 0 2026-03-10T12:38:03.898 INFO:tasks.workunit.client.0.vm00.stdout:6/459: symlink d2/d16/d29/d31/d34/la1 0 2026-03-10T12:38:03.901 INFO:tasks.workunit.client.0.vm00.stdout:6/460: symlink d2/d16/d29/d31/d34/la2 0 2026-03-10T12:38:03.901 INFO:tasks.workunit.client.0.vm00.stdout:1/713: creat da/d21/db3/d59/ff0 x:0 0 0 2026-03-10T12:38:03.902 INFO:tasks.workunit.client.0.vm00.stdout:1/714: chown da/d24/d28/d67/da2/d78/c8e 20 1 2026-03-10T12:38:03.905 INFO:tasks.workunit.client.0.vm00.stdout:6/461: symlink d2/d16/d29/d31/d88/d92/la3 0 2026-03-10T12:38:03.911 INFO:tasks.workunit.client.0.vm00.stdout:6/462: dwrite d2/da/dc/d2f/f3a [0,4194304] 0 2026-03-10T12:38:03.915 INFO:tasks.workunit.client.1.vm07.stdout:8/549: dread d1/d3/f1f [4194304,4194304] 0 2026-03-10T12:38:03.915 INFO:tasks.workunit.client.0.vm00.stdout:1/715: symlink da/d12/da8/lf1 0 2026-03-10T12:38:03.915 INFO:tasks.workunit.client.0.vm00.stdout:8/574: link d0/f8 d0/d93/d2d/fba 0 2026-03-10T12:38:03.915 INFO:tasks.workunit.client.0.vm00.stdout:3/728: truncate dd/d3d/d8a/de0/fa7 4876209 0 2026-03-10T12:38:03.923 INFO:tasks.workunit.client.0.vm00.stdout:1/716: mknod da/d12/d91/dcb/cf2 0 2026-03-10T12:38:03.926 INFO:tasks.workunit.client.0.vm00.stdout:1/717: rmdir da/d21/db3/d5d/d72/d7e 39 2026-03-10T12:38:03.927 INFO:tasks.workunit.client.0.vm00.stdout:4/711: dread df/d1f/d22/d26/d65/d91/f50 [0,4194304] 0 2026-03-10T12:38:03.938 INFO:tasks.workunit.client.0.vm00.stdout:8/575: dread d0/f22 [0,4194304] 0 2026-03-10T12:38:03.938 INFO:tasks.workunit.client.0.vm00.stdout:8/576: rmdir d0/d46 39 2026-03-10T12:38:03.960 INFO:tasks.workunit.client.1.vm07.stdout:0/610: rename d0/d14/d5f/d76/d8e to d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8 0 2026-03-10T12:38:03.963 INFO:tasks.workunit.client.1.vm07.stdout:4/672: rename d0/d4/d5/da/f4d to d0/d4/d10/d9a/db9/fef 0 2026-03-10T12:38:03.973 INFO:tasks.workunit.client.1.vm07.stdout:5/584: rename d0/d22/d18/d19/d21/d54/l7a to d0/d22/d18/d19/d21/d3a/ld0 0 2026-03-10T12:38:03.977 INFO:tasks.workunit.client.1.vm07.stdout:5/585: dwrite d0/d22/d18/fb4 [0,4194304] 0 2026-03-10T12:38:03.982 INFO:tasks.workunit.client.0.vm00.stdout:4/712: sync 2026-03-10T12:38:03.982 INFO:tasks.workunit.client.0.vm00.stdout:4/713: chown df/d32 1819656272 1 2026-03-10T12:38:03.984 INFO:tasks.workunit.client.0.vm00.stdout:6/463: dread d2/d16/f2a [0,4194304] 0 2026-03-10T12:38:03.984 INFO:tasks.workunit.client.1.vm07.stdout:5/586: mkdir d0/d22/d18/d19/d21/d54/dd1 0 2026-03-10T12:38:03.988 INFO:tasks.workunit.client.0.vm00.stdout:4/714: unlink df/d57/fa0 0 2026-03-10T12:38:03.990 INFO:tasks.workunit.client.0.vm00.stdout:4/715: rmdir df/d1f/d22/d26/dab 39 2026-03-10T12:38:03.991 INFO:tasks.workunit.client.0.vm00.stdout:4/716: fdatasync df/d1f/d22/d26/f31 0 2026-03-10T12:38:03.993 INFO:tasks.workunit.client.0.vm00.stdout:4/717: fsync df/d1f/d22/d26/d65/d91/fad 0 2026-03-10T12:38:03.993 INFO:tasks.workunit.client.1.vm07.stdout:5/587: creat d0/d22/d18/d3e/d5d/dcf/fd2 x:0 0 0 2026-03-10T12:38:03.994 INFO:tasks.workunit.client.0.vm00.stdout:4/718: write df/d1f/d36/f92 [969521,48920] 0 2026-03-10T12:38:04.002 INFO:tasks.workunit.client.0.vm00.stdout:4/719: symlink df/d1f/d36/d3a/d41/de4/le9 0 2026-03-10T12:38:04.002 INFO:tasks.workunit.client.0.vm00.stdout:4/720: rmdir df/d57/db7 39 2026-03-10T12:38:04.002 INFO:tasks.workunit.client.0.vm00.stdout:6/464: dread - d2/d39/f6c zero size 2026-03-10T12:38:04.008 INFO:tasks.workunit.client.0.vm00.stdout:6/465: mkdir d2/da4 0 2026-03-10T12:38:04.008 INFO:tasks.workunit.client.0.vm00.stdout:6/466: read - d2/d39/f6c zero size 2026-03-10T12:38:04.015 INFO:tasks.workunit.client.0.vm00.stdout:6/467: creat d2/d42/d80/d89/fa5 x:0 0 0 2026-03-10T12:38:04.017 INFO:tasks.workunit.client.0.vm00.stdout:6/468: creat d2/d16/d29/fa6 x:0 0 0 2026-03-10T12:38:04.020 INFO:tasks.workunit.client.1.vm07.stdout:4/673: dread d0/d4/d5/d34/f5d [0,4194304] 0 2026-03-10T12:38:04.022 INFO:tasks.workunit.client.1.vm07.stdout:4/674: getdents d0 0 2026-03-10T12:38:04.026 INFO:tasks.workunit.client.1.vm07.stdout:4/675: write d0/d4/d5/d34/fa3 [4049927,110875] 0 2026-03-10T12:38:04.028 INFO:tasks.workunit.client.1.vm07.stdout:4/676: mkdir d0/d8e/df0 0 2026-03-10T12:38:04.029 INFO:tasks.workunit.client.0.vm00.stdout:5/752: dwrite d1f/d26/d2b/d35/d53/d72/ff9 [0,4194304] 0 2026-03-10T12:38:04.032 INFO:tasks.workunit.client.0.vm00.stdout:9/736: dread d0/d3d/d59/fad [0,4194304] 0 2026-03-10T12:38:04.032 INFO:tasks.workunit.client.1.vm07.stdout:4/677: dread - d0/d4/d10/d8d/fb0 zero size 2026-03-10T12:38:04.032 INFO:tasks.workunit.client.0.vm00.stdout:2/692: write d4/d6/dca/f3f [2387263,116324] 0 2026-03-10T12:38:04.034 INFO:tasks.workunit.client.1.vm07.stdout:4/678: rmdir d0/d8e/df0 0 2026-03-10T12:38:04.034 INFO:tasks.workunit.client.0.vm00.stdout:8/577: getdents d0/d46/d89 0 2026-03-10T12:38:04.034 INFO:tasks.workunit.client.1.vm07.stdout:4/679: stat d0/d4/d10/d5f/f63 0 2026-03-10T12:38:04.037 INFO:tasks.workunit.client.0.vm00.stdout:9/737: symlink d0/d3d/l10c 0 2026-03-10T12:38:04.042 INFO:tasks.workunit.client.0.vm00.stdout:9/738: dwrite d0/d3d/d59/d4e/dba/d19/f7d [0,4194304] 0 2026-03-10T12:38:04.044 INFO:tasks.workunit.client.1.vm07.stdout:9/619: write d5/d16/d23/d26/f42 [4341121,42130] 0 2026-03-10T12:38:04.044 INFO:tasks.workunit.client.1.vm07.stdout:6/525: write d1/d4/d6/d16/f50 [806197,50392] 0 2026-03-10T12:38:04.046 INFO:tasks.workunit.client.0.vm00.stdout:2/693: fsync d4/fc1 0 2026-03-10T12:38:04.051 INFO:tasks.workunit.client.0.vm00.stdout:8/578: symlink d0/d93/d2d/lbb 0 2026-03-10T12:38:04.055 INFO:tasks.workunit.client.1.vm07.stdout:9/620: dread - d5/d1f/f9f zero size 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.0.vm00.stdout:5/753: dread d1f/d26/d2b/d37/f38 [0,4194304] 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.0.vm00.stdout:9/739: creat d0/d5/f10d x:0 0 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.0.vm00.stdout:3/729: dwrite dd/d27/f44 [4194304,4194304] 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.0.vm00.stdout:2/694: mkdir d4/d53/d76/dba/de8 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.0.vm00.stdout:5/754: truncate f11 3134005 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.0.vm00.stdout:3/730: readlink dd/d4e/l59 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.1.vm07.stdout:6/526: creat d1/d4/d6/d16/d1a/d99/fa8 x:0 0 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.1.vm07.stdout:9/621: rename d5/d13/d6c/d7a/daf to d5/d16/dd7 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.1.vm07.stdout:9/622: chown d5/d13/d2c/f44 167464 1 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.1.vm07.stdout:9/623: mknod d5/d13/d6c/cd8 0 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.1.vm07.stdout:8/550: chown d1/d3/d11/d87 987 1 2026-03-10T12:38:04.082 INFO:tasks.workunit.client.1.vm07.stdout:8/551: chown d1/d3/d6/d54/l6a 18774035 1 2026-03-10T12:38:04.083 INFO:tasks.workunit.client.0.vm00.stdout:3/731: symlink dd/d3d/d8a/de0/de4/lf2 0 2026-03-10T12:38:04.085 INFO:tasks.workunit.client.1.vm07.stdout:4/680: sync 2026-03-10T12:38:04.086 INFO:tasks.workunit.client.1.vm07.stdout:9/624: rmdir d5/d1f/d31/d76 39 2026-03-10T12:38:04.086 INFO:tasks.workunit.client.0.vm00.stdout:2/695: unlink d4/d6/d2d/d3a/d43/c70 0 2026-03-10T12:38:04.089 INFO:tasks.workunit.client.0.vm00.stdout:3/732: fdatasync dd/d3d/f53 0 2026-03-10T12:38:04.090 INFO:tasks.workunit.client.1.vm07.stdout:8/552: readlink d1/d3/d6/d7b/l83 0 2026-03-10T12:38:04.093 INFO:tasks.workunit.client.0.vm00.stdout:3/733: truncate dd/d3d/d73/f8f 1248198 0 2026-03-10T12:38:04.096 INFO:tasks.workunit.client.0.vm00.stdout:1/718: write da/d24/d5a/f7c [677487,67231] 0 2026-03-10T12:38:04.098 INFO:tasks.workunit.client.0.vm00.stdout:3/734: getdents dd/d27/d2c/def 0 2026-03-10T12:38:04.103 INFO:tasks.workunit.client.0.vm00.stdout:3/735: write dd/ff1 [633000,7872] 0 2026-03-10T12:38:04.103 INFO:tasks.workunit.client.0.vm00.stdout:3/736: chown dd/d64/d93/fce 0 1 2026-03-10T12:38:04.103 INFO:tasks.workunit.client.0.vm00.stdout:1/719: mkdir da/d21/db3/d59/da6/d8b/df3 0 2026-03-10T12:38:04.103 INFO:tasks.workunit.client.0.vm00.stdout:2/696: getdents d4/d6/d2d 0 2026-03-10T12:38:04.105 INFO:tasks.workunit.client.1.vm07.stdout:9/625: link d5/d16/d23/d26/f5c d5/d69/d93/d97/fd9 0 2026-03-10T12:38:04.107 INFO:tasks.workunit.client.0.vm00.stdout:0/583: dwrite d3/d22/f46 [0,4194304] 0 2026-03-10T12:38:04.112 INFO:tasks.workunit.client.0.vm00.stdout:1/720: rename da/f22 to da/d21/db3/d5d/dab/ff4 0 2026-03-10T12:38:04.112 INFO:tasks.workunit.client.1.vm07.stdout:9/626: unlink d5/d16/f35 0 2026-03-10T12:38:04.112 INFO:tasks.workunit.client.1.vm07.stdout:9/627: write d5/d69/fc2 [916862,78782] 0 2026-03-10T12:38:04.112 INFO:tasks.workunit.client.1.vm07.stdout:9/628: chown d5/d13/l5b 11240675 1 2026-03-10T12:38:04.116 INFO:tasks.workunit.client.1.vm07.stdout:8/553: creat d1/d3/d6/faf x:0 0 0 2026-03-10T12:38:04.122 INFO:tasks.workunit.client.1.vm07.stdout:8/554: write d1/d3/d5d/d65/fad [536196,93490] 0 2026-03-10T12:38:04.133 INFO:tasks.workunit.client.0.vm00.stdout:2/697: sync 2026-03-10T12:38:04.135 INFO:tasks.workunit.client.0.vm00.stdout:2/698: unlink d4/dd/db9/d6d/cbb 0 2026-03-10T12:38:04.136 INFO:tasks.workunit.client.1.vm07.stdout:9/629: link d5/d13/d2c/f44 d5/fda 0 2026-03-10T12:38:04.141 INFO:tasks.workunit.client.1.vm07.stdout:8/555: creat d1/d3/d40/fb0 x:0 0 0 2026-03-10T12:38:04.142 INFO:tasks.workunit.client.1.vm07.stdout:9/630: dread - d5/d16/d23/d26/d68/fa0 zero size 2026-03-10T12:38:04.148 INFO:tasks.workunit.client.1.vm07.stdout:8/556: symlink d1/d3/d6c/lb1 0 2026-03-10T12:38:04.150 INFO:tasks.workunit.client.1.vm07.stdout:8/557: mkdir d1/d3/db2 0 2026-03-10T12:38:04.152 INFO:tasks.workunit.client.1.vm07.stdout:9/631: sync 2026-03-10T12:38:04.153 INFO:tasks.workunit.client.1.vm07.stdout:9/632: chown d5/d16/d23/lb8 3 1 2026-03-10T12:38:04.157 INFO:tasks.workunit.client.1.vm07.stdout:9/633: dwrite d5/d69/fc2 [0,4194304] 0 2026-03-10T12:38:04.158 INFO:tasks.workunit.client.1.vm07.stdout:9/634: dread - d5/d69/d93/fd1 zero size 2026-03-10T12:38:04.159 INFO:tasks.workunit.client.0.vm00.stdout:7/504: truncate da/f10 9442040 0 2026-03-10T12:38:04.160 INFO:tasks.workunit.client.1.vm07.stdout:8/558: rmdir d1/d3 39 2026-03-10T12:38:04.169 INFO:tasks.workunit.client.1.vm07.stdout:1/538: dwrite d9/df/f11 [0,4194304] 0 2026-03-10T12:38:04.174 INFO:tasks.workunit.client.1.vm07.stdout:9/635: mknod d5/d13/d57/d4f/d6a/cdb 0 2026-03-10T12:38:04.189 INFO:tasks.workunit.client.1.vm07.stdout:8/559: dwrite d1/d3/f1f [4194304,4194304] 0 2026-03-10T12:38:04.192 INFO:tasks.workunit.client.1.vm07.stdout:8/560: stat d1/d3/d40/d92/f94 0 2026-03-10T12:38:04.195 INFO:tasks.workunit.client.1.vm07.stdout:8/561: readlink d1/d3/d11/l5c 0 2026-03-10T12:38:04.202 INFO:tasks.workunit.client.0.vm00.stdout:7/505: dread da/d41/f72 [0,4194304] 0 2026-03-10T12:38:04.203 INFO:tasks.workunit.client.1.vm07.stdout:8/562: symlink d1/d3/d6/d54/lb3 0 2026-03-10T12:38:04.215 INFO:tasks.workunit.client.1.vm07.stdout:1/539: getdents d9/d2d/d80/d8e 0 2026-03-10T12:38:04.216 INFO:tasks.workunit.client.1.vm07.stdout:8/563: chown d1/d3/l51 1508 1 2026-03-10T12:38:04.217 INFO:tasks.workunit.client.0.vm00.stdout:7/506: creat da/d47/fb7 x:0 0 0 2026-03-10T12:38:04.224 INFO:tasks.workunit.client.0.vm00.stdout:7/507: stat da/fb 0 2026-03-10T12:38:04.227 INFO:tasks.workunit.client.0.vm00.stdout:7/508: rename da/d25/d2c/d82/d68/c33 to da/d41/d7b/d9d/cb8 0 2026-03-10T12:38:04.228 INFO:tasks.workunit.client.0.vm00.stdout:7/509: stat da/d26/d50/d73/d89 0 2026-03-10T12:38:04.231 INFO:tasks.workunit.client.0.vm00.stdout:7/510: mknod da/d26/d37/d56/cb9 0 2026-03-10T12:38:04.231 INFO:tasks.workunit.client.0.vm00.stdout:7/511: chown da/d1b/d40/f5c 190 1 2026-03-10T12:38:04.235 INFO:tasks.workunit.client.0.vm00.stdout:4/721: dwrite df/f4e [0,4194304] 0 2026-03-10T12:38:04.241 INFO:tasks.workunit.client.0.vm00.stdout:7/512: dread da/d25/d2c/f98 [0,4194304] 0 2026-03-10T12:38:04.246 INFO:tasks.workunit.client.1.vm07.stdout:6/527: dread d1/d4/d6/f7e [0,4194304] 0 2026-03-10T12:38:04.249 INFO:tasks.workunit.client.1.vm07.stdout:6/528: creat d1/d4/d6/d53/fa9 x:0 0 0 2026-03-10T12:38:04.250 INFO:tasks.workunit.client.1.vm07.stdout:6/529: chown d1/d4/d6/d4e/d64/fa4 2 1 2026-03-10T12:38:04.256 INFO:tasks.workunit.client.1.vm07.stdout:6/530: mknod d1/d4/caa 0 2026-03-10T12:38:04.259 INFO:tasks.workunit.client.0.vm00.stdout:2/699: write d4/fc1 [662634,130426] 0 2026-03-10T12:38:04.260 INFO:tasks.workunit.client.0.vm00.stdout:2/700: write d4/dd/f3c [2945482,22983] 0 2026-03-10T12:38:04.264 INFO:tasks.workunit.client.1.vm07.stdout:6/531: creat d1/d4/d6/d53/d66/fab x:0 0 0 2026-03-10T12:38:04.268 INFO:tasks.workunit.client.0.vm00.stdout:3/737: fsync dd/d3d/d8a/de0/fa7 0 2026-03-10T12:38:04.268 INFO:tasks.workunit.client.0.vm00.stdout:3/738: chown dd/d3d/d84/f8c 38072872 1 2026-03-10T12:38:04.269 INFO:tasks.workunit.client.1.vm07.stdout:6/532: symlink d1/d4/d6/d16/d1a/d99/lac 0 2026-03-10T12:38:04.269 INFO:tasks.workunit.client.0.vm00.stdout:3/739: readlink dd/d3d/d8a/de0/l4d 0 2026-03-10T12:38:04.272 INFO:tasks.workunit.client.0.vm00.stdout:3/740: link dd/d18/d13/c26 dd/d64/d93/cf3 0 2026-03-10T12:38:04.280 INFO:tasks.workunit.client.0.vm00.stdout:3/741: dread dd/d3d/d8a/de0/fa7 [0,4194304] 0 2026-03-10T12:38:04.283 INFO:tasks.workunit.client.0.vm00.stdout:3/742: unlink dd/d27/f35 0 2026-03-10T12:38:04.283 INFO:tasks.workunit.client.0.vm00.stdout:5/755: dread d1f/d26/d2e/fa5 [0,4194304] 0 2026-03-10T12:38:04.286 INFO:tasks.workunit.client.0.vm00.stdout:5/756: dwrite d1f/d26/d2b/d35/f42 [0,4194304] 0 2026-03-10T12:38:04.291 INFO:tasks.workunit.client.1.vm07.stdout:6/533: dread d1/d4/d6/d16/d1a/d33/f37 [0,4194304] 0 2026-03-10T12:38:04.293 INFO:tasks.workunit.client.0.vm00.stdout:3/743: mkdir dd/d3d/d8a/de0/d55/dd3/df4 0 2026-03-10T12:38:04.294 INFO:tasks.workunit.client.1.vm07.stdout:6/534: dread d1/d4/d6/d53/f5e [0,4194304] 0 2026-03-10T12:38:04.299 INFO:tasks.workunit.client.0.vm00.stdout:5/757: stat f11 0 2026-03-10T12:38:04.299 INFO:tasks.workunit.client.0.vm00.stdout:5/758: rmdir d1f/d26/de3/db7 39 2026-03-10T12:38:04.299 INFO:tasks.workunit.client.1.vm07.stdout:6/535: symlink d1/d4/d6/d16/d1a/lad 0 2026-03-10T12:38:04.299 INFO:tasks.workunit.client.1.vm07.stdout:6/536: chown d1/d4/d6/d4e 4 1 2026-03-10T12:38:04.301 INFO:tasks.workunit.client.0.vm00.stdout:5/759: mkdir d1f/d26/d2e/d58/d10c 0 2026-03-10T12:38:04.309 INFO:tasks.workunit.client.0.vm00.stdout:8/579: write d0/f8 [5630328,27855] 0 2026-03-10T12:38:04.311 INFO:tasks.workunit.client.1.vm07.stdout:3/593: write dc/f94 [628740,85161] 0 2026-03-10T12:38:04.311 INFO:tasks.workunit.client.1.vm07.stdout:3/594: chown dc/dd/d1f/d6f/l83 4 1 2026-03-10T12:38:04.315 INFO:tasks.workunit.client.1.vm07.stdout:7/534: truncate d0/d52/f98 3593465 0 2026-03-10T12:38:04.316 INFO:tasks.workunit.client.1.vm07.stdout:7/535: chown d0/d61/d79/f95 316813384 1 2026-03-10T12:38:04.318 INFO:tasks.workunit.client.0.vm00.stdout:8/580: creat d0/dd/fbc x:0 0 0 2026-03-10T12:38:04.319 INFO:tasks.workunit.client.1.vm07.stdout:7/536: mknod d0/d67/caf 0 2026-03-10T12:38:04.320 INFO:tasks.workunit.client.1.vm07.stdout:7/537: creat d0/d57/d62/fb0 x:0 0 0 2026-03-10T12:38:04.322 INFO:tasks.workunit.client.0.vm00.stdout:8/581: fdatasync d0/dd/d38/d81/f88 0 2026-03-10T12:38:04.334 INFO:tasks.workunit.client.0.vm00.stdout:8/582: chown d0/d93/d17/d48 480554 1 2026-03-10T12:38:04.338 INFO:tasks.workunit.client.0.vm00.stdout:8/583: dwrite d0/dd/fbc [0,4194304] 0 2026-03-10T12:38:04.339 INFO:tasks.workunit.client.1.vm07.stdout:2/454: write d0/d42/d26/d7d/f7f [4721293,130012] 0 2026-03-10T12:38:04.340 INFO:tasks.workunit.client.0.vm00.stdout:8/584: fdatasync d0/dd/f9a 0 2026-03-10T12:38:04.342 INFO:tasks.workunit.client.0.vm00.stdout:8/585: creat d0/dd/d38/d81/fbd x:0 0 0 2026-03-10T12:38:04.343 INFO:tasks.workunit.client.1.vm07.stdout:2/455: unlink d0/d42/d26/d7d/f7f 0 2026-03-10T12:38:04.343 INFO:tasks.workunit.client.0.vm00.stdout:8/586: stat d0/d46/d7e 0 2026-03-10T12:38:04.351 INFO:tasks.workunit.client.0.vm00.stdout:8/587: rename d0/d93/d2d/d49/l5a to d0/d93/d17/lbe 0 2026-03-10T12:38:04.351 INFO:tasks.workunit.client.0.vm00.stdout:8/588: chown d0/c66 0 1 2026-03-10T12:38:04.355 INFO:tasks.workunit.client.0.vm00.stdout:8/589: creat d0/d58/fbf x:0 0 0 2026-03-10T12:38:04.358 INFO:tasks.workunit.client.0.vm00.stdout:8/590: write d0/d46/d6e/f7b [411788,43876] 0 2026-03-10T12:38:04.359 INFO:tasks.workunit.client.0.vm00.stdout:8/591: chown d0/dd/d38/d81 46 1 2026-03-10T12:38:04.360 INFO:tasks.workunit.client.1.vm07.stdout:6/537: dread d1/d4/d6/d43/d65/f76 [0,4194304] 0 2026-03-10T12:38:04.360 INFO:tasks.workunit.client.1.vm07.stdout:2/456: link d0/d29/f32 d0/d42/d26/d7d/f9a 0 2026-03-10T12:38:04.365 INFO:tasks.workunit.client.1.vm07.stdout:0/611: dwrite d0/d14/d5f/d76/d2f/d31/f6f [0,4194304] 0 2026-03-10T12:38:04.369 INFO:tasks.workunit.client.1.vm07.stdout:0/612: readlink d0/d14/d5f/d76/d2f/d31/d79/d85/l88 0 2026-03-10T12:38:04.373 INFO:tasks.workunit.client.1.vm07.stdout:1/540: rename d9/df/d29/d2b/d3d/f85 to d9/d2d/d4f/fb5 0 2026-03-10T12:38:04.385 INFO:tasks.workunit.client.1.vm07.stdout:6/538: mknod d1/d4/d71/cae 0 2026-03-10T12:38:04.386 INFO:tasks.workunit.client.1.vm07.stdout:6/539: chown d1/d4/d6/d43/c93 50775 1 2026-03-10T12:38:04.390 INFO:tasks.workunit.client.1.vm07.stdout:5/588: dwrite d0/d22/d18/d19/d21/f61 [0,4194304] 0 2026-03-10T12:38:04.403 INFO:tasks.workunit.client.1.vm07.stdout:0/613: getdents d0/d14/d5f/d3b/dbc/d8d 0 2026-03-10T12:38:04.407 INFO:tasks.workunit.client.0.vm00.stdout:6/469: dwrite d2/d16/f6d [0,4194304] 0 2026-03-10T12:38:04.408 INFO:tasks.workunit.client.1.vm07.stdout:5/589: rename d0/d22/d18/d19/d21/d3a/c57 to d0/d22/d18/d19/d21/d3a/cd3 0 2026-03-10T12:38:04.424 INFO:tasks.workunit.client.1.vm07.stdout:8/564: rename d1/d3/d6/d50/d70/la3 to d1/d3/d40/lb4 0 2026-03-10T12:38:04.433 INFO:tasks.workunit.client.1.vm07.stdout:0/614: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9 0 2026-03-10T12:38:04.434 INFO:tasks.workunit.client.1.vm07.stdout:0/615: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/fc7 [437512,95051] 0 2026-03-10T12:38:04.453 INFO:tasks.workunit.client.1.vm07.stdout:3/595: dread dc/dd/d43/f61 [0,4194304] 0 2026-03-10T12:38:04.455 INFO:tasks.workunit.client.0.vm00.stdout:1/721: write da/d21/d27/f54 [2857088,115572] 0 2026-03-10T12:38:04.460 INFO:tasks.workunit.client.0.vm00.stdout:1/722: mknod da/d12/d26/dd2/ddf/cf5 0 2026-03-10T12:38:04.460 INFO:tasks.workunit.client.1.vm07.stdout:2/457: rename d0/d42/d26/c69 to d0/d5b/d98/c9b 0 2026-03-10T12:38:04.460 INFO:tasks.workunit.client.0.vm00.stdout:1/723: fsync da/d21/f74 0 2026-03-10T12:38:04.461 INFO:tasks.workunit.client.0.vm00.stdout:1/724: dread - da/d24/d28/d67/f6c zero size 2026-03-10T12:38:04.468 INFO:tasks.workunit.client.0.vm00.stdout:1/725: fsync da/d12/d26/f57 0 2026-03-10T12:38:04.468 INFO:tasks.workunit.client.1.vm07.stdout:4/681: write d0/d4/d10/d3c/f68 [304124,122899] 0 2026-03-10T12:38:04.468 INFO:tasks.workunit.client.0.vm00.stdout:9/740: dread d0/d3d/d59/d4e/dba/d1e/d27/f28 [0,4194304] 0 2026-03-10T12:38:04.469 INFO:tasks.workunit.client.0.vm00.stdout:1/726: dread - da/d21/d27/fe8 zero size 2026-03-10T12:38:04.471 INFO:tasks.workunit.client.1.vm07.stdout:6/540: creat d1/d4/d6/d16/faf x:0 0 0 2026-03-10T12:38:04.475 INFO:tasks.workunit.client.0.vm00.stdout:4/722: write df/d1f/d22/f3c [2013051,73808] 0 2026-03-10T12:38:04.479 INFO:tasks.workunit.client.0.vm00.stdout:1/727: write da/d12/d91/fb8 [430532,35453] 0 2026-03-10T12:38:04.480 INFO:tasks.workunit.client.0.vm00.stdout:1/728: truncate da/d21/db3/d59/da6/fd3 585589 0 2026-03-10T12:38:04.481 INFO:tasks.workunit.client.0.vm00.stdout:6/470: creat d2/d9f/fa7 x:0 0 0 2026-03-10T12:38:04.486 INFO:tasks.workunit.client.0.vm00.stdout:0/584: write d3/d7/d4c/d5b/f57 [3664536,57039] 0 2026-03-10T12:38:04.487 INFO:tasks.workunit.client.0.vm00.stdout:0/585: chown d3/d22/d3a/f8c 4166 1 2026-03-10T12:38:04.489 INFO:tasks.workunit.client.0.vm00.stdout:6/471: dread - d2/da/dc/f40 zero size 2026-03-10T12:38:04.490 INFO:tasks.workunit.client.0.vm00.stdout:0/586: dread d3/d7/d4c/d5b/d38/d44/d5a/f86 [0,4194304] 0 2026-03-10T12:38:04.496 INFO:tasks.workunit.client.0.vm00.stdout:2/701: dwrite d4/d6/d93/dc6/fe4 [0,4194304] 0 2026-03-10T12:38:04.497 INFO:tasks.workunit.client.1.vm07.stdout:2/458: rename d0/d42/d26/d38/d4f/f5c to d0/f9c 0 2026-03-10T12:38:04.499 INFO:tasks.workunit.client.0.vm00.stdout:2/702: dread d4/d6/d93/dc6/fe4 [0,4194304] 0 2026-03-10T12:38:04.504 INFO:tasks.workunit.client.0.vm00.stdout:3/744: write dd/d4e/faa [1881928,93126] 0 2026-03-10T12:38:04.510 INFO:tasks.workunit.client.0.vm00.stdout:5/760: write d1f/d26/d2b/d35/fad [7453780,80859] 0 2026-03-10T12:38:04.511 INFO:tasks.workunit.client.0.vm00.stdout:5/761: stat d1f/d26/d2b/d35/d78/d7f/cac 0 2026-03-10T12:38:04.512 INFO:tasks.workunit.client.0.vm00.stdout:8/592: write d0/d93/d2d/f75 [579088,101690] 0 2026-03-10T12:38:04.518 INFO:tasks.workunit.client.1.vm07.stdout:4/682: unlink d0/d4/d5/da/fb3 0 2026-03-10T12:38:04.520 INFO:tasks.workunit.client.0.vm00.stdout:1/729: symlink da/d12/db4/lf6 0 2026-03-10T12:38:04.521 INFO:tasks.workunit.client.0.vm00.stdout:1/730: chown da/d24/d28/c2d 45922714 1 2026-03-10T12:38:04.524 INFO:tasks.workunit.client.0.vm00.stdout:4/723: creat df/d1f/d22/d26/d65/d91/db9/fea x:0 0 0 2026-03-10T12:38:04.532 INFO:tasks.workunit.client.1.vm07.stdout:8/565: creat d1/d3/fb5 x:0 0 0 2026-03-10T12:38:04.532 INFO:tasks.workunit.client.0.vm00.stdout:5/762: creat d1f/d39/f10d x:0 0 0 2026-03-10T12:38:04.532 INFO:tasks.workunit.client.0.vm00.stdout:5/763: stat d1f/d39/la8 0 2026-03-10T12:38:04.532 INFO:tasks.workunit.client.0.vm00.stdout:5/764: fdatasync d1f/d26/d2b/fe6 0 2026-03-10T12:38:04.532 INFO:tasks.workunit.client.0.vm00.stdout:8/593: mknod d0/d5c/cc0 0 2026-03-10T12:38:04.532 INFO:tasks.workunit.client.0.vm00.stdout:8/594: chown d0/dd/d38/f3d 1 1 2026-03-10T12:38:04.532 INFO:tasks.workunit.client.0.vm00.stdout:4/724: truncate df/d1f/d22/d26/d70/fb4 393993 0 2026-03-10T12:38:04.534 INFO:tasks.workunit.client.0.vm00.stdout:2/703: rename d4/l9 to d4/dd/da7/le9 0 2026-03-10T12:38:04.535 INFO:tasks.workunit.client.0.vm00.stdout:3/745: write dd/d27/f44 [7286305,23008] 0 2026-03-10T12:38:04.536 INFO:tasks.workunit.client.1.vm07.stdout:3/596: mkdir dc/d18/dcb 0 2026-03-10T12:38:04.536 INFO:tasks.workunit.client.0.vm00.stdout:5/765: mknod d1f/d26/d6f/c10e 0 2026-03-10T12:38:04.538 INFO:tasks.workunit.client.0.vm00.stdout:8/595: chown d0/d93/c35 1651921 1 2026-03-10T12:38:04.539 INFO:tasks.workunit.client.0.vm00.stdout:3/746: dwrite dd/d18/d13/f6b [0,4194304] 0 2026-03-10T12:38:04.539 INFO:tasks.workunit.client.1.vm07.stdout:9/636: dwrite d5/d16/d23/fc8 [0,4194304] 0 2026-03-10T12:38:04.540 INFO:tasks.workunit.client.0.vm00.stdout:8/596: stat d0/dd/d38/lb9 0 2026-03-10T12:38:04.540 INFO:tasks.workunit.client.0.vm00.stdout:8/597: truncate d0/d46/d7e/f8a 851385 0 2026-03-10T12:38:04.541 INFO:tasks.workunit.client.0.vm00.stdout:4/725: symlink df/d1f/d22/leb 0 2026-03-10T12:38:04.543 INFO:tasks.workunit.client.0.vm00.stdout:2/704: dread d4/d6/f4e [0,4194304] 0 2026-03-10T12:38:04.547 INFO:tasks.workunit.client.0.vm00.stdout:2/705: dwrite d4/d6/d2d/d31/f79 [0,4194304] 0 2026-03-10T12:38:04.555 INFO:tasks.workunit.client.0.vm00.stdout:1/731: creat da/d24/d5a/d71/dd4/ff7 x:0 0 0 2026-03-10T12:38:04.555 INFO:tasks.workunit.client.0.vm00.stdout:1/732: readlink da/d24/l2c 0 2026-03-10T12:38:04.556 INFO:tasks.workunit.client.1.vm07.stdout:2/459: rmdir d0/d42 39 2026-03-10T12:38:04.557 INFO:tasks.workunit.client.0.vm00.stdout:4/726: dread df/f29 [0,4194304] 0 2026-03-10T12:38:04.559 INFO:tasks.workunit.client.0.vm00.stdout:2/706: truncate d4/f6e 1502505 0 2026-03-10T12:38:04.560 INFO:tasks.workunit.client.0.vm00.stdout:5/766: creat d1f/d26/d2b/d35/d53/d5b/dd1/f10f x:0 0 0 2026-03-10T12:38:04.563 INFO:tasks.workunit.client.0.vm00.stdout:5/767: dwrite d1f/d26/d6f/f9b [0,4194304] 0 2026-03-10T12:38:04.565 INFO:tasks.workunit.client.1.vm07.stdout:4/683: mknod d0/d4/d10/d8d/cf1 0 2026-03-10T12:38:04.565 INFO:tasks.workunit.client.1.vm07.stdout:4/684: chown d0 242 1 2026-03-10T12:38:04.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:04 vm07.local ceph-mon[58582]: pgmap v167: 65 pgs: 65 active+clean; 2.4 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 53 MiB/s rd, 126 MiB/s wr, 285 op/s 2026-03-10T12:38:04.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:04 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:04.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:04 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:04.568 INFO:tasks.workunit.client.0.vm00.stdout:1/733: dread da/d24/d73/fb6 [0,4194304] 0 2026-03-10T12:38:04.569 INFO:tasks.workunit.client.1.vm07.stdout:6/541: symlink d1/d4/d9b/lb0 0 2026-03-10T12:38:04.570 INFO:tasks.workunit.client.0.vm00.stdout:3/747: symlink dd/d27/d2c/def/lf5 0 2026-03-10T12:38:04.571 INFO:tasks.workunit.client.0.vm00.stdout:1/734: rmdir da/d12/d91/dcb 39 2026-03-10T12:38:04.573 INFO:tasks.workunit.client.0.vm00.stdout:5/768: creat d1f/d26/d2e/d58/d10c/f110 x:0 0 0 2026-03-10T12:38:04.575 INFO:tasks.workunit.client.1.vm07.stdout:8/566: truncate d1/d3/d11/f43 1898801 0 2026-03-10T12:38:04.575 INFO:tasks.workunit.client.0.vm00.stdout:3/748: truncate dd/d18/f83 968477 0 2026-03-10T12:38:04.575 INFO:tasks.workunit.client.0.vm00.stdout:7/513: truncate da/f35 2244396 0 2026-03-10T12:38:04.577 INFO:tasks.workunit.client.0.vm00.stdout:1/735: link da/d21/d27/f54 da/d21/db3/d5d/ff8 0 2026-03-10T12:38:04.577 INFO:tasks.workunit.client.0.vm00.stdout:5/769: getdents d1f/d96/dbd/df0 0 2026-03-10T12:38:04.578 INFO:tasks.workunit.client.0.vm00.stdout:5/770: stat d1f/d26/d2b/d37/d105 0 2026-03-10T12:38:04.579 INFO:tasks.workunit.client.1.vm07.stdout:9/637: dread - d5/d1f/d5e/d6b/fb4 zero size 2026-03-10T12:38:04.579 INFO:tasks.workunit.client.1.vm07.stdout:8/567: dwrite d1/d3/d5d/d65/fad [0,4194304] 0 2026-03-10T12:38:04.581 INFO:tasks.workunit.client.0.vm00.stdout:5/771: dread d1f/d26/d2e/f10b [0,4194304] 0 2026-03-10T12:38:04.581 INFO:tasks.workunit.client.0.vm00.stdout:3/749: creat dd/d18/d13/d1d/ff6 x:0 0 0 2026-03-10T12:38:04.583 INFO:tasks.workunit.client.0.vm00.stdout:1/736: dwrite da/d24/d28/faa [0,4194304] 0 2026-03-10T12:38:04.584 INFO:tasks.workunit.client.0.vm00.stdout:1/737: write da/d21/db3/d59/da6/da4/dda/fbb [409700,65874] 0 2026-03-10T12:38:04.587 INFO:tasks.workunit.client.0.vm00.stdout:3/750: creat dd/d64/d93/ff7 x:0 0 0 2026-03-10T12:38:04.589 INFO:tasks.workunit.client.0.vm00.stdout:5/772: creat d1f/d26/d2b/f111 x:0 0 0 2026-03-10T12:38:04.591 INFO:tasks.workunit.client.0.vm00.stdout:5/773: symlink d1f/l112 0 2026-03-10T12:38:04.595 INFO:tasks.workunit.client.1.vm07.stdout:8/568: dread - d1/d3/d40/f5a zero size 2026-03-10T12:38:04.595 INFO:tasks.workunit.client.1.vm07.stdout:9/638: chown d5/d13/d6c/d7a/l9c 1 1 2026-03-10T12:38:04.607 INFO:tasks.workunit.client.1.vm07.stdout:4/685: mkdir d0/d4/df2 0 2026-03-10T12:38:04.608 INFO:tasks.workunit.client.1.vm07.stdout:2/460: dread - d0/d42/f5f zero size 2026-03-10T12:38:04.615 INFO:tasks.workunit.client.1.vm07.stdout:6/542: dread d1/d4/f82 [0,4194304] 0 2026-03-10T12:38:04.628 INFO:tasks.workunit.client.0.vm00.stdout:9/741: dwrite d0/d7f/d88/ff8 [0,4194304] 0 2026-03-10T12:38:04.630 INFO:tasks.workunit.client.0.vm00.stdout:6/472: write d2/d51/f63 [3429130,28687] 0 2026-03-10T12:38:04.630 INFO:tasks.workunit.client.0.vm00.stdout:6/473: chown d2/d16/c4b 3943 1 2026-03-10T12:38:04.644 INFO:tasks.workunit.client.0.vm00.stdout:6/474: dwrite d2/da/dc/d2f/f56 [0,4194304] 0 2026-03-10T12:38:04.663 INFO:tasks.workunit.client.0.vm00.stdout:6/475: mknod d2/d9f/ca8 0 2026-03-10T12:38:04.666 INFO:tasks.workunit.client.1.vm07.stdout:6/543: rename d1/d4/d6/d16/d1a/d33/f61 to d1/d4/d6/d4e/d64/fb1 0 2026-03-10T12:38:04.671 INFO:tasks.workunit.client.1.vm07.stdout:6/544: fsync d1/d4/f11 0 2026-03-10T12:38:04.672 INFO:tasks.workunit.client.1.vm07.stdout:2/461: link d0/c16 d0/d29/d64/d6c/d94/c9d 0 2026-03-10T12:38:04.672 INFO:tasks.workunit.client.0.vm00.stdout:6/476: rmdir d2/d51/d70 39 2026-03-10T12:38:04.681 INFO:tasks.workunit.client.0.vm00.stdout:6/477: rename d2/d39/c8a to d2/d16/d29/ca9 0 2026-03-10T12:38:04.684 INFO:tasks.workunit.client.1.vm07.stdout:6/545: fsync d1/d4/d6/f7e 0 2026-03-10T12:38:04.689 INFO:tasks.workunit.client.0.vm00.stdout:6/478: mkdir d2/d16/d29/d31/d88/d92/daa 0 2026-03-10T12:38:04.689 INFO:tasks.workunit.client.1.vm07.stdout:6/546: chown d1/d4/d6/d16/f50 205461 1 2026-03-10T12:38:04.710 INFO:tasks.workunit.client.0.vm00.stdout:4/727: dread df/d93/dbc/fc3 [0,4194304] 0 2026-03-10T12:38:04.712 INFO:tasks.workunit.client.0.vm00.stdout:4/728: rename df/d1f/d22/d26/f56 to df/d63/d94/fec 0 2026-03-10T12:38:04.725 INFO:tasks.workunit.client.0.vm00.stdout:4/729: creat df/fed x:0 0 0 2026-03-10T12:38:04.730 INFO:tasks.workunit.client.0.vm00.stdout:4/730: dread df/d63/d77/fe8 [0,4194304] 0 2026-03-10T12:38:04.733 INFO:tasks.workunit.client.0.vm00.stdout:4/731: creat df/d93/fee x:0 0 0 2026-03-10T12:38:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:04 vm00.local ceph-mon[50686]: pgmap v167: 65 pgs: 65 active+clean; 2.4 GiB data, 8.5 GiB used, 111 GiB / 120 GiB avail; 53 MiB/s rd, 126 MiB/s wr, 285 op/s 2026-03-10T12:38:04.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:04 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:04.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:04 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:04.740 INFO:tasks.workunit.client.0.vm00.stdout:4/732: dwrite df/d1f/d22/dcb/fd9 [0,4194304] 0 2026-03-10T12:38:04.741 INFO:tasks.workunit.client.1.vm07.stdout:6/547: dread d1/d4/d6/d16/d49/f67 [4194304,4194304] 0 2026-03-10T12:38:04.745 INFO:tasks.workunit.client.0.vm00.stdout:6/479: sync 2026-03-10T12:38:04.745 INFO:tasks.workunit.client.0.vm00.stdout:7/514: dread da/d1b/f39 [0,4194304] 0 2026-03-10T12:38:04.745 INFO:tasks.workunit.client.0.vm00.stdout:6/480: readlink d2/d16/d29/d31/d34/la2 0 2026-03-10T12:38:04.746 INFO:tasks.workunit.client.0.vm00.stdout:4/733: truncate df/f1e 5185080 0 2026-03-10T12:38:04.746 INFO:tasks.workunit.client.1.vm07.stdout:2/462: read d0/d29/f32 [1725476,67694] 0 2026-03-10T12:38:04.746 INFO:tasks.workunit.client.1.vm07.stdout:7/538: write d0/d57/f9f [1408641,119267] 0 2026-03-10T12:38:04.759 INFO:tasks.workunit.client.0.vm00.stdout:6/481: dread - d2/d16/d29/f54 zero size 2026-03-10T12:38:04.767 INFO:tasks.workunit.client.1.vm07.stdout:1/541: dwrite d9/df/d29/d6b/fa1 [0,4194304] 0 2026-03-10T12:38:04.768 INFO:tasks.workunit.client.0.vm00.stdout:7/515: truncate da/d3f/d60/f85 344053 0 2026-03-10T12:38:04.772 INFO:tasks.workunit.client.1.vm07.stdout:2/463: truncate d0/d42/d4e/d77/f6f 190271 0 2026-03-10T12:38:04.778 INFO:tasks.workunit.client.0.vm00.stdout:6/482: creat d2/d51/d70/fab x:0 0 0 2026-03-10T12:38:04.779 INFO:tasks.workunit.client.0.vm00.stdout:6/483: chown d2/da 83 1 2026-03-10T12:38:04.779 INFO:tasks.workunit.client.0.vm00.stdout:6/484: readlink d2/d42/d80/d89/l96 0 2026-03-10T12:38:04.784 INFO:tasks.workunit.client.1.vm07.stdout:2/464: rename d0/d42/f53 to d0/d29/d64/d74/f9e 0 2026-03-10T12:38:04.788 INFO:tasks.workunit.client.1.vm07.stdout:1/542: getdents d9/df/d29/d6b 0 2026-03-10T12:38:04.798 INFO:tasks.workunit.client.1.vm07.stdout:2/465: rename d0/d42/d1f/d20/c68 to d0/d42/d26/d38/d4f/d5d/c9f 0 2026-03-10T12:38:04.798 INFO:tasks.workunit.client.0.vm00.stdout:8/598: getdents d0/d5c 0 2026-03-10T12:38:04.799 INFO:tasks.workunit.client.1.vm07.stdout:2/466: creat d0/d42/d1f/d20/fa0 x:0 0 0 2026-03-10T12:38:04.800 INFO:tasks.workunit.client.0.vm00.stdout:8/599: dwrite d0/f9 [0,4194304] 0 2026-03-10T12:38:04.804 INFO:tasks.workunit.client.1.vm07.stdout:1/543: getdents d9/d2d/d4f/d75 0 2026-03-10T12:38:04.812 INFO:tasks.workunit.client.0.vm00.stdout:8/600: dwrite d0/d93/d17/fa6 [0,4194304] 0 2026-03-10T12:38:04.817 INFO:tasks.workunit.client.0.vm00.stdout:2/707: dwrite d4/d6/d2d/d3a/d43/fa1 [0,4194304] 0 2026-03-10T12:38:04.820 INFO:tasks.workunit.client.0.vm00.stdout:2/708: dread - d4/d6/f9c zero size 2026-03-10T12:38:04.823 INFO:tasks.workunit.client.0.vm00.stdout:2/709: dwrite f1 [0,4194304] 0 2026-03-10T12:38:04.830 INFO:tasks.workunit.client.1.vm07.stdout:1/544: fdatasync d9/df/d29/d2b/d31/d91/d59/fa4 0 2026-03-10T12:38:04.832 INFO:tasks.workunit.client.1.vm07.stdout:1/545: chown d9/d2d/d80/f8d 573845223 1 2026-03-10T12:38:04.832 INFO:tasks.workunit.client.1.vm07.stdout:5/590: dwrite d0/d22/d18/d3e/d5d/f6d [0,4194304] 0 2026-03-10T12:38:04.834 INFO:tasks.workunit.client.0.vm00.stdout:2/710: creat d4/d6/de7/fea x:0 0 0 2026-03-10T12:38:04.838 INFO:tasks.workunit.client.0.vm00.stdout:3/751: creat dd/d18/d13/d1d/dc6/ff8 x:0 0 0 2026-03-10T12:38:04.840 INFO:tasks.workunit.client.0.vm00.stdout:9/742: write d0/d3d/d43/d53/fd1 [145157,130181] 0 2026-03-10T12:38:04.843 INFO:tasks.workunit.client.0.vm00.stdout:9/743: dwrite d0/d5/dc/f2a [0,4194304] 0 2026-03-10T12:38:04.846 INFO:tasks.workunit.client.0.vm00.stdout:6/485: link d2/c6 d2/d16/d29/d31/d88/d92/daa/cac 0 2026-03-10T12:38:04.847 INFO:tasks.workunit.client.0.vm00.stdout:5/774: readlink d1f/d26/d2e/d58/d6b/ld8 0 2026-03-10T12:38:04.848 INFO:tasks.workunit.client.0.vm00.stdout:5/775: write d1f/d26/d2b/d35/d53/d5b/dd1/f10f [75694,33218] 0 2026-03-10T12:38:04.850 INFO:tasks.workunit.client.0.vm00.stdout:5/776: dread d1f/d26/f9f [0,4194304] 0 2026-03-10T12:38:04.854 INFO:tasks.workunit.client.0.vm00.stdout:2/711: rmdir d4/d6 39 2026-03-10T12:38:04.859 INFO:tasks.workunit.client.0.vm00.stdout:3/752: rmdir dd/d3d/d8a 39 2026-03-10T12:38:04.864 INFO:tasks.workunit.client.1.vm07.stdout:1/546: dwrite d9/df/d29/d2b/d31/d91/faf [0,4194304] 0 2026-03-10T12:38:04.867 INFO:tasks.workunit.client.0.vm00.stdout:9/744: creat d0/d7f/d88/f10e x:0 0 0 2026-03-10T12:38:04.871 INFO:tasks.workunit.client.0.vm00.stdout:9/745: stat d0/d3d/d59/d4e/dba/d1e/d85/fe7 0 2026-03-10T12:38:04.872 INFO:tasks.workunit.client.0.vm00.stdout:5/777: mkdir d1f/d26/d2e/d58/d6b/d113 0 2026-03-10T12:38:04.872 INFO:tasks.workunit.client.0.vm00.stdout:6/486: symlink d2/da/dc/d94/lad 0 2026-03-10T12:38:04.872 INFO:tasks.workunit.client.0.vm00.stdout:6/487: chown d2/da4 2024886 1 2026-03-10T12:38:04.872 INFO:tasks.workunit.client.0.vm00.stdout:2/712: mkdir d4/d53/d76/dba/deb 0 2026-03-10T12:38:04.875 INFO:tasks.workunit.client.0.vm00.stdout:9/746: mknod d0/d7f/db8/dc4/c10f 0 2026-03-10T12:38:04.877 INFO:tasks.workunit.client.0.vm00.stdout:9/747: dread d0/d3d/d59/fad [0,4194304] 0 2026-03-10T12:38:04.880 INFO:tasks.workunit.client.0.vm00.stdout:6/488: rename d2/da4 to d2/d42/dae 0 2026-03-10T12:38:04.881 INFO:tasks.workunit.client.0.vm00.stdout:6/489: chown d2/d16 210674 1 2026-03-10T12:38:04.883 INFO:tasks.workunit.client.0.vm00.stdout:2/713: creat d4/d53/d68/dc2/fec x:0 0 0 2026-03-10T12:38:04.892 INFO:tasks.workunit.client.0.vm00.stdout:2/714: chown d4/d6/d93/lb0 1388332009 1 2026-03-10T12:38:04.892 INFO:tasks.workunit.client.1.vm07.stdout:5/591: creat d0/d22/d18/d19/d21/fd4 x:0 0 0 2026-03-10T12:38:04.892 INFO:tasks.workunit.client.0.vm00.stdout:2/715: read - d4/d53/d76/f92 zero size 2026-03-10T12:38:04.892 INFO:tasks.workunit.client.1.vm07.stdout:1/547: mkdir d9/df/d29/d2b/d92/db6 0 2026-03-10T12:38:04.892 INFO:tasks.workunit.client.0.vm00.stdout:2/716: dread - d4/d6/d2d/dc3/fce zero size 2026-03-10T12:38:04.893 INFO:tasks.workunit.client.0.vm00.stdout:3/753: mknod dd/d3d/d8a/de0/de4/dac/cf9 0 2026-03-10T12:38:04.895 INFO:tasks.workunit.client.0.vm00.stdout:9/748: truncate d0/f4 6297935 0 2026-03-10T12:38:04.896 INFO:tasks.workunit.client.0.vm00.stdout:6/490: mknod d2/d42/d80/d9d/caf 0 2026-03-10T12:38:04.897 INFO:tasks.workunit.client.0.vm00.stdout:9/749: read d0/d3d/d59/f45 [340618,124701] 0 2026-03-10T12:38:04.900 INFO:tasks.workunit.client.0.vm00.stdout:3/754: rmdir dd/d3d/d8a/de0/de4 39 2026-03-10T12:38:04.902 INFO:tasks.workunit.client.1.vm07.stdout:1/548: getdents d9/df/d29/d2b/d31 0 2026-03-10T12:38:04.908 INFO:tasks.workunit.client.0.vm00.stdout:3/755: dwrite dd/d18/d13/d1d/dc6/ff8 [0,4194304] 0 2026-03-10T12:38:04.908 INFO:tasks.workunit.client.0.vm00.stdout:6/491: creat d2/da/dc/d83/fb0 x:0 0 0 2026-03-10T12:38:04.909 INFO:tasks.workunit.client.0.vm00.stdout:3/756: write dd/d2a/fbc [1176852,72200] 0 2026-03-10T12:38:04.909 INFO:tasks.workunit.client.1.vm07.stdout:1/549: rename d9/df/d29/d2b/d92/d9d/cac to d9/d2d/d4f/d5a/cb7 0 2026-03-10T12:38:04.911 INFO:tasks.workunit.client.1.vm07.stdout:1/550: mknod d9/df/cb8 0 2026-03-10T12:38:04.915 INFO:tasks.workunit.client.0.vm00.stdout:6/492: dread d2/d16/f41 [0,4194304] 0 2026-03-10T12:38:04.920 INFO:tasks.workunit.client.0.vm00.stdout:6/493: mknod d2/d14/d7a/cb1 0 2026-03-10T12:38:04.923 INFO:tasks.workunit.client.1.vm07.stdout:1/551: mkdir d9/df/d29/d2b/db9 0 2026-03-10T12:38:04.923 INFO:tasks.workunit.client.1.vm07.stdout:1/552: fsync d9/df/f24 0 2026-03-10T12:38:04.924 INFO:tasks.workunit.client.1.vm07.stdout:1/553: fdatasync d9/df/f4a 0 2026-03-10T12:38:04.936 INFO:tasks.workunit.client.0.vm00.stdout:9/750: dread d0/d3d/d59/d4e/dba/d1e/d2b/f5f [0,4194304] 0 2026-03-10T12:38:04.942 INFO:tasks.workunit.client.1.vm07.stdout:1/554: mknod d9/df/d29/d2b/d30/cba 0 2026-03-10T12:38:04.943 INFO:tasks.workunit.client.1.vm07.stdout:1/555: chown d9/df/d29/d2b/d30/f38 493642756 1 2026-03-10T12:38:04.943 INFO:tasks.workunit.client.0.vm00.stdout:9/751: dread - d0/d7f/db8/dc4/fca zero size 2026-03-10T12:38:04.943 INFO:tasks.workunit.client.0.vm00.stdout:9/752: rename d0/d3d/d59/d4e/dba/fa1 to d0/d3d/f110 0 2026-03-10T12:38:04.943 INFO:tasks.workunit.client.0.vm00.stdout:9/753: write d0/d3d/d59/d4e/dba/d19/f7d [3731808,57385] 0 2026-03-10T12:38:04.943 INFO:tasks.workunit.client.0.vm00.stdout:9/754: stat d0/dc2 0 2026-03-10T12:38:04.943 INFO:tasks.workunit.client.0.vm00.stdout:9/755: truncate d0/d3d/d59/d4e/dba/d1e/d85/d98/fa7 198696 0 2026-03-10T12:38:04.947 INFO:tasks.workunit.client.0.vm00.stdout:0/587: dread d3/d40/d65/fc0 [0,4194304] 0 2026-03-10T12:38:04.949 INFO:tasks.workunit.client.0.vm00.stdout:0/588: creat d3/db/d77/d82/fc1 x:0 0 0 2026-03-10T12:38:04.952 INFO:tasks.workunit.client.0.vm00.stdout:0/589: rename d3/d7/d4c/d5b/d38/d44/d5a/f86 to d3/db/d77/d82/fc2 0 2026-03-10T12:38:04.953 INFO:tasks.workunit.client.0.vm00.stdout:0/590: chown d3/d7/d4c/d5b/d38/fa2 9196052 1 2026-03-10T12:38:04.955 INFO:tasks.workunit.client.0.vm00.stdout:0/591: truncate d3/db/d24/d25/f3f 2844486 0 2026-03-10T12:38:04.955 INFO:tasks.workunit.client.0.vm00.stdout:0/592: truncate d3/d7/d4c/d5b/d38/fbf 943659 0 2026-03-10T12:38:04.958 INFO:tasks.workunit.client.1.vm07.stdout:0/616: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf [0,4194304] 0 2026-03-10T12:38:04.959 INFO:tasks.workunit.client.0.vm00.stdout:1/738: dread da/d21/db3/d5d/d80/fcc [0,4194304] 0 2026-03-10T12:38:04.960 INFO:tasks.workunit.client.0.vm00.stdout:0/593: symlink d3/d7/d4c/lc3 0 2026-03-10T12:38:04.961 INFO:tasks.workunit.client.0.vm00.stdout:1/739: truncate da/d21/d27/d6a/f6d 1469225 0 2026-03-10T12:38:04.968 INFO:tasks.workunit.client.1.vm07.stdout:2/467: getdents d0 0 2026-03-10T12:38:04.971 INFO:tasks.workunit.client.0.vm00.stdout:0/594: mkdir d3/d7/db0/dc4 0 2026-03-10T12:38:04.979 INFO:tasks.workunit.client.0.vm00.stdout:1/740: dread da/d12/d26/f69 [0,4194304] 0 2026-03-10T12:38:04.979 INFO:tasks.workunit.client.0.vm00.stdout:1/741: dread - da/d21/d27/fe8 zero size 2026-03-10T12:38:04.980 INFO:tasks.workunit.client.0.vm00.stdout:1/742: chown da/d21/db3/d5d/d80/f8a 850 1 2026-03-10T12:38:04.980 INFO:tasks.workunit.client.0.vm00.stdout:0/595: mkdir d3/d7/d4c/d5b/dc5 0 2026-03-10T12:38:04.980 INFO:tasks.workunit.client.0.vm00.stdout:1/743: write da/d12/d26/dd2/ddf/feb [424981,80573] 0 2026-03-10T12:38:04.982 INFO:tasks.workunit.client.0.vm00.stdout:4/734: write df/d1f/d22/d26/dab/f75 [1009085,26195] 0 2026-03-10T12:38:04.984 INFO:tasks.workunit.client.1.vm07.stdout:3/597: dwrite dc/d18/d24/f55 [0,4194304] 0 2026-03-10T12:38:04.986 INFO:tasks.workunit.client.1.vm07.stdout:3/598: chown dc/dd/d1f/dac 96635429 1 2026-03-10T12:38:04.986 INFO:tasks.workunit.client.0.vm00.stdout:1/744: truncate da/d24/d5a/f68 3697127 0 2026-03-10T12:38:04.993 INFO:tasks.workunit.client.1.vm07.stdout:9/639: write d5/f91 [2859669,45761] 0 2026-03-10T12:38:04.993 INFO:tasks.workunit.client.1.vm07.stdout:9/640: chown d5/d13/d6c/c78 7657 1 2026-03-10T12:38:04.993 INFO:tasks.workunit.client.1.vm07.stdout:9/641: rename d5/d13/d57/d4f/d6a/fc4 to d5/d16/d23/d26/d68/fdc 0 2026-03-10T12:38:04.993 INFO:tasks.workunit.client.1.vm07.stdout:9/642: mkdir d5/d13/d6c/da4/ddd 0 2026-03-10T12:38:04.997 INFO:tasks.workunit.client.0.vm00.stdout:4/735: read df/d1f/d22/d26/d65/f8e [625356,18360] 0 2026-03-10T12:38:05.000 INFO:tasks.workunit.client.0.vm00.stdout:0/596: dread f2 [0,4194304] 0 2026-03-10T12:38:05.000 INFO:tasks.workunit.client.0.vm00.stdout:0/597: dread - d3/d7/d4c/d5b/d38/db3/fbb zero size 2026-03-10T12:38:05.013 INFO:tasks.workunit.client.1.vm07.stdout:3/599: sync 2026-03-10T12:38:05.013 INFO:tasks.workunit.client.1.vm07.stdout:1/556: sync 2026-03-10T12:38:05.019 INFO:tasks.workunit.client.0.vm00.stdout:0/598: dread d3/db/d77/f9e [0,4194304] 0 2026-03-10T12:38:05.019 INFO:tasks.workunit.client.1.vm07.stdout:3/600: unlink dc/dd/d1f/f6d 0 2026-03-10T12:38:05.022 INFO:tasks.workunit.client.1.vm07.stdout:1/557: rename d9/df/d55/lb0 to d9/d2d/d80/d8e/lbb 0 2026-03-10T12:38:05.022 INFO:tasks.workunit.client.1.vm07.stdout:1/558: stat d9/f1a 0 2026-03-10T12:38:05.022 INFO:tasks.workunit.client.0.vm00.stdout:0/599: dwrite d3/d7/d4c/d5b/d38/db3/fbe [0,4194304] 0 2026-03-10T12:38:05.027 INFO:tasks.workunit.client.1.vm07.stdout:3/601: creat dc/dd/db5/fcc x:0 0 0 2026-03-10T12:38:05.028 INFO:tasks.workunit.client.1.vm07.stdout:3/602: chown dc/dd/d28/d7a/fab 0 1 2026-03-10T12:38:05.028 INFO:tasks.workunit.client.1.vm07.stdout:1/559: symlink d9/df/d29/d2b/db9/lbc 0 2026-03-10T12:38:05.041 INFO:tasks.workunit.client.1.vm07.stdout:2/468: read d0/d42/d4e/d77/d70/f8a [176588,8260] 0 2026-03-10T12:38:05.055 INFO:tasks.workunit.client.1.vm07.stdout:3/603: symlink dc/dd/d28/d3b/lcd 0 2026-03-10T12:38:05.056 INFO:tasks.workunit.client.1.vm07.stdout:4/686: write d0/d4/d10/fc7 [906784,13761] 0 2026-03-10T12:38:05.064 INFO:tasks.workunit.client.0.vm00.stdout:8/601: write d0/d93/d2d/f44 [1541621,8186] 0 2026-03-10T12:38:05.065 INFO:tasks.workunit.client.0.vm00.stdout:8/602: chown d0/d93/c24 2355224 1 2026-03-10T12:38:05.066 INFO:tasks.workunit.client.0.vm00.stdout:8/603: write d0/d5c/f4a [2134655,42773] 0 2026-03-10T12:38:05.068 INFO:tasks.workunit.client.0.vm00.stdout:8/604: rmdir d0/d5c/d86 0 2026-03-10T12:38:05.070 INFO:tasks.workunit.client.0.vm00.stdout:0/600: creat d3/d7/d58/fc6 x:0 0 0 2026-03-10T12:38:05.071 INFO:tasks.workunit.client.0.vm00.stdout:0/601: dread - d3/db/d77/faa zero size 2026-03-10T12:38:05.076 INFO:tasks.workunit.client.1.vm07.stdout:8/569: truncate d1/d3/d6/d50/f5e 1561260 0 2026-03-10T12:38:05.077 INFO:tasks.workunit.client.1.vm07.stdout:2/469: truncate d0/d29/d64/f67 506308 0 2026-03-10T12:38:05.084 INFO:tasks.workunit.client.0.vm00.stdout:0/602: mknod d3/d22/d3a/cc7 0 2026-03-10T12:38:05.085 INFO:tasks.workunit.client.0.vm00.stdout:0/603: write d3/db/d77/faa [333233,46537] 0 2026-03-10T12:38:05.093 INFO:tasks.workunit.client.1.vm07.stdout:4/687: fsync d0/d4/d7a/d46/f56 0 2026-03-10T12:38:05.099 INFO:tasks.workunit.client.1.vm07.stdout:6/548: dwrite d1/d4/d6/f2a [0,4194304] 0 2026-03-10T12:38:05.106 INFO:tasks.workunit.client.1.vm07.stdout:8/570: rename d1/d3/d5d/d65 to d1/d3/d40/d92/db6 0 2026-03-10T12:38:05.107 INFO:tasks.workunit.client.1.vm07.stdout:0/617: dread d0/d14/d5f/d76/d2f/d31/f5a [0,4194304] 0 2026-03-10T12:38:05.127 INFO:tasks.workunit.client.1.vm07.stdout:6/549: sync 2026-03-10T12:38:05.129 INFO:tasks.workunit.client.1.vm07.stdout:6/550: write d1/d4/d6/d16/faf [661583,7631] 0 2026-03-10T12:38:05.129 INFO:tasks.workunit.client.1.vm07.stdout:7/539: write d0/d47/f73 [583642,113993] 0 2026-03-10T12:38:05.133 INFO:tasks.workunit.client.1.vm07.stdout:6/551: dwrite d1/d4/d6/f7c [0,4194304] 0 2026-03-10T12:38:05.147 INFO:tasks.workunit.client.1.vm07.stdout:0/618: mknod d0/d14/d5f/d3b/cca 0 2026-03-10T12:38:05.154 INFO:tasks.workunit.client.1.vm07.stdout:2/470: unlink d0/d29/d64/d6c/d94/c9d 0 2026-03-10T12:38:05.158 INFO:tasks.workunit.client.0.vm00.stdout:0/604: fsync d3/db/d77/faa 0 2026-03-10T12:38:05.168 INFO:tasks.workunit.client.0.vm00.stdout:7/516: write da/d3f/d60/f88 [138797,37376] 0 2026-03-10T12:38:05.172 INFO:tasks.workunit.client.0.vm00.stdout:7/517: dwrite da/d26/d37/d56/f9a [0,4194304] 0 2026-03-10T12:38:05.182 INFO:tasks.workunit.client.0.vm00.stdout:5/778: dwrite d1f/d26/d2b/fce [0,4194304] 0 2026-03-10T12:38:05.187 INFO:tasks.workunit.client.0.vm00.stdout:2/717: dwrite d4/d6/d2d/d3a/d43/d85/fa3 [0,4194304] 0 2026-03-10T12:38:05.192 INFO:tasks.workunit.client.0.vm00.stdout:2/718: dwrite d4/fc1 [0,4194304] 0 2026-03-10T12:38:05.194 INFO:tasks.workunit.client.0.vm00.stdout:0/605: truncate d3/d22/f71 1486347 0 2026-03-10T12:38:05.197 INFO:tasks.workunit.client.0.vm00.stdout:0/606: truncate d3/db/fbc 218193 0 2026-03-10T12:38:05.205 INFO:tasks.workunit.client.0.vm00.stdout:3/757: dwrite dd/d3d/d65/fad [0,4194304] 0 2026-03-10T12:38:05.206 INFO:tasks.workunit.client.0.vm00.stdout:9/756: write d0/d3d/d43/f68 [4572305,103932] 0 2026-03-10T12:38:05.213 INFO:tasks.workunit.client.0.vm00.stdout:1/745: truncate da/d24/d73/fb6 846729 0 2026-03-10T12:38:05.216 INFO:tasks.workunit.client.0.vm00.stdout:4/736: dwrite df/d1f/d22/d26/d65/fba [0,4194304] 0 2026-03-10T12:38:05.218 INFO:tasks.workunit.client.0.vm00.stdout:4/737: readlink df/d1f/d22/l3f 0 2026-03-10T12:38:05.220 INFO:tasks.workunit.client.0.vm00.stdout:9/757: creat d0/d7f/db8/dc4/f111 x:0 0 0 2026-03-10T12:38:05.222 INFO:tasks.workunit.client.0.vm00.stdout:4/738: dwrite df/d1f/d22/f3c [0,4194304] 0 2026-03-10T12:38:05.228 INFO:tasks.workunit.client.0.vm00.stdout:3/758: mknod dd/d27/cfa 0 2026-03-10T12:38:05.230 INFO:tasks.workunit.client.1.vm07.stdout:6/552: stat d1/d4/d6/d16/l2f 0 2026-03-10T12:38:05.231 INFO:tasks.workunit.client.1.vm07.stdout:6/553: dread - d1/d4/d6/d43/d88/d97/fa2 zero size 2026-03-10T12:38:05.233 INFO:tasks.workunit.client.0.vm00.stdout:2/719: getdents d4/dd/db9 0 2026-03-10T12:38:05.236 INFO:tasks.workunit.client.1.vm07.stdout:0/619: creat d0/d14/d5f/d3b/fcb x:0 0 0 2026-03-10T12:38:05.237 INFO:tasks.workunit.client.1.vm07.stdout:0/620: dread - d0/d14/d5f/d76/d2f/d31/d4f/fc4 zero size 2026-03-10T12:38:05.240 INFO:tasks.workunit.client.0.vm00.stdout:4/739: mkdir df/d1f/d22/dcb/def 0 2026-03-10T12:38:05.250 INFO:tasks.workunit.client.0.vm00.stdout:0/607: creat d3/d7/d4c/d9d/fc8 x:0 0 0 2026-03-10T12:38:05.260 INFO:tasks.workunit.client.0.vm00.stdout:4/740: rename df/d1f/d22/d26/d65/f8e to df/d1f/d22/d26/ff0 0 2026-03-10T12:38:05.268 INFO:tasks.workunit.client.1.vm07.stdout:0/621: sync 2026-03-10T12:38:05.268 INFO:tasks.workunit.client.1.vm07.stdout:8/571: link d1/f19 d1/d3/d6/fb7 0 2026-03-10T12:38:05.268 INFO:tasks.workunit.client.0.vm00.stdout:4/741: mkdir df/d1f/d36/dc6/df1 0 2026-03-10T12:38:05.268 INFO:tasks.workunit.client.0.vm00.stdout:4/742: symlink df/d1f/d22/d26/d65/d91/db9/lf2 0 2026-03-10T12:38:05.268 INFO:tasks.workunit.client.0.vm00.stdout:4/743: truncate df/d1f/d22/f72 478906 0 2026-03-10T12:38:05.271 INFO:tasks.workunit.client.0.vm00.stdout:4/744: dwrite df/d1f/d22/d26/dab/fd7 [0,4194304] 0 2026-03-10T12:38:05.273 INFO:tasks.workunit.client.1.vm07.stdout:5/592: dwrite d0/d22/d18/d19/d21/d3a/fa2 [0,4194304] 0 2026-03-10T12:38:05.275 INFO:tasks.workunit.client.0.vm00.stdout:4/745: symlink df/d1f/d22/d26/d65/da7/lf3 0 2026-03-10T12:38:05.280 INFO:tasks.workunit.client.0.vm00.stdout:4/746: mknod df/d1f/d22/d26/dab/d73/dda/cf4 0 2026-03-10T12:38:05.280 INFO:tasks.workunit.client.0.vm00.stdout:4/747: rename df/d1f/d36/d3a/d41/f5e to df/d93/dbc/ff5 0 2026-03-10T12:38:05.281 INFO:tasks.workunit.client.0.vm00.stdout:4/748: link df/d63/d77/lc8 df/d1f/d22/d26/d65/lf6 0 2026-03-10T12:38:05.282 INFO:tasks.workunit.client.0.vm00.stdout:4/749: mkdir df/d1f/d36/d3a/d41/df7 0 2026-03-10T12:38:05.283 INFO:tasks.workunit.client.0.vm00.stdout:4/750: write df/d1f/d22/d26/dab/fd7 [2044576,40461] 0 2026-03-10T12:38:05.285 INFO:tasks.workunit.client.0.vm00.stdout:4/751: unlink df/d1f/d22/d26/caf 0 2026-03-10T12:38:05.285 INFO:tasks.workunit.client.0.vm00.stdout:4/752: stat df/d6c 0 2026-03-10T12:38:05.286 INFO:tasks.workunit.client.0.vm00.stdout:6/494: rmdir d2/d14 39 2026-03-10T12:38:05.288 INFO:tasks.workunit.client.1.vm07.stdout:2/471: fdatasync d0/f9c 0 2026-03-10T12:38:05.323 INFO:tasks.workunit.client.1.vm07.stdout:8/572: read d1/d3/d6/f4f [92258,24336] 0 2026-03-10T12:38:05.324 INFO:tasks.workunit.client.0.vm00.stdout:5/779: write d1f/d26/d2e/f8c [1297220,123917] 0 2026-03-10T12:38:05.325 INFO:tasks.workunit.client.0.vm00.stdout:5/780: dread - d1f/d26/d2b/d37/dcc/fed zero size 2026-03-10T12:38:05.326 INFO:tasks.workunit.client.0.vm00.stdout:5/781: dread - d1f/d26/d2e/d58/d10c/f110 zero size 2026-03-10T12:38:05.326 INFO:tasks.workunit.client.0.vm00.stdout:8/605: dwrite d0/d93/f27 [0,4194304] 0 2026-03-10T12:38:05.334 INFO:tasks.workunit.client.0.vm00.stdout:8/606: creat d0/d93/d17/da2/fc1 x:0 0 0 2026-03-10T12:38:05.342 INFO:tasks.workunit.client.0.vm00.stdout:8/607: unlink d0/d93/d36/d5b/laa 0 2026-03-10T12:38:05.344 INFO:tasks.workunit.client.0.vm00.stdout:6/495: rmdir d2/d16/d29/d31/d88 39 2026-03-10T12:38:05.349 INFO:tasks.workunit.client.0.vm00.stdout:3/759: write dd/d64/f98 [538805,39722] 0 2026-03-10T12:38:05.350 INFO:tasks.workunit.client.0.vm00.stdout:1/746: dwrite da/d12/d26/fd0 [0,4194304] 0 2026-03-10T12:38:05.351 INFO:tasks.workunit.client.0.vm00.stdout:2/720: write d4/d6/f4e [920817,93931] 0 2026-03-10T12:38:05.353 INFO:tasks.workunit.client.0.vm00.stdout:9/758: write d0/f21 [4818320,95684] 0 2026-03-10T12:38:05.354 INFO:tasks.workunit.client.0.vm00.stdout:3/760: dwrite dd/d3d/fe3 [0,4194304] 0 2026-03-10T12:38:05.356 INFO:tasks.workunit.client.0.vm00.stdout:9/759: symlink d0/d7f/l112 0 2026-03-10T12:38:05.359 INFO:tasks.workunit.client.0.vm00.stdout:9/760: chown d0/d3d/d59/d4e/dba/d19/f95 0 1 2026-03-10T12:38:05.361 INFO:tasks.workunit.client.0.vm00.stdout:3/761: creat dd/d3d/d8a/ffb x:0 0 0 2026-03-10T12:38:05.361 INFO:tasks.workunit.client.0.vm00.stdout:3/762: stat dd/ff1 0 2026-03-10T12:38:05.363 INFO:tasks.workunit.client.0.vm00.stdout:2/721: symlink d4/led 0 2026-03-10T12:38:05.364 INFO:tasks.workunit.client.0.vm00.stdout:9/761: creat d0/d7f/d88/f113 x:0 0 0 2026-03-10T12:38:05.375 INFO:tasks.workunit.client.0.vm00.stdout:5/782: sync 2026-03-10T12:38:05.375 INFO:tasks.workunit.client.0.vm00.stdout:8/608: sync 2026-03-10T12:38:05.378 INFO:tasks.workunit.client.0.vm00.stdout:8/609: dwrite d0/d93/d2d/f75 [0,4194304] 0 2026-03-10T12:38:05.381 INFO:tasks.workunit.client.0.vm00.stdout:1/747: getdents da/d21/db3/d5d 0 2026-03-10T12:38:05.383 INFO:tasks.workunit.client.0.vm00.stdout:2/722: fsync d4/f39 0 2026-03-10T12:38:05.384 INFO:tasks.workunit.client.1.vm07.stdout:7/540: getdents d0/d47/d48 0 2026-03-10T12:38:05.385 INFO:tasks.workunit.client.0.vm00.stdout:8/610: truncate d0/d93/d2d/f55 1318901 0 2026-03-10T12:38:05.386 INFO:tasks.workunit.client.0.vm00.stdout:1/748: creat da/d12/d26/dd2/ff9 x:0 0 0 2026-03-10T12:38:05.391 INFO:tasks.workunit.client.0.vm00.stdout:5/783: creat d1f/d6a/d94/dc9/f114 x:0 0 0 2026-03-10T12:38:05.391 INFO:tasks.workunit.client.0.vm00.stdout:1/749: chown da/d21/db3/d59/l5e 0 1 2026-03-10T12:38:05.392 INFO:tasks.workunit.client.0.vm00.stdout:5/784: read d1f/d26/d2b/d35/d53/d72/ff9 [2500064,7315] 0 2026-03-10T12:38:05.392 INFO:tasks.workunit.client.0.vm00.stdout:8/611: link d0/dd/d38/lb9 d0/d93/d36/d7d/lc2 0 2026-03-10T12:38:05.394 INFO:tasks.workunit.client.0.vm00.stdout:8/612: dread d0/d46/d6e/f7b [0,4194304] 0 2026-03-10T12:38:05.398 INFO:tasks.workunit.client.1.vm07.stdout:5/593: symlink d0/d22/d18/d19/d36/d75/ld5 0 2026-03-10T12:38:05.403 INFO:tasks.workunit.client.0.vm00.stdout:1/750: creat da/d21/db3/d59/da6/d8b/ffa x:0 0 0 2026-03-10T12:38:05.404 INFO:tasks.workunit.client.0.vm00.stdout:5/785: sync 2026-03-10T12:38:05.405 INFO:tasks.workunit.client.0.vm00.stdout:2/723: dread d4/dd/db9/f4c [0,4194304] 0 2026-03-10T12:38:05.405 INFO:tasks.workunit.client.0.vm00.stdout:5/786: sync 2026-03-10T12:38:05.411 INFO:tasks.workunit.client.0.vm00.stdout:2/724: dwrite d4/d6/d2d/dc3/fdf [0,4194304] 0 2026-03-10T12:38:05.414 INFO:tasks.workunit.client.0.vm00.stdout:6/496: mkdir d2/d42/d80/d98/db2 0 2026-03-10T12:38:05.416 INFO:tasks.workunit.client.0.vm00.stdout:1/751: mkdir da/d24/d28/d67/dfb 0 2026-03-10T12:38:05.419 INFO:tasks.workunit.client.0.vm00.stdout:1/752: write da/d21/db3/d59/da6/da4/dda/fbb [507335,100764] 0 2026-03-10T12:38:05.422 INFO:tasks.workunit.client.0.vm00.stdout:1/753: fsync da/d24/d28/f37 0 2026-03-10T12:38:05.434 INFO:tasks.workunit.client.0.vm00.stdout:1/754: symlink da/d21/db3/d59/da6/da4/lfc 0 2026-03-10T12:38:05.437 INFO:tasks.workunit.client.0.vm00.stdout:1/755: mknod da/d24/d28/d67/da2/cfd 0 2026-03-10T12:38:05.459 INFO:tasks.workunit.client.1.vm07.stdout:9/643: write d5/d13/d2c/f41 [1757219,56893] 0 2026-03-10T12:38:05.459 INFO:tasks.workunit.client.1.vm07.stdout:8/573: readlink d1/d3/d6/l17 0 2026-03-10T12:38:05.495 INFO:tasks.workunit.client.0.vm00.stdout:4/753: write df/d1f/d22/d26/dab/d73/f7a [229244,97931] 0 2026-03-10T12:38:05.497 INFO:tasks.workunit.client.0.vm00.stdout:4/754: creat df/d63/ddb/ff8 x:0 0 0 2026-03-10T12:38:05.497 INFO:tasks.workunit.client.1.vm07.stdout:5/594: truncate d0/d22/d18/f4c 9196546 0 2026-03-10T12:38:05.498 INFO:tasks.workunit.client.0.vm00.stdout:4/755: creat df/d1f/ff9 x:0 0 0 2026-03-10T12:38:05.502 INFO:tasks.workunit.client.0.vm00.stdout:4/756: dwrite df/fac [4194304,4194304] 0 2026-03-10T12:38:05.504 INFO:tasks.workunit.client.0.vm00.stdout:4/757: chown df/d1f/d36/d3a/d41/fe0 4616998 1 2026-03-10T12:38:05.506 INFO:tasks.workunit.client.0.vm00.stdout:4/758: read df/d1f/d22/d26/dab/f89 [38429,94946] 0 2026-03-10T12:38:05.507 INFO:tasks.workunit.client.0.vm00.stdout:4/759: readlink df/d1f/d22/d26/d65/da7/lc4 0 2026-03-10T12:38:05.508 INFO:tasks.workunit.client.0.vm00.stdout:4/760: fsync df/f12 0 2026-03-10T12:38:05.511 INFO:tasks.workunit.client.1.vm07.stdout:1/560: write d9/fd [5156540,39763] 0 2026-03-10T12:38:05.511 INFO:tasks.workunit.client.0.vm00.stdout:4/761: rmdir df/d6c 39 2026-03-10T12:38:05.512 INFO:tasks.workunit.client.1.vm07.stdout:1/561: write d9/df/f26 [2489414,117889] 0 2026-03-10T12:38:05.519 INFO:tasks.workunit.client.0.vm00.stdout:4/762: symlink df/d63/d94/lfa 0 2026-03-10T12:38:05.519 INFO:tasks.workunit.client.1.vm07.stdout:2/472: dread d0/d29/d64/f67 [0,4194304] 0 2026-03-10T12:38:05.520 INFO:tasks.workunit.client.1.vm07.stdout:9/644: dwrite d5/d13/d57/d4f/d6a/f8e [0,4194304] 0 2026-03-10T12:38:05.534 INFO:tasks.workunit.client.1.vm07.stdout:3/604: dwrite dc/dd/d28/d3b/fa5 [8388608,4194304] 0 2026-03-10T12:38:05.540 INFO:tasks.workunit.client.1.vm07.stdout:8/574: unlink d1/d3/d18/c8b 0 2026-03-10T12:38:05.546 INFO:tasks.workunit.client.0.vm00.stdout:4/763: dread df/d1f/d36/d3a/d41/fc7 [0,4194304] 0 2026-03-10T12:38:05.551 INFO:tasks.workunit.client.0.vm00.stdout:4/764: dread df/d1f/d22/d26/dab/fd7 [0,4194304] 0 2026-03-10T12:38:05.561 INFO:tasks.workunit.client.1.vm07.stdout:4/688: write d0/d8e/fc4 [299469,69434] 0 2026-03-10T12:38:05.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:05 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:05.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:05 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:05.576 INFO:tasks.workunit.client.1.vm07.stdout:5/595: rename d0/d22/d18/d19/d36/d75/ld5 to d0/d22/d18/d19/d72/dcc/ld6 0 2026-03-10T12:38:05.577 INFO:tasks.workunit.client.1.vm07.stdout:5/596: write d0/d22/d18/d19/d21/d54/dcb/db8/fca [553515,87459] 0 2026-03-10T12:38:05.620 INFO:tasks.workunit.client.0.vm00.stdout:7/518: truncate da/d26/d37/f79 726274 0 2026-03-10T12:38:05.663 INFO:tasks.workunit.client.0.vm00.stdout:0/608: write d3/d22/f83 [1095862,123580] 0 2026-03-10T12:38:05.664 INFO:tasks.workunit.client.1.vm07.stdout:2/473: creat d0/d45/fa1 x:0 0 0 2026-03-10T12:38:05.692 INFO:tasks.workunit.client.0.vm00.stdout:9/762: dwrite d0/d3d/d59/d4e/dba/d1e/d2b/fc7 [0,4194304] 0 2026-03-10T12:38:05.697 INFO:tasks.workunit.client.0.vm00.stdout:3/763: dwrite dd/d18/d13/d1d/f69 [0,4194304] 0 2026-03-10T12:38:05.703 INFO:tasks.workunit.client.0.vm00.stdout:3/764: readlink dd/d2a/l57 0 2026-03-10T12:38:05.706 INFO:tasks.workunit.client.1.vm07.stdout:3/605: unlink dc/dd/f41 0 2026-03-10T12:38:05.706 INFO:tasks.workunit.client.1.vm07.stdout:3/606: chown dc/dd/db5/f73 0 1 2026-03-10T12:38:05.719 INFO:tasks.workunit.client.0.vm00.stdout:4/765: dread df/d32/d64/f67 [0,4194304] 0 2026-03-10T12:38:05.723 INFO:tasks.workunit.client.1.vm07.stdout:6/554: dwrite d1/d4/d6/f7d [0,4194304] 0 2026-03-10T12:38:05.728 INFO:tasks.workunit.client.1.vm07.stdout:8/575: mkdir d1/d3/d6/d7b/db8 0 2026-03-10T12:38:05.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:05 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:05.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:05 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:05.737 INFO:tasks.workunit.client.0.vm00.stdout:7/519: fdatasync da/f10 0 2026-03-10T12:38:05.742 INFO:tasks.workunit.client.1.vm07.stdout:7/541: link d0/d67/d6f/fa2 d0/d47/d48/d8a/d9d/fb1 0 2026-03-10T12:38:05.749 INFO:tasks.workunit.client.1.vm07.stdout:4/689: rename d0/d4/d5/da/d66/lc2 to d0/lf3 0 2026-03-10T12:38:05.777 INFO:tasks.workunit.client.1.vm07.stdout:2/474: symlink d0/la2 0 2026-03-10T12:38:05.778 INFO:tasks.workunit.client.1.vm07.stdout:3/607: creat dc/dd/d43/fce x:0 0 0 2026-03-10T12:38:05.779 INFO:tasks.workunit.client.1.vm07.stdout:8/576: chown d1/d3/d40/c96 5 1 2026-03-10T12:38:05.780 INFO:tasks.workunit.client.1.vm07.stdout:8/577: truncate d1/d3/d6/d54/fa8 135556 0 2026-03-10T12:38:05.782 INFO:tasks.workunit.client.0.vm00.stdout:7/520: mkdir da/d41/d7b/d9d/dba 0 2026-03-10T12:38:05.783 INFO:tasks.workunit.client.1.vm07.stdout:7/542: rmdir d0 39 2026-03-10T12:38:05.787 INFO:tasks.workunit.client.1.vm07.stdout:4/690: creat d0/d4/d10/d8d/db2/ff4 x:0 0 0 2026-03-10T12:38:05.788 INFO:tasks.workunit.client.1.vm07.stdout:4/691: truncate d0/d4/d10/d3c/d2b/d2d/d9c/fcc 1324162 0 2026-03-10T12:38:05.791 INFO:tasks.workunit.client.1.vm07.stdout:3/608: mkdir dc/dd/d1f/d6f/dcf 0 2026-03-10T12:38:05.806 INFO:tasks.workunit.client.0.vm00.stdout:8/613: dwrite d0/d93/d36/d5b/f69 [0,4194304] 0 2026-03-10T12:38:05.811 INFO:tasks.workunit.client.0.vm00.stdout:8/614: symlink d0/d93/d36/d5b/lc3 0 2026-03-10T12:38:05.812 INFO:tasks.workunit.client.0.vm00.stdout:5/787: dwrite d1f/d6a/d94/dc9/fc8 [0,4194304] 0 2026-03-10T12:38:05.812 INFO:tasks.workunit.client.0.vm00.stdout:8/615: readlink d0/d93/d2d/lbb 0 2026-03-10T12:38:05.812 INFO:tasks.workunit.client.1.vm07.stdout:8/578: symlink d1/d3/d6/d50/d70/lb9 0 2026-03-10T12:38:05.827 INFO:tasks.workunit.client.0.vm00.stdout:5/788: symlink d1f/d26/d2b/d35/d53/d72/d9d/d8e/l115 0 2026-03-10T12:38:05.828 INFO:tasks.workunit.client.0.vm00.stdout:5/789: readlink d1f/d26/d2b/d35/d53/d72/d9d/dcb/ld4 0 2026-03-10T12:38:05.829 INFO:tasks.workunit.client.0.vm00.stdout:5/790: chown d1f/d26/d2b/d35/d53/d72/d9d/dcb/ld4 251075291 1 2026-03-10T12:38:05.839 INFO:tasks.workunit.client.0.vm00.stdout:9/763: truncate d0/d7f/db8/dc4/db0/fe1 18962 0 2026-03-10T12:38:05.839 INFO:tasks.workunit.client.0.vm00.stdout:9/764: dread - d0/d5/f10d zero size 2026-03-10T12:38:05.842 INFO:tasks.workunit.client.1.vm07.stdout:1/562: write d9/df/d29/d2b/d31/d91/d59/f73 [1499988,57598] 0 2026-03-10T12:38:05.848 INFO:tasks.workunit.client.0.vm00.stdout:3/765: truncate dd/d27/d2c/f89 413457 0 2026-03-10T12:38:05.848 INFO:tasks.workunit.client.1.vm07.stdout:9/645: write d5/d69/d93/d97/fd9 [898763,94466] 0 2026-03-10T12:38:05.857 INFO:tasks.workunit.client.1.vm07.stdout:9/646: dread d5/f91 [0,4194304] 0 2026-03-10T12:38:05.859 INFO:tasks.workunit.client.0.vm00.stdout:9/765: dread d0/d7f/db8/dc4/db0/fbf [0,4194304] 0 2026-03-10T12:38:05.860 INFO:tasks.workunit.client.0.vm00.stdout:9/766: mkdir d0/d3d/d43/d114 0 2026-03-10T12:38:05.866 INFO:tasks.workunit.client.0.vm00.stdout:2/725: dwrite d4/dd/f3e [0,4194304] 0 2026-03-10T12:38:05.868 INFO:tasks.workunit.client.0.vm00.stdout:1/756: truncate da/d12/f1d 2683272 0 2026-03-10T12:38:05.869 INFO:tasks.workunit.client.1.vm07.stdout:4/692: unlink d0/d4/d5/da/l17 0 2026-03-10T12:38:05.869 INFO:tasks.workunit.client.0.vm00.stdout:1/757: stat da/d24/d28/d67/c46 0 2026-03-10T12:38:05.869 INFO:tasks.workunit.client.0.vm00.stdout:1/758: chown da/d24/d73/le6 214 1 2026-03-10T12:38:05.871 INFO:tasks.workunit.client.0.vm00.stdout:2/726: symlink d4/d53/d68/dc2/dd9/lee 0 2026-03-10T12:38:05.875 INFO:tasks.workunit.client.1.vm07.stdout:2/475: symlink d0/d29/d64/la3 0 2026-03-10T12:38:05.875 INFO:tasks.workunit.client.0.vm00.stdout:1/759: mkdir da/d21/db3/d59/da6/da4/dda/dc0/dfe 0 2026-03-10T12:38:05.882 INFO:tasks.workunit.client.0.vm00.stdout:0/609: write d3/d22/f42 [2979133,51226] 0 2026-03-10T12:38:05.884 INFO:tasks.workunit.client.0.vm00.stdout:3/766: dread dd/d18/d13/d99/da5/fcc [0,4194304] 0 2026-03-10T12:38:05.888 INFO:tasks.workunit.client.0.vm00.stdout:3/767: symlink dd/d3d/d84/lfc 0 2026-03-10T12:38:05.889 INFO:tasks.workunit.client.0.vm00.stdout:3/768: mkdir dd/d3d/d8a/de0/d55/dfd 0 2026-03-10T12:38:05.891 INFO:tasks.workunit.client.0.vm00.stdout:3/769: creat dd/d18/d13/d1d/dc6/ffe x:0 0 0 2026-03-10T12:38:05.894 INFO:tasks.workunit.client.1.vm07.stdout:8/579: rmdir d1/d3/d40/d92/db6 39 2026-03-10T12:38:05.896 INFO:tasks.workunit.client.0.vm00.stdout:3/770: dread dd/d18/f83 [0,4194304] 0 2026-03-10T12:38:05.896 INFO:tasks.workunit.client.0.vm00.stdout:0/610: mknod d3/d7/d4c/d5b/d38/db3/cc9 0 2026-03-10T12:38:05.900 INFO:tasks.workunit.client.0.vm00.stdout:0/611: dwrite d3/d7/d4c/d5b/f57 [0,4194304] 0 2026-03-10T12:38:05.901 INFO:tasks.workunit.client.0.vm00.stdout:0/612: read d3/d7/d4c/f96 [919690,45533] 0 2026-03-10T12:38:05.901 INFO:tasks.workunit.client.0.vm00.stdout:9/767: sync 2026-03-10T12:38:05.912 INFO:tasks.workunit.client.0.vm00.stdout:0/613: creat d3/d7/d4c/d5b/d38/db3/fca x:0 0 0 2026-03-10T12:38:05.951 INFO:tasks.workunit.client.0.vm00.stdout:4/766: creat df/d6c/ffb x:0 0 0 2026-03-10T12:38:05.962 INFO:tasks.workunit.client.0.vm00.stdout:0/614: symlink d3/d7/d3c/lcb 0 2026-03-10T12:38:05.962 INFO:tasks.workunit.client.0.vm00.stdout:0/615: chown d3/d33 5 1 2026-03-10T12:38:05.963 INFO:tasks.workunit.client.0.vm00.stdout:0/616: write d3/db/d24/d25/fbd [251116,20688] 0 2026-03-10T12:38:05.964 INFO:tasks.workunit.client.0.vm00.stdout:0/617: truncate d3/d7/d4c/d9d/fc8 168123 0 2026-03-10T12:38:05.964 INFO:tasks.workunit.client.0.vm00.stdout:0/618: readlink d3/d7/d4c/d5b/d38/l51 0 2026-03-10T12:38:05.965 INFO:tasks.workunit.client.0.vm00.stdout:0/619: chown d3/db/d77/d82/lb5 748394149 1 2026-03-10T12:38:05.965 INFO:tasks.workunit.client.0.vm00.stdout:0/620: chown d3/d22/f2e 2016 1 2026-03-10T12:38:05.989 INFO:tasks.workunit.client.1.vm07.stdout:5/597: dwrite d0/d22/d18/f20 [0,4194304] 0 2026-03-10T12:38:05.990 INFO:tasks.workunit.client.1.vm07.stdout:5/598: chown d0/d22/d18/d19/d72/dcc 1 1 2026-03-10T12:38:05.991 INFO:tasks.workunit.client.0.vm00.stdout:5/791: write d1f/d26/d2b/d35/f50 [1270901,66173] 0 2026-03-10T12:38:06.004 INFO:tasks.workunit.client.0.vm00.stdout:7/521: truncate da/d25/d2c/d82/d68/f38 639318 0 2026-03-10T12:38:06.015 INFO:tasks.workunit.client.0.vm00.stdout:0/621: mkdir d3/d7/d4c/dcc 0 2026-03-10T12:38:06.018 INFO:tasks.workunit.client.0.vm00.stdout:2/727: dwrite d4/d53/d76/d9b/dad/f50 [4194304,4194304] 0 2026-03-10T12:38:06.021 INFO:tasks.workunit.client.0.vm00.stdout:2/728: mkdir d4/dd/def 0 2026-03-10T12:38:06.026 INFO:tasks.workunit.client.0.vm00.stdout:7/522: creat da/d26/d37/d56/fbb x:0 0 0 2026-03-10T12:38:06.029 INFO:tasks.workunit.client.0.vm00.stdout:2/729: fsync d4/f1d 0 2026-03-10T12:38:06.038 INFO:tasks.workunit.client.0.vm00.stdout:1/760: truncate da/d24/d5a/f7c 441680 0 2026-03-10T12:38:06.045 INFO:tasks.workunit.client.0.vm00.stdout:2/730: dread d4/d6/f30 [0,4194304] 0 2026-03-10T12:38:06.047 INFO:tasks.workunit.client.0.vm00.stdout:2/731: truncate d4/d6/f89 303783 0 2026-03-10T12:38:06.052 INFO:tasks.workunit.client.0.vm00.stdout:1/761: dread da/d24/d28/d67/da2/d78/f86 [0,4194304] 0 2026-03-10T12:38:06.052 INFO:tasks.workunit.client.0.vm00.stdout:1/762: chown da/d24/d28/d67/da2/cfd 4361726 1 2026-03-10T12:38:06.053 INFO:tasks.workunit.client.0.vm00.stdout:2/732: symlink d4/d6/d2d/d3a/d43/lf0 0 2026-03-10T12:38:06.059 INFO:tasks.workunit.client.0.vm00.stdout:4/767: dwrite f3 [0,4194304] 0 2026-03-10T12:38:06.060 INFO:tasks.workunit.client.0.vm00.stdout:4/768: chown df/d1f/d22/leb 578029048 1 2026-03-10T12:38:06.061 INFO:tasks.workunit.client.0.vm00.stdout:0/622: creat d3/d7/d4c/d5b/d38/db3/fcd x:0 0 0 2026-03-10T12:38:06.071 INFO:tasks.workunit.client.0.vm00.stdout:4/769: link df/d6c/d90/cc1 df/d32/d76/cfc 0 2026-03-10T12:38:06.072 INFO:tasks.workunit.client.0.vm00.stdout:4/770: creat df/d93/dbc/ffd x:0 0 0 2026-03-10T12:38:06.074 INFO:tasks.workunit.client.0.vm00.stdout:4/771: symlink df/d93/dbc/lfe 0 2026-03-10T12:38:06.076 INFO:tasks.workunit.client.0.vm00.stdout:3/771: mknod dd/d18/d13/d99/cff 0 2026-03-10T12:38:06.076 INFO:tasks.workunit.client.0.vm00.stdout:3/772: readlink dd/l1f 0 2026-03-10T12:38:06.077 INFO:tasks.workunit.client.0.vm00.stdout:3/773: chown dd/d2a/da2/de1/f60 201 1 2026-03-10T12:38:06.081 INFO:tasks.workunit.client.0.vm00.stdout:0/623: truncate d3/d7/d3c/d74/f78 826641 0 2026-03-10T12:38:06.083 INFO:tasks.workunit.client.0.vm00.stdout:0/624: readlink d3/d7/d3c/l20 0 2026-03-10T12:38:06.084 INFO:tasks.workunit.client.0.vm00.stdout:2/733: dread d4/d6/dca/f3f [0,4194304] 0 2026-03-10T12:38:06.085 INFO:tasks.workunit.client.0.vm00.stdout:2/734: fsync d4/dd/fe2 0 2026-03-10T12:38:06.086 INFO:tasks.workunit.client.0.vm00.stdout:0/625: creat d3/db/d77/d82/fce x:0 0 0 2026-03-10T12:38:06.087 INFO:tasks.workunit.client.0.vm00.stdout:2/735: creat d4/d6/d2d/dc3/de1/ff1 x:0 0 0 2026-03-10T12:38:06.087 INFO:tasks.workunit.client.0.vm00.stdout:2/736: fdatasync d4/dd/da7/fd2 0 2026-03-10T12:38:06.088 INFO:tasks.workunit.client.0.vm00.stdout:0/626: symlink d3/db/d77/lcf 0 2026-03-10T12:38:06.088 INFO:tasks.workunit.client.0.vm00.stdout:0/627: stat d3/db/d77/f8a 0 2026-03-10T12:38:06.089 INFO:tasks.workunit.client.0.vm00.stdout:0/628: dread - d3/db/d24/d25/f7d zero size 2026-03-10T12:38:06.090 INFO:tasks.workunit.client.0.vm00.stdout:2/737: creat d4/d6/ff2 x:0 0 0 2026-03-10T12:38:06.092 INFO:tasks.workunit.client.0.vm00.stdout:2/738: symlink d4/d53/d68/dc2/dd9/lf3 0 2026-03-10T12:38:06.092 INFO:tasks.workunit.client.0.vm00.stdout:2/739: dread - d4/d6/d2d/dc3/fce zero size 2026-03-10T12:38:06.093 INFO:tasks.workunit.client.0.vm00.stdout:2/740: symlink d4/d6/dca/lf4 0 2026-03-10T12:38:06.095 INFO:tasks.workunit.client.0.vm00.stdout:2/741: dread - d4/d53/d76/d9b/dad/f65 zero size 2026-03-10T12:38:06.095 INFO:tasks.workunit.client.0.vm00.stdout:2/742: chown d4/d6/d2d/d3a/fbd 102896 1 2026-03-10T12:38:06.096 INFO:tasks.workunit.client.0.vm00.stdout:2/743: stat d4/d6/d2d/c9d 0 2026-03-10T12:38:06.097 INFO:tasks.workunit.client.0.vm00.stdout:2/744: write d4/d6/d2d/d3a/fbd [2918273,12435] 0 2026-03-10T12:38:06.098 INFO:tasks.workunit.client.1.vm07.stdout:7/543: fsync d0/f14 0 2026-03-10T12:38:06.102 INFO:tasks.workunit.client.0.vm00.stdout:7/523: getdents da/d25/d2e/d4c 0 2026-03-10T12:38:06.106 INFO:tasks.workunit.client.1.vm07.stdout:9/647: rename d5/d13/d6c/da4/ddd to d5/d13/d9b/dde 0 2026-03-10T12:38:06.115 INFO:tasks.workunit.client.1.vm07.stdout:2/476: chown d0/d42/f5f 12 1 2026-03-10T12:38:06.116 INFO:tasks.workunit.client.0.vm00.stdout:0/629: unlink d3/d40/d65/f8f 0 2026-03-10T12:38:06.116 INFO:tasks.workunit.client.1.vm07.stdout:3/609: mkdir dc/dd/d28/dd0 0 2026-03-10T12:38:06.117 INFO:tasks.workunit.client.1.vm07.stdout:3/610: write dc/d18/f36 [816888,39002] 0 2026-03-10T12:38:06.120 INFO:tasks.workunit.client.0.vm00.stdout:1/763: read da/d21/d39/f55 [3069701,8748] 0 2026-03-10T12:38:06.129 INFO:tasks.workunit.client.1.vm07.stdout:6/555: getdents d1/d4/d6/d16/d1a/d2c 0 2026-03-10T12:38:06.131 INFO:tasks.workunit.client.0.vm00.stdout:8/616: symlink d0/lc4 0 2026-03-10T12:38:06.132 INFO:tasks.workunit.client.0.vm00.stdout:8/617: fsync d0/d93/d36/f41 0 2026-03-10T12:38:06.132 INFO:tasks.workunit.client.0.vm00.stdout:8/618: stat d0/d93/d36/d5b/c97 0 2026-03-10T12:38:06.134 INFO:tasks.workunit.client.0.vm00.stdout:8/619: creat d0/d58/d68/fc5 x:0 0 0 2026-03-10T12:38:06.135 INFO:tasks.workunit.client.0.vm00.stdout:8/620: creat d0/d46/fc6 x:0 0 0 2026-03-10T12:38:06.138 INFO:tasks.workunit.client.1.vm07.stdout:0/622: dread d0/d14/d5f/d3b/f4b [0,4194304] 0 2026-03-10T12:38:06.138 INFO:tasks.workunit.client.0.vm00.stdout:8/621: dwrite d0/d93/d36/f41 [0,4194304] 0 2026-03-10T12:38:06.139 INFO:tasks.workunit.client.0.vm00.stdout:8/622: write d0/d5c/fa0 [631110,67831] 0 2026-03-10T12:38:06.144 INFO:tasks.workunit.client.1.vm07.stdout:5/599: creat d0/d22/d18/d19/d36/d75/d77/fd7 x:0 0 0 2026-03-10T12:38:06.162 INFO:tasks.workunit.client.0.vm00.stdout:7/524: creat da/d41/d48/fbc x:0 0 0 2026-03-10T12:38:06.162 INFO:tasks.workunit.client.1.vm07.stdout:1/563: truncate d9/d2d/d4f/d5a/f93 1511325 0 2026-03-10T12:38:06.165 INFO:tasks.workunit.client.1.vm07.stdout:9/648: chown d5/c3a 1206765807 1 2026-03-10T12:38:06.187 INFO:tasks.workunit.client.1.vm07.stdout:5/600: read - d0/d22/d18/d3e/d53/faa zero size 2026-03-10T12:38:06.190 INFO:tasks.workunit.client.1.vm07.stdout:9/649: creat d5/d13/d6c/fdf x:0 0 0 2026-03-10T12:38:06.191 INFO:tasks.workunit.client.1.vm07.stdout:9/650: write d5/d16/d23/fc8 [1023648,60367] 0 2026-03-10T12:38:06.194 INFO:tasks.workunit.client.1.vm07.stdout:5/601: creat d0/d22/d18/d19/d72/fd8 x:0 0 0 2026-03-10T12:38:06.195 INFO:tasks.workunit.client.1.vm07.stdout:5/602: chown d0/d22/c44 13 1 2026-03-10T12:38:06.200 INFO:tasks.workunit.client.1.vm07.stdout:5/603: mkdir d0/d22/d18/d19/d2e/d67/dd9 0 2026-03-10T12:38:06.201 INFO:tasks.workunit.client.1.vm07.stdout:5/604: stat d0/d22/d18/d19/d21/d3a/c7f 0 2026-03-10T12:38:06.203 INFO:tasks.workunit.client.1.vm07.stdout:9/651: unlink d5/d16/da3/cca 0 2026-03-10T12:38:06.205 INFO:tasks.workunit.client.1.vm07.stdout:5/605: rmdir d0/d22/d18/d19/d21/d3a 39 2026-03-10T12:38:06.212 INFO:tasks.workunit.client.1.vm07.stdout:5/606: rmdir d0/d22/d18/d19/d21/d54/dcb/db8 39 2026-03-10T12:38:06.216 INFO:tasks.workunit.client.1.vm07.stdout:5/607: mknod d0/d22/d18/d19/d36/d75/d77/cda 0 2026-03-10T12:38:06.225 INFO:tasks.workunit.client.0.vm00.stdout:8/623: dread d0/d93/d36/d5b/f6b [0,4194304] 0 2026-03-10T12:38:06.225 INFO:tasks.workunit.client.0.vm00.stdout:4/772: write df/d32/d76/f7e [874174,55387] 0 2026-03-10T12:38:06.225 INFO:tasks.workunit.client.0.vm00.stdout:8/624: truncate d0/d93/d17/fb2 163546 0 2026-03-10T12:38:06.226 INFO:tasks.workunit.client.1.vm07.stdout:5/608: chown d0/d22/d18/d19/d21/d54 0 1 2026-03-10T12:38:06.226 INFO:tasks.workunit.client.1.vm07.stdout:5/609: rename d0/d22/d18/f97 to d0/d22/d18/d19/d36/d75/fdb 0 2026-03-10T12:38:06.226 INFO:tasks.workunit.client.1.vm07.stdout:5/610: fsync d0/d22/f27 0 2026-03-10T12:38:06.226 INFO:tasks.workunit.client.1.vm07.stdout:1/564: sync 2026-03-10T12:38:06.237 INFO:tasks.workunit.client.1.vm07.stdout:2/477: write d0/d29/d64/d6c/f71 [4050851,51793] 0 2026-03-10T12:38:06.238 INFO:tasks.workunit.client.0.vm00.stdout:8/625: stat d0/d93/d17/l3b 0 2026-03-10T12:38:06.238 INFO:tasks.workunit.client.0.vm00.stdout:0/630: write d3/d7/d3c/f72 [4138913,104851] 0 2026-03-10T12:38:06.240 INFO:tasks.workunit.client.1.vm07.stdout:4/693: dwrite d0/d4/d10/f36 [4194304,4194304] 0 2026-03-10T12:38:06.241 INFO:tasks.workunit.client.0.vm00.stdout:4/773: creat df/d1f/d22/d26/d65/d91/fff x:0 0 0 2026-03-10T12:38:06.250 INFO:tasks.workunit.client.1.vm07.stdout:8/580: write d1/d3/d6/f24 [2327909,97094] 0 2026-03-10T12:38:06.250 INFO:tasks.workunit.client.1.vm07.stdout:3/611: write f2 [800195,34120] 0 2026-03-10T12:38:06.250 INFO:tasks.workunit.client.1.vm07.stdout:0/623: write d0/d14/d5f/d76/d2f/d31/d79/f7b [3309082,957] 0 2026-03-10T12:38:06.251 INFO:tasks.workunit.client.1.vm07.stdout:8/581: chown d1/d3/l99 17 1 2026-03-10T12:38:06.252 INFO:tasks.workunit.client.1.vm07.stdout:0/624: write d0/d14/d5f/d76/d2f/d31/d79/f7b [4447739,111366] 0 2026-03-10T12:38:06.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.252+0000 7fb463003700 1 -- 192.168.123.100:0/1634774535 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c071a60 msgr2=0x7fb45c071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:06.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.252+0000 7fb463003700 1 --2- 192.168.123.100:0/1634774535 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c071a60 0x7fb45c071e70 secure :-1 s=READY pgs=337 cs=0 l=1 rev1=1 crypto rx=0x7fb458009b00 tx=0x7fb458009e10 comp rx=0 tx=0).stop 2026-03-10T12:38:06.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.252+0000 7fb463003700 1 -- 192.168.123.100:0/1634774535 shutdown_connections 2026-03-10T12:38:06.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.252+0000 7fb463003700 1 --2- 192.168.123.100:0/1634774535 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb45c072440 0x7fb45c10be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.252+0000 7fb463003700 1 --2- 192.168.123.100:0/1634774535 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c071a60 0x7fb45c071e70 unknown :-1 s=CLOSED pgs=337 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.252+0000 7fb463003700 1 -- 192.168.123.100:0/1634774535 >> 192.168.123.100:0/1634774535 conn(0x7fb45c06d1a0 msgr2=0x7fb45c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:06.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.253+0000 7fb463003700 1 -- 192.168.123.100:0/1634774535 shutdown_connections 2026-03-10T12:38:06.253 INFO:tasks.workunit.client.0.vm00.stdout:7/525: write da/d3f/d60/f85 [1212706,30502] 0 2026-03-10T12:38:06.253 INFO:tasks.workunit.client.1.vm07.stdout:6/556: dwrite d1/d4/d6/f41 [0,4194304] 0 2026-03-10T12:38:06.254 INFO:tasks.workunit.client.1.vm07.stdout:6/557: fdatasync d1/d4/d6/d16/f50 0 2026-03-10T12:38:06.261 INFO:tasks.workunit.client.1.vm07.stdout:7/544: dwrite d0/f9b [0,4194304] 0 2026-03-10T12:38:06.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.253+0000 7fb463003700 1 -- 192.168.123.100:0/1634774535 wait complete. 2026-03-10T12:38:06.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.255+0000 7fb463003700 1 Processor -- start 2026-03-10T12:38:06.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.256+0000 7fb463003700 1 -- start start 2026-03-10T12:38:06.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.256+0000 7fb463003700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb45c071a60 0x7fb45c116a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:06.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.256+0000 7fb463003700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c072440 0x7fb45c116f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:06.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.256+0000 7fb463003700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb45c117570 con 0x7fb45c072440 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.256+0000 7fb463003700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb45c1b2790 con 0x7fb45c071a60 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.256+0000 7fb461800700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c072440 0x7fb45c116f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.257+0000 7fb461800700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c072440 0x7fb45c116f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:49842/0 (socket says 192.168.123.100:49842) 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.257+0000 7fb461800700 1 -- 192.168.123.100:0/4082003049 learned_addr learned my addr 192.168.123.100:0/4082003049 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.257+0000 7fb461800700 1 -- 192.168.123.100:0/4082003049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb45c071a60 msgr2=0x7fb45c116a10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.257+0000 7fb461800700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb45c071a60 0x7fb45c116a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.257+0000 7fb461800700 1 -- 192.168.123.100:0/4082003049 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb4580097e0 con 0x7fb45c072440 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.257+0000 7fb461800700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c072440 0x7fb45c116f50 secure :-1 s=READY pgs=338 cs=0 l=1 rev1=1 crypto rx=0x7fb44c00b700 tx=0x7fb44c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.257+0000 7fb452ffd700 1 -- 192.168.123.100:0/4082003049 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb44c010820 con 0x7fb45c072440 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.258+0000 7fb463003700 1 -- 192.168.123.100:0/4082003049 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb45c1b2990 con 0x7fb45c072440 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.258+0000 7fb463003700 1 -- 192.168.123.100:0/4082003049 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb45c1b2e90 con 0x7fb45c072440 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.258+0000 7fb452ffd700 1 -- 192.168.123.100:0/4082003049 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb44c010e60 con 0x7fb45c072440 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.258+0000 7fb452ffd700 1 -- 192.168.123.100:0/4082003049 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb44c017570 con 0x7fb45c072440 2026-03-10T12:38:06.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.258+0000 7fb463003700 1 -- 192.168.123.100:0/4082003049 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb45c110c20 con 0x7fb45c072440 2026-03-10T12:38:06.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.261+0000 7fb452ffd700 1 -- 192.168.123.100:0/4082003049 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 19) v1 ==== 89910+0+0 (secure 0 0 0) 0x7fb44c00f830 con 0x7fb45c072440 2026-03-10T12:38:06.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.261+0000 7fb452ffd700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb44806c680 0x7fb44806eb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:06.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.261+0000 7fb462001700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb44806c680 0x7fb44806eb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:06.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.262+0000 7fb462001700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb44806c680 0x7fb44806eb30 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7fb458009ad0 tx=0x7fb458009f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:06.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.262+0000 7fb452ffd700 1 -- 192.168.123.100:0/4082003049 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb44c059a70 con 0x7fb45c072440 2026-03-10T12:38:06.268 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.267+0000 7fb452ffd700 1 -- 192.168.123.100:0/4082003049 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fb44c0595b0 con 0x7fb45c072440 2026-03-10T12:38:06.272 INFO:tasks.workunit.client.1.vm07.stdout:9/652: write d5/d13/f14 [3301341,18959] 0 2026-03-10T12:38:06.275 INFO:tasks.workunit.client.0.vm00.stdout:0/631: truncate d3/d7/d4c/d5b/f2a 2830172 0 2026-03-10T12:38:06.275 INFO:tasks.workunit.client.0.vm00.stdout:0/632: stat d3/db/d24/d25/l67 0 2026-03-10T12:38:06.276 INFO:tasks.workunit.client.0.vm00.stdout:0/633: readlink d3/d22/da5/lb7 0 2026-03-10T12:38:06.279 INFO:tasks.workunit.client.1.vm07.stdout:5/611: unlink d0/l3 0 2026-03-10T12:38:06.284 INFO:tasks.workunit.client.0.vm00.stdout:7/526: mknod da/d47/d87/cbd 0 2026-03-10T12:38:06.296 INFO:tasks.workunit.client.0.vm00.stdout:6/497: rename d2/da/dc/d83/c86 to d2/d51/cb3 0 2026-03-10T12:38:06.301 INFO:tasks.workunit.client.0.vm00.stdout:7/527: rmdir da/d3f/d71 39 2026-03-10T12:38:06.306 INFO:tasks.workunit.client.0.vm00.stdout:9/768: rename d0/dc2 to d0/d3d/d59/d4e/dba/d1e/d27/d115 0 2026-03-10T12:38:06.328 INFO:tasks.workunit.client.0.vm00.stdout:9/769: unlink d0/d3d/d59/d4e/l7e 0 2026-03-10T12:38:06.339 INFO:tasks.workunit.client.1.vm07.stdout:0/625: mkdir d0/d14/d5f/d76/d2f/d31/d79/dcc 0 2026-03-10T12:38:06.340 INFO:tasks.workunit.client.1.vm07.stdout:0/626: write d0/d14/d5f/d3b/fcb [975977,74117] 0 2026-03-10T12:38:06.341 INFO:tasks.workunit.client.1.vm07.stdout:7/545: rename d0/l41 to d0/d57/d62/d90/lb2 0 2026-03-10T12:38:06.343 INFO:tasks.workunit.client.0.vm00.stdout:5/792: rename d1f/d26/d2b/d35/c36 to d1f/d26/d2b/d37/dbf/c116 0 2026-03-10T12:38:06.349 INFO:tasks.workunit.client.1.vm07.stdout:9/653: mkdir d5/d1f/d5e/d6b/de0 0 2026-03-10T12:38:06.354 INFO:tasks.workunit.client.0.vm00.stdout:0/634: dwrite d3/d7/d4c/d5b/f88 [0,4194304] 0 2026-03-10T12:38:06.367 INFO:tasks.workunit.client.1.vm07.stdout:1/565: dwrite d9/df/d29/d2b/f4e [4194304,4194304] 0 2026-03-10T12:38:06.371 INFO:tasks.workunit.client.0.vm00.stdout:7/528: dwrite da/d26/d37/f6f [0,4194304] 0 2026-03-10T12:38:06.371 INFO:tasks.workunit.client.1.vm07.stdout:2/478: symlink d0/d45/la4 0 2026-03-10T12:38:06.373 INFO:tasks.workunit.client.0.vm00.stdout:5/793: dwrite d1f/d26/d6f/fa9 [0,4194304] 0 2026-03-10T12:38:06.373 INFO:tasks.workunit.client.0.vm00.stdout:3/774: rename dd/d3d/d84 to dd/d2a/da2/de1/d100 0 2026-03-10T12:38:06.378 INFO:tasks.workunit.client.1.vm07.stdout:8/582: mkdir d1/d3/d40/d92/dba 0 2026-03-10T12:38:06.378 INFO:tasks.workunit.client.1.vm07.stdout:6/558: mkdir d1/d4/d6/d16/d1a/d9d/db2 0 2026-03-10T12:38:06.383 INFO:tasks.workunit.client.0.vm00.stdout:2/745: rename d4/d53/d76/ccd to d4/d6/cf5 0 2026-03-10T12:38:06.386 INFO:tasks.workunit.client.1.vm07.stdout:3/612: rename dc/dd/d43/d76/d95/da0/cc8 to dc/dd/d43/d76/d95/cd1 0 2026-03-10T12:38:06.387 INFO:tasks.workunit.client.0.vm00.stdout:7/529: symlink da/d26/d50/d73/lbe 0 2026-03-10T12:38:06.389 INFO:tasks.workunit.client.0.vm00.stdout:0/635: link d3/d40/l7f d3/d7/ld0 0 2026-03-10T12:38:06.389 INFO:tasks.workunit.client.0.vm00.stdout:0/636: write d3/db/d24/d25/fb8 [4200637,385] 0 2026-03-10T12:38:06.395 INFO:tasks.workunit.client.0.vm00.stdout:5/794: symlink d1f/d39/l117 0 2026-03-10T12:38:06.397 INFO:tasks.workunit.client.1.vm07.stdout:1/566: symlink d9/d2d/d80/d8e/lbd 0 2026-03-10T12:38:06.399 INFO:tasks.workunit.client.1.vm07.stdout:5/612: getdents d0/dbf 0 2026-03-10T12:38:06.401 INFO:tasks.workunit.client.1.vm07.stdout:1/567: dread d9/df/d29/d2b/f4e [4194304,4194304] 0 2026-03-10T12:38:06.402 INFO:tasks.workunit.client.0.vm00.stdout:7/530: creat da/d26/d50/fbf x:0 0 0 2026-03-10T12:38:06.405 INFO:tasks.workunit.client.1.vm07.stdout:6/559: sync 2026-03-10T12:38:06.413 INFO:tasks.workunit.client.1.vm07.stdout:0/627: getdents d0/d14/d5f/d76/d2f/d31/d79/dcc 0 2026-03-10T12:38:06.413 INFO:tasks.workunit.client.1.vm07.stdout:6/560: sync 2026-03-10T12:38:06.419 INFO:tasks.workunit.client.0.vm00.stdout:1/764: rename da/d24/d28/d67/da2/d78/c7f to da/d12/d26/cff 0 2026-03-10T12:38:06.420 INFO:tasks.workunit.client.1.vm07.stdout:1/568: symlink d9/df/d55/lbe 0 2026-03-10T12:38:06.421 INFO:tasks.workunit.client.1.vm07.stdout:2/479: creat d0/d29/d64/d74/d75/fa5 x:0 0 0 2026-03-10T12:38:06.421 INFO:tasks.workunit.client.1.vm07.stdout:3/613: dread dc/dd/d28/f46 [0,4194304] 0 2026-03-10T12:38:06.422 INFO:tasks.workunit.client.1.vm07.stdout:3/614: chown dc/dd/d43/c64 5501 1 2026-03-10T12:38:06.432 INFO:tasks.workunit.client.0.vm00.stdout:3/775: mkdir dd/d2a/da2/de1/d101 0 2026-03-10T12:38:06.432 INFO:tasks.workunit.client.0.vm00.stdout:3/776: fdatasync dd/d2a/da2/db4/fdb 0 2026-03-10T12:38:06.434 INFO:tasks.workunit.client.0.vm00.stdout:5/795: fsync d1f/d6a/d94/fb3 0 2026-03-10T12:38:06.434 INFO:tasks.workunit.client.0.vm00.stdout:5/796: stat d1f/d26/d2b/l3f 0 2026-03-10T12:38:06.435 INFO:tasks.workunit.client.1.vm07.stdout:6/561: mknod d1/d4/d6/d16/d49/cb3 0 2026-03-10T12:38:06.439 INFO:tasks.workunit.client.1.vm07.stdout:2/480: creat d0/d29/d64/fa6 x:0 0 0 2026-03-10T12:38:06.441 INFO:tasks.workunit.client.1.vm07.stdout:1/569: rename d9/df/d29/d2b/d3d/f47 to d9/df/d29/d2b/d3d/fbf 0 2026-03-10T12:38:06.448 INFO:tasks.workunit.client.1.vm07.stdout:3/615: rename dc/dd/f21 to dc/d18/d99/da3/fd2 0 2026-03-10T12:38:06.450 INFO:tasks.workunit.client.1.vm07.stdout:4/694: write d0/d4/d10/d9a/db9/fef [238305,119537] 0 2026-03-10T12:38:06.453 INFO:tasks.workunit.client.1.vm07.stdout:0/628: symlink d0/d14/d5f/d76/lcd 0 2026-03-10T12:38:06.457 INFO:tasks.workunit.client.1.vm07.stdout:9/654: getdents d5/d69 0 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.455+0000 7fb463003700 1 -- 192.168.123.100:0/4082003049 --> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb45c061190 con 0x7fb44806c680 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.458+0000 7fb452ffd700 1 -- 192.168.123.100:0/4082003049 <== mgr.14223 v2:192.168.123.100:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+349 (secure 0 0 0) 0x7fb45c061190 con 0x7fb44806c680 2026-03-10T12:38:06.462 INFO:tasks.workunit.client.1.vm07.stdout:9/655: dwrite d5/d16/d23/fc8 [0,4194304] 0 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.461+0000 7fb450ff9700 1 -- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb44806c680 msgr2=0x7fb44806eb30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.461+0000 7fb450ff9700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb44806c680 0x7fb44806eb30 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7fb458009ad0 tx=0x7fb458009f90 comp rx=0 tx=0).stop 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 -- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c072440 msgr2=0x7fb45c116f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c072440 0x7fb45c116f50 secure :-1 s=READY pgs=338 cs=0 l=1 rev1=1 crypto rx=0x7fb44c00b700 tx=0x7fb44c00bac0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 -- 192.168.123.100:0/4082003049 shutdown_connections 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:6800/2,v1:192.168.123.100:6801/2] conn(0x7fb44806c680 0x7fb44806eb30 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb45c071a60 0x7fb45c116a10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 --2- 192.168.123.100:0/4082003049 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb45c072440 0x7fb45c116f50 unknown :-1 s=CLOSED pgs=338 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 -- 192.168.123.100:0/4082003049 >> 192.168.123.100:0/4082003049 conn(0x7fb45c06d1a0 msgr2=0x7fb45c10b4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:06.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 -- 192.168.123.100:0/4082003049 shutdown_connections 2026-03-10T12:38:06.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.462+0000 7fb450ff9700 1 -- 192.168.123.100:0/4082003049 wait complete. 2026-03-10T12:38:06.467 INFO:tasks.workunit.client.1.vm07.stdout:4/695: sync 2026-03-10T12:38:06.476 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:38:06.480 INFO:tasks.workunit.client.1.vm07.stdout:1/570: truncate d9/df/d54/f7a 646233 0 2026-03-10T12:38:06.484 INFO:tasks.workunit.client.0.vm00.stdout:6/498: rename d2/f9 to d2/da/dc/d2f/fb4 0 2026-03-10T12:38:06.485 INFO:tasks.workunit.client.1.vm07.stdout:0/629: symlink d0/d14/d7c/lce 0 2026-03-10T12:38:06.487 INFO:tasks.workunit.client.1.vm07.stdout:6/562: rename d1/d4/d44/l8c to d1/d4/d6/d53/lb4 0 2026-03-10T12:38:06.487 INFO:tasks.workunit.client.1.vm07.stdout:6/563: fdatasync d1/d4/d6/f41 0 2026-03-10T12:38:06.496 INFO:tasks.workunit.client.0.vm00.stdout:6/499: write d2/d39/f9b [299576,95286] 0 2026-03-10T12:38:06.497 INFO:tasks.workunit.client.0.vm00.stdout:7/531: rename da/d1b/d40/l8a to da/d47/lc0 0 2026-03-10T12:38:06.505 INFO:tasks.workunit.client.1.vm07.stdout:4/696: symlink d0/d4/d10/d5f/lf5 0 2026-03-10T12:38:06.512 INFO:tasks.workunit.client.0.vm00.stdout:7/532: mknod da/d1b/d40/cc1 0 2026-03-10T12:38:06.521 INFO:tasks.workunit.client.1.vm07.stdout:5/613: dwrite d0/d22/d18/d19/d2e/d67/fa0 [4194304,4194304] 0 2026-03-10T12:38:06.521 INFO:tasks.workunit.client.0.vm00.stdout:0/637: dwrite d3/d7/d4c/d5b/f9b [0,4194304] 0 2026-03-10T12:38:06.524 INFO:tasks.workunit.client.0.vm00.stdout:0/638: read d3/d7/d4c/d5b/d38/f81 [749413,101568] 0 2026-03-10T12:38:06.524 INFO:tasks.workunit.client.0.vm00.stdout:0/639: readlink d3/db/d24/d25/lab 0 2026-03-10T12:38:06.526 INFO:tasks.workunit.client.0.vm00.stdout:6/500: mknod d2/d14/cb5 0 2026-03-10T12:38:06.526 INFO:tasks.workunit.client.1.vm07.stdout:8/583: dwrite d1/d3/d6c/fa7 [0,4194304] 0 2026-03-10T12:38:06.532 INFO:tasks.workunit.client.1.vm07.stdout:4/697: rename d0/d4/d7a to d0/d4/df2/df6 0 2026-03-10T12:38:06.532 INFO:tasks.workunit.client.1.vm07.stdout:4/698: chown d0/d4/d10/d9a/db9/fef 1832426 1 2026-03-10T12:38:06.533 INFO:tasks.workunit.client.1.vm07.stdout:4/699: stat d0/d4/d10/d3c/d2b/d54/de1/cca 0 2026-03-10T12:38:06.538 INFO:tasks.workunit.client.0.vm00.stdout:0/640: dwrite d3/d7/d3c/fba [0,4194304] 0 2026-03-10T12:38:06.552 INFO:tasks.workunit.client.0.vm00.stdout:6/501: unlink d2/d16/d29/fa6 0 2026-03-10T12:38:06.553 INFO:tasks.workunit.client.0.vm00.stdout:6/502: chown d2/da/dc/c5b 30959865 1 2026-03-10T12:38:06.553 INFO:tasks.workunit.client.1.vm07.stdout:5/614: mkdir d0/d22/d18/d19/d36/d75/ddc 0 2026-03-10T12:38:06.553 INFO:tasks.workunit.client.0.vm00.stdout:6/503: readlink d2/d16/d29/d31/d34/la2 0 2026-03-10T12:38:06.553 INFO:tasks.workunit.client.0.vm00.stdout:6/504: write d2/d39/f9b [1423391,44579] 0 2026-03-10T12:38:06.562 INFO:tasks.workunit.client.0.vm00.stdout:7/533: link da/d41/d7b/f83 da/d41/d7b/d9d/fc2 0 2026-03-10T12:38:06.564 INFO:tasks.workunit.client.1.vm07.stdout:2/481: dwrite d0/d42/d4e/d77/d70/f8a [0,4194304] 0 2026-03-10T12:38:06.572 INFO:tasks.workunit.client.1.vm07.stdout:1/571: write d9/f1f [118673,49551] 0 2026-03-10T12:38:06.579 INFO:tasks.workunit.client.0.vm00.stdout:4/774: write df/d1f/d22/d26/dab/fc5 [286267,89627] 0 2026-03-10T12:38:06.588 INFO:tasks.workunit.client.0.vm00.stdout:8/626: truncate d0/d5c/fa0 1556518 0 2026-03-10T12:38:06.588 INFO:tasks.workunit.client.1.vm07.stdout:9/656: rmdir d5/d13/d9b/dde 0 2026-03-10T12:38:06.589 INFO:tasks.workunit.client.1.vm07.stdout:9/657: write d5/d13/d57/d4f/d6a/f8e [2429285,27579] 0 2026-03-10T12:38:06.593 INFO:tasks.workunit.client.1.vm07.stdout:4/700: chown d0/d4/d10/d9a/c84 3233 1 2026-03-10T12:38:06.593 INFO:tasks.workunit.client.0.vm00.stdout:6/505: chown d2/f30 6769 1 2026-03-10T12:38:06.594 INFO:tasks.workunit.client.1.vm07.stdout:3/616: getdents dc/dd/d43/d76/d95 0 2026-03-10T12:38:06.597 INFO:tasks.workunit.client.1.vm07.stdout:5/615: read d0/d22/d18/d19/d21/d54/faf [4098610,82386] 0 2026-03-10T12:38:06.600 INFO:tasks.workunit.client.0.vm00.stdout:4/775: dread df/f1e [0,4194304] 0 2026-03-10T12:38:06.604 INFO:tasks.workunit.client.0.vm00.stdout:6/506: creat d2/d16/d29/d31/d88/d92/fb6 x:0 0 0 2026-03-10T12:38:06.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.604+0000 7fc267fe4700 1 -- 192.168.123.100:0/4083469083 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260072360 msgr2=0x7fc2600770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:06.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.604+0000 7fc267fe4700 1 --2- 192.168.123.100:0/4083469083 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260072360 0x7fc2600770e0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fc258009230 tx=0x7fc258009260 comp rx=0 tx=0).stop 2026-03-10T12:38:06.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.604+0000 7fc267fe4700 1 -- 192.168.123.100:0/4083469083 shutdown_connections 2026-03-10T12:38:06.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.604+0000 7fc267fe4700 1 --2- 192.168.123.100:0/4083469083 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260072360 0x7fc2600770e0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.604+0000 7fc267fe4700 1 --2- 192.168.123.100:0/4083469083 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc260071980 0x7fc260071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.604+0000 7fc267fe4700 1 -- 192.168.123.100:0/4083469083 >> 192.168.123.100:0/4083469083 conn(0x7fc26006d1a0 msgr2=0x7fc26006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:06.605 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:06 vm00.local ceph-mon[50686]: pgmap v168: 65 pgs: 65 active+clean; 2.5 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 58 MiB/s rd, 137 MiB/s wr, 334 op/s 2026-03-10T12:38:06.605 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:06 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:06.605 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:06 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:06.605 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:06 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:06.605 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:06 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:06.605 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:06 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:06.605 INFO:tasks.workunit.client.1.vm07.stdout:1/572: unlink d9/df/d29/d2b/d31/d91/d59/fa2 0 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 -- 192.168.123.100:0/4083469083 shutdown_connections 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 -- 192.168.123.100:0/4083469083 wait complete. 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 Processor -- start 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 -- start start 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260071980 0x7fc260082550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc260082a90 0x7fc260082f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc26012dd80 con 0x7fc260082a90 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc267fe4700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc26012def0 con 0x7fc260071980 2026-03-10T12:38:06.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc265d80700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260071980 0x7fc260082550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc265d80700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260071980 0x7fc260082550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:36240/0 (socket says 192.168.123.100:36240) 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.605+0000 7fc265d80700 1 -- 192.168.123.100:0/1897598474 learned_addr learned my addr 192.168.123.100:0/1897598474 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:06.614 INFO:tasks.workunit.client.1.vm07.stdout:9/658: rmdir d5/d16/da3 39 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.608+0000 7fc26557f700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc260082a90 0x7fc260082f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.608+0000 7fc265d80700 1 -- 192.168.123.100:0/1897598474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc260082a90 msgr2=0x7fc260082f00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.608+0000 7fc265d80700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc260082a90 0x7fc260082f00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.608+0000 7fc265d80700 1 -- 192.168.123.100:0/1897598474 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc258008ee0 con 0x7fc260071980 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.608+0000 7fc265d80700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260071980 0x7fc260082550 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fc25c00bfd0 tx=0x7fc25c009d70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.608+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc25c010040 con 0x7fc260071980 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.609+0000 7fc267fe4700 1 -- 192.168.123.100:0/1897598474 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc26012e170 con 0x7fc260071980 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.609+0000 7fc267fe4700 1 -- 192.168.123.100:0/1897598474 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc26012e6c0 con 0x7fc260071980 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.609+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc25c00ec20 con 0x7fc260071980 2026-03-10T12:38:06.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.609+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc25c014e40 con 0x7fc260071980 2026-03-10T12:38:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.610+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 20) v1 ==== 44873+0+0 (secure 0 0 0) 0x7fc25c01e4a0 con 0x7fc260071980 2026-03-10T12:38:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.610+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fc25c04ca10 con 0x7fc260071980 2026-03-10T12:38:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.610+0000 7fc267fe4700 1 -- 192.168.123.100:0/1897598474 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc26004ea50 con 0x7fc260071980 2026-03-10T12:38:06.615 INFO:tasks.workunit.client.0.vm00.stdout:3/777: creat dd/d3d/d8a/f102 x:0 0 0 2026-03-10T12:38:06.615 INFO:tasks.workunit.client.1.vm07.stdout:3/617: sync 2026-03-10T12:38:06.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:06.620+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+177933 (secure 0 0 0) 0x7fc25c0128b0 con 0x7fc260071980 2026-03-10T12:38:06.620 INFO:tasks.workunit.client.1.vm07.stdout:7/546: dread d0/f3f [0,4194304] 0 2026-03-10T12:38:06.621 INFO:tasks.workunit.client.1.vm07.stdout:7/547: dread - d0/d57/d62/f75 zero size 2026-03-10T12:38:06.623 INFO:tasks.workunit.client.0.vm00.stdout:1/765: truncate da/d12/fc5 474829 0 2026-03-10T12:38:06.625 INFO:tasks.workunit.client.1.vm07.stdout:4/701: rename d0/d4/d10/d8d to d0/d4/d5/d78/dc5/df7 0 2026-03-10T12:38:06.626 INFO:tasks.workunit.client.1.vm07.stdout:4/702: stat d0/d4/d10/d3c/d2b/d54/de1/c7d 0 2026-03-10T12:38:06.626 INFO:tasks.workunit.client.1.vm07.stdout:4/703: chown d0/d4/d5/da/d95 99573308 1 2026-03-10T12:38:06.630 INFO:tasks.workunit.client.0.vm00.stdout:5/797: rename d1f/d26/d2b/d35/d53/d72/d9d to d1f/d6a/d118 0 2026-03-10T12:38:06.638 INFO:tasks.workunit.client.0.vm00.stdout:9/770: write d0/d3d/d59/fad [4728322,33768] 0 2026-03-10T12:38:06.639 INFO:tasks.workunit.client.0.vm00.stdout:9/771: chown d0/d3d/d59/d4e/dba/d1e/d85/fe7 1886 1 2026-03-10T12:38:06.640 INFO:tasks.workunit.client.0.vm00.stdout:9/772: dread - d0/d7f/db8/dc4/f111 zero size 2026-03-10T12:38:06.642 INFO:tasks.workunit.client.0.vm00.stdout:9/773: dread d0/d3d/d59/fad [0,4194304] 0 2026-03-10T12:38:06.644 INFO:tasks.workunit.client.1.vm07.stdout:9/659: dread d5/d16/d23/d26/f46 [0,4194304] 0 2026-03-10T12:38:06.646 INFO:tasks.workunit.client.1.vm07.stdout:6/564: dread d1/f38 [0,4194304] 0 2026-03-10T12:38:06.653 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:06 vm07.local ceph-mon[58582]: pgmap v168: 65 pgs: 65 active+clean; 2.5 GiB data, 8.8 GiB used, 111 GiB / 120 GiB avail; 58 MiB/s rd, 137 MiB/s wr, 334 op/s 2026-03-10T12:38:06.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:06 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:06.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:06 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:06.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:06 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:06.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:06 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:06.654 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:06 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' 2026-03-10T12:38:06.701 INFO:tasks.workunit.client.0.vm00.stdout:1/766: write da/d21/d27/d6a/f9e [4072423,117930] 0 2026-03-10T12:38:06.711 INFO:tasks.workunit.client.0.vm00.stdout:2/746: dwrite d4/d6/f9c [0,4194304] 0 2026-03-10T12:38:06.716 INFO:tasks.workunit.client.1.vm07.stdout:5/616: dwrite d0/d22/d18/d19/d21/d54/dcb/db8/fca [0,4194304] 0 2026-03-10T12:38:06.719 INFO:tasks.workunit.client.1.vm07.stdout:5/617: dwrite d0/d22/d18/fb4 [0,4194304] 0 2026-03-10T12:38:06.721 INFO:tasks.workunit.client.1.vm07.stdout:5/618: chown d0/d22/d18/d3e/d53/faa 1 1 2026-03-10T12:38:06.732 INFO:tasks.workunit.client.1.vm07.stdout:2/482: link d0/d42/d26/d38/f3d d0/d29/d64/d6c/d94/fa7 0 2026-03-10T12:38:06.734 INFO:tasks.workunit.client.1.vm07.stdout:0/630: write d0/d14/d5f/d76/d2f/d31/d4f/f70 [572144,100664] 0 2026-03-10T12:38:06.740 INFO:tasks.workunit.client.0.vm00.stdout:0/641: write d3/d22/f55 [943000,36110] 0 2026-03-10T12:38:06.741 INFO:tasks.workunit.client.0.vm00.stdout:0/642: chown d3/d22/f2e 2367 1 2026-03-10T12:38:06.741 INFO:tasks.workunit.client.0.vm00.stdout:0/643: write d3/db/d24/d25/fb8 [413607,27394] 0 2026-03-10T12:38:06.742 INFO:tasks.workunit.client.0.vm00.stdout:7/534: rename da/d1b/ca3 to da/d3f/cc3 0 2026-03-10T12:38:06.743 INFO:tasks.workunit.client.0.vm00.stdout:0/644: chown d3/db/d77/d82/fce 29536067 1 2026-03-10T12:38:06.764 INFO:tasks.workunit.client.0.vm00.stdout:3/778: mknod dd/d18/d14/d2b/c103 0 2026-03-10T12:38:06.764 INFO:tasks.workunit.client.0.vm00.stdout:3/779: stat dd/d2a/da2/de1/d45 0 2026-03-10T12:38:06.766 INFO:tasks.workunit.client.0.vm00.stdout:5/798: mknod d1f/d26/d2e/d58/d6b/d113/c119 0 2026-03-10T12:38:06.772 INFO:tasks.workunit.client.1.vm07.stdout:7/548: mkdir d0/d61/db3 0 2026-03-10T12:38:06.774 INFO:tasks.workunit.client.1.vm07.stdout:7/549: dread d0/d61/d79/f95 [0,4194304] 0 2026-03-10T12:38:06.774 INFO:tasks.workunit.client.0.vm00.stdout:1/767: creat da/d21/db3/d59/da6/da4/dda/dc0/dc3/f100 x:0 0 0 2026-03-10T12:38:06.786 INFO:tasks.workunit.client.0.vm00.stdout:3/780: creat dd/d64/d92/f104 x:0 0 0 2026-03-10T12:38:06.790 INFO:tasks.workunit.client.0.vm00.stdout:4/776: rename df/d1f/d22/d26/ca3 to df/d32/d76/c100 0 2026-03-10T12:38:06.795 INFO:tasks.workunit.client.0.vm00.stdout:5/799: fdatasync d1f/d26/d2b/f52 0 2026-03-10T12:38:06.811 INFO:tasks.workunit.client.0.vm00.stdout:0/645: getdents d3/db/da4 0 2026-03-10T12:38:06.817 INFO:tasks.workunit.client.0.vm00.stdout:5/800: fsync d1f/d26/f28 0 2026-03-10T12:38:06.825 INFO:tasks.workunit.client.0.vm00.stdout:9/774: link d0/d3d/d59/d4e/dba/d19/f5c d0/f116 0 2026-03-10T12:38:06.825 INFO:tasks.workunit.client.0.vm00.stdout:8/627: getdents d0/d93/d17/db1 0 2026-03-10T12:38:06.825 INFO:tasks.workunit.client.0.vm00.stdout:2/747: getdents d4/dd 0 2026-03-10T12:38:06.825 INFO:tasks.workunit.client.0.vm00.stdout:0/646: creat d3/d7/d4c/d9d/fd1 x:0 0 0 2026-03-10T12:38:06.825 INFO:tasks.workunit.client.0.vm00.stdout:5/801: mkdir d1f/d39/d11a 0 2026-03-10T12:38:06.826 INFO:tasks.workunit.client.1.vm07.stdout:0/631: creat d0/d14/d5f/d76/d2f/d31/d79/d85/fcf x:0 0 0 2026-03-10T12:38:06.827 INFO:tasks.workunit.client.0.vm00.stdout:8/628: creat d0/d93/d17/d48/fc7 x:0 0 0 2026-03-10T12:38:06.829 INFO:tasks.workunit.client.0.vm00.stdout:5/802: creat d1f/d26/d2e/d58/d6b/d113/f11b x:0 0 0 2026-03-10T12:38:06.831 INFO:tasks.workunit.client.1.vm07.stdout:4/704: fdatasync d0/d4/d10/d3c/d2b/d2d/f99 0 2026-03-10T12:38:06.832 INFO:tasks.workunit.client.1.vm07.stdout:4/705: read - d0/d4/d5/d78/dc5/df7/fb0 zero size 2026-03-10T12:38:06.834 INFO:tasks.workunit.client.0.vm00.stdout:9/775: dread d0/d5/f3b [0,4194304] 0 2026-03-10T12:38:06.835 INFO:tasks.workunit.client.0.vm00.stdout:3/781: sync 2026-03-10T12:38:06.840 INFO:tasks.workunit.client.1.vm07.stdout:6/565: fsync d1/f3d 0 2026-03-10T12:38:06.840 INFO:tasks.workunit.client.1.vm07.stdout:6/566: readlink d1/d4/d6/d16/d1a/lad 0 2026-03-10T12:38:06.841 INFO:tasks.workunit.client.0.vm00.stdout:4/777: getdents df 0 2026-03-10T12:38:06.842 INFO:tasks.workunit.client.1.vm07.stdout:3/618: unlink dc/dd/d1f/d45/f56 0 2026-03-10T12:38:06.843 INFO:tasks.workunit.client.0.vm00.stdout:0/647: dread d3/d7/d4c/d5b/f2b [0,4194304] 0 2026-03-10T12:38:06.843 INFO:tasks.workunit.client.1.vm07.stdout:2/483: mkdir d0/d5b/d98/da8 0 2026-03-10T12:38:06.848 INFO:tasks.workunit.client.0.vm00.stdout:3/782: symlink dd/d2a/da2/de1/d101/l105 0 2026-03-10T12:38:06.850 INFO:tasks.workunit.client.0.vm00.stdout:3/783: chown dd/d18/d13/d99/da5/dd0/cd1 139838602 1 2026-03-10T12:38:06.851 INFO:tasks.workunit.client.1.vm07.stdout:4/706: truncate d0/d4/d5/d78/dc5/df7/f97 311693 0 2026-03-10T12:38:06.851 INFO:tasks.workunit.client.0.vm00.stdout:3/784: readlink dd/d18/d14/d2b/le7 0 2026-03-10T12:38:06.852 INFO:tasks.workunit.client.0.vm00.stdout:3/785: write dd/d2a/da2/de1/d45/f47 [4440002,52503] 0 2026-03-10T12:38:06.855 INFO:tasks.workunit.client.0.vm00.stdout:8/629: getdents d0/dd 0 2026-03-10T12:38:06.856 INFO:tasks.workunit.client.0.vm00.stdout:8/630: write d0/dd/d38/f3d [232334,69420] 0 2026-03-10T12:38:06.860 INFO:tasks.workunit.client.0.vm00.stdout:1/768: write da/d24/f45 [1776944,100513] 0 2026-03-10T12:38:06.861 INFO:tasks.workunit.client.0.vm00.stdout:2/748: write d4/dd/db9/d6d/faa [1015692,8043] 0 2026-03-10T12:38:06.865 INFO:tasks.workunit.client.0.vm00.stdout:9/776: dwrite d0/d3d/d59/d4e/f6f [0,4194304] 0 2026-03-10T12:38:06.875 INFO:tasks.workunit.client.0.vm00.stdout:1/769: mknod da/d21/d27/d6a/d94/c101 0 2026-03-10T12:38:06.878 INFO:tasks.workunit.client.0.vm00.stdout:5/803: link d1f/f27 d1f/d26/d2e/d58/d6b/d113/f11c 0 2026-03-10T12:38:06.881 INFO:tasks.workunit.client.0.vm00.stdout:5/804: creat d1f/d26/d2b/de4/f11d x:0 0 0 2026-03-10T12:38:06.884 INFO:tasks.workunit.client.0.vm00.stdout:5/805: dread d1f/d26/d2b/fce [0,4194304] 0 2026-03-10T12:38:06.884 INFO:tasks.workunit.client.0.vm00.stdout:5/806: fsync d1f/d6a/f84 0 2026-03-10T12:38:06.887 INFO:tasks.workunit.client.1.vm07.stdout:6/567: creat d1/d4/d6/d53/fb5 x:0 0 0 2026-03-10T12:38:06.890 INFO:tasks.workunit.client.0.vm00.stdout:9/777: creat d0/d3d/d59/d4e/dba/d1e/d85/f117 x:0 0 0 2026-03-10T12:38:06.893 INFO:tasks.workunit.client.0.vm00.stdout:9/778: dwrite d0/f17 [0,4194304] 0 2026-03-10T12:38:06.898 INFO:tasks.workunit.client.0.vm00.stdout:5/807: sync 2026-03-10T12:38:06.899 INFO:tasks.workunit.client.1.vm07.stdout:0/632: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0 0 2026-03-10T12:38:06.900 INFO:tasks.workunit.client.1.vm07.stdout:0/633: truncate d0/d14/d5f/d76/d2f/d31/d4f/f70 843197 0 2026-03-10T12:38:06.904 INFO:tasks.workunit.client.1.vm07.stdout:2/484: getdents d0/d42/d4e/d77 0 2026-03-10T12:38:06.908 INFO:tasks.workunit.client.0.vm00.stdout:5/808: creat d1f/d26/d2e/d58/d6b/deb/f11e x:0 0 0 2026-03-10T12:38:06.909 INFO:tasks.workunit.client.1.vm07.stdout:0/634: mknod d0/d14/d5f/d76/d2f/d31/d79/dcc/cd1 0 2026-03-10T12:38:06.910 INFO:tasks.workunit.client.1.vm07.stdout:2/485: creat d0/d42/d1f/d20/fa9 x:0 0 0 2026-03-10T12:38:06.910 INFO:tasks.workunit.client.1.vm07.stdout:0/635: chown d0/d14/d5f/d41/d86 17 1 2026-03-10T12:38:06.911 INFO:tasks.workunit.client.0.vm00.stdout:3/786: write dd/d18/d14/d2b/f31 [1790599,30693] 0 2026-03-10T12:38:06.920 INFO:tasks.workunit.client.0.vm00.stdout:8/631: dwrite d0/d93/d2d/f52 [0,4194304] 0 2026-03-10T12:38:06.921 INFO:tasks.workunit.client.1.vm07.stdout:0/636: mknod d0/d14/d5f/d41/cd2 0 2026-03-10T12:38:06.921 INFO:tasks.workunit.client.0.vm00.stdout:8/632: write d0/d5c/f4a [132570,130415] 0 2026-03-10T12:38:06.922 INFO:tasks.workunit.client.0.vm00.stdout:1/770: dread da/d21/d27/f6e [0,4194304] 0 2026-03-10T12:38:06.922 INFO:tasks.workunit.client.1.vm07.stdout:0/637: mknod d0/d14/d7c/cd3 0 2026-03-10T12:38:06.923 INFO:tasks.workunit.client.1.vm07.stdout:0/638: fdatasync d0/d14/d5f/d76/d2f/d31/d79/f7b 0 2026-03-10T12:38:06.925 INFO:tasks.workunit.client.0.vm00.stdout:2/749: dwrite d4/dd/db9/f7a [0,4194304] 0 2026-03-10T12:38:06.926 INFO:tasks.workunit.client.0.vm00.stdout:2/750: fsync f1 0 2026-03-10T12:38:06.934 INFO:tasks.workunit.client.0.vm00.stdout:9/779: write d0/d3d/d59/d4e/dba/d19/d50/fbd [143711,44451] 0 2026-03-10T12:38:06.936 INFO:tasks.workunit.client.0.vm00.stdout:5/809: fsync d1f/d26/d2b/d35/d53/fa7 0 2026-03-10T12:38:06.940 INFO:tasks.workunit.client.0.vm00.stdout:5/810: dwrite d1f/d26/f9f [0,4194304] 0 2026-03-10T12:38:06.951 INFO:tasks.workunit.client.1.vm07.stdout:1/573: creat d9/d2d/d80/fc0 x:0 0 0 2026-03-10T12:38:06.951 INFO:tasks.workunit.client.0.vm00.stdout:8/633: mkdir d0/d93/d2d/dc8 0 2026-03-10T12:38:06.951 INFO:tasks.workunit.client.0.vm00.stdout:8/634: stat d0/d58/d68 0 2026-03-10T12:38:06.951 INFO:tasks.workunit.client.0.vm00.stdout:2/751: mknod d4/dd/db9/cf6 0 2026-03-10T12:38:06.951 INFO:tasks.workunit.client.0.vm00.stdout:2/752: stat d4/d6/l24 0 2026-03-10T12:38:06.954 INFO:tasks.workunit.client.0.vm00.stdout:5/811: mkdir d1f/d26/d2e/d58/d6b/d113/d11f 0 2026-03-10T12:38:06.971 INFO:tasks.workunit.client.0.vm00.stdout:5/812: dwrite d1f/d26/d2e/d58/ff6 [0,4194304] 0 2026-03-10T12:38:06.978 INFO:tasks.workunit.client.0.vm00.stdout:9/780: link d0/d3d/d59/d4e/dba/d1e/d27/d115/fdd d0/d3d/d59/d4e/dba/d1e/d85/f118 0 2026-03-10T12:38:06.978 INFO:tasks.workunit.client.0.vm00.stdout:9/781: fsync d0/d3d/d43/f68 0 2026-03-10T12:38:06.980 INFO:tasks.workunit.client.0.vm00.stdout:1/771: getdents da/d21/db3/d59/da6/da4/dda/dc0 0 2026-03-10T12:38:06.982 INFO:tasks.workunit.client.0.vm00.stdout:2/753: link d4/d53/d76/cc0 d4/d53/d68/dc2/cf7 0 2026-03-10T12:38:06.983 INFO:tasks.workunit.client.0.vm00.stdout:9/782: creat d0/d3d/d43/f119 x:0 0 0 2026-03-10T12:38:06.987 INFO:tasks.workunit.client.0.vm00.stdout:1/772: mknod da/d24/d28/d67/da2/d78/c102 0 2026-03-10T12:38:06.987 INFO:tasks.workunit.client.0.vm00.stdout:9/783: mknod d0/d3d/d59/d4e/c11a 0 2026-03-10T12:38:06.989 INFO:tasks.workunit.client.1.vm07.stdout:1/574: sync 2026-03-10T12:38:06.989 INFO:tasks.workunit.client.0.vm00.stdout:1/773: creat da/d21/d27/d6a/d94/db9/f103 x:0 0 0 2026-03-10T12:38:06.992 INFO:tasks.workunit.client.1.vm07.stdout:1/575: truncate d9/d2d/d80/d8e/fa0 637387 0 2026-03-10T12:38:06.993 INFO:tasks.workunit.client.0.vm00.stdout:9/784: creat d0/d7f/db8/f11b x:0 0 0 2026-03-10T12:38:06.993 INFO:tasks.workunit.client.0.vm00.stdout:2/754: dread d4/d6/d2d/d31/f46 [0,4194304] 0 2026-03-10T12:38:06.995 INFO:tasks.workunit.client.0.vm00.stdout:2/755: fsync d4/d6/d93/dc6/fd1 0 2026-03-10T12:38:06.996 INFO:tasks.workunit.client.0.vm00.stdout:2/756: dread - d4/dd/fe6 zero size 2026-03-10T12:38:06.998 INFO:tasks.workunit.client.0.vm00.stdout:2/757: truncate d4/d6/d2d/fa9 1003511 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:9/785: link d0/d3d/d59/d4e/dba/d19/f20 d0/d3d/d43/d114/f11c 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:2/758: symlink d4/lf8 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:9/786: truncate d0/fc9 577547 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:9/787: stat d0/d3d/d59 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:2/759: mkdir d4/d78/df9 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:2/760: chown d4/d6/d2d/dc3 1867469946 1 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:2/761: chown d4/f67 29363 1 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:2/762: write d4/dd/fe6 [233430,69299] 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:2/763: write d4/dd/f3c [2060785,128826] 0 2026-03-10T12:38:07.014 INFO:tasks.workunit.client.0.vm00.stdout:2/764: stat d4/dd/db9 0 2026-03-10T12:38:07.016 INFO:tasks.workunit.client.0.vm00.stdout:1/774: sync 2026-03-10T12:38:07.017 INFO:tasks.workunit.client.0.vm00.stdout:9/788: creat d0/d3d/d59/d4e/dba/d1e/d85/f11d x:0 0 0 2026-03-10T12:38:07.019 INFO:tasks.workunit.client.0.vm00.stdout:2/765: mkdir d4/d6/d2d/d3a/dfa 0 2026-03-10T12:38:07.020 INFO:tasks.workunit.client.0.vm00.stdout:2/766: readlink d4/dd/db9/d6d/l48 0 2026-03-10T12:38:07.023 INFO:tasks.workunit.client.0.vm00.stdout:1/775: symlink da/d21/db3/d59/da6/da4/dda/dc0/dc3/l104 0 2026-03-10T12:38:07.028 INFO:tasks.workunit.client.0.vm00.stdout:9/789: mknod d0/d7f/db8/dc4/c11e 0 2026-03-10T12:38:07.029 INFO:tasks.workunit.client.0.vm00.stdout:2/767: write d4/d6/d93/fbf [1719274,101954] 0 2026-03-10T12:38:07.031 INFO:tasks.workunit.client.0.vm00.stdout:1/776: rmdir da/d21/db3/d5d 39 2026-03-10T12:38:07.040 INFO:tasks.workunit.client.0.vm00.stdout:8/635: truncate d0/f9d 2890121 0 2026-03-10T12:38:07.043 INFO:tasks.workunit.client.0.vm00.stdout:8/636: dwrite d0/dd/f9a [0,4194304] 0 2026-03-10T12:38:07.047 INFO:tasks.workunit.client.0.vm00.stdout:5/813: dwrite f11 [0,4194304] 0 2026-03-10T12:38:07.049 INFO:tasks.workunit.client.0.vm00.stdout:5/814: chown d1f/d6a/d94/dc9/f114 237472198 1 2026-03-10T12:38:07.052 INFO:tasks.workunit.client.0.vm00.stdout:2/768: truncate d4/d6/d93/dc6/fd1 544570 0 2026-03-10T12:38:07.055 INFO:tasks.workunit.client.0.vm00.stdout:2/769: dread d4/dd/db9/f4c [0,4194304] 0 2026-03-10T12:38:07.055 INFO:tasks.workunit.client.0.vm00.stdout:2/770: readlink d4/d6/d2d/lb5 0 2026-03-10T12:38:07.065 INFO:tasks.workunit.client.0.vm00.stdout:9/790: getdents d0/d7f/db8/dc4/db0 0 2026-03-10T12:38:07.066 INFO:tasks.workunit.client.0.vm00.stdout:9/791: write d0/d7f/d88/f10e [768536,89696] 0 2026-03-10T12:38:07.068 INFO:tasks.workunit.client.0.vm00.stdout:7/535: rename da/fb to da/d26/d37/fc4 0 2026-03-10T12:38:07.080 INFO:tasks.workunit.client.0.vm00.stdout:7/536: unlink da/d26/laa 0 2026-03-10T12:38:07.080 INFO:tasks.workunit.client.0.vm00.stdout:7/537: fsync da/d3f/d60/fb1 0 2026-03-10T12:38:07.080 INFO:tasks.workunit.client.0.vm00.stdout:7/538: symlink da/d41/d48/lc5 0 2026-03-10T12:38:07.081 INFO:tasks.workunit.client.0.vm00.stdout:7/539: symlink da/d3f/lc6 0 2026-03-10T12:38:07.081 INFO:tasks.workunit.client.0.vm00.stdout:7/540: truncate da/d25/d2e/d4c/f6e 2725576 0 2026-03-10T12:38:07.083 INFO:tasks.workunit.client.0.vm00.stdout:9/792: mkdir d0/d3d/d59/d4e/dba/d1e/d85/d11f 0 2026-03-10T12:38:07.084 INFO:tasks.workunit.client.0.vm00.stdout:9/793: readlink d0/d3d/d59/d4e/dba/d1e/d27/lc8 0 2026-03-10T12:38:07.085 INFO:tasks.workunit.client.0.vm00.stdout:2/771: truncate f1 2309522 0 2026-03-10T12:38:07.088 INFO:tasks.workunit.client.0.vm00.stdout:4/778: rename df/d57 to df/d1f/d22/d26/d65/d91/d101 0 2026-03-10T12:38:07.095 INFO:tasks.workunit.client.0.vm00.stdout:3/787: rename dd/d64/d92 to dd/d18/d13/d1d/dc6/d106 0 2026-03-10T12:38:07.104 INFO:tasks.workunit.client.0.vm00.stdout:7/541: dread f9 [0,4194304] 0 2026-03-10T12:38:07.105 INFO:tasks.workunit.client.0.vm00.stdout:9/794: dread d0/d3d/d59/d4e/dba/d1e/d2b/f5f [0,4194304] 0 2026-03-10T12:38:07.106 INFO:tasks.workunit.client.0.vm00.stdout:7/542: mkdir da/d26/d37/dc7 0 2026-03-10T12:38:07.106 INFO:tasks.workunit.client.0.vm00.stdout:7/543: stat da/d26/d37/f96 0 2026-03-10T12:38:07.113 INFO:tasks.workunit.client.0.vm00.stdout:2/772: write d4/d6/f2b [3927346,67030] 0 2026-03-10T12:38:07.115 INFO:tasks.workunit.client.0.vm00.stdout:2/773: write d4/d53/d68/f69 [1052958,61537] 0 2026-03-10T12:38:07.116 INFO:tasks.workunit.client.0.vm00.stdout:2/774: read - d4/d53/d68/dc2/fec zero size 2026-03-10T12:38:07.116 INFO:tasks.workunit.client.0.vm00.stdout:8/637: dwrite d0/d93/d17/f67 [0,4194304] 0 2026-03-10T12:38:07.127 INFO:tasks.workunit.client.1.vm07.stdout:9/660: write d5/fcd [360953,127353] 0 2026-03-10T12:38:07.128 INFO:tasks.workunit.client.1.vm07.stdout:7/550: dwrite d0/d57/d62/f8b [0,4194304] 0 2026-03-10T12:38:07.139 INFO:tasks.workunit.client.1.vm07.stdout:9/661: getdents d5/d13/d22 0 2026-03-10T12:38:07.139 INFO:tasks.workunit.client.1.vm07.stdout:9/662: stat d5/d13/d6c/l87 0 2026-03-10T12:38:07.140 INFO:tasks.workunit.client.1.vm07.stdout:9/663: write d5/d1f/d31/f43 [671441,61187] 0 2026-03-10T12:38:07.140 INFO:tasks.workunit.client.1.vm07.stdout:9/664: readlink d5/d1f/d31/d64/lbb 0 2026-03-10T12:38:07.143 INFO:tasks.workunit.client.1.vm07.stdout:9/665: symlink d5/d1f/d31/d74/le1 0 2026-03-10T12:38:07.145 INFO:tasks.workunit.client.0.vm00.stdout:0/648: dwrite d3/d7/d3c/f30 [4194304,4194304] 0 2026-03-10T12:38:07.145 INFO:tasks.workunit.client.1.vm07.stdout:9/666: stat d5/d13/d2c/f44 0 2026-03-10T12:38:07.146 INFO:tasks.workunit.client.0.vm00.stdout:0/649: chown d3/d40/d65 31586 1 2026-03-10T12:38:07.155 INFO:tasks.workunit.client.1.vm07.stdout:9/667: fsync d5/d1f/f9f 0 2026-03-10T12:38:07.158 INFO:tasks.workunit.client.0.vm00.stdout:7/544: dread da/d25/d2c/f30 [0,4194304] 0 2026-03-10T12:38:07.158 INFO:tasks.workunit.client.0.vm00.stdout:8/638: symlink d0/d5c/lc9 0 2026-03-10T12:38:07.158 INFO:tasks.workunit.client.1.vm07.stdout:9/668: symlink d5/d1f/le2 0 2026-03-10T12:38:07.159 INFO:tasks.workunit.client.1.vm07.stdout:9/669: chown d5/d13/d57 184966837 1 2026-03-10T12:38:07.162 INFO:tasks.workunit.client.0.vm00.stdout:9/795: fsync d0/fc9 0 2026-03-10T12:38:07.163 INFO:tasks.workunit.client.0.vm00.stdout:7/545: stat da/f35 0 2026-03-10T12:38:07.164 INFO:tasks.workunit.client.1.vm07.stdout:9/670: rmdir d5/d13/d6c/d89 39 2026-03-10T12:38:07.165 INFO:tasks.workunit.client.0.vm00.stdout:0/650: mknod d3/cd2 0 2026-03-10T12:38:07.170 INFO:tasks.workunit.client.0.vm00.stdout:2/775: mkdir d4/d6/d2d/de5/dfb 0 2026-03-10T12:38:07.171 INFO:tasks.workunit.client.0.vm00.stdout:0/651: mknod d3/d40/d65/cd3 0 2026-03-10T12:38:07.175 INFO:tasks.workunit.client.0.vm00.stdout:0/652: creat d3/d7/d58/fd4 x:0 0 0 2026-03-10T12:38:07.176 INFO:tasks.workunit.client.0.vm00.stdout:2/776: creat d4/dd/da7/ffc x:0 0 0 2026-03-10T12:38:07.178 INFO:tasks.workunit.client.0.vm00.stdout:9/796: mknod d0/c120 0 2026-03-10T12:38:07.179 INFO:tasks.workunit.client.0.vm00.stdout:0/653: mkdir d3/d7/db0/dc4/dd5 0 2026-03-10T12:38:07.181 INFO:tasks.workunit.client.0.vm00.stdout:2/777: symlink d4/d53/d76/dba/deb/lfd 0 2026-03-10T12:38:07.182 INFO:tasks.workunit.client.0.vm00.stdout:2/778: write d4/dd/da7/fd2 [281915,66880] 0 2026-03-10T12:38:07.182 INFO:tasks.workunit.client.0.vm00.stdout:4/779: link df/d1f/d22/d26/d65/d91/d101/db7/le2 df/d1f/d36/d3a/l102 0 2026-03-10T12:38:07.183 INFO:tasks.workunit.client.0.vm00.stdout:2/779: write d4/d6/d2d/d3a/d43/d85/f8f [1468101,13178] 0 2026-03-10T12:38:07.194 INFO:tasks.workunit.client.0.vm00.stdout:9/797: link d0/d3d/d59/d4e/f7c d0/d3d/d59/d4e/dba/d1e/f121 0 2026-03-10T12:38:07.194 INFO:tasks.workunit.client.0.vm00.stdout:2/780: dread d4/d6/d93/dc6/fd1 [0,4194304] 0 2026-03-10T12:38:07.194 INFO:tasks.workunit.client.0.vm00.stdout:2/781: fdatasync d4/d6/d93/fdb 0 2026-03-10T12:38:07.194 INFO:tasks.workunit.client.0.vm00.stdout:9/798: dwrite d0/d3d/d59/d4e/dba/d19/fb1 [0,4194304] 0 2026-03-10T12:38:07.194 INFO:tasks.workunit.client.0.vm00.stdout:9/799: readlink d0/d3d/d59/d4e/l76 0 2026-03-10T12:38:07.195 INFO:tasks.workunit.client.0.vm00.stdout:2/782: fsync d4/d53/d9e/f60 0 2026-03-10T12:38:07.202 INFO:tasks.workunit.client.0.vm00.stdout:8/639: sync 2026-03-10T12:38:07.203 INFO:tasks.workunit.client.1.vm07.stdout:9/671: sync 2026-03-10T12:38:07.203 INFO:tasks.workunit.client.1.vm07.stdout:9/672: chown d5/d1f/d31/d74 783186484 1 2026-03-10T12:38:07.204 INFO:tasks.workunit.client.0.vm00.stdout:9/800: creat d0/d7f/db8/dc4/de6/f122 x:0 0 0 2026-03-10T12:38:07.205 INFO:tasks.workunit.client.0.vm00.stdout:9/801: chown d0/d7f/db8/dc4/c10f 1488608 1 2026-03-10T12:38:07.205 INFO:tasks.workunit.client.0.vm00.stdout:9/802: chown d0/d7f/db8/dc4/d106 802 1 2026-03-10T12:38:07.207 INFO:tasks.workunit.client.0.vm00.stdout:8/640: symlink d0/d58/lca 0 2026-03-10T12:38:07.208 INFO:tasks.workunit.client.0.vm00.stdout:8/641: getdents d0/d93/d17/db1 0 2026-03-10T12:38:07.209 INFO:tasks.workunit.client.0.vm00.stdout:8/642: write d0/d5c/f4a [5136342,10815] 0 2026-03-10T12:38:07.209 INFO:tasks.workunit.client.0.vm00.stdout:8/643: write d0/dd/d38/d81/f88 [75259,9184] 0 2026-03-10T12:38:07.210 INFO:tasks.workunit.client.0.vm00.stdout:8/644: readlink d0/d93/d60/lb5 0 2026-03-10T12:38:07.213 INFO:tasks.workunit.client.0.vm00.stdout:8/645: rmdir d0/d5c 39 2026-03-10T12:38:07.214 INFO:tasks.workunit.client.0.vm00.stdout:8/646: stat d0/d93/d2d/d49/c77 0 2026-03-10T12:38:07.214 INFO:tasks.workunit.client.0.vm00.stdout:8/647: chown d0/d93/d17/d48/lb3 27166051 1 2026-03-10T12:38:07.222 INFO:tasks.workunit.client.0.vm00.stdout:8/648: mknod d0/d58/ccb 0 2026-03-10T12:38:07.222 INFO:tasks.workunit.client.0.vm00.stdout:8/649: chown d0/d93/d36 38 1 2026-03-10T12:38:07.222 INFO:tasks.workunit.client.0.vm00.stdout:8/650: creat d0/d93/fcc x:0 0 0 2026-03-10T12:38:07.222 INFO:tasks.workunit.client.0.vm00.stdout:8/651: chown d0/dd 140 1 2026-03-10T12:38:07.222 INFO:tasks.workunit.client.0.vm00.stdout:8/652: rmdir d0/d93/d2d 39 2026-03-10T12:38:07.222 INFO:tasks.workunit.client.0.vm00.stdout:8/653: write d0/dd/fbc [1001036,124144] 0 2026-03-10T12:38:07.222 INFO:tasks.workunit.client.0.vm00.stdout:8/654: truncate d0/d93/fcc 545223 0 2026-03-10T12:38:07.232 INFO:tasks.workunit.client.0.vm00.stdout:8/655: mknod d0/d93/d2d/d49/ccd 0 2026-03-10T12:38:07.233 INFO:tasks.workunit.client.0.vm00.stdout:8/656: readlink d0/d93/d17/l18 0 2026-03-10T12:38:07.245 INFO:tasks.workunit.client.0.vm00.stdout:9/803: dread d0/d3d/d59/d4e/dba/f49 [0,4194304] 0 2026-03-10T12:38:07.246 INFO:tasks.workunit.client.0.vm00.stdout:9/804: mknod d0/d7f/db8/dc4/de6/c123 0 2026-03-10T12:38:07.250 INFO:tasks.workunit.client.0.vm00.stdout:8/657: sync 2026-03-10T12:38:07.254 INFO:tasks.workunit.client.0.vm00.stdout:4/780: write df/d1f/d22/d26/d65/d91/fad [840101,37739] 0 2026-03-10T12:38:07.255 INFO:tasks.workunit.client.0.vm00.stdout:4/781: truncate df/d1f/fd3 258843 0 2026-03-10T12:38:07.259 INFO:tasks.workunit.client.1.vm07.stdout:5/619: rename d0/fd to d0/d22/d18/d3e/d5d/dcf/fdd 0 2026-03-10T12:38:07.260 INFO:tasks.workunit.client.0.vm00.stdout:4/782: chown df/d1f/d22/d26/d70/cbe 458 1 2026-03-10T12:38:07.260 INFO:tasks.workunit.client.0.vm00.stdout:4/783: chown df/d1f/d36/d3a/d41/f2f 1026 1 2026-03-10T12:38:07.260 INFO:tasks.workunit.client.0.vm00.stdout:4/784: chown le 1220462 1 2026-03-10T12:38:07.261 INFO:tasks.workunit.client.0.vm00.stdout:8/658: chown d0/d5c/l59 0 1 2026-03-10T12:38:07.263 INFO:tasks.workunit.client.1.vm07.stdout:5/620: creat d0/d22/d18/d19/d21/d3a/fde x:0 0 0 2026-03-10T12:38:07.264 INFO:tasks.workunit.client.1.vm07.stdout:5/621: dread - d0/d22/d18/f86 zero size 2026-03-10T12:38:07.264 INFO:tasks.workunit.client.0.vm00.stdout:8/659: dwrite d0/d93/d17/d48/f87 [0,4194304] 0 2026-03-10T12:38:07.264 INFO:tasks.workunit.client.1.vm07.stdout:3/619: write dc/dd/d28/d7a/fba [334402,40088] 0 2026-03-10T12:38:07.271 INFO:tasks.workunit.client.1.vm07.stdout:4/707: dwrite d0/d4/df2/df6/f87 [0,4194304] 0 2026-03-10T12:38:07.272 INFO:tasks.workunit.client.0.vm00.stdout:4/785: fdatasync df/d1f/d22/f52 0 2026-03-10T12:38:07.278 INFO:tasks.workunit.client.0.vm00.stdout:8/660: rmdir d0/d93/d60 39 2026-03-10T12:38:07.279 INFO:tasks.workunit.client.1.vm07.stdout:2/486: dwrite d0/d42/d4e/d77/f89 [0,4194304] 0 2026-03-10T12:38:07.280 INFO:tasks.workunit.client.1.vm07.stdout:3/620: dread dc/dd/f19 [0,4194304] 0 2026-03-10T12:38:07.282 INFO:tasks.workunit.client.1.vm07.stdout:0/639: truncate d0/d14/d5f/d76/d2f/d31/d79/f7b 3224398 0 2026-03-10T12:38:07.282 INFO:tasks.workunit.client.0.vm00.stdout:8/661: dwrite d0/d93/d17/d48/f87 [0,4194304] 0 2026-03-10T12:38:07.285 INFO:tasks.workunit.client.1.vm07.stdout:5/622: dread d0/d22/f16 [0,4194304] 0 2026-03-10T12:38:07.288 INFO:tasks.workunit.client.0.vm00.stdout:8/662: sync 2026-03-10T12:38:07.292 INFO:tasks.workunit.client.0.vm00.stdout:8/663: mknod d0/dd/d38/d81/cce 0 2026-03-10T12:38:07.292 INFO:tasks.workunit.client.0.vm00.stdout:8/664: read d0/d93/d36/f41 [1984508,5896] 0 2026-03-10T12:38:07.293 INFO:tasks.workunit.client.1.vm07.stdout:6/568: rename d1/d4/d71/cae to d1/d4/d6/cb6 0 2026-03-10T12:38:07.294 INFO:tasks.workunit.client.1.vm07.stdout:6/569: chown d1/d4/d6/d43/d65/f9c 8564624 1 2026-03-10T12:38:07.295 INFO:tasks.workunit.client.0.vm00.stdout:4/786: creat df/f103 x:0 0 0 2026-03-10T12:38:07.298 INFO:tasks.workunit.client.0.vm00.stdout:8/665: truncate d0/d93/d36/f39 2501152 0 2026-03-10T12:38:07.298 INFO:tasks.workunit.client.1.vm07.stdout:3/621: creat dc/dd/db5/fd3 x:0 0 0 2026-03-10T12:38:07.303 INFO:tasks.workunit.client.0.vm00.stdout:0/654: unlink d3/db/d24/c6a 0 2026-03-10T12:38:07.307 INFO:tasks.workunit.client.0.vm00.stdout:8/666: mknod d0/d93/d2d/d49/ccf 0 2026-03-10T12:38:07.307 INFO:tasks.workunit.client.1.vm07.stdout:7/551: rename d0/d47/d48 to d0/d61/db4 0 2026-03-10T12:38:07.308 INFO:tasks.workunit.client.1.vm07.stdout:6/570: mkdir d1/d4/d6/d16/d49/db7 0 2026-03-10T12:38:07.310 INFO:tasks.workunit.client.0.vm00.stdout:8/667: symlink d0/d93/d17/ld0 0 2026-03-10T12:38:07.311 INFO:tasks.workunit.client.0.vm00.stdout:8/668: chown d0/d93/d36/d5b/l99 29314 1 2026-03-10T12:38:07.313 INFO:tasks.workunit.client.0.vm00.stdout:8/669: write d0/d93/f8f [160589,39755] 0 2026-03-10T12:38:07.314 INFO:tasks.workunit.client.1.vm07.stdout:3/622: truncate dc/d18/f79 266715 0 2026-03-10T12:38:07.314 INFO:tasks.workunit.client.0.vm00.stdout:8/670: dread d0/d93/d36/d7d/fb0 [0,4194304] 0 2026-03-10T12:38:07.319 INFO:tasks.workunit.client.1.vm07.stdout:8/584: creat d1/d3/fbb x:0 0 0 2026-03-10T12:38:07.321 INFO:tasks.workunit.client.0.vm00.stdout:4/787: getdents df/d1f/d36/dc6 0 2026-03-10T12:38:07.322 INFO:tasks.workunit.client.1.vm07.stdout:6/571: dread - d1/d4/d71/f79 zero size 2026-03-10T12:38:07.322 INFO:tasks.workunit.client.1.vm07.stdout:1/576: dwrite d9/df/d29/f82 [0,4194304] 0 2026-03-10T12:38:07.325 INFO:tasks.workunit.client.0.vm00.stdout:4/788: dread df/d1f/d22/d26/dab/fd7 [0,4194304] 0 2026-03-10T12:38:07.329 INFO:tasks.workunit.client.1.vm07.stdout:2/487: link d0/d42/f22 d0/d42/d26/d7d/faa 0 2026-03-10T12:38:07.330 INFO:tasks.workunit.client.1.vm07.stdout:2/488: readlink d0/d29/d64/la3 0 2026-03-10T12:38:07.331 INFO:tasks.workunit.client.1.vm07.stdout:8/585: symlink d1/d3/db2/lbc 0 2026-03-10T12:38:07.333 INFO:tasks.workunit.client.1.vm07.stdout:1/577: dread d9/df/d29/d2b/d3d/f4c [0,4194304] 0 2026-03-10T12:38:07.334 INFO:tasks.workunit.client.0.vm00.stdout:4/789: truncate df/f1e 1852115 0 2026-03-10T12:38:07.339 INFO:tasks.workunit.client.1.vm07.stdout:8/586: truncate d1/f4b 2085696 0 2026-03-10T12:38:07.339 INFO:tasks.workunit.client.1.vm07.stdout:8/587: chown d1/d3/d6/d54/f7d 1464471 1 2026-03-10T12:38:07.346 INFO:tasks.workunit.client.0.vm00.stdout:4/790: truncate df/d63/d77/f8d 5916466 0 2026-03-10T12:38:07.362 INFO:tasks.workunit.client.0.vm00.stdout:9/805: dread d0/d3d/d59/d4e/dba/d1e/f121 [0,4194304] 0 2026-03-10T12:38:07.369 INFO:tasks.workunit.client.0.vm00.stdout:9/806: dwrite d0/d7f/db8/dc4/f4f [0,4194304] 0 2026-03-10T12:38:07.371 INFO:tasks.workunit.client.0.vm00.stdout:4/791: dread df/d1f/d22/d26/dab/fc5 [0,4194304] 0 2026-03-10T12:38:07.384 INFO:tasks.workunit.client.0.vm00.stdout:4/792: dread df/d1f/fd3 [0,4194304] 0 2026-03-10T12:38:07.385 INFO:tasks.workunit.client.0.vm00.stdout:4/793: readlink df/d1f/d22/d26/d65/lf6 0 2026-03-10T12:38:07.386 INFO:tasks.workunit.client.0.vm00.stdout:4/794: fsync df/d63/ddb/ff8 0 2026-03-10T12:38:07.386 INFO:tasks.workunit.client.1.vm07.stdout:3/623: dread dc/dd/d28/d3b/f4c [0,4194304] 0 2026-03-10T12:38:07.387 INFO:tasks.workunit.client.0.vm00.stdout:4/795: mknod df/d63/c104 0 2026-03-10T12:38:07.388 INFO:tasks.workunit.client.0.vm00.stdout:4/796: mknod df/d1f/d22/dcb/c105 0 2026-03-10T12:38:07.389 INFO:tasks.workunit.client.0.vm00.stdout:4/797: mknod df/d93/d9e/c106 0 2026-03-10T12:38:07.391 INFO:tasks.workunit.client.0.vm00.stdout:4/798: rename df/d1f/d22/d26/d65/da7/lf3 to df/d1f/d22/d26/d65/da7/l107 0 2026-03-10T12:38:07.428 INFO:tasks.workunit.client.1.vm07.stdout:4/708: rename d0/d4/d5/l20 to d0/lf8 0 2026-03-10T12:38:07.434 INFO:tasks.workunit.client.1.vm07.stdout:7/552: rename d0/d61/db3 to d0/d61/d79/db5 0 2026-03-10T12:38:07.435 INFO:tasks.workunit.client.1.vm07.stdout:3/624: creat dc/d18/fd4 x:0 0 0 2026-03-10T12:38:07.440 INFO:tasks.workunit.client.1.vm07.stdout:7/553: dwrite d0/d61/db4/f4b [4194304,4194304] 0 2026-03-10T12:38:07.450 INFO:tasks.workunit.client.1.vm07.stdout:2/489: read d0/d42/d4e/d77/d70/f96 [1586388,67231] 0 2026-03-10T12:38:07.450 INFO:tasks.workunit.client.1.vm07.stdout:1/578: fdatasync d9/df/d29/f82 0 2026-03-10T12:38:07.452 INFO:tasks.workunit.client.0.vm00.stdout:7/546: write f0 [9675432,109138] 0 2026-03-10T12:38:07.452 INFO:tasks.workunit.client.1.vm07.stdout:7/554: mknod d0/d67/d6f/d80/cb6 0 2026-03-10T12:38:07.454 INFO:tasks.workunit.client.1.vm07.stdout:2/490: mkdir d0/d42/d4e/dab 0 2026-03-10T12:38:07.457 INFO:tasks.workunit.client.1.vm07.stdout:1/579: mknod d9/df/d29/d2b/d92/cc1 0 2026-03-10T12:38:07.460 INFO:tasks.workunit.client.0.vm00.stdout:7/547: mkdir da/d41/d7b/d9d/dc8 0 2026-03-10T12:38:07.460 INFO:tasks.workunit.client.1.vm07.stdout:9/673: write d5/d13/d6c/da4/fd0 [503663,80883] 0 2026-03-10T12:38:07.464 INFO:tasks.workunit.client.0.vm00.stdout:7/548: dread da/d25/d2c/f30 [0,4194304] 0 2026-03-10T12:38:07.467 INFO:tasks.workunit.client.1.vm07.stdout:7/555: fdatasync d0/d61/db4/fad 0 2026-03-10T12:38:07.468 INFO:tasks.workunit.client.0.vm00.stdout:4/799: dread df/d1f/d22/d26/d65/fba [0,4194304] 0 2026-03-10T12:38:07.469 INFO:tasks.workunit.client.1.vm07.stdout:1/580: fsync d9/fe 0 2026-03-10T12:38:07.473 INFO:tasks.workunit.client.0.vm00.stdout:8/671: dwrite d0/d46/d6e/f7b [0,4194304] 0 2026-03-10T12:38:07.474 INFO:tasks.workunit.client.1.vm07.stdout:9/674: creat d5/d69/d93/d97/fe3 x:0 0 0 2026-03-10T12:38:07.477 INFO:tasks.workunit.client.1.vm07.stdout:7/556: unlink d0/d57/d62/f6c 0 2026-03-10T12:38:07.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:07 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:07.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:07 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr fail", "who": "vm00.nescmq"}]: dispatch 2026-03-10T12:38:07.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:07 vm00.local ceph-mon[50686]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T12:38:07.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:07 vm00.local ceph-mon[50686]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "mgr fail", "who": "vm00.nescmq"}]': finished 2026-03-10T12:38:07.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:07 vm00.local ceph-mon[50686]: mgrmap e20: vm07.kfawlb(active, starting, since 0.0440459s) 2026-03-10T12:38:07.485 INFO:tasks.workunit.client.1.vm07.stdout:2/491: rename d0/d42/d26/d7d/c83 to d0/d42/d4e/d77/d70/cac 0 2026-03-10T12:38:07.487 INFO:tasks.workunit.client.1.vm07.stdout:1/581: dread d9/f1f [0,4194304] 0 2026-03-10T12:38:07.495 INFO:tasks.workunit.client.1.vm07.stdout:0/640: write d0/d14/d5f/d76/f3d [3854891,7781] 0 2026-03-10T12:38:07.497 INFO:tasks.workunit.client.0.vm00.stdout:9/807: write d0/d3d/d59/d4e/f70 [281475,113095] 0 2026-03-10T12:38:07.498 INFO:tasks.workunit.client.1.vm07.stdout:7/557: creat d0/d61/db4/d8a/d9d/fb7 x:0 0 0 2026-03-10T12:38:07.500 INFO:tasks.workunit.client.0.vm00.stdout:8/672: creat d0/d93/d2d/dc8/fd1 x:0 0 0 2026-03-10T12:38:07.507 INFO:tasks.workunit.client.1.vm07.stdout:5/623: read d0/d22/d18/d3e/d5d/dcf/fdd [220590,53920] 0 2026-03-10T12:38:07.507 INFO:tasks.workunit.client.1.vm07.stdout:2/492: mkdir d0/d42/d26/d38/d4f/dad 0 2026-03-10T12:38:07.507 INFO:tasks.workunit.client.0.vm00.stdout:9/808: write d0/f1a [1388176,115150] 0 2026-03-10T12:38:07.507 INFO:tasks.workunit.client.0.vm00.stdout:8/673: stat d0/d93/d60/f98 0 2026-03-10T12:38:07.507 INFO:tasks.workunit.client.1.vm07.stdout:5/624: dread d0/d22/d18/d19/d2e/f52 [0,4194304] 0 2026-03-10T12:38:07.508 INFO:tasks.workunit.client.1.vm07.stdout:5/625: stat d0/d22/d18/d19/d21/f61 0 2026-03-10T12:38:07.509 INFO:tasks.workunit.client.0.vm00.stdout:0/655: dwrite d3/d7/d4c/f76 [0,4194304] 0 2026-03-10T12:38:07.517 INFO:tasks.workunit.client.0.vm00.stdout:8/674: stat d0/f9d 0 2026-03-10T12:38:07.518 INFO:tasks.workunit.client.1.vm07.stdout:7/558: read d0/f40 [2406762,13743] 0 2026-03-10T12:38:07.519 INFO:tasks.workunit.client.1.vm07.stdout:7/559: truncate d0/d67/f71 777058 0 2026-03-10T12:38:07.519 INFO:tasks.workunit.client.1.vm07.stdout:7/560: stat d0 0 2026-03-10T12:38:07.525 INFO:tasks.workunit.client.0.vm00.stdout:6/507: rename d2/d16/d29/d31/d34/la2 to d2/d51/d70/lb7 0 2026-03-10T12:38:07.527 INFO:tasks.workunit.client.1.vm07.stdout:6/572: dwrite d1/d4/d6/d16/d1a/d2c/f78 [0,4194304] 0 2026-03-10T12:38:07.529 INFO:tasks.workunit.client.1.vm07.stdout:6/573: write d1/d4/d6/d43/d88/d97/fa2 [154535,74881] 0 2026-03-10T12:38:07.530 INFO:tasks.workunit.client.1.vm07.stdout:6/574: readlink d1/d4/d6/d16/d49/la6 0 2026-03-10T12:38:07.530 INFO:tasks.workunit.client.1.vm07.stdout:6/575: stat d1/d4/d6/d46 0 2026-03-10T12:38:07.531 INFO:tasks.workunit.client.0.vm00.stdout:1/777: rename da/d21/d27/cd1 to da/d21/db3/d59/da6/da4/dda/dc0/c105 0 2026-03-10T12:38:07.531 INFO:tasks.workunit.client.1.vm07.stdout:6/576: write d1/d4/d6/d16/d1a/d2c/f78 [4879119,22267] 0 2026-03-10T12:38:07.531 INFO:tasks.workunit.client.0.vm00.stdout:8/675: link d0/d46/d6e/d9b/faf d0/d93/d43/fd2 0 2026-03-10T12:38:07.532 INFO:tasks.workunit.client.0.vm00.stdout:8/676: write d0/d93/d17/d48/fc7 [1025605,3776] 0 2026-03-10T12:38:07.533 INFO:tasks.workunit.client.1.vm07.stdout:2/493: symlink d0/d42/d26/d38/d4f/lae 0 2026-03-10T12:38:07.533 INFO:tasks.workunit.client.1.vm07.stdout:2/494: dread - d0/d42/d26/f85 zero size 2026-03-10T12:38:07.537 INFO:tasks.workunit.client.1.vm07.stdout:6/577: dwrite d1/d4/d6/d16/d1a/d99/fa8 [0,4194304] 0 2026-03-10T12:38:07.539 INFO:tasks.workunit.client.1.vm07.stdout:6/578: write d1/d4/d6/f41 [4263447,6130] 0 2026-03-10T12:38:07.545 INFO:tasks.workunit.client.0.vm00.stdout:4/800: dread df/f1c [0,4194304] 0 2026-03-10T12:38:07.546 INFO:tasks.workunit.client.0.vm00.stdout:0/656: creat d3/d22/d3a/fd6 x:0 0 0 2026-03-10T12:38:07.546 INFO:tasks.workunit.client.0.vm00.stdout:4/801: chown df/d1f/ff9 603634129 1 2026-03-10T12:38:07.548 INFO:tasks.workunit.client.0.vm00.stdout:9/809: read d0/d7f/db8/dc4/fde [4049069,1287] 0 2026-03-10T12:38:07.551 INFO:tasks.workunit.client.1.vm07.stdout:8/588: write d1/f19 [5612220,42547] 0 2026-03-10T12:38:07.553 INFO:tasks.workunit.client.0.vm00.stdout:0/657: symlink d3/d7/db0/dc4/ld7 0 2026-03-10T12:38:07.558 INFO:tasks.workunit.client.0.vm00.stdout:8/677: symlink d0/d46/d89/ld3 0 2026-03-10T12:38:07.561 INFO:tasks.workunit.client.0.vm00.stdout:5/815: rename d1f/d26/d2b/d37/l62 to d1f/d26/d2b/d37/dc4/l120 0 2026-03-10T12:38:07.564 INFO:tasks.workunit.client.0.vm00.stdout:5/816: dwrite d1f/d6a/d94/dc3/fd9 [4194304,4194304] 0 2026-03-10T12:38:07.570 INFO:tasks.workunit.client.0.vm00.stdout:9/810: sync 2026-03-10T12:38:07.573 INFO:tasks.workunit.client.0.vm00.stdout:4/802: dwrite df/d1f/d36/d3a/d41/fc7 [0,4194304] 0 2026-03-10T12:38:07.576 INFO:tasks.workunit.client.0.vm00.stdout:3/788: rename dd/d64/f87 to dd/d2a/da2/db4/f107 0 2026-03-10T12:38:07.583 INFO:tasks.workunit.client.1.vm07.stdout:9/675: rmdir d5/d1f/d5e/d8d 0 2026-03-10T12:38:07.585 INFO:tasks.workunit.client.1.vm07.stdout:4/709: dwrite d0/f53 [0,4194304] 0 2026-03-10T12:38:07.587 INFO:tasks.workunit.client.1.vm07.stdout:0/641: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4 0 2026-03-10T12:38:07.589 INFO:tasks.workunit.client.0.vm00.stdout:4/803: dwrite df/f4e [4194304,4194304] 0 2026-03-10T12:38:07.591 INFO:tasks.workunit.client.1.vm07.stdout:7/561: chown d0/c34 934478 1 2026-03-10T12:38:07.597 INFO:tasks.workunit.client.1.vm07.stdout:7/562: chown d0/d61/db4/f53 121742 1 2026-03-10T12:38:07.602 INFO:tasks.workunit.client.1.vm07.stdout:3/625: write dc/d18/d2d/f71 [1123396,37636] 0 2026-03-10T12:38:07.604 INFO:tasks.workunit.client.0.vm00.stdout:9/811: sync 2026-03-10T12:38:07.609 INFO:tasks.workunit.client.0.vm00.stdout:0/658: rename d3/d7/d3c/l7b to d3/d7/d3c/ld8 0 2026-03-10T12:38:07.618 INFO:tasks.workunit.client.1.vm07.stdout:2/495: mkdir d0/d42/d4e/daf 0 2026-03-10T12:38:07.623 INFO:tasks.workunit.client.0.vm00.stdout:2/783: rename d4/d53/d68/dc2/cf7 to d4/dd/def/cfe 0 2026-03-10T12:38:07.632 INFO:tasks.workunit.client.1.vm07.stdout:6/579: rename d1/d4/d6/d16/d1a/d99/lac to d1/d4/d6/d16/d1a/d9d/db2/lb8 0 2026-03-10T12:38:07.632 INFO:tasks.workunit.client.1.vm07.stdout:6/580: dread d1/d4/d6/d43/d65/f76 [0,4194304] 0 2026-03-10T12:38:07.632 INFO:tasks.workunit.client.1.vm07.stdout:6/581: chown d1/d4/d6/l32 536519 1 2026-03-10T12:38:07.636 INFO:tasks.workunit.client.1.vm07.stdout:1/582: rename d9/df/d54 to d9/df/dc2 0 2026-03-10T12:38:07.637 INFO:tasks.workunit.client.0.vm00.stdout:9/812: creat d0/d3d/d59/d4e/dba/d19/d50/f124 x:0 0 0 2026-03-10T12:38:07.640 INFO:tasks.workunit.client.0.vm00.stdout:1/778: dread da/d24/d28/faa [0,4194304] 0 2026-03-10T12:38:07.641 INFO:tasks.workunit.client.1.vm07.stdout:9/676: creat d5/d1f/fe4 x:0 0 0 2026-03-10T12:38:07.645 INFO:tasks.workunit.client.0.vm00.stdout:1/779: dwrite da/d12/d91/fb8 [0,4194304] 0 2026-03-10T12:38:07.655 INFO:tasks.workunit.client.1.vm07.stdout:7/563: mknod d0/d61/d79/cb8 0 2026-03-10T12:38:07.655 INFO:tasks.workunit.client.0.vm00.stdout:1/780: creat da/d12/d26/dd2/ddf/f106 x:0 0 0 2026-03-10T12:38:07.655 INFO:tasks.workunit.client.0.vm00.stdout:7/549: rename da/fe to da/d26/d50/fc9 0 2026-03-10T12:38:07.655 INFO:tasks.workunit.client.0.vm00.stdout:1/781: unlink da/d12/d91/dcb/cf2 0 2026-03-10T12:38:07.655 INFO:tasks.workunit.client.0.vm00.stdout:1/782: dread - da/d24/d73/fc8 zero size 2026-03-10T12:38:07.655 INFO:tasks.workunit.client.0.vm00.stdout:9/813: getdents d0/d3d/d59/d74 0 2026-03-10T12:38:07.656 INFO:tasks.workunit.client.0.vm00.stdout:9/814: mkdir d0/d3d/d125 0 2026-03-10T12:38:07.656 INFO:tasks.workunit.client.1.vm07.stdout:2/496: symlink d0/d29/d64/d74/d88/lb0 0 2026-03-10T12:38:07.657 INFO:tasks.workunit.client.1.vm07.stdout:2/497: truncate d0/d42/d26/f85 357930 0 2026-03-10T12:38:07.657 INFO:tasks.workunit.client.0.vm00.stdout:9/815: mkdir d0/d3d/d43/d53/d126 0 2026-03-10T12:38:07.664 INFO:tasks.workunit.client.1.vm07.stdout:0/642: rename d0/d14/d5f/d76/da1/fa2 to d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/fd5 0 2026-03-10T12:38:07.665 INFO:tasks.workunit.client.0.vm00.stdout:6/508: rename d2/d14/f2b to d2/d42/d80/d89/fb8 0 2026-03-10T12:38:07.668 INFO:tasks.workunit.client.0.vm00.stdout:4/804: rename df/d93/dbc/ffd to df/d1f/d36/dc6/df1/f108 0 2026-03-10T12:38:07.669 INFO:tasks.workunit.client.0.vm00.stdout:4/805: dread df/d1f/d22/d26/dab/fc5 [0,4194304] 0 2026-03-10T12:38:07.671 INFO:tasks.workunit.client.0.vm00.stdout:1/783: sync 2026-03-10T12:38:07.675 INFO:tasks.workunit.client.1.vm07.stdout:9/677: creat d5/d13/d6c/d7a/fe5 x:0 0 0 2026-03-10T12:38:07.676 INFO:tasks.workunit.client.0.vm00.stdout:1/784: creat da/d21/db3/d59/da6/da4/dda/dc0/dfe/f107 x:0 0 0 2026-03-10T12:38:07.677 INFO:tasks.workunit.client.0.vm00.stdout:1/785: rmdir da 39 2026-03-10T12:38:07.678 INFO:tasks.workunit.client.0.vm00.stdout:0/659: rename d3/d22/f2e to d3/d22/d3a/fd9 0 2026-03-10T12:38:07.678 INFO:tasks.workunit.client.0.vm00.stdout:0/660: chown d3/d7/d4c/d5b/d38/fbf 41 1 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.0.vm00.stdout:0/661: dwrite d3/d7/d3c/f72 [0,4194304] 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.0.vm00.stdout:0/662: mknod d3/d7/d4c/d9d/cda 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.0.vm00.stdout:1/786: creat da/d12/d91/f108 x:0 0 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.1.vm07.stdout:7/564: creat d0/d52/fb9 x:0 0 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.1.vm07.stdout:2/498: stat d0/d42/d26/f50 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.1.vm07.stdout:0/643: mknod d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/cd6 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.1.vm07.stdout:6/582: getdents d1/d4/d6/d46/d4d 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.1.vm07.stdout:1/583: mknod d9/df/d29/d2b/d92/db6/cc3 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.1.vm07.stdout:8/589: creat d1/d3/d11/fbd x:0 0 0 2026-03-10T12:38:07.697 INFO:tasks.workunit.client.1.vm07.stdout:9/678: rename d5/d1f/d31 to d5/d13/d2c/de6 0 2026-03-10T12:38:07.701 INFO:tasks.workunit.client.1.vm07.stdout:7/565: creat d0/d61/d79/fba x:0 0 0 2026-03-10T12:38:07.706 INFO:tasks.workunit.client.0.vm00.stdout:8/678: truncate d0/d93/d2d/d49/f6c 2283608 0 2026-03-10T12:38:07.709 INFO:tasks.workunit.client.0.vm00.stdout:5/817: dwrite d1f/d26/d2e/d58/d6b/d113/f11c [4194304,4194304] 0 2026-03-10T12:38:07.711 INFO:tasks.workunit.client.0.vm00.stdout:0/663: mknod d3/d7/d3c/cdb 0 2026-03-10T12:38:07.711 INFO:tasks.workunit.client.0.vm00.stdout:7/550: read f0 [4774663,101814] 0 2026-03-10T12:38:07.718 INFO:tasks.workunit.client.0.vm00.stdout:5/818: dread d1f/d26/d2b/d35/fe8 [0,4194304] 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.0.vm00.stdout:5/819: stat d1f/d26/d2b/d35/d53/d72/fa0 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.0.vm00.stdout:5/820: mknod d1f/d26/d2b/d37/db2/c121 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.0.vm00.stdout:5/821: mkdir d1f/d26/d2b/d35/d78/d99/dcd/d122 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.0.vm00.stdout:5/822: write d1f/d26/d2e/d58/ff6 [2044476,113732] 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.0.vm00.stdout:0/664: symlink d3/db/d77/d82/ldc 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.0.vm00.stdout:5/823: rename d1f/d26/d2b/d35/d53 to d1f/d26/d2e/d58/d10c/d123 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:1/584: symlink d9/df/d29/d2b/d92/d9d/lc4 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:6/583: rename d1/d4/d6/l23 to d1/d4/d6/d43/d88/d97/lb9 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:9/679: fsync d5/d13/f67 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:8/590: mknod d1/d3/d6/d7b/db8/cbe 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:6/584: creat d1/d4/d6/d53/d66/fba x:0 0 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:9/680: rename d5/d13/l47 to d5/d13/d9b/le7 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:1/585: fsync d9/df/d29/d2b/f32 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:6/585: creat d1/d4/d44/fbb x:0 0 0 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:9/681: rmdir d5/d13/d22 39 2026-03-10T12:38:07.753 INFO:tasks.workunit.client.1.vm07.stdout:1/586: dread d9/d2d/d4f/fb5 [0,4194304] 0 2026-03-10T12:38:07.755 INFO:tasks.workunit.client.1.vm07.stdout:9/682: dread - d5/d13/d22/f9e zero size 2026-03-10T12:38:07.767 INFO:tasks.workunit.client.1.vm07.stdout:9/683: fsync d5/d13/d2c/de6/f43 0 2026-03-10T12:38:07.767 INFO:tasks.workunit.client.1.vm07.stdout:9/684: creat d5/d69/d93/d97/fe8 x:0 0 0 2026-03-10T12:38:07.767 INFO:tasks.workunit.client.1.vm07.stdout:9/685: dwrite d5/d69/d93/d97/fe3 [0,4194304] 0 2026-03-10T12:38:07.768 INFO:tasks.workunit.client.1.vm07.stdout:1/587: read d9/d2d/d80/f8d [2910955,87276] 0 2026-03-10T12:38:07.769 INFO:tasks.workunit.client.1.vm07.stdout:1/588: write d9/df/f96 [5185822,127806] 0 2026-03-10T12:38:07.772 INFO:tasks.workunit.client.0.vm00.stdout:7/551: read da/d26/d37/f96 [1554042,72777] 0 2026-03-10T12:38:07.775 INFO:tasks.workunit.client.0.vm00.stdout:7/552: stat da/d26/d37/d61/l70 0 2026-03-10T12:38:07.775 INFO:tasks.workunit.client.1.vm07.stdout:7/566: sync 2026-03-10T12:38:07.775 INFO:tasks.workunit.client.1.vm07.stdout:7/567: write d0/f2b [4018497,31851] 0 2026-03-10T12:38:07.779 INFO:tasks.workunit.client.0.vm00.stdout:2/784: write d4/d53/d76/fac [142363,71262] 0 2026-03-10T12:38:07.779 INFO:tasks.workunit.client.1.vm07.stdout:7/568: mknod d0/d61/db4/cbb 0 2026-03-10T12:38:07.782 INFO:tasks.workunit.client.0.vm00.stdout:3/789: dwrite dd/d3d/d8a/de0/f95 [4194304,4194304] 0 2026-03-10T12:38:07.788 INFO:tasks.workunit.client.0.vm00.stdout:9/816: dwrite d0/d3d/d59/d4e/dba/d1e/d2b/f47 [0,4194304] 0 2026-03-10T12:38:07.788 INFO:tasks.workunit.client.1.vm07.stdout:7/569: link d0/d47/f73 d0/d67/d6f/d80/fbc 0 2026-03-10T12:38:07.789 INFO:tasks.workunit.client.1.vm07.stdout:7/570: dread - d0/d61/d79/f83 zero size 2026-03-10T12:38:07.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:07 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:07.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:07 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr fail", "who": "vm00.nescmq"}]: dispatch 2026-03-10T12:38:07.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:07 vm07.local ceph-mon[58582]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T12:38:07.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:07 vm07.local ceph-mon[58582]: from='mgr.14223 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd='[{"prefix": "mgr fail", "who": "vm00.nescmq"}]': finished 2026-03-10T12:38:07.789 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:07 vm07.local ceph-mon[58582]: mgrmap e20: vm07.kfawlb(active, starting, since 0.0440459s) 2026-03-10T12:38:07.789 INFO:tasks.workunit.client.0.vm00.stdout:9/817: write d0/d7f/db8/f11b [533117,52793] 0 2026-03-10T12:38:07.796 INFO:tasks.workunit.client.0.vm00.stdout:3/790: creat dd/d3d/d73/f108 x:0 0 0 2026-03-10T12:38:07.800 INFO:tasks.workunit.client.0.vm00.stdout:3/791: chown dd/ff1 20 1 2026-03-10T12:38:07.807 INFO:tasks.workunit.client.0.vm00.stdout:2/785: unlink d4/d6/d2d/fa9 0 2026-03-10T12:38:07.807 INFO:tasks.workunit.client.0.vm00.stdout:2/786: read d4/dd/db9/f7a [232624,65470] 0 2026-03-10T12:38:07.807 INFO:tasks.workunit.client.0.vm00.stdout:2/787: fsync d4/f67 0 2026-03-10T12:38:07.812 INFO:tasks.workunit.client.0.vm00.stdout:9/818: rmdir d0/d5 39 2026-03-10T12:38:07.816 INFO:tasks.workunit.client.0.vm00.stdout:3/792: dread dd/d3d/d8a/f8b [0,4194304] 0 2026-03-10T12:38:07.822 INFO:tasks.workunit.client.1.vm07.stdout:5/626: dwrite d0/d22/d18/d19/d2e/f88 [4194304,4194304] 0 2026-03-10T12:38:07.826 INFO:tasks.workunit.client.0.vm00.stdout:9/819: mknod d0/d3d/d59/d4e/dba/d1e/d27/c127 0 2026-03-10T12:38:07.833 INFO:tasks.workunit.client.0.vm00.stdout:3/793: stat dd/d27/l79 0 2026-03-10T12:38:07.835 INFO:tasks.workunit.client.0.vm00.stdout:4/806: truncate df/d1f/d36/d3a/fd5 50786 0 2026-03-10T12:38:07.841 INFO:tasks.workunit.client.0.vm00.stdout:1/787: dwrite da/d24/d28/d67/f6c [0,4194304] 0 2026-03-10T12:38:07.843 INFO:tasks.workunit.client.0.vm00.stdout:1/788: fsync da/d24/d5a/d71/dd4/ff7 0 2026-03-10T12:38:07.843 INFO:tasks.workunit.client.0.vm00.stdout:1/789: chown da/d12/c25 185956 1 2026-03-10T12:38:07.844 INFO:tasks.workunit.client.0.vm00.stdout:1/790: write da/d12/d26/f31 [5168005,97013] 0 2026-03-10T12:38:07.845 INFO:tasks.workunit.client.0.vm00.stdout:1/791: write da/d24/d73/fc8 [379505,56438] 0 2026-03-10T12:38:07.845 INFO:tasks.workunit.client.0.vm00.stdout:1/792: write da/d21/f74 [5003125,57600] 0 2026-03-10T12:38:07.846 INFO:tasks.workunit.client.0.vm00.stdout:8/679: write d0/dd/f2b [4829564,60599] 0 2026-03-10T12:38:07.847 INFO:tasks.workunit.client.0.vm00.stdout:1/793: stat da/d21/db3/d59/da6/da4/dda/dc0/dc3 0 2026-03-10T12:38:07.855 INFO:tasks.workunit.client.0.vm00.stdout:2/788: mknod d4/d78/df9/cff 0 2026-03-10T12:38:07.860 INFO:tasks.workunit.client.0.vm00.stdout:9/820: creat d0/d3d/d43/f128 x:0 0 0 2026-03-10T12:38:07.865 INFO:tasks.workunit.client.0.vm00.stdout:4/807: mknod df/d1f/d36/dc6/c109 0 2026-03-10T12:38:07.866 INFO:tasks.workunit.client.0.vm00.stdout:4/808: truncate df/d1f/d22/d26/d65/d91/fff 132391 0 2026-03-10T12:38:07.876 INFO:tasks.workunit.client.0.vm00.stdout:8/680: fsync d0/d46/d6e/d9b/faf 0 2026-03-10T12:38:07.877 INFO:tasks.workunit.client.0.vm00.stdout:7/553: dread da/f23 [0,4194304] 0 2026-03-10T12:38:07.879 INFO:tasks.workunit.client.0.vm00.stdout:7/554: creat da/d1b/d40/fca x:0 0 0 2026-03-10T12:38:07.881 INFO:tasks.workunit.client.0.vm00.stdout:2/789: mknod d4/d53/d9e/c100 0 2026-03-10T12:38:07.882 INFO:tasks.workunit.client.0.vm00.stdout:3/794: mknod dd/d18/c109 0 2026-03-10T12:38:07.885 INFO:tasks.workunit.client.0.vm00.stdout:5/824: write d1f/d26/d2b/d37/f81 [1507134,121609] 0 2026-03-10T12:38:07.891 INFO:tasks.workunit.client.0.vm00.stdout:4/809: chown df/d32/d76/c9f 2051870245 1 2026-03-10T12:38:07.897 INFO:tasks.workunit.client.0.vm00.stdout:1/794: truncate da/d24/d73/fb6 159506 0 2026-03-10T12:38:07.897 INFO:tasks.workunit.client.0.vm00.stdout:3/795: dread - dd/d18/d13/d1d/dc6/d106/f104 zero size 2026-03-10T12:38:07.897 INFO:tasks.workunit.client.0.vm00.stdout:4/810: creat df/d93/dbc/f10a x:0 0 0 2026-03-10T12:38:07.898 INFO:tasks.workunit.client.0.vm00.stdout:4/811: rmdir df/d1f/d22/dcb 39 2026-03-10T12:38:07.899 INFO:tasks.workunit.client.0.vm00.stdout:3/796: symlink dd/d3d/l10a 0 2026-03-10T12:38:07.902 INFO:tasks.workunit.client.0.vm00.stdout:5/825: getdents d1f/d26/d2e/d58/d10c/d123/d5b/dd1 0 2026-03-10T12:38:07.904 INFO:tasks.workunit.client.0.vm00.stdout:4/812: mknod df/d1f/d22/c10b 0 2026-03-10T12:38:07.906 INFO:tasks.workunit.client.0.vm00.stdout:3/797: link dd/d18/d14/c46 dd/d64/d93/c10b 0 2026-03-10T12:38:07.907 INFO:tasks.workunit.client.0.vm00.stdout:5/826: dwrite d1f/d26/d2b/de4/f11d [0,4194304] 0 2026-03-10T12:38:07.908 INFO:tasks.workunit.client.0.vm00.stdout:7/555: sync 2026-03-10T12:38:07.908 INFO:tasks.workunit.client.0.vm00.stdout:2/790: sync 2026-03-10T12:38:07.908 INFO:tasks.workunit.client.0.vm00.stdout:7/556: readlink da/d1b/l6d 0 2026-03-10T12:38:07.912 INFO:tasks.workunit.client.0.vm00.stdout:7/557: mknod da/d1b/ccb 0 2026-03-10T12:38:07.914 INFO:tasks.workunit.client.0.vm00.stdout:7/558: read da/d1b/f39 [2855545,17183] 0 2026-03-10T12:38:07.914 INFO:tasks.workunit.client.1.vm07.stdout:4/710: write d0/d4/d10/d3c/d2b/d54/de1/f91 [319676,41200] 0 2026-03-10T12:38:07.916 INFO:tasks.workunit.client.0.vm00.stdout:7/559: creat da/d41/d48/d81/fcc x:0 0 0 2026-03-10T12:38:07.917 INFO:tasks.workunit.client.1.vm07.stdout:4/711: creat d0/d5c/d7c/ff9 x:0 0 0 2026-03-10T12:38:07.917 INFO:tasks.workunit.client.0.vm00.stdout:7/560: chown da/d26/d37/f4a 63450 1 2026-03-10T12:38:07.924 INFO:tasks.workunit.client.0.vm00.stdout:3/798: creat dd/d18/d13/d1d/f10c x:0 0 0 2026-03-10T12:38:07.930 INFO:tasks.workunit.client.1.vm07.stdout:4/712: dread d0/d4/d10/d5f/f63 [0,4194304] 0 2026-03-10T12:38:07.931 INFO:tasks.workunit.client.1.vm07.stdout:4/713: write d0/d4/df2/df6/fcd [3012565,66734] 0 2026-03-10T12:38:07.931 INFO:tasks.workunit.client.0.vm00.stdout:2/791: fsync d4/f73 0 2026-03-10T12:38:07.935 INFO:tasks.workunit.client.1.vm07.stdout:4/714: dread d0/d4/d10/fc7 [0,4194304] 0 2026-03-10T12:38:07.938 INFO:tasks.workunit.client.1.vm07.stdout:4/715: mknod d0/d4/d5/d78/dc5/df7/db2/dd5/cfa 0 2026-03-10T12:38:07.940 INFO:tasks.workunit.client.0.vm00.stdout:2/792: chown d4/d53/d76/cc0 1463 1 2026-03-10T12:38:07.941 INFO:tasks.workunit.client.1.vm07.stdout:3/626: dwrite dc/dd/d28/f46 [0,4194304] 0 2026-03-10T12:38:07.943 INFO:tasks.workunit.client.1.vm07.stdout:4/716: fdatasync d0/d4/df2/df6/f27 0 2026-03-10T12:38:07.945 INFO:tasks.workunit.client.0.vm00.stdout:7/561: sync 2026-03-10T12:38:07.946 INFO:tasks.workunit.client.1.vm07.stdout:3/627: dread dc/d18/d24/f55 [0,4194304] 0 2026-03-10T12:38:07.948 INFO:tasks.workunit.client.0.vm00.stdout:3/799: sync 2026-03-10T12:38:07.949 INFO:tasks.workunit.client.0.vm00.stdout:3/800: write dd/d64/d93/ff7 [304983,101466] 0 2026-03-10T12:38:07.957 INFO:tasks.workunit.client.1.vm07.stdout:4/717: rename d0/d4/d5/da/l26 to d0/d4/df2/df6/lfb 0 2026-03-10T12:38:07.959 INFO:tasks.workunit.client.0.vm00.stdout:2/793: chown d4/dd/db9/d6d/c47 5960715 1 2026-03-10T12:38:07.962 INFO:tasks.workunit.client.0.vm00.stdout:4/813: dread df/d1f/d36/d3a/fe1 [0,4194304] 0 2026-03-10T12:38:07.964 INFO:tasks.workunit.client.0.vm00.stdout:2/794: dwrite d4/d6/d2d/d3a/d43/d85/fa3 [0,4194304] 0 2026-03-10T12:38:07.970 INFO:tasks.workunit.client.0.vm00.stdout:7/562: link da/d26/f97 da/d25/d2c/d82/d68/fcd 0 2026-03-10T12:38:07.981 INFO:tasks.workunit.client.0.vm00.stdout:7/563: dread f0 [0,4194304] 0 2026-03-10T12:38:07.985 INFO:tasks.workunit.client.0.vm00.stdout:3/801: read - dd/d3d/d8a/de0/de4/fc5 zero size 2026-03-10T12:38:07.985 INFO:tasks.workunit.client.0.vm00.stdout:3/802: chown dd/d3d/d65 722094 1 2026-03-10T12:38:07.988 INFO:tasks.workunit.client.0.vm00.stdout:2/795: mkdir d4/d53/d9e/d101 0 2026-03-10T12:38:07.988 INFO:tasks.workunit.client.0.vm00.stdout:7/564: rename da/d1b/f1e to da/d26/d50/d73/fce 0 2026-03-10T12:38:07.989 INFO:tasks.workunit.client.0.vm00.stdout:2/796: fsync d4/d53/d68/f69 0 2026-03-10T12:38:07.989 INFO:tasks.workunit.client.0.vm00.stdout:7/565: chown da/d26/d37/d61 149 1 2026-03-10T12:38:07.992 INFO:tasks.workunit.client.1.vm07.stdout:3/628: dread dc/d18/d24/f3f [0,4194304] 0 2026-03-10T12:38:07.995 INFO:tasks.workunit.client.0.vm00.stdout:6/509: dwrite d2/d16/d74/f62 [0,4194304] 0 2026-03-10T12:38:07.998 INFO:tasks.workunit.client.1.vm07.stdout:2/499: write d0/d42/d26/d38/f3d [1339306,107435] 0 2026-03-10T12:38:08.004 INFO:tasks.workunit.client.0.vm00.stdout:6/510: sync 2026-03-10T12:38:08.008 INFO:tasks.workunit.client.0.vm00.stdout:9/821: write d0/d3d/d59/d4e/dba/f24 [2584642,116454] 0 2026-03-10T12:38:08.008 INFO:tasks.workunit.client.1.vm07.stdout:0/644: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/f9b [2598024,20389] 0 2026-03-10T12:38:08.009 INFO:tasks.workunit.client.0.vm00.stdout:8/681: write d0/dd/d38/f3d [1860850,96120] 0 2026-03-10T12:38:08.013 INFO:tasks.workunit.client.0.vm00.stdout:8/682: dwrite d0/f10 [0,4194304] 0 2026-03-10T12:38:08.020 INFO:tasks.workunit.client.0.vm00.stdout:8/683: dread - d0/d46/fc6 zero size 2026-03-10T12:38:08.020 INFO:tasks.workunit.client.0.vm00.stdout:1/795: dwrite da/d21/d27/fa0 [0,4194304] 0 2026-03-10T12:38:08.020 INFO:tasks.workunit.client.0.vm00.stdout:1/796: dread - da/d24/d28/d67/fed zero size 2026-03-10T12:38:08.020 INFO:tasks.workunit.client.1.vm07.stdout:8/591: dwrite d1/d3/d6c/f74 [0,4194304] 0 2026-03-10T12:38:08.021 INFO:tasks.workunit.client.0.vm00.stdout:0/665: dwrite d3/db/da4/fa7 [0,4194304] 0 2026-03-10T12:38:08.027 INFO:tasks.workunit.client.0.vm00.stdout:5/827: link d1f/d26/cb4 d1f/d96/dbd/df0/c124 0 2026-03-10T12:38:08.031 INFO:tasks.workunit.client.0.vm00.stdout:4/814: symlink df/d1f/d22/d26/d65/d91/l10c 0 2026-03-10T12:38:08.033 INFO:tasks.workunit.client.0.vm00.stdout:4/815: fdatasync df/d1f/d36/d3a/d41/fc7 0 2026-03-10T12:38:08.033 INFO:tasks.workunit.client.1.vm07.stdout:3/629: chown dc/dd/c1a 0 1 2026-03-10T12:38:08.035 INFO:tasks.workunit.client.0.vm00.stdout:6/511: rename d2/d39 to d2/d14/d7a/db9 0 2026-03-10T12:38:08.037 INFO:tasks.workunit.client.1.vm07.stdout:6/586: dwrite d1/d4/d6/d43/d65/f7f [0,4194304] 0 2026-03-10T12:38:08.038 INFO:tasks.workunit.client.0.vm00.stdout:4/816: dwrite df/d1f/d22/d26/d65/d91/d101/fe6 [0,4194304] 0 2026-03-10T12:38:08.045 INFO:tasks.workunit.client.1.vm07.stdout:9/686: write d5/d13/d57/d4f/d6a/f8a [768633,12457] 0 2026-03-10T12:38:08.051 INFO:tasks.workunit.client.0.vm00.stdout:2/797: fsync d4/d53/d76/d9b/dad/f65 0 2026-03-10T12:38:08.051 INFO:tasks.workunit.client.1.vm07.stdout:1/589: dwrite d9/d2d/d4f/d5a/f65 [0,4194304] 0 2026-03-10T12:38:08.051 INFO:tasks.workunit.client.1.vm07.stdout:2/500: symlink d0/d45/lb1 0 2026-03-10T12:38:08.051 INFO:tasks.workunit.client.1.vm07.stdout:2/501: chown d0/d42/d4e/d77/d70 429 1 2026-03-10T12:38:08.051 INFO:tasks.workunit.client.1.vm07.stdout:1/590: chown d9/df/d29/f82 0 1 2026-03-10T12:38:08.060 INFO:tasks.workunit.client.1.vm07.stdout:0/645: rmdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87 39 2026-03-10T12:38:08.063 INFO:tasks.workunit.client.0.vm00.stdout:8/684: unlink d0/dd/le 0 2026-03-10T12:38:08.065 INFO:tasks.workunit.client.0.vm00.stdout:7/566: mknod da/d26/d50/ccf 0 2026-03-10T12:38:08.068 INFO:tasks.workunit.client.0.vm00.stdout:4/817: symlink df/d1f/d22/d26/d65/d91/db9/l10d 0 2026-03-10T12:38:08.070 INFO:tasks.workunit.client.1.vm07.stdout:7/571: dwrite d0/d61/db4/d8a/d9d/fb1 [0,4194304] 0 2026-03-10T12:38:08.071 INFO:tasks.workunit.client.0.vm00.stdout:3/803: symlink dd/d3d/d8a/de0/d55/dfd/l10d 0 2026-03-10T12:38:08.073 INFO:tasks.workunit.client.0.vm00.stdout:2/798: mkdir d4/dd/d102 0 2026-03-10T12:38:08.074 INFO:tasks.workunit.client.0.vm00.stdout:8/685: mknod d0/d58/cd4 0 2026-03-10T12:38:08.075 INFO:tasks.workunit.client.0.vm00.stdout:8/686: chown d0/d46/d7e/f8a 3894902 1 2026-03-10T12:38:08.075 INFO:tasks.workunit.client.1.vm07.stdout:8/592: symlink d1/d3/d6/d7b/db8/lbf 0 2026-03-10T12:38:08.075 INFO:tasks.workunit.client.0.vm00.stdout:1/797: symlink da/l109 0 2026-03-10T12:38:08.077 INFO:tasks.workunit.client.0.vm00.stdout:6/512: readlink d2/l7 0 2026-03-10T12:38:08.079 INFO:tasks.workunit.client.1.vm07.stdout:5/627: dwrite d0/d22/d18/d3e/d53/d9e/f76 [0,4194304] 0 2026-03-10T12:38:08.095 INFO:tasks.workunit.client.1.vm07.stdout:6/587: fsync d1/f38 0 2026-03-10T12:38:08.095 INFO:tasks.workunit.client.0.vm00.stdout:3/804: mkdir dd/d2a/da2/d10e 0 2026-03-10T12:38:08.095 INFO:tasks.workunit.client.0.vm00.stdout:1/798: fdatasync da/d24/d28/d67/da2/f9c 0 2026-03-10T12:38:08.095 INFO:tasks.workunit.client.0.vm00.stdout:2/799: creat d4/d53/d76/d9b/dad/d8e/f103 x:0 0 0 2026-03-10T12:38:08.095 INFO:tasks.workunit.client.0.vm00.stdout:1/799: mkdir da/d12/d26/d10a 0 2026-03-10T12:38:08.098 INFO:tasks.workunit.client.0.vm00.stdout:1/800: dread da/d21/db3/d5d/d72/d7e/fac [0,4194304] 0 2026-03-10T12:38:08.098 INFO:tasks.workunit.client.1.vm07.stdout:1/591: symlink d9/df/d29/d2b/d92/db6/lc5 0 2026-03-10T12:38:08.101 INFO:tasks.workunit.client.0.vm00.stdout:1/801: rename da/d24/d73/dc6 to da/d24/d28/d67/da2/d10b 0 2026-03-10T12:38:08.102 INFO:tasks.workunit.client.0.vm00.stdout:1/802: mkdir da/d24/d5a/d71/d10c 0 2026-03-10T12:38:08.106 INFO:tasks.workunit.client.1.vm07.stdout:0/646: mkdir d0/d14/d5f/d76/d2f/d31/d79/dd7 0 2026-03-10T12:38:08.109 INFO:tasks.workunit.client.0.vm00.stdout:1/803: rename da/d12/f62 to da/d21/db3/d5d/d72/d7e/dbf/f10d 0 2026-03-10T12:38:08.112 INFO:tasks.workunit.client.1.vm07.stdout:3/630: mknod dc/d18/cd5 0 2026-03-10T12:38:08.113 INFO:tasks.workunit.client.0.vm00.stdout:1/804: mkdir da/d21/db3/d59/da6/da4/dda/dc0/dfe/d10e 0 2026-03-10T12:38:08.114 INFO:tasks.workunit.client.1.vm07.stdout:5/628: fsync d0/d22/f89 0 2026-03-10T12:38:08.115 INFO:tasks.workunit.client.0.vm00.stdout:1/805: creat da/d24/d28/d67/da2/f10f x:0 0 0 2026-03-10T12:38:08.149 INFO:tasks.workunit.client.1.vm07.stdout:6/588: truncate d1/d4/d6/d4e/d64/f6f 6062115 0 2026-03-10T12:38:08.152 INFO:tasks.workunit.client.1.vm07.stdout:1/592: creat d9/df/d29/d2b/d31/fc6 x:0 0 0 2026-03-10T12:38:08.165 INFO:tasks.workunit.client.1.vm07.stdout:0/647: unlink d0/d14/d5f/d41/d6a/d74/fc2 0 2026-03-10T12:38:08.165 INFO:tasks.workunit.client.1.vm07.stdout:7/572: link d0/d61/f93 d0/d47/dab/dae/fbd 0 2026-03-10T12:38:08.165 INFO:tasks.workunit.client.1.vm07.stdout:3/631: rename dc/dd/d28/d3b/f9e to dc/dd/d43/d5c/fd6 0 2026-03-10T12:38:08.165 INFO:tasks.workunit.client.1.vm07.stdout:5/629: symlink d0/d22/d18/d19/d36/d75/ldf 0 2026-03-10T12:38:08.165 INFO:tasks.workunit.client.1.vm07.stdout:9/687: link d5/l80 d5/d13/d2c/le9 0 2026-03-10T12:38:08.166 INFO:tasks.workunit.client.1.vm07.stdout:2/502: rmdir d0/d5b/d98/da8 0 2026-03-10T12:38:08.171 INFO:tasks.workunit.client.1.vm07.stdout:1/593: mkdir d9/d2d/d80/d8e/dc7 0 2026-03-10T12:38:08.174 INFO:tasks.workunit.client.0.vm00.stdout:4/818: dread df/d1f/d22/f52 [0,4194304] 0 2026-03-10T12:38:08.174 INFO:tasks.workunit.client.0.vm00.stdout:4/819: chown df/d63 6287 1 2026-03-10T12:38:08.176 INFO:tasks.workunit.client.0.vm00.stdout:4/820: rmdir df/d32/d64 39 2026-03-10T12:38:08.177 INFO:tasks.workunit.client.0.vm00.stdout:4/821: mkdir df/d1f/d22/d26/d65/da7/d10e 0 2026-03-10T12:38:08.178 INFO:tasks.workunit.client.0.vm00.stdout:4/822: truncate df/d1f/d22/d26/d65/d91/db9/fea 66268 0 2026-03-10T12:38:08.182 INFO:tasks.workunit.client.0.vm00.stdout:4/823: dwrite f3 [0,4194304] 0 2026-03-10T12:38:08.186 INFO:tasks.workunit.client.0.vm00.stdout:4/824: mknod df/d1f/d36/dc6/c10f 0 2026-03-10T12:38:08.186 INFO:tasks.workunit.client.1.vm07.stdout:2/503: unlink d0/d42/d4e/l7e 0 2026-03-10T12:38:08.188 INFO:tasks.workunit.client.1.vm07.stdout:1/594: mkdir d9/df/d29/d2b/d3d/dc8 0 2026-03-10T12:38:08.189 INFO:tasks.workunit.client.0.vm00.stdout:4/825: mkdir df/d6c/d90/d110 0 2026-03-10T12:38:08.189 INFO:tasks.workunit.client.1.vm07.stdout:8/593: getdents d1/d3/d5d 0 2026-03-10T12:38:08.195 INFO:tasks.workunit.client.0.vm00.stdout:4/826: dread df/d1f/d36/f69 [0,4194304] 0 2026-03-10T12:38:08.195 INFO:tasks.workunit.client.1.vm07.stdout:6/589: creat d1/d4/d6/d16/fbc x:0 0 0 2026-03-10T12:38:08.196 INFO:tasks.workunit.client.1.vm07.stdout:6/590: stat d1/d4/d6/d16/d1a/d9d 0 2026-03-10T12:38:08.200 INFO:tasks.workunit.client.1.vm07.stdout:2/504: rename d0/d29/d64/fa6 to d0/d42/d1f/d90/fb2 0 2026-03-10T12:38:08.201 INFO:tasks.workunit.client.1.vm07.stdout:2/505: chown d0/d42/d26/d7d 13972877 1 2026-03-10T12:38:08.205 INFO:tasks.workunit.client.1.vm07.stdout:3/632: link dc/d18/d24/f49 dc/dd/d1f/dac/fd7 0 2026-03-10T12:38:08.205 INFO:tasks.workunit.client.0.vm00.stdout:4/827: mkdir df/d1f/d36/d3a/d41/d111 0 2026-03-10T12:38:08.207 INFO:tasks.workunit.client.1.vm07.stdout:3/633: read dc/dd/f85 [2671089,110974] 0 2026-03-10T12:38:08.212 INFO:tasks.workunit.client.1.vm07.stdout:2/506: creat d0/d29/fb3 x:0 0 0 2026-03-10T12:38:08.215 INFO:tasks.workunit.client.1.vm07.stdout:2/507: dwrite d0/d42/d4e/d77/f89 [0,4194304] 0 2026-03-10T12:38:08.224 INFO:tasks.workunit.client.1.vm07.stdout:1/595: mkdir d9/df/dc9 0 2026-03-10T12:38:08.228 INFO:tasks.workunit.client.1.vm07.stdout:5/630: getdents d0/d22/d18/d19/d2e/da9 0 2026-03-10T12:38:08.242 INFO:tasks.workunit.client.1.vm07.stdout:3/634: mkdir dc/dd/d43/d76/d95/dd8 0 2026-03-10T12:38:08.243 INFO:tasks.workunit.client.1.vm07.stdout:2/508: mknod d0/d29/d64/d6c/d94/cb4 0 2026-03-10T12:38:08.243 INFO:tasks.workunit.client.1.vm07.stdout:5/631: creat d0/d22/d18/fe0 x:0 0 0 2026-03-10T12:38:08.243 INFO:tasks.workunit.client.1.vm07.stdout:5/632: chown d0/d22/l92 24 1 2026-03-10T12:38:08.247 INFO:tasks.workunit.client.1.vm07.stdout:5/633: unlink d0/d22/d18/d3e/d53/fce 0 2026-03-10T12:38:08.247 INFO:tasks.workunit.client.1.vm07.stdout:5/634: stat d0/d22/d18/d19/d2e/l78 0 2026-03-10T12:38:08.250 INFO:tasks.workunit.client.1.vm07.stdout:2/509: dread d0/d42/d26/d7d/f9a [0,4194304] 0 2026-03-10T12:38:08.251 INFO:tasks.workunit.client.1.vm07.stdout:5/635: dwrite d0/d22/d18/d3e/d5d/f6d [0,4194304] 0 2026-03-10T12:38:08.261 INFO:tasks.workunit.client.1.vm07.stdout:4/718: dwrite d0/d4/d5/d34/f5d [0,4194304] 0 2026-03-10T12:38:08.267 INFO:tasks.workunit.client.1.vm07.stdout:2/510: mkdir d0/d29/d64/db5 0 2026-03-10T12:38:08.272 INFO:tasks.workunit.client.1.vm07.stdout:2/511: creat d0/d80/d93/fb6 x:0 0 0 2026-03-10T12:38:08.273 INFO:tasks.workunit.client.1.vm07.stdout:4/719: creat d0/d4/d5/ffc x:0 0 0 2026-03-10T12:38:08.280 INFO:tasks.workunit.client.1.vm07.stdout:4/720: link d0/d4/d10/d3c/d2b/d54/de1/ca9 d0/d5c/d7c/cfd 0 2026-03-10T12:38:08.281 INFO:tasks.workunit.client.1.vm07.stdout:4/721: truncate d0/d4/d5/d8f/fdd 603766 0 2026-03-10T12:38:08.282 INFO:tasks.workunit.client.1.vm07.stdout:4/722: stat d0/d4/d10/d3c/d2b/d54/ldc 0 2026-03-10T12:38:08.284 INFO:tasks.workunit.client.1.vm07.stdout:4/723: rmdir d0/d4/d10/d9a 39 2026-03-10T12:38:08.286 INFO:tasks.workunit.client.1.vm07.stdout:4/724: dread - d0/d4/d5/da/d66/fa8 zero size 2026-03-10T12:38:08.350 INFO:tasks.workunit.client.0.vm00.stdout:9/822: write d0/d3d/d59/fad [3076890,33820] 0 2026-03-10T12:38:08.354 INFO:tasks.workunit.client.0.vm00.stdout:7/567: dwrite da/d25/d2c/f98 [0,4194304] 0 2026-03-10T12:38:08.354 INFO:tasks.workunit.client.0.vm00.stdout:6/513: dwrite d2/d14/d7a/f8d [0,4194304] 0 2026-03-10T12:38:08.354 INFO:tasks.workunit.client.0.vm00.stdout:0/666: dwrite d3/d40/d65/f92 [0,4194304] 0 2026-03-10T12:38:08.363 INFO:tasks.workunit.client.0.vm00.stdout:5/828: write d1f/d26/d2b/fce [138280,73438] 0 2026-03-10T12:38:08.363 INFO:tasks.workunit.client.0.vm00.stdout:9/823: creat d0/d3d/d59/d4e/dba/d19/d50/f129 x:0 0 0 2026-03-10T12:38:08.368 INFO:tasks.workunit.client.0.vm00.stdout:7/568: mkdir da/d41/d7b/d9d/dc8/dd0 0 2026-03-10T12:38:08.369 INFO:tasks.workunit.client.0.vm00.stdout:5/829: dwrite d1f/d6a/d94/dc9/fae [0,4194304] 0 2026-03-10T12:38:08.369 INFO:tasks.workunit.client.0.vm00.stdout:0/667: fsync d3/d7/d4c/d5b/d38/f8b 0 2026-03-10T12:38:08.371 INFO:tasks.workunit.client.1.vm07.stdout:0/648: write d0/d14/d5f/d3b/f4b [3355849,112360] 0 2026-03-10T12:38:08.371 INFO:tasks.workunit.client.1.vm07.stdout:7/573: write d0/d57/d62/f75 [640500,36910] 0 2026-03-10T12:38:08.373 INFO:tasks.workunit.client.1.vm07.stdout:0/649: chown d0/d14/d5f/l29 3433 1 2026-03-10T12:38:08.374 INFO:tasks.workunit.client.0.vm00.stdout:8/687: write d0/d93/d60/f7f [1947078,60594] 0 2026-03-10T12:38:08.375 INFO:tasks.workunit.client.0.vm00.stdout:8/688: dread - d0/d93/d2d/d49/fae zero size 2026-03-10T12:38:08.376 INFO:tasks.workunit.client.0.vm00.stdout:8/689: readlink d0/d58/d68/l90 0 2026-03-10T12:38:08.376 INFO:tasks.workunit.client.1.vm07.stdout:9/688: dwrite d5/d13/f67 [0,4194304] 0 2026-03-10T12:38:08.380 INFO:tasks.workunit.client.0.vm00.stdout:6/514: sync 2026-03-10T12:38:08.381 INFO:tasks.workunit.client.0.vm00.stdout:9/824: creat d0/d3d/df2/f12a x:0 0 0 2026-03-10T12:38:08.382 INFO:tasks.workunit.client.1.vm07.stdout:8/594: dwrite d1/d3/d6/d54/f9c [0,4194304] 0 2026-03-10T12:38:08.384 INFO:tasks.workunit.client.0.vm00.stdout:3/805: dwrite dd/d3d/d8a/f8b [0,4194304] 0 2026-03-10T12:38:08.384 INFO:tasks.workunit.client.1.vm07.stdout:7/574: fsync d0/d47/dab/dae/fbd 0 2026-03-10T12:38:08.389 INFO:tasks.workunit.client.0.vm00.stdout:3/806: sync 2026-03-10T12:38:08.389 INFO:tasks.workunit.client.0.vm00.stdout:3/807: stat dd/d18 0 2026-03-10T12:38:08.395 INFO:tasks.workunit.client.0.vm00.stdout:8/690: symlink d0/d58/d68/ld5 0 2026-03-10T12:38:08.398 INFO:tasks.workunit.client.1.vm07.stdout:0/650: symlink d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/ld8 0 2026-03-10T12:38:08.408 INFO:tasks.workunit.client.0.vm00.stdout:9/825: creat d0/d3d/f12b x:0 0 0 2026-03-10T12:38:08.409 INFO:tasks.workunit.client.0.vm00.stdout:2/800: dwrite d4/d53/d76/f92 [0,4194304] 0 2026-03-10T12:38:08.409 INFO:tasks.workunit.client.0.vm00.stdout:8/691: read - d0/d93/d36/d5b/f95 zero size 2026-03-10T12:38:08.410 INFO:tasks.workunit.client.0.vm00.stdout:6/515: fsync d2/d14/d7a/db9/f6c 0 2026-03-10T12:38:08.418 INFO:tasks.workunit.client.1.vm07.stdout:1/596: write d9/d2d/d80/d8e/fa0 [30685,27594] 0 2026-03-10T12:38:08.419 INFO:tasks.workunit.client.0.vm00.stdout:0/668: rename d3/d7/d3c/l3d to d3/db/d24/ldd 0 2026-03-10T12:38:08.420 INFO:tasks.workunit.client.0.vm00.stdout:5/830: rmdir d1f/d26/d2b/de4/d103 0 2026-03-10T12:38:08.430 INFO:tasks.workunit.client.0.vm00.stdout:6/516: creat d2/d16/d29/d31/d88/d92/fba x:0 0 0 2026-03-10T12:38:08.433 INFO:tasks.workunit.client.0.vm00.stdout:2/801: creat d4/d6/dca/f104 x:0 0 0 2026-03-10T12:38:08.434 INFO:tasks.workunit.client.0.vm00.stdout:2/802: fsync d4/d53/d76/d9b/dad/d8e/f103 0 2026-03-10T12:38:08.434 INFO:tasks.workunit.client.1.vm07.stdout:7/575: creat d0/d61/db4/d8a/fbe x:0 0 0 2026-03-10T12:38:08.436 INFO:tasks.workunit.client.0.vm00.stdout:2/803: dread - d4/d6/d93/fdb zero size 2026-03-10T12:38:08.438 INFO:tasks.workunit.client.1.vm07.stdout:5/636: dwrite d0/d22/d18/d3e/d53/fa3 [0,4194304] 0 2026-03-10T12:38:08.442 INFO:tasks.workunit.client.0.vm00.stdout:0/669: rmdir d3/d7/d58 39 2026-03-10T12:38:08.444 INFO:tasks.workunit.client.0.vm00.stdout:1/806: dwrite f3 [0,4194304] 0 2026-03-10T12:38:08.447 INFO:tasks.workunit.client.1.vm07.stdout:0/651: dwrite d0/d14/d5f/d76/d2f/d31/d79/d85/fc6 [0,4194304] 0 2026-03-10T12:38:08.449 INFO:tasks.workunit.client.0.vm00.stdout:2/804: sync 2026-03-10T12:38:08.455 INFO:tasks.workunit.client.1.vm07.stdout:2/512: write d0/f4 [1788990,88887] 0 2026-03-10T12:38:08.470 INFO:tasks.workunit.client.1.vm07.stdout:7/576: symlink d0/d61/d79/db5/lbf 0 2026-03-10T12:38:08.480 INFO:tasks.workunit.client.1.vm07.stdout:0/652: readlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/l7d 0 2026-03-10T12:38:08.483 INFO:tasks.workunit.client.1.vm07.stdout:2/513: dread d0/d42/d26/d38/d4f/f65 [0,4194304] 0 2026-03-10T12:38:08.484 INFO:tasks.workunit.client.1.vm07.stdout:2/514: chown d0/d42/d26/d38/d4f/f65 1 1 2026-03-10T12:38:08.496 INFO:tasks.workunit.client.0.vm00.stdout:3/808: creat dd/d2a/da2/de1/f10f x:0 0 0 2026-03-10T12:38:08.496 INFO:tasks.workunit.client.1.vm07.stdout:7/577: rename d0/d57/d62/fb0 to d0/d52/fc0 0 2026-03-10T12:38:08.498 INFO:tasks.workunit.client.0.vm00.stdout:5/831: creat d1f/d26/de3/d104/f125 x:0 0 0 2026-03-10T12:38:08.499 INFO:tasks.workunit.client.1.vm07.stdout:5/637: symlink d0/d22/d18/d19/d21/d54/dcb/le1 0 2026-03-10T12:38:08.499 INFO:tasks.workunit.client.1.vm07.stdout:0/653: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9 0 2026-03-10T12:38:08.500 INFO:tasks.workunit.client.1.vm07.stdout:2/515: mkdir d0/d29/d64/d74/d75/db7 0 2026-03-10T12:38:08.500 INFO:tasks.workunit.client.1.vm07.stdout:8/595: link d1/d3/l99 d1/d3/d6/lc0 0 2026-03-10T12:38:08.500 INFO:tasks.workunit.client.0.vm00.stdout:8/692: dwrite d0/dd/d38/f3d [0,4194304] 0 2026-03-10T12:38:08.500 INFO:tasks.workunit.client.1.vm07.stdout:7/578: unlink d0/d67/d6f/fa2 0 2026-03-10T12:38:08.501 INFO:tasks.workunit.client.0.vm00.stdout:9/826: link d0/lef d0/d3d/d59/d4e/dba/d1e/d27/l12c 0 2026-03-10T12:38:08.501 INFO:tasks.workunit.client.1.vm07.stdout:7/579: write d0/d61/d79/fba [981247,59082] 0 2026-03-10T12:38:08.501 INFO:tasks.workunit.client.0.vm00.stdout:8/693: write d0/f10 [1758075,30844] 0 2026-03-10T12:38:08.505 INFO:tasks.workunit.client.1.vm07.stdout:0/654: creat d0/d14/d5f/d76/d2f/d31/d4f/d9d/fda x:0 0 0 2026-03-10T12:38:08.505 INFO:tasks.workunit.client.1.vm07.stdout:7/580: symlink d0/d47/lc1 0 2026-03-10T12:38:08.505 INFO:tasks.workunit.client.0.vm00.stdout:2/805: dwrite d4/f7b [0,4194304] 0 2026-03-10T12:38:08.509 INFO:tasks.workunit.client.0.vm00.stdout:1/807: symlink da/d21/db3/d59/da6/da4/dda/dc0/dfe/d10e/l110 0 2026-03-10T12:38:08.510 INFO:tasks.workunit.client.0.vm00.stdout:5/832: dread - d1f/d26/d2e/ff1 zero size 2026-03-10T12:38:08.513 INFO:tasks.workunit.client.0.vm00.stdout:8/694: creat d0/d46/d7e/fd6 x:0 0 0 2026-03-10T12:38:08.513 INFO:tasks.workunit.client.0.vm00.stdout:8/695: write d0/d93/d17/da2/fc1 [89970,7785] 0 2026-03-10T12:38:08.514 INFO:tasks.workunit.client.0.vm00.stdout:8/696: fdatasync d0/dd/d38/d81/fbd 0 2026-03-10T12:38:08.519 INFO:tasks.workunit.client.0.vm00.stdout:9/827: mkdir d0/d3d/d59/d4e/d104/d12d 0 2026-03-10T12:38:08.530 INFO:tasks.workunit.client.0.vm00.stdout:9/828: truncate d0/d3d/d59/f4a 3325970 0 2026-03-10T12:38:08.530 INFO:tasks.workunit.client.0.vm00.stdout:9/829: fsync d0/f21 0 2026-03-10T12:38:08.530 INFO:tasks.workunit.client.1.vm07.stdout:7/581: chown d0/d52/f98 217 1 2026-03-10T12:38:08.532 INFO:tasks.workunit.client.0.vm00.stdout:2/806: mkdir d4/dd/da7/d105 0 2026-03-10T12:38:08.534 INFO:tasks.workunit.client.0.vm00.stdout:2/807: write d4/d53/d76/d9b/dad/d8e/f103 [638146,30209] 0 2026-03-10T12:38:08.538 INFO:tasks.workunit.client.1.vm07.stdout:0/655: symlink d0/d14/d5f/d76/d2f/d31/d4f/d60/ldb 0 2026-03-10T12:38:08.539 INFO:tasks.workunit.client.1.vm07.stdout:7/582: rename d0/f40 to d0/d61/d79/db5/fc2 0 2026-03-10T12:38:08.540 INFO:tasks.workunit.client.1.vm07.stdout:8/596: link d1/d3/d6/d50/d70/f7f d1/d3/d11/d87/fc1 0 2026-03-10T12:38:08.540 INFO:tasks.workunit.client.0.vm00.stdout:2/808: chown d4/dd/f17 834 1 2026-03-10T12:38:08.542 INFO:tasks.workunit.client.0.vm00.stdout:9/830: creat d0/d3d/d59/d4e/dba/f12e x:0 0 0 2026-03-10T12:38:08.543 INFO:tasks.workunit.client.0.vm00.stdout:9/831: write d0/d3d/d59/fe8 [178360,46438] 0 2026-03-10T12:38:08.563 INFO:tasks.workunit.client.1.vm07.stdout:0/656: creat d0/d14/d5f/d76/d2f/d31/d79/fdc x:0 0 0 2026-03-10T12:38:08.563 INFO:tasks.workunit.client.1.vm07.stdout:8/597: read - d1/d3/d6/f81 zero size 2026-03-10T12:38:08.563 INFO:tasks.workunit.client.0.vm00.stdout:2/809: rmdir d4/dd/da7/d105 0 2026-03-10T12:38:08.564 INFO:tasks.workunit.client.0.vm00.stdout:2/810: write d4/d6/d2d/d31/f79 [3690217,79401] 0 2026-03-10T12:38:08.564 INFO:tasks.workunit.client.0.vm00.stdout:9/832: dwrite d0/d5/f10d [0,4194304] 0 2026-03-10T12:38:08.564 INFO:tasks.workunit.client.0.vm00.stdout:9/833: symlink d0/d3d/d43/d53/d126/l12f 0 2026-03-10T12:38:08.564 INFO:tasks.workunit.client.0.vm00.stdout:1/808: dread da/d24/d5a/f75 [0,4194304] 0 2026-03-10T12:38:08.564 INFO:tasks.workunit.client.0.vm00.stdout:2/811: getdents d4 0 2026-03-10T12:38:08.572 INFO:tasks.workunit.client.0.vm00.stdout:1/809: truncate da/d24/d5a/f7c 1369861 0 2026-03-10T12:38:08.576 INFO:tasks.workunit.client.0.vm00.stdout:1/810: creat da/d21/db3/d59/da6/f111 x:0 0 0 2026-03-10T12:38:08.587 INFO:tasks.workunit.client.0.vm00.stdout:1/811: truncate da/d24/d5a/f68 3617348 0 2026-03-10T12:38:08.587 INFO:tasks.workunit.client.0.vm00.stdout:1/812: fdatasync da/d12/da8/fc9 0 2026-03-10T12:38:08.587 INFO:tasks.workunit.client.0.vm00.stdout:1/813: mkdir da/d12/da8/d112 0 2026-03-10T12:38:08.595 INFO:tasks.workunit.client.0.vm00.stdout:1/814: rmdir da/d12/d91/dcb 0 2026-03-10T12:38:08.601 INFO:tasks.workunit.client.1.vm07.stdout:3/635: dwrite dc/dd/d43/d5c/fd6 [0,4194304] 0 2026-03-10T12:38:08.603 INFO:tasks.workunit.client.1.vm07.stdout:6/591: dwrite d1/d4/d6/d16/d49/f67 [4194304,4194304] 0 2026-03-10T12:38:08.603 INFO:tasks.workunit.client.1.vm07.stdout:4/725: dwrite d0/d4/d5/f75 [0,4194304] 0 2026-03-10T12:38:08.605 INFO:tasks.workunit.client.0.vm00.stdout:1/815: dwrite da/d24/d28/d67/da2/d78/f86 [0,4194304] 0 2026-03-10T12:38:08.606 INFO:tasks.workunit.client.1.vm07.stdout:4/726: readlink d0/d4/df2/df6/d46/d76/la4 0 2026-03-10T12:38:08.606 INFO:tasks.workunit.client.0.vm00.stdout:1/816: write da/d21/d27/d6a/f9e [3647801,14334] 0 2026-03-10T12:38:08.615 INFO:tasks.workunit.client.0.vm00.stdout:1/817: truncate da/d24/f76 998443 0 2026-03-10T12:38:08.615 INFO:tasks.workunit.client.1.vm07.stdout:6/592: dwrite d1/d4/d6/d16/faf [0,4194304] 0 2026-03-10T12:38:08.615 INFO:tasks.workunit.client.0.vm00.stdout:1/818: chown da/d21/db3/d5d/d80/fcc 1544 1 2026-03-10T12:38:08.618 INFO:tasks.workunit.client.0.vm00.stdout:1/819: dread da/d21/db3/d59/da6/fd3 [0,4194304] 0 2026-03-10T12:38:08.620 INFO:tasks.workunit.client.0.vm00.stdout:1/820: unlink da/d21/d27/d6a/f6b 0 2026-03-10T12:38:08.624 INFO:tasks.workunit.client.0.vm00.stdout:1/821: rename da/d21/d27/d6a/l84 to da/d21/db3/d5d/dab/l113 0 2026-03-10T12:38:08.653 INFO:tasks.workunit.client.0.vm00.stdout:9/834: dread d0/d3d/d43/d53/f66 [0,4194304] 0 2026-03-10T12:38:08.654 INFO:tasks.workunit.client.0.vm00.stdout:9/835: chown d0/la4 45730666 1 2026-03-10T12:38:08.655 INFO:tasks.workunit.client.0.vm00.stdout:9/836: mknod d0/d3d/df2/c130 0 2026-03-10T12:38:08.657 INFO:tasks.workunit.client.0.vm00.stdout:9/837: mknod d0/d3d/d43/d53/c131 0 2026-03-10T12:38:08.692 INFO:tasks.workunit.client.0.vm00.stdout:7/569: truncate f9 8199595 0 2026-03-10T12:38:08.693 INFO:tasks.workunit.client.1.vm07.stdout:9/689: write d5/d13/d22/f9e [862079,20398] 0 2026-03-10T12:38:08.694 INFO:tasks.workunit.client.1.vm07.stdout:1/597: write d9/f61 [1838138,83199] 0 2026-03-10T12:38:08.695 INFO:tasks.workunit.client.1.vm07.stdout:9/690: rmdir d5/d13 39 2026-03-10T12:38:08.696 INFO:tasks.workunit.client.1.vm07.stdout:1/598: symlink d9/df/d79/lca 0 2026-03-10T12:38:08.699 INFO:tasks.workunit.client.0.vm00.stdout:7/570: mkdir da/d3f/dd1 0 2026-03-10T12:38:08.700 INFO:tasks.workunit.client.1.vm07.stdout:1/599: dwrite d9/d2d/d4f/d75/f83 [4194304,4194304] 0 2026-03-10T12:38:08.705 INFO:tasks.workunit.client.1.vm07.stdout:1/600: rmdir d9/df/d55/d9f 39 2026-03-10T12:38:08.710 INFO:tasks.workunit.client.0.vm00.stdout:7/571: getdents da/d41/d48/d81 0 2026-03-10T12:38:08.711 INFO:tasks.workunit.client.1.vm07.stdout:1/601: creat d9/d2d/fcb x:0 0 0 2026-03-10T12:38:08.713 INFO:tasks.workunit.client.1.vm07.stdout:1/602: creat d9/df/d29/d6b/fcc x:0 0 0 2026-03-10T12:38:08.715 INFO:tasks.workunit.client.1.vm07.stdout:1/603: creat d9/d2d/d4f/d75/d77/da7/fcd x:0 0 0 2026-03-10T12:38:08.716 INFO:tasks.workunit.client.1.vm07.stdout:1/604: creat d9/df/d55/fce x:0 0 0 2026-03-10T12:38:08.717 INFO:tasks.workunit.client.1.vm07.stdout:1/605: symlink d9/d2d/lcf 0 2026-03-10T12:38:08.718 INFO:tasks.workunit.client.1.vm07.stdout:1/606: rmdir d9/d2d/d4f/d75/d77 39 2026-03-10T12:38:08.720 INFO:tasks.workunit.client.1.vm07.stdout:1/607: unlink d9/df/f96 0 2026-03-10T12:38:08.724 INFO:tasks.workunit.client.1.vm07.stdout:1/608: chown d9/df/d29/d2b/d31/f72 56985 1 2026-03-10T12:38:08.732 INFO:tasks.workunit.client.0.vm00.stdout:4/828: write df/d1f/d22/d26/dab/f89 [479537,48629] 0 2026-03-10T12:38:08.736 INFO:tasks.workunit.client.1.vm07.stdout:1/609: creat d9/df/d29/d2b/d30/fd0 x:0 0 0 2026-03-10T12:38:08.739 INFO:tasks.workunit.client.1.vm07.stdout:1/610: dwrite d9/d2d/fcb [0,4194304] 0 2026-03-10T12:38:08.759 INFO:tasks.workunit.client.1.vm07.stdout:0/657: dread d0/d14/d5f/d76/f30 [4194304,4194304] 0 2026-03-10T12:38:08.760 INFO:tasks.workunit.client.1.vm07.stdout:0/658: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf [1061806,6715] 0 2026-03-10T12:38:08.762 INFO:tasks.workunit.client.1.vm07.stdout:0/659: truncate d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf 1595314 0 2026-03-10T12:38:08.763 INFO:tasks.workunit.client.1.vm07.stdout:0/660: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf [2051564,29389] 0 2026-03-10T12:38:08.795 INFO:tasks.workunit.client.0.vm00.stdout:6/517: write d2/d14/f32 [711608,56346] 0 2026-03-10T12:38:08.797 INFO:tasks.workunit.client.0.vm00.stdout:6/518: chown d2/d16/d74/f7d 8037 1 2026-03-10T12:38:08.797 INFO:tasks.workunit.client.0.vm00.stdout:6/519: chown d2/da/dc/l87 114077 1 2026-03-10T12:38:08.799 INFO:tasks.workunit.client.0.vm00.stdout:6/520: mkdir d2/d14/dbb 0 2026-03-10T12:38:08.805 INFO:tasks.workunit.client.1.vm07.stdout:5/638: dwrite d0/d22/d18/d19/d36/d75/fdb [0,4194304] 0 2026-03-10T12:38:08.805 INFO:tasks.workunit.client.0.vm00.stdout:6/521: dread - d2/d42/f71 zero size 2026-03-10T12:38:08.816 INFO:tasks.workunit.client.0.vm00.stdout:6/522: rename d2/da/dc/l87 to d2/d42/d80/lbc 0 2026-03-10T12:38:08.816 INFO:tasks.workunit.client.1.vm07.stdout:5/639: creat d0/d22/d18/d19/d2e/d67/fe2 x:0 0 0 2026-03-10T12:38:08.816 INFO:tasks.workunit.client.0.vm00.stdout:6/523: chown d2/d16/f20 153412 1 2026-03-10T12:38:08.818 INFO:tasks.workunit.client.1.vm07.stdout:2/516: dwrite d0/d29/d64/d74/d88/f51 [0,4194304] 0 2026-03-10T12:38:08.820 INFO:tasks.workunit.client.0.vm00.stdout:6/524: creat d2/d42/d80/fbd x:0 0 0 2026-03-10T12:38:08.820 INFO:tasks.workunit.client.1.vm07.stdout:5/640: chown d0/d22/l45 48064 1 2026-03-10T12:38:08.824 INFO:tasks.workunit.client.1.vm07.stdout:2/517: rename d0/d42/d26/f50 to d0/d29/d64/d6c/d94/fb8 0 2026-03-10T12:38:08.830 INFO:tasks.workunit.client.1.vm07.stdout:2/518: creat d0/d29/d64/d6c/fb9 x:0 0 0 2026-03-10T12:38:08.836 INFO:tasks.workunit.client.1.vm07.stdout:2/519: link d0/d29/d64/d6c/d94/fa7 d0/d42/d26/d38/d4f/d62/fba 0 2026-03-10T12:38:08.837 INFO:tasks.workunit.client.1.vm07.stdout:2/520: readlink d0/d80/d93/l97 0 2026-03-10T12:38:08.838 INFO:tasks.workunit.client.1.vm07.stdout:5/641: dread d0/d22/d18/d19/d21/d54/f7d [0,4194304] 0 2026-03-10T12:38:08.841 INFO:tasks.workunit.client.1.vm07.stdout:2/521: fdatasync d0/d42/d1f/f84 0 2026-03-10T12:38:08.847 INFO:tasks.workunit.client.0.vm00.stdout:6/525: dread d2/d16/f2a [0,4194304] 0 2026-03-10T12:38:08.860 INFO:tasks.workunit.client.1.vm07.stdout:5/642: unlink d0/cbe 0 2026-03-10T12:38:08.862 INFO:tasks.workunit.client.1.vm07.stdout:2/522: mkdir d0/d29/d64/db5/dbb 0 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:5/643: rename d0/d22/d18/d19/d21/d54/c6c to d0/d22/d18/d3e/d5d/db6/ce3 0 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:5/644: creat d0/d22/d18/d3e/d5d/db6/fe4 x:0 0 0 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:2/523: write d0/f8d [1584587,15443] 0 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:2/524: creat d0/d80/fbc x:0 0 0 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:2/525: chown d0/d42/d26/d38/d4f/dad 7 1 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:5/645: fsync d0/d22/d18/d19/fa8 0 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:5/646: mkdir d0/d22/d18/d19/de5 0 2026-03-10T12:38:08.877 INFO:tasks.workunit.client.1.vm07.stdout:2/526: read d0/d42/d26/d7d/faa [594677,78542] 0 2026-03-10T12:38:08.879 INFO:tasks.workunit.client.1.vm07.stdout:5/647: creat d0/d22/d18/d3e/d5d/db6/fe6 x:0 0 0 2026-03-10T12:38:08.880 INFO:tasks.workunit.client.1.vm07.stdout:2/527: mknod d0/d42/d1f/d20/cbd 0 2026-03-10T12:38:08.883 INFO:tasks.workunit.client.1.vm07.stdout:2/528: mknod d0/d5b/cbe 0 2026-03-10T12:38:08.889 INFO:tasks.workunit.client.0.vm00.stdout:0/670: dread d3/d33/f4d [0,4194304] 0 2026-03-10T12:38:08.894 INFO:tasks.workunit.client.0.vm00.stdout:0/671: dread d3/d40/d65/fc0 [0,4194304] 0 2026-03-10T12:38:08.896 INFO:tasks.workunit.client.0.vm00.stdout:6/526: sync 2026-03-10T12:38:08.896 INFO:tasks.workunit.client.0.vm00.stdout:0/672: creat d3/d22/fde x:0 0 0 2026-03-10T12:38:08.898 INFO:tasks.workunit.client.0.vm00.stdout:6/527: rename d2/f30 to d2/d16/d29/d31/d88/fbe 0 2026-03-10T12:38:08.900 INFO:tasks.workunit.client.0.vm00.stdout:0/673: mkdir d3/d22/da5/ddf 0 2026-03-10T12:38:08.902 INFO:tasks.workunit.client.0.vm00.stdout:6/528: rename d2/d42/d80/d98 to d2/da/dbf 0 2026-03-10T12:38:08.902 INFO:tasks.workunit.client.0.vm00.stdout:6/529: stat d2/d42/d80/d9d/caf 0 2026-03-10T12:38:08.907 INFO:tasks.workunit.client.0.vm00.stdout:6/530: dwrite d2/da/dc/d2f/f56 [0,4194304] 0 2026-03-10T12:38:08.910 INFO:tasks.workunit.client.0.vm00.stdout:6/531: mkdir d2/d14/dc0 0 2026-03-10T12:38:08.911 INFO:tasks.workunit.client.0.vm00.stdout:6/532: read d2/d16/d74/f62 [3646462,105842] 0 2026-03-10T12:38:08.914 INFO:tasks.workunit.client.0.vm00.stdout:6/533: mkdir d2/d16/d29/d31/d88/d92/daa/dc1 0 2026-03-10T12:38:08.926 INFO:tasks.workunit.client.1.vm07.stdout:7/583: dwrite d0/d67/d6f/d80/fbc [0,4194304] 0 2026-03-10T12:38:08.929 INFO:tasks.workunit.client.1.vm07.stdout:7/584: creat d0/d67/d6f/d80/fc3 x:0 0 0 2026-03-10T12:38:08.931 INFO:tasks.workunit.client.1.vm07.stdout:3/636: write dc/d18/d24/f37 [2531615,124036] 0 2026-03-10T12:38:08.936 INFO:tasks.workunit.client.1.vm07.stdout:3/637: readlink dc/dd/d1f/d6f/lc4 0 2026-03-10T12:38:08.936 INFO:tasks.workunit.client.1.vm07.stdout:8/598: dwrite d1/d3/ff [0,4194304] 0 2026-03-10T12:38:08.936 INFO:tasks.workunit.client.1.vm07.stdout:7/585: rename d0/d47/f51 to d0/d61/db4/fc4 0 2026-03-10T12:38:08.937 INFO:tasks.workunit.client.1.vm07.stdout:3/638: mknod dc/dd/d43/d76/d95/cd9 0 2026-03-10T12:38:08.941 INFO:tasks.workunit.client.1.vm07.stdout:4/727: dwrite d0/d4/d10/d3c/d2b/d54/de1/f25 [0,4194304] 0 2026-03-10T12:38:08.943 INFO:tasks.workunit.client.1.vm07.stdout:6/593: write d1/d4/f11 [1425306,116803] 0 2026-03-10T12:38:08.944 INFO:tasks.workunit.client.0.vm00.stdout:3/809: dwrite dd/d64/fa4 [0,4194304] 0 2026-03-10T12:38:08.947 INFO:tasks.workunit.client.0.vm00.stdout:7/572: write da/d26/d37/d56/f6c [20879,10432] 0 2026-03-10T12:38:08.947 INFO:tasks.workunit.client.1.vm07.stdout:9/691: write d5/f8 [7695732,57680] 0 2026-03-10T12:38:08.953 INFO:tasks.workunit.client.0.vm00.stdout:3/810: fdatasync dd/d18/d13/d99/da5/fd4 0 2026-03-10T12:38:08.954 INFO:tasks.workunit.client.1.vm07.stdout:7/586: symlink d0/d67/lc5 0 2026-03-10T12:38:08.955 INFO:tasks.workunit.client.0.vm00.stdout:3/811: creat dd/d3d/d8a/f110 x:0 0 0 2026-03-10T12:38:08.959 INFO:tasks.workunit.client.0.vm00.stdout:3/812: dwrite dd/d18/d13/d1d/f69 [0,4194304] 0 2026-03-10T12:38:08.960 INFO:tasks.workunit.client.1.vm07.stdout:3/639: rename dc/dd/f29 to dc/dd/d1f/dc7/dc9/fda 0 2026-03-10T12:38:08.971 INFO:tasks.workunit.client.0.vm00.stdout:7/573: unlink da/f23 0 2026-03-10T12:38:08.975 INFO:tasks.workunit.client.1.vm07.stdout:1/611: dwrite d9/df/d29/d2b/d31/f3c [0,4194304] 0 2026-03-10T12:38:08.986 INFO:tasks.workunit.client.1.vm07.stdout:0/661: write d0/d14/d5f/d76/d2f/d31/f5a [92245,96568] 0 2026-03-10T12:38:08.986 INFO:tasks.workunit.client.1.vm07.stdout:8/599: mknod d1/d3/d40/d92/db6/cc2 0 2026-03-10T12:38:08.986 INFO:tasks.workunit.client.1.vm07.stdout:8/600: dread - d1/d3/d40/f5a zero size 2026-03-10T12:38:08.989 INFO:tasks.workunit.client.1.vm07.stdout:6/594: creat d1/d4/d6/d46/d4d/fbd x:0 0 0 2026-03-10T12:38:08.990 INFO:tasks.workunit.client.1.vm07.stdout:6/595: write d1/d4/d6/d16/d1a/d2c/f78 [2044410,91940] 0 2026-03-10T12:38:09.005 INFO:tasks.workunit.client.1.vm07.stdout:5/648: write d0/f47 [4772725,3434] 0 2026-03-10T12:38:09.008 INFO:tasks.workunit.client.1.vm07.stdout:4/728: rename d0/d4/d5/da/d66/l9e to d0/d4/d10/d3c/d2b/lfe 0 2026-03-10T12:38:09.013 INFO:tasks.workunit.client.1.vm07.stdout:4/729: chown d0/d4/d10/fc7 1933100 1 2026-03-10T12:38:09.014 INFO:tasks.workunit.client.1.vm07.stdout:3/640: chown dc/dd/d1f/l23 28 1 2026-03-10T12:38:09.014 INFO:tasks.workunit.client.1.vm07.stdout:6/596: creat d1/d4/d6/d16/d1a/d6e/fbe x:0 0 0 2026-03-10T12:38:09.016 INFO:tasks.workunit.client.1.vm07.stdout:5/649: chown d0/d22/d18/d19/d2e/d67/dd9 9 1 2026-03-10T12:38:09.017 INFO:tasks.workunit.client.1.vm07.stdout:6/597: dwrite d1/d4/d6/d16/fbc [0,4194304] 0 2026-03-10T12:38:09.019 INFO:tasks.workunit.client.1.vm07.stdout:6/598: dread - d1/d4/d6/d16/d1a/d6e/fbe zero size 2026-03-10T12:38:09.020 INFO:tasks.workunit.client.1.vm07.stdout:9/692: creat d5/d13/d22/fea x:0 0 0 2026-03-10T12:38:09.024 INFO:tasks.workunit.client.1.vm07.stdout:9/693: dwrite d5/d13/d6c/da4/fd0 [0,4194304] 0 2026-03-10T12:38:09.025 INFO:tasks.workunit.client.1.vm07.stdout:4/730: mknod d0/d4/d5/d34/cff 0 2026-03-10T12:38:09.037 INFO:tasks.workunit.client.1.vm07.stdout:0/662: symlink d0/d14/ldd 0 2026-03-10T12:38:09.054 INFO:tasks.workunit.client.1.vm07.stdout:0/663: dwrite d0/d14/d5f/d76/d2f/d31/d79/fdc [0,4194304] 0 2026-03-10T12:38:09.054 INFO:tasks.workunit.client.1.vm07.stdout:0/664: stat d0/d14/d5f/d76/d2f/db2 0 2026-03-10T12:38:09.054 INFO:tasks.workunit.client.1.vm07.stdout:4/731: creat d0/d5c/d7c/f100 x:0 0 0 2026-03-10T12:38:09.054 INFO:tasks.workunit.client.1.vm07.stdout:9/694: creat d5/d13/d6c/da4/feb x:0 0 0 2026-03-10T12:38:09.054 INFO:tasks.workunit.client.1.vm07.stdout:0/665: creat d0/d14/d7c/fde x:0 0 0 2026-03-10T12:38:09.055 INFO:tasks.workunit.client.1.vm07.stdout:7/587: getdents d0/d57/d62/d90 0 2026-03-10T12:38:09.056 INFO:tasks.workunit.client.1.vm07.stdout:4/732: creat d0/d5c/d7c/f101 x:0 0 0 2026-03-10T12:38:09.057 INFO:tasks.workunit.client.1.vm07.stdout:9/695: fsync d5/d13/d2c/f44 0 2026-03-10T12:38:09.058 INFO:tasks.workunit.client.1.vm07.stdout:3/641: link dc/dd/db5/f73 dc/dd/d28/dd0/fdb 0 2026-03-10T12:38:09.059 INFO:tasks.workunit.client.1.vm07.stdout:3/642: chown dc/dd/d1f/d45 279460788 1 2026-03-10T12:38:09.059 INFO:tasks.workunit.client.1.vm07.stdout:0/666: rmdir d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4 39 2026-03-10T12:38:09.062 INFO:tasks.workunit.client.1.vm07.stdout:4/733: symlink d0/d4/df2/df6/d46/d76/l102 0 2026-03-10T12:38:09.095 INFO:tasks.workunit.client.1.vm07.stdout:4/734: chown d0/d4/d5/fd3 6576018 1 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:4/735: dread d0/d4/d10/d5f/f63 [0,4194304] 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:4/736: readlink d0/d4/d5/da/d95/l9d 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:4/737: chown d0/d4/d5/da/ced 15366497 1 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:9/696: fsync d5/d16/d23/d26/d68/fa0 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:3/643: write dc/dd/d28/d3b/fa5 [6294579,13950] 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:0/667: creat d0/d14/d5f/d76/d93/fdf x:0 0 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:7/588: truncate d0/f10 539578 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:9/697: creat d5/d13/d9b/fec x:0 0 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:7/589: chown d0/d57/d62/c74 2837779 1 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:9/698: mknod d5/d13/d22/ced 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:9/699: fsync d5/f65 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:0/668: creat d0/d14/d5f/d41/d6a/fe0 x:0 0 0 2026-03-10T12:38:09.096 INFO:tasks.workunit.client.1.vm07.stdout:9/700: creat d5/d16/d23/fee x:0 0 0 2026-03-10T12:38:09.116 INFO:tasks.workunit.client.1.vm07.stdout:4/738: dread d0/d4/d10/d3c/d2b/f60 [0,4194304] 0 2026-03-10T12:38:09.123 INFO:tasks.workunit.client.0.vm00.stdout:5/833: dwrite d1f/d96/dbd/ffb [0,4194304] 0 2026-03-10T12:38:09.141 INFO:tasks.workunit.client.0.vm00.stdout:5/834: creat d1f/d6a/d118/dcb/f126 x:0 0 0 2026-03-10T12:38:09.141 INFO:tasks.workunit.client.0.vm00.stdout:5/835: creat d1f/d6a/d118/f127 x:0 0 0 2026-03-10T12:38:09.141 INFO:tasks.workunit.client.0.vm00.stdout:5/836: unlink d1f/d39/l47 0 2026-03-10T12:38:09.141 INFO:tasks.workunit.client.0.vm00.stdout:5/837: stat d1f/d26/d2e/d58/d6b/f87 0 2026-03-10T12:38:09.141 INFO:tasks.workunit.client.0.vm00.stdout:5/838: fdatasync d1f/d26/f48 0 2026-03-10T12:38:09.141 INFO:tasks.workunit.client.1.vm07.stdout:1/612: sync 2026-03-10T12:38:09.141 INFO:tasks.workunit.client.1.vm07.stdout:0/669: sync 2026-03-10T12:38:09.144 INFO:tasks.workunit.client.0.vm00.stdout:5/839: dread d1f/d26/d2e/d58/d10c/d123/fa7 [0,4194304] 0 2026-03-10T12:38:09.146 INFO:tasks.workunit.client.0.vm00.stdout:5/840: creat d1f/d26/d101/f128 x:0 0 0 2026-03-10T12:38:09.147 INFO:tasks.workunit.client.0.vm00.stdout:5/841: rmdir d1f/d26/d2b/d35/d78 39 2026-03-10T12:38:09.147 INFO:tasks.workunit.client.0.vm00.stdout:5/842: read d1f/d6a/d94/dc9/fae [4203027,59649] 0 2026-03-10T12:38:09.156 INFO:tasks.workunit.client.1.vm07.stdout:0/670: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/f89 [0,4194304] 0 2026-03-10T12:38:09.159 INFO:tasks.workunit.client.1.vm07.stdout:0/671: getdents d0/d14/d5f/d76/d2f/d31/d79/d85 0 2026-03-10T12:38:09.162 INFO:tasks.workunit.client.1.vm07.stdout:0/672: creat d0/d14/d5f/d76/d2f/d31/d4f/fe1 x:0 0 0 2026-03-10T12:38:09.204 INFO:tasks.workunit.client.1.vm07.stdout:9/701: dread d5/d16/d23/d26/f86 [0,4194304] 0 2026-03-10T12:38:09.211 INFO:tasks.workunit.client.1.vm07.stdout:2/529: dwrite d0/d42/d1f/d20/f3f [4194304,4194304] 0 2026-03-10T12:38:09.212 INFO:tasks.workunit.client.1.vm07.stdout:2/530: fdatasync d0/d42/d26/f2e 0 2026-03-10T12:38:09.214 INFO:tasks.workunit.client.1.vm07.stdout:2/531: creat d0/d42/d1f/fbf x:0 0 0 2026-03-10T12:38:09.239 INFO:tasks.workunit.client.0.vm00.stdout:6/534: getdents d2 0 2026-03-10T12:38:09.240 INFO:tasks.workunit.client.0.vm00.stdout:0/674: truncate d3/d7/d3c/f30 1373491 0 2026-03-10T12:38:09.241 INFO:tasks.workunit.client.0.vm00.stdout:0/675: stat d3/d7/d4c/d5b/d38/d44/d5a/cb9 0 2026-03-10T12:38:09.261 INFO:tasks.workunit.client.0.vm00.stdout:6/535: dread d2/da/dc/fd [0,4194304] 0 2026-03-10T12:38:09.261 INFO:tasks.workunit.client.0.vm00.stdout:6/536: write d2/d14/f32 [114096,112550] 0 2026-03-10T12:38:09.262 INFO:tasks.workunit.client.0.vm00.stdout:6/537: chown d2/da/dc/f25 4 1 2026-03-10T12:38:09.270 INFO:tasks.workunit.client.0.vm00.stdout:2/812: dwrite d4/d6/d2d/d3a/f74 [0,4194304] 0 2026-03-10T12:38:09.271 INFO:tasks.workunit.client.0.vm00.stdout:2/813: truncate d4/d6/dca/f104 62337 0 2026-03-10T12:38:09.277 INFO:tasks.workunit.client.0.vm00.stdout:7/574: dwrite da/d26/d37/f4a [0,4194304] 0 2026-03-10T12:38:09.278 INFO:tasks.workunit.client.0.vm00.stdout:7/575: dread - da/d26/d50/fbf zero size 2026-03-10T12:38:09.279 INFO:tasks.workunit.client.0.vm00.stdout:1/822: truncate da/d24/d28/d67/f52 1467633 0 2026-03-10T12:38:09.285 INFO:tasks.workunit.client.1.vm07.stdout:8/601: dwrite d1/f68 [0,4194304] 0 2026-03-10T12:38:09.287 INFO:tasks.workunit.client.0.vm00.stdout:9/838: rmdir d0/d3d/d43/d53 39 2026-03-10T12:38:09.289 INFO:tasks.workunit.client.0.vm00.stdout:4/829: write df/d63/d77/f8d [5221988,119256] 0 2026-03-10T12:38:09.290 INFO:tasks.workunit.client.0.vm00.stdout:4/830: chown df/d1f/d36/d3a/d41/lc9 76252760 1 2026-03-10T12:38:09.300 INFO:tasks.workunit.client.0.vm00.stdout:2/814: symlink d4/d53/d9e/l106 0 2026-03-10T12:38:09.300 INFO:tasks.workunit.client.0.vm00.stdout:2/815: write d4/d6/d2d/d3a/d43/fa1 [4019252,26150] 0 2026-03-10T12:38:09.300 INFO:tasks.workunit.client.0.vm00.stdout:8/697: truncate d0/d5c/f4a 3980824 0 2026-03-10T12:38:09.300 INFO:tasks.workunit.client.0.vm00.stdout:4/831: dread df/f1e [0,4194304] 0 2026-03-10T12:38:09.303 INFO:tasks.workunit.client.0.vm00.stdout:9/839: dread d0/d3d/d59/d4e/dba/d1e/d27/d115/f87 [0,4194304] 0 2026-03-10T12:38:09.304 INFO:tasks.workunit.client.1.vm07.stdout:6/599: write d1/f3d [1509082,125180] 0 2026-03-10T12:38:09.309 INFO:tasks.workunit.client.0.vm00.stdout:4/832: dwrite f9 [4194304,4194304] 0 2026-03-10T12:38:09.309 INFO:tasks.workunit.client.0.vm00.stdout:9/840: dwrite d0/d7f/db8/f11b [0,4194304] 0 2026-03-10T12:38:09.314 INFO:tasks.workunit.client.1.vm07.stdout:6/600: rename d1/d4/d6/d46/d4d/c81 to d1/d4/d6/d53/da3/cbf 0 2026-03-10T12:38:09.315 INFO:tasks.workunit.client.0.vm00.stdout:7/576: creat da/d25/d2c/d82/fd2 x:0 0 0 2026-03-10T12:38:09.330 INFO:tasks.workunit.client.0.vm00.stdout:7/577: chown da/d3f/d71/f95 21123115 1 2026-03-10T12:38:09.331 INFO:tasks.workunit.client.1.vm07.stdout:6/601: mknod d1/d4/d6/d16/d1a/d9d/cc0 0 2026-03-10T12:38:09.332 INFO:tasks.workunit.client.1.vm07.stdout:6/602: fdatasync d1/d4/d6/d53/d66/fba 0 2026-03-10T12:38:09.332 INFO:tasks.workunit.client.1.vm07.stdout:5/650: dwrite d0/d22/d18/d19/d21/d54/dcb/f6a [0,4194304] 0 2026-03-10T12:38:09.334 INFO:tasks.workunit.client.0.vm00.stdout:7/578: dread da/d25/d2c/f98 [0,4194304] 0 2026-03-10T12:38:09.337 INFO:tasks.workunit.client.0.vm00.stdout:7/579: read da/d26/d37/d56/f9a [356353,99116] 0 2026-03-10T12:38:09.337 INFO:tasks.workunit.client.1.vm07.stdout:6/603: rename d1/f38 to d1/d4/d6/d46/d4d/fc1 0 2026-03-10T12:38:09.337 INFO:tasks.workunit.client.0.vm00.stdout:2/816: sync 2026-03-10T12:38:09.338 INFO:tasks.workunit.client.0.vm00.stdout:0/676: read d3/d7/d4c/d5b/f37 [44846,67519] 0 2026-03-10T12:38:09.340 INFO:tasks.workunit.client.0.vm00.stdout:9/841: mknod d0/d7f/db8/dc4/db0/c132 0 2026-03-10T12:38:09.348 INFO:tasks.workunit.client.1.vm07.stdout:5/651: getdents d0/d22/d18/d19/d2e/d67/dd9 0 2026-03-10T12:38:09.349 INFO:tasks.workunit.client.0.vm00.stdout:2/817: sync 2026-03-10T12:38:09.349 INFO:tasks.workunit.client.0.vm00.stdout:3/813: write dd/d64/fc2 [102669,53513] 0 2026-03-10T12:38:09.350 INFO:tasks.workunit.client.0.vm00.stdout:2/818: write d4/d6/dca/f104 [368495,40950] 0 2026-03-10T12:38:09.350 INFO:tasks.workunit.client.0.vm00.stdout:3/814: truncate dd/d64/d93/ff7 973076 0 2026-03-10T12:38:09.352 INFO:tasks.workunit.client.0.vm00.stdout:7/580: unlink da/d26/d37/c99 0 2026-03-10T12:38:09.353 INFO:tasks.workunit.client.1.vm07.stdout:5/652: fdatasync d0/d22/d18/d19/d21/f42 0 2026-03-10T12:38:09.355 INFO:tasks.workunit.client.0.vm00.stdout:2/819: dwrite d4/d53/d68/fb1 [0,4194304] 0 2026-03-10T12:38:09.359 INFO:tasks.workunit.client.1.vm07.stdout:3/644: dwrite dc/d18/d24/f3e [4194304,4194304] 0 2026-03-10T12:38:09.366 INFO:tasks.workunit.client.0.vm00.stdout:3/815: symlink dd/d64/d93/l111 0 2026-03-10T12:38:09.366 INFO:tasks.workunit.client.1.vm07.stdout:3/645: readlink dc/d18/d2d/l78 0 2026-03-10T12:38:09.368 INFO:tasks.workunit.client.0.vm00.stdout:5/843: write d1f/d26/f48 [2482324,119447] 0 2026-03-10T12:38:09.369 INFO:tasks.workunit.client.1.vm07.stdout:7/590: dwrite d0/f70 [0,4194304] 0 2026-03-10T12:38:09.370 INFO:tasks.workunit.client.0.vm00.stdout:2/820: truncate d4/d53/d76/d9b/dad/f80 1684752 0 2026-03-10T12:38:09.371 INFO:tasks.workunit.client.1.vm07.stdout:3/646: truncate dc/dd/d43/f61 118709 0 2026-03-10T12:38:09.372 INFO:tasks.workunit.client.1.vm07.stdout:3/647: write dc/dd/d43/d5c/fd6 [3571366,17749] 0 2026-03-10T12:38:09.373 INFO:tasks.workunit.client.0.vm00.stdout:3/816: mkdir dd/d3d/d8a/de0/d55/d112 0 2026-03-10T12:38:09.374 INFO:tasks.workunit.client.1.vm07.stdout:7/591: mknod d0/d47/dab/cc6 0 2026-03-10T12:38:09.374 INFO:tasks.workunit.client.0.vm00.stdout:3/817: read - dd/d3d/d8a/f102 zero size 2026-03-10T12:38:09.380 INFO:tasks.workunit.client.0.vm00.stdout:0/677: dread d3/d40/f7a [0,4194304] 0 2026-03-10T12:38:09.380 INFO:tasks.workunit.client.0.vm00.stdout:0/678: read - d3/d7/d3c/d4b/f79 zero size 2026-03-10T12:38:09.380 INFO:tasks.workunit.client.0.vm00.stdout:5/844: rename d1f/d26/d2b/f111 to d1f/d26/d2b/d37/dcc/f129 0 2026-03-10T12:38:09.380 INFO:tasks.workunit.client.0.vm00.stdout:5/845: chown d1f/d26/d2e/f8c 401 1 2026-03-10T12:38:09.381 INFO:tasks.workunit.client.0.vm00.stdout:5/846: chown f19 74300 1 2026-03-10T12:38:09.383 INFO:tasks.workunit.client.1.vm07.stdout:5/653: creat d0/d22/d18/d19/d72/dcc/fe7 x:0 0 0 2026-03-10T12:38:09.383 INFO:tasks.workunit.client.1.vm07.stdout:5/654: fsync d0/d22/d18/d3e/d53/fa3 0 2026-03-10T12:38:09.386 INFO:tasks.workunit.client.0.vm00.stdout:3/818: creat dd/d3d/d8a/f113 x:0 0 0 2026-03-10T12:38:09.386 INFO:tasks.workunit.client.0.vm00.stdout:3/819: readlink dd/d3d/d65/l96 0 2026-03-10T12:38:09.387 INFO:tasks.workunit.client.1.vm07.stdout:4/739: write d0/d4/d10/d3c/f6c [1225489,127072] 0 2026-03-10T12:38:09.388 INFO:tasks.workunit.client.0.vm00.stdout:0/679: truncate d3/d33/f4d 993162 0 2026-03-10T12:38:09.389 INFO:tasks.workunit.client.0.vm00.stdout:2/821: mkdir d4/d53/d76/d9b/d107 0 2026-03-10T12:38:09.390 INFO:tasks.workunit.client.0.vm00.stdout:2/822: fsync d4/d53/d76/f92 0 2026-03-10T12:38:09.390 INFO:tasks.workunit.client.0.vm00.stdout:2/823: write d4/d6/d2d/d31/f79 [215884,96668] 0 2026-03-10T12:38:09.391 INFO:tasks.workunit.client.0.vm00.stdout:5/847: sync 2026-03-10T12:38:09.391 INFO:tasks.workunit.client.0.vm00.stdout:5/848: chown d1f/d26/d2b/d35 168 1 2026-03-10T12:38:09.391 INFO:tasks.workunit.client.0.vm00.stdout:2/824: stat d4/d6/d2d/dc3/de1/ff1 0 2026-03-10T12:38:09.392 INFO:tasks.workunit.client.0.vm00.stdout:5/849: readlink d1f/d6a/d94/dc9/lab 0 2026-03-10T12:38:09.392 INFO:tasks.workunit.client.1.vm07.stdout:1/613: dwrite d9/df/d29/f70 [0,4194304] 0 2026-03-10T12:38:09.395 INFO:tasks.workunit.client.1.vm07.stdout:3/648: rmdir dc/d18 39 2026-03-10T12:38:09.395 INFO:tasks.workunit.client.0.vm00.stdout:5/850: dwrite d1f/d6a/d94/dc9/f114 [0,4194304] 0 2026-03-10T12:38:09.395 INFO:tasks.workunit.client.1.vm07.stdout:1/614: chown d9/df/d29/d2b/d31/fc6 126536 1 2026-03-10T12:38:09.398 INFO:tasks.workunit.client.1.vm07.stdout:7/592: rename d0/d57/d62/f77 to d0/d52/fc7 0 2026-03-10T12:38:09.405 INFO:tasks.workunit.client.1.vm07.stdout:0/673: dwrite d0/d14/d5f/d76/d2f/d31/d4f/f92 [0,4194304] 0 2026-03-10T12:38:09.405 INFO:tasks.workunit.client.1.vm07.stdout:9/702: write d5/d13/d2c/de6/f56 [2833110,17732] 0 2026-03-10T12:38:09.406 INFO:tasks.workunit.client.0.vm00.stdout:2/825: truncate d4/d6/f89 818906 0 2026-03-10T12:38:09.411 INFO:tasks.workunit.client.0.vm00.stdout:0/680: symlink d3/d40/le0 0 2026-03-10T12:38:09.412 INFO:tasks.workunit.client.1.vm07.stdout:4/740: creat d0/d4/d10/d5f/d6d/f103 x:0 0 0 2026-03-10T12:38:09.427 INFO:tasks.workunit.client.1.vm07.stdout:1/615: mknod d9/df/d29/d2b/d3d/cd1 0 2026-03-10T12:38:09.429 INFO:tasks.workunit.client.0.vm00.stdout:2/826: dread d4/d6/dca/f3f [0,4194304] 0 2026-03-10T12:38:09.432 INFO:tasks.workunit.client.1.vm07.stdout:7/593: sync 2026-03-10T12:38:09.432 INFO:tasks.workunit.client.1.vm07.stdout:7/594: stat d0/d52/c7f 0 2026-03-10T12:38:09.432 INFO:tasks.workunit.client.0.vm00.stdout:0/681: truncate d3/d7/d3c/d74/f78 1179151 0 2026-03-10T12:38:09.435 INFO:tasks.workunit.client.0.vm00.stdout:0/682: truncate d3/d7/d4c/d5b/d38/fbf 1337789 0 2026-03-10T12:38:09.439 INFO:tasks.workunit.client.1.vm07.stdout:5/655: read d0/d22/f93 [267869,79086] 0 2026-03-10T12:38:09.439 INFO:tasks.workunit.client.1.vm07.stdout:0/674: unlink d0/d14/d5f/d76/d2f/d31/d4f/l50 0 2026-03-10T12:38:09.444 INFO:tasks.workunit.client.1.vm07.stdout:1/616: dread - d9/df/d55/f87 zero size 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.0.vm00.stdout:0/683: fdatasync d3/d33/f4d 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.0.vm00.stdout:2/827: getdents d4/d53/d76/dba 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:7/595: dread - d0/d52/fa4 zero size 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:0/675: mkdir d0/d14/d5f/d76/d2f/d31/d4f/da8/de2 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:9/703: creat d5/d1f/d5e/d6b/de0/fef x:0 0 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:7/596: fsync d0/f5f 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:5/656: mkdir d0/d22/d18/d19/d21/d54/dcb/de8 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:0/676: rename d0/d14/d5f/d76/d2f/fa9 to d0/d14/d5f/d76/d2f/d31/d79/dcc/fe3 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:7/597: creat d0/d57/d62/fc8 x:0 0 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:5/657: mknod d0/d22/d18/d19/d2e/d67/dd9/ce9 0 2026-03-10T12:38:09.466 INFO:tasks.workunit.client.1.vm07.stdout:5/658: symlink d0/lea 0 2026-03-10T12:38:09.500 INFO:tasks.workunit.client.1.vm07.stdout:0/677: sync 2026-03-10T12:38:09.502 INFO:tasks.workunit.client.1.vm07.stdout:0/678: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/fe4 x:0 0 0 2026-03-10T12:38:09.506 INFO:tasks.workunit.client.1.vm07.stdout:5/659: sync 2026-03-10T12:38:09.509 INFO:tasks.workunit.client.1.vm07.stdout:5/660: mknod d0/d22/ceb 0 2026-03-10T12:38:09.513 INFO:tasks.workunit.client.1.vm07.stdout:0/679: creat d0/d14/d5f/d76/d2f/d31/d4f/d9d/fe5 x:0 0 0 2026-03-10T12:38:09.519 INFO:tasks.workunit.client.1.vm07.stdout:0/680: symlink d0/d14/d5f/d76/d2f/d31/d79/dd7/le6 0 2026-03-10T12:38:09.519 INFO:tasks.workunit.client.1.vm07.stdout:0/681: rmdir d0 39 2026-03-10T12:38:09.522 INFO:tasks.workunit.client.1.vm07.stdout:2/532: write d0/d29/d64/d74/f8e [2660124,22382] 0 2026-03-10T12:38:09.523 INFO:tasks.workunit.client.1.vm07.stdout:2/533: read - d0/d80/d93/fb6 zero size 2026-03-10T12:38:09.523 INFO:tasks.workunit.client.1.vm07.stdout:2/534: chown d0/d29/d64/d74/d75 906 1 2026-03-10T12:38:09.524 INFO:tasks.workunit.client.1.vm07.stdout:2/535: readlink d0/d42/d26/d38/d4f/d62/l8c 0 2026-03-10T12:38:09.524 INFO:tasks.workunit.client.1.vm07.stdout:2/536: fsync d0/d29/fb3 0 2026-03-10T12:38:09.525 INFO:tasks.workunit.client.1.vm07.stdout:2/537: chown d0/d42/d1f 457203227 1 2026-03-10T12:38:09.525 INFO:tasks.workunit.client.1.vm07.stdout:2/538: fsync d0/d42/d1f/d20/fa0 0 2026-03-10T12:38:09.528 INFO:tasks.workunit.client.1.vm07.stdout:8/602: write d1/d3/d6/f81 [377542,114119] 0 2026-03-10T12:38:09.529 INFO:tasks.workunit.client.0.vm00.stdout:6/538: write d2/d16/d74/f6e [496067,60685] 0 2026-03-10T12:38:09.534 INFO:tasks.workunit.client.1.vm07.stdout:2/539: write d0/f46 [1810266,60607] 0 2026-03-10T12:38:09.537 INFO:tasks.workunit.client.1.vm07.stdout:2/540: mkdir d0/d42/d1f/dc0 0 2026-03-10T12:38:09.539 INFO:tasks.workunit.client.0.vm00.stdout:6/539: link d2/d16/d74/f4d d2/d42/d80/d89/fc2 0 2026-03-10T12:38:09.543 INFO:tasks.workunit.client.0.vm00.stdout:6/540: symlink d2/d14/d7a/lc3 0 2026-03-10T12:38:09.543 INFO:tasks.workunit.client.1.vm07.stdout:8/603: rmdir d1/d3/d40/dac 0 2026-03-10T12:38:09.543 INFO:tasks.workunit.client.1.vm07.stdout:0/682: getdents d0/d14/d5f/d76/d2f/d31/d79/d9e 0 2026-03-10T12:38:09.546 INFO:tasks.workunit.client.1.vm07.stdout:0/683: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf [0,4194304] 0 2026-03-10T12:38:09.551 INFO:tasks.workunit.client.0.vm00.stdout:6/541: fsync d2/d16/d29/f54 0 2026-03-10T12:38:09.551 INFO:tasks.workunit.client.1.vm07.stdout:8/604: creat d1/d3/d40/d92/dba/fc3 x:0 0 0 2026-03-10T12:38:09.554 INFO:tasks.workunit.client.1.vm07.stdout:0/684: fdatasync d0/d14/d5f/f54 0 2026-03-10T12:38:09.555 INFO:tasks.workunit.client.1.vm07.stdout:0/685: dread - d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/fe4 zero size 2026-03-10T12:38:09.556 INFO:tasks.workunit.client.1.vm07.stdout:0/686: chown d0/d14/cbb 1622 1 2026-03-10T12:38:09.558 INFO:tasks.workunit.client.1.vm07.stdout:8/605: creat d1/d3/d40/d92/dba/fc4 x:0 0 0 2026-03-10T12:38:09.558 INFO:tasks.workunit.client.1.vm07.stdout:8/606: chown d1/d3/d5d/f5f 14870 1 2026-03-10T12:38:09.559 INFO:tasks.workunit.client.1.vm07.stdout:0/687: mknod d0/d14/d5f/d76/d2f/d31/d79/d9e/ce7 0 2026-03-10T12:38:09.563 INFO:tasks.workunit.client.1.vm07.stdout:0/688: creat d0/d14/d5f/d41/fe8 x:0 0 0 2026-03-10T12:38:09.563 INFO:tasks.workunit.client.0.vm00.stdout:6/542: unlink d2/d51/f5c 0 2026-03-10T12:38:09.563 INFO:tasks.workunit.client.1.vm07.stdout:0/689: dread - d0/d14/d5f/d41/f77 zero size 2026-03-10T12:38:09.564 INFO:tasks.workunit.client.0.vm00.stdout:6/543: chown d2/d42/d80/d89/fb8 825551 1 2026-03-10T12:38:09.565 INFO:tasks.workunit.client.1.vm07.stdout:0/690: symlink d0/d14/le9 0 2026-03-10T12:38:09.565 INFO:tasks.workunit.client.1.vm07.stdout:0/691: readlink d0/d14/le9 0 2026-03-10T12:38:09.566 INFO:tasks.workunit.client.0.vm00.stdout:6/544: mknod d2/d16/d29/d31/d88/d92/daa/cc4 0 2026-03-10T12:38:09.569 INFO:tasks.workunit.client.1.vm07.stdout:0/692: rmdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c 39 2026-03-10T12:38:09.569 INFO:tasks.workunit.client.1.vm07.stdout:0/693: dread - d0/d14/d5f/fb3 zero size 2026-03-10T12:38:09.571 INFO:tasks.workunit.client.1.vm07.stdout:0/694: dwrite d0/d14/d5f/d76/f3d [0,4194304] 0 2026-03-10T12:38:09.582 INFO:tasks.workunit.client.0.vm00.stdout:6/545: creat d2/d9f/fc5 x:0 0 0 2026-03-10T12:38:09.582 INFO:tasks.workunit.client.0.vm00.stdout:6/546: readlink d2/d14/l95 0 2026-03-10T12:38:09.584 INFO:tasks.workunit.client.1.vm07.stdout:0/695: mknod d0/d14/d5f/d41/d6a/cea 0 2026-03-10T12:38:09.585 INFO:tasks.workunit.client.1.vm07.stdout:0/696: stat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87 0 2026-03-10T12:38:09.592 INFO:tasks.workunit.client.1.vm07.stdout:0/697: dread - d0/d14/d5f/d76/d2f/d31/d4f/fa7 zero size 2026-03-10T12:38:09.613 INFO:tasks.workunit.client.0.vm00.stdout:1/823: write da/d12/d26/f57 [1718307,67542] 0 2026-03-10T12:38:09.615 INFO:tasks.workunit.client.0.vm00.stdout:1/824: rename da/d24/d5a/dd9/le2 to da/l114 0 2026-03-10T12:38:09.616 INFO:tasks.workunit.client.0.vm00.stdout:1/825: write da/d24/d28/d67/f52 [1536900,11701] 0 2026-03-10T12:38:09.617 INFO:tasks.workunit.client.0.vm00.stdout:1/826: chown da/d12/d91/f108 188 1 2026-03-10T12:38:09.619 INFO:tasks.workunit.client.0.vm00.stdout:8/698: dwrite d0/d46/d6e/f70 [0,4194304] 0 2026-03-10T12:38:09.621 INFO:tasks.workunit.client.0.vm00.stdout:1/827: symlink da/d21/d27/d6a/l115 0 2026-03-10T12:38:09.626 INFO:tasks.workunit.client.0.vm00.stdout:4/833: dwrite df/d1f/d22/f52 [0,4194304] 0 2026-03-10T12:38:09.631 INFO:tasks.workunit.client.0.vm00.stdout:8/699: sync 2026-03-10T12:38:09.636 INFO:tasks.workunit.client.0.vm00.stdout:1/828: link da/d12/d91/fb8 da/d24/d28/f116 0 2026-03-10T12:38:09.638 INFO:tasks.workunit.client.0.vm00.stdout:4/834: mkdir df/d1f/d36/d3a/d41/df7/d112 0 2026-03-10T12:38:09.641 INFO:tasks.workunit.client.0.vm00.stdout:7/581: write da/d1b/d40/f7d [4456659,64944] 0 2026-03-10T12:38:09.641 INFO:tasks.workunit.client.0.vm00.stdout:1/829: dread - da/d12/d26/fec zero size 2026-03-10T12:38:09.644 INFO:tasks.workunit.client.0.vm00.stdout:7/582: mknod da/d25/d2c/d82/d68/cd3 0 2026-03-10T12:38:09.645 INFO:tasks.workunit.client.1.vm07.stdout:6/604: symlink d1/d4/d44/d98/lc2 0 2026-03-10T12:38:09.648 INFO:tasks.workunit.client.0.vm00.stdout:4/835: rename fa to df/d1f/d22/d26/d65/da7/d10e/f113 0 2026-03-10T12:38:09.650 INFO:tasks.workunit.client.1.vm07.stdout:7/598: rmdir d0/d52 39 2026-03-10T12:38:09.652 INFO:tasks.workunit.client.0.vm00.stdout:1/830: symlink da/d21/d27/d6a/d94/l117 0 2026-03-10T12:38:09.653 INFO:tasks.workunit.client.0.vm00.stdout:3/820: write dd/f25 [1884102,63061] 0 2026-03-10T12:38:09.654 INFO:tasks.workunit.client.1.vm07.stdout:5/661: read d0/d22/d18/d19/d2e/d67/fc8 [1430152,660] 0 2026-03-10T12:38:09.657 INFO:tasks.workunit.client.0.vm00.stdout:4/836: read df/d1f/d36/f51 [499092,79263] 0 2026-03-10T12:38:09.660 INFO:tasks.workunit.client.1.vm07.stdout:3/649: write dc/d18/d2d/f80 [964601,58124] 0 2026-03-10T12:38:09.661 INFO:tasks.workunit.client.1.vm07.stdout:3/650: chown dc/dd/d43/d76/d95/db8 1 1 2026-03-10T12:38:09.663 INFO:tasks.workunit.client.0.vm00.stdout:3/821: unlink dd/d18/d13/c1c 0 2026-03-10T12:38:09.665 INFO:tasks.workunit.client.1.vm07.stdout:1/617: dwrite d9/df/d29/d2b/f32 [8388608,4194304] 0 2026-03-10T12:38:09.667 INFO:tasks.workunit.client.1.vm07.stdout:1/618: chown d9/df/d29/d2b/d30/fd0 231935 1 2026-03-10T12:38:09.671 INFO:tasks.workunit.client.0.vm00.stdout:5/851: dwrite d1f/d26/d2b/d37/da4/fde [0,4194304] 0 2026-03-10T12:38:09.673 INFO:tasks.workunit.client.0.vm00.stdout:4/837: creat df/d1f/d22/d26/d65/f114 x:0 0 0 2026-03-10T12:38:09.673 INFO:tasks.workunit.client.0.vm00.stdout:3/822: truncate dd/d3d/d73/f8f 471792 0 2026-03-10T12:38:09.674 INFO:tasks.workunit.client.0.vm00.stdout:4/838: readlink df/d1f/d22/d26/dab/d73/lae 0 2026-03-10T12:38:09.677 INFO:tasks.workunit.client.0.vm00.stdout:3/823: mkdir dd/d3d/d65/d114 0 2026-03-10T12:38:09.678 INFO:tasks.workunit.client.1.vm07.stdout:5/662: mkdir d0/d22/d18/d19/d21/d54/dcb/db8/dec 0 2026-03-10T12:38:09.680 INFO:tasks.workunit.client.0.vm00.stdout:0/684: dwrite d3/db/d24/d25/f7d [0,4194304] 0 2026-03-10T12:38:09.681 INFO:tasks.workunit.client.1.vm07.stdout:9/704: dwrite d5/f1c [4194304,4194304] 0 2026-03-10T12:38:09.693 INFO:tasks.workunit.client.1.vm07.stdout:3/651: mknod dc/dd/d1f/dc7/cdc 0 2026-03-10T12:38:09.694 INFO:tasks.workunit.client.1.vm07.stdout:3/652: truncate dc/d18/d2d/f71 1166704 0 2026-03-10T12:38:09.697 INFO:tasks.workunit.client.0.vm00.stdout:4/839: fdatasync df/d1f/d22/d26/d65/d91/d101/f7c 0 2026-03-10T12:38:09.699 INFO:tasks.workunit.client.0.vm00.stdout:0/685: symlink d3/d7/d3c/d74/le1 0 2026-03-10T12:38:09.699 INFO:tasks.workunit.client.1.vm07.stdout:1/619: symlink d9/d2d/d4f/ld2 0 2026-03-10T12:38:09.699 INFO:tasks.workunit.client.0.vm00.stdout:0/686: chown d3/d7/d3c/laf 338321420 1 2026-03-10T12:38:09.704 INFO:tasks.workunit.client.0.vm00.stdout:7/583: rename da/d41/d48/d81/fab to da/d41/d48/fd4 0 2026-03-10T12:38:09.714 INFO:tasks.workunit.client.0.vm00.stdout:3/824: mkdir dd/d3d/d115 0 2026-03-10T12:38:09.714 INFO:tasks.workunit.client.1.vm07.stdout:7/599: rmdir d0/d52 39 2026-03-10T12:38:09.714 INFO:tasks.workunit.client.1.vm07.stdout:2/541: dwrite d0/d42/d26/d7d/faa [0,4194304] 0 2026-03-10T12:38:09.714 INFO:tasks.workunit.client.1.vm07.stdout:8/607: write d1/d3/d40/f4c [793804,102403] 0 2026-03-10T12:38:09.714 INFO:tasks.workunit.client.0.vm00.stdout:5/852: rename d1f/d6a/d118/f127 to d1f/d26/d2b/d35/d78/d99/dcd/d122/f12a 0 2026-03-10T12:38:09.715 INFO:tasks.workunit.client.0.vm00.stdout:5/853: write d1f/d26/d2b/fe6 [234883,54456] 0 2026-03-10T12:38:09.722 INFO:tasks.workunit.client.0.vm00.stdout:3/825: readlink dd/d18/lba 0 2026-03-10T12:38:09.724 INFO:tasks.workunit.client.0.vm00.stdout:5/854: truncate d1f/d96/dbd/fc5 351151 0 2026-03-10T12:38:09.726 INFO:tasks.workunit.client.0.vm00.stdout:3/826: truncate dd/d18/d13/d1d/dc6/ffe 340118 0 2026-03-10T12:38:09.729 INFO:tasks.workunit.client.0.vm00.stdout:7/584: dread da/f15 [0,4194304] 0 2026-03-10T12:38:09.729 INFO:tasks.workunit.client.1.vm07.stdout:4/741: fdatasync d0/d4/d10/d3c/f6c 0 2026-03-10T12:38:09.729 INFO:tasks.workunit.client.1.vm07.stdout:2/542: dread d0/d29/d64/f78 [0,4194304] 0 2026-03-10T12:38:09.730 INFO:tasks.workunit.client.0.vm00.stdout:3/827: fdatasync dd/d18/d13/d99/da5/dd0/fed 0 2026-03-10T12:38:09.732 INFO:tasks.workunit.client.1.vm07.stdout:2/543: dread d0/d29/f32 [0,4194304] 0 2026-03-10T12:38:09.732 INFO:tasks.workunit.client.1.vm07.stdout:7/600: symlink d0/d52/lc9 0 2026-03-10T12:38:09.734 INFO:tasks.workunit.client.1.vm07.stdout:3/653: rename dc/dd/d43/fce to dc/d18/fdd 0 2026-03-10T12:38:09.736 INFO:tasks.workunit.client.0.vm00.stdout:2/828: write d4/d53/f7d [863992,12884] 0 2026-03-10T12:38:09.737 INFO:tasks.workunit.client.0.vm00.stdout:2/829: read - d4/dd/da7/ffc zero size 2026-03-10T12:38:09.737 INFO:tasks.workunit.client.1.vm07.stdout:4/742: rmdir d0/d4/d10/d3c 39 2026-03-10T12:38:09.739 INFO:tasks.workunit.client.1.vm07.stdout:7/601: symlink d0/d61/d79/lca 0 2026-03-10T12:38:09.740 INFO:tasks.workunit.client.0.vm00.stdout:7/585: dread da/d1b/f22 [0,4194304] 0 2026-03-10T12:38:09.745 INFO:tasks.workunit.client.1.vm07.stdout:4/743: mknod d0/d4/df2/df6/c104 0 2026-03-10T12:38:09.747 INFO:tasks.workunit.client.1.vm07.stdout:4/744: dread d0/d4/d5/f75 [0,4194304] 0 2026-03-10T12:38:09.753 INFO:tasks.workunit.client.1.vm07.stdout:8/608: link d1/d3/d11/f77 d1/d3/d6c/fc5 0 2026-03-10T12:38:09.753 INFO:tasks.workunit.client.0.vm00.stdout:2/830: dread d4/dd/f3e [0,4194304] 0 2026-03-10T12:38:09.754 INFO:tasks.workunit.client.0.vm00.stdout:2/831: readlink d4/d53/d9e/l106 0 2026-03-10T12:38:09.754 INFO:tasks.workunit.client.0.vm00.stdout:2/832: readlink d4/d78/la4 0 2026-03-10T12:38:09.755 INFO:tasks.workunit.client.0.vm00.stdout:2/833: creat d4/d78/df9/f108 x:0 0 0 2026-03-10T12:38:09.757 INFO:tasks.workunit.client.0.vm00.stdout:2/834: mknod d4/dd/db9/d6d/c109 0 2026-03-10T12:38:09.757 INFO:tasks.workunit.client.0.vm00.stdout:2/835: chown d4/d6/d2d/d3a/dd3 29309 1 2026-03-10T12:38:09.782 INFO:tasks.workunit.client.0.vm00.stdout:3/828: sync 2026-03-10T12:38:09.782 INFO:tasks.workunit.client.0.vm00.stdout:7/586: sync 2026-03-10T12:38:09.782 INFO:tasks.workunit.client.1.vm07.stdout:4/745: sync 2026-03-10T12:38:09.783 INFO:tasks.workunit.client.0.vm00.stdout:3/829: mknod dd/d2a/c116 0 2026-03-10T12:38:09.784 INFO:tasks.workunit.client.0.vm00.stdout:2/836: dread d4/d53/d76/fac [0,4194304] 0 2026-03-10T12:38:09.785 INFO:tasks.workunit.client.1.vm07.stdout:4/746: fdatasync d0/d4/d10/d3c/d2b/d54/fd6 0 2026-03-10T12:38:09.787 INFO:tasks.workunit.client.0.vm00.stdout:2/837: dread d4/d6/d2d/d31/f46 [0,4194304] 0 2026-03-10T12:38:09.790 INFO:tasks.workunit.client.1.vm07.stdout:4/747: symlink d0/d4/d5/d78/dc5/l105 0 2026-03-10T12:38:09.792 INFO:tasks.workunit.client.1.vm07.stdout:4/748: mknod d0/d4/d10/c106 0 2026-03-10T12:38:09.797 INFO:tasks.workunit.client.0.vm00.stdout:8/700: dwrite d0/d93/d43/fd2 [0,4194304] 0 2026-03-10T12:38:09.800 INFO:tasks.workunit.client.0.vm00.stdout:3/830: dread dd/d27/f56 [0,4194304] 0 2026-03-10T12:38:09.806 INFO:tasks.workunit.client.0.vm00.stdout:2/838: fdatasync d4/dd/f10 0 2026-03-10T12:38:09.807 INFO:tasks.workunit.client.0.vm00.stdout:2/839: chown d4/d53/f61 27116 1 2026-03-10T12:38:09.812 INFO:tasks.workunit.client.0.vm00.stdout:8/701: readlink d0/d93/l2c 0 2026-03-10T12:38:09.821 INFO:tasks.workunit.client.0.vm00.stdout:1/831: write da/d24/d73/fb6 [497980,31445] 0 2026-03-10T12:38:09.822 INFO:tasks.workunit.client.0.vm00.stdout:1/832: readlink da/d12/db4/lf6 0 2026-03-10T12:38:09.824 INFO:tasks.workunit.client.1.vm07.stdout:0/698: dwrite d0/d14/d5f/d3b/dbc/fbe [0,4194304] 0 2026-03-10T12:38:09.825 INFO:tasks.workunit.client.0.vm00.stdout:1/833: dread da/d21/db3/f7a [0,4194304] 0 2026-03-10T12:38:09.825 INFO:tasks.workunit.client.0.vm00.stdout:2/840: chown d4/dd/d63/f83 14560 1 2026-03-10T12:38:09.826 INFO:tasks.workunit.client.0.vm00.stdout:1/834: readlink da/d21/db3/d59/da6/l8f 0 2026-03-10T12:38:09.827 INFO:tasks.workunit.client.0.vm00.stdout:2/841: stat d4/d6/d2d/d31/f79 0 2026-03-10T12:38:09.827 INFO:tasks.workunit.client.1.vm07.stdout:0/699: mknod d0/d14/d5f/d76/d2f/ceb 0 2026-03-10T12:38:09.827 INFO:tasks.workunit.client.0.vm00.stdout:9/842: dread d0/d3d/d59/d4e/dba/d19/f7d [0,4194304] 0 2026-03-10T12:38:09.829 INFO:tasks.workunit.client.0.vm00.stdout:9/843: truncate d0/d3d/d59/d4e/dba/d19/d50/fe0 762718 0 2026-03-10T12:38:09.830 INFO:tasks.workunit.client.1.vm07.stdout:0/700: mknod d0/d14/d5f/d76/d2f/d31/d79/d85/cec 0 2026-03-10T12:38:09.831 INFO:tasks.workunit.client.0.vm00.stdout:8/702: symlink d0/d93/d17/db1/ld7 0 2026-03-10T12:38:09.840 INFO:tasks.workunit.client.1.vm07.stdout:0/701: symlink d0/d14/d5f/d76/d2f/d31/d4f/led 0 2026-03-10T12:38:09.844 INFO:tasks.workunit.client.0.vm00.stdout:4/840: dwrite fb [4194304,4194304] 0 2026-03-10T12:38:09.846 INFO:tasks.workunit.client.1.vm07.stdout:6/605: dwrite d1/d4/d6/f60 [0,4194304] 0 2026-03-10T12:38:09.849 INFO:tasks.workunit.client.0.vm00.stdout:2/842: mkdir d4/d53/d9e/d10a 0 2026-03-10T12:38:09.864 INFO:tasks.workunit.client.0.vm00.stdout:0/687: write d3/db/f97 [5063006,101381] 0 2026-03-10T12:38:09.865 INFO:tasks.workunit.client.0.vm00.stdout:9/844: truncate d0/d3d/d59/d4e/dba/d1e/d2b/f36 5340193 0 2026-03-10T12:38:09.868 INFO:tasks.workunit.client.1.vm07.stdout:0/702: mknod d0/d14/d5f/d76/d2f/d31/d79/d85/cee 0 2026-03-10T12:38:09.870 INFO:tasks.workunit.client.0.vm00.stdout:5/855: dwrite d1f/d26/d2b/f44 [0,4194304] 0 2026-03-10T12:38:09.875 INFO:tasks.workunit.client.1.vm07.stdout:9/705: write d5/d69/d93/d97/fa2 [532271,8840] 0 2026-03-10T12:38:09.877 INFO:tasks.workunit.client.0.vm00.stdout:1/835: mkdir da/d21/d27/d118 0 2026-03-10T12:38:09.879 INFO:tasks.workunit.client.1.vm07.stdout:1/620: dwrite d9/df/d29/d2b/d31/f72 [0,4194304] 0 2026-03-10T12:38:09.880 INFO:tasks.workunit.client.1.vm07.stdout:5/663: dwrite d0/d22/d18/d19/d36/fc1 [0,4194304] 0 2026-03-10T12:38:09.887 INFO:tasks.workunit.client.1.vm07.stdout:2/544: dwrite d0/d29/d64/f67 [0,4194304] 0 2026-03-10T12:38:09.891 INFO:tasks.workunit.client.0.vm00.stdout:9/845: creat d0/d3d/d59/d4e/dba/d1e/d85/de5/f133 x:0 0 0 2026-03-10T12:38:09.892 INFO:tasks.workunit.client.1.vm07.stdout:6/606: mkdir d1/d4/d6/d43/d88/dc3 0 2026-03-10T12:38:09.892 INFO:tasks.workunit.client.1.vm07.stdout:9/706: mkdir d5/d13/d22/df0 0 2026-03-10T12:38:09.896 INFO:tasks.workunit.client.0.vm00.stdout:8/703: truncate d0/d5c/f42 140518 0 2026-03-10T12:38:09.897 INFO:tasks.workunit.client.1.vm07.stdout:9/707: dwrite d5/d16/d23/fee [0,4194304] 0 2026-03-10T12:38:09.898 INFO:tasks.workunit.client.0.vm00.stdout:0/688: getdents d3/d7/d4c/d5b/d38/d44/d5a 0 2026-03-10T12:38:09.899 INFO:tasks.workunit.client.1.vm07.stdout:9/708: fsync d5/d69/d93/d97/fd9 0 2026-03-10T12:38:09.900 INFO:tasks.workunit.client.0.vm00.stdout:8/704: dwrite d0/d93/d17/da2/fc1 [0,4194304] 0 2026-03-10T12:38:09.902 INFO:tasks.workunit.client.1.vm07.stdout:1/621: fdatasync d9/df/dc2/fa6 0 2026-03-10T12:38:09.902 INFO:tasks.workunit.client.0.vm00.stdout:1/836: write da/d21/db3/d5d/d80/fcc [23093,123564] 0 2026-03-10T12:38:09.903 INFO:tasks.workunit.client.1.vm07.stdout:5/664: dread - d0/d22/d18/d19/d21/fa1 zero size 2026-03-10T12:38:09.904 INFO:tasks.workunit.client.1.vm07.stdout:5/665: truncate d0/d22/d18/d19/d21/fd4 407345 0 2026-03-10T12:38:09.907 INFO:tasks.workunit.client.1.vm07.stdout:2/545: fdatasync d0/d42/d1f/d90/fb2 0 2026-03-10T12:38:09.907 INFO:tasks.workunit.client.1.vm07.stdout:2/546: chown d0/d42/d26/f2e 644418 1 2026-03-10T12:38:09.907 INFO:tasks.workunit.client.0.vm00.stdout:9/846: rename d0/d7f/l112 to d0/d3d/d59/d74/l134 0 2026-03-10T12:38:09.914 INFO:tasks.workunit.client.1.vm07.stdout:6/607: fsync d1/d4/d6/f80 0 2026-03-10T12:38:09.918 INFO:tasks.workunit.client.1.vm07.stdout:6/608: dwrite d1/d4/d6/d53/d66/fba [0,4194304] 0 2026-03-10T12:38:09.918 INFO:tasks.workunit.client.0.vm00.stdout:3/831: dwrite dd/d18/d13/d99/da5/fd4 [0,4194304] 0 2026-03-10T12:38:09.918 INFO:tasks.workunit.client.0.vm00.stdout:5/856: symlink d1f/d26/d2b/d35/l12b 0 2026-03-10T12:38:09.920 INFO:tasks.workunit.client.0.vm00.stdout:5/857: dwrite d1f/d26/d2e/d58/ff6 [0,4194304] 0 2026-03-10T12:38:09.920 INFO:tasks.workunit.client.1.vm07.stdout:6/609: write d1/d4/d6/d46/d4d/fb [294821,71170] 0 2026-03-10T12:38:09.921 INFO:tasks.workunit.client.0.vm00.stdout:5/858: write d1f/d26/d101/f128 [636211,26966] 0 2026-03-10T12:38:09.927 INFO:tasks.workunit.client.0.vm00.stdout:8/705: chown d0/d5c/f42 1 1 2026-03-10T12:38:09.927 INFO:tasks.workunit.client.1.vm07.stdout:3/654: write dc/dd/d43/d5c/fa9 [269760,101932] 0 2026-03-10T12:38:09.927 INFO:tasks.workunit.client.0.vm00.stdout:7/587: write da/f16 [2276339,21573] 0 2026-03-10T12:38:09.928 INFO:tasks.workunit.client.0.vm00.stdout:8/706: chown d0/d46/d6e/d9b/la9 55165 1 2026-03-10T12:38:09.928 INFO:tasks.workunit.client.1.vm07.stdout:3/655: stat dc/d18/d24/l62 0 2026-03-10T12:38:09.928 INFO:tasks.workunit.client.0.vm00.stdout:7/588: truncate da/d1b/d40/fca 543091 0 2026-03-10T12:38:09.930 INFO:tasks.workunit.client.1.vm07.stdout:7/602: dwrite d0/d47/f8e [0,4194304] 0 2026-03-10T12:38:09.934 INFO:tasks.workunit.client.1.vm07.stdout:8/609: dwrite d1/d3/d6/d50/faa [0,4194304] 0 2026-03-10T12:38:09.939 INFO:tasks.workunit.client.1.vm07.stdout:4/749: dwrite d0/d4/d5/d78/dc5/df7/f97 [0,4194304] 0 2026-03-10T12:38:09.946 INFO:tasks.workunit.client.0.vm00.stdout:5/859: creat d1f/d26/d2e/d58/d6b/deb/f12c x:0 0 0 2026-03-10T12:38:09.954 INFO:tasks.workunit.client.0.vm00.stdout:8/707: dwrite d0/d93/d36/d5b/f65 [0,4194304] 0 2026-03-10T12:38:09.961 INFO:tasks.workunit.client.0.vm00.stdout:0/689: rename d3/d7/d58 to d3/d7/d4c/d5b/d38/db3/de2 0 2026-03-10T12:38:09.964 INFO:tasks.workunit.client.0.vm00.stdout:0/690: dwrite d3/d33/f4d [0,4194304] 0 2026-03-10T12:38:09.977 INFO:tasks.workunit.client.1.vm07.stdout:3/656: mkdir dc/dd/d43/d76/d95/dde 0 2026-03-10T12:38:09.980 INFO:tasks.workunit.client.0.vm00.stdout:0/691: creat d3/d7/d4c/d5b/d38/db3/fe3 x:0 0 0 2026-03-10T12:38:09.980 INFO:tasks.workunit.client.0.vm00.stdout:5/860: truncate d1f/d26/d2b/d35/d78/d99/daf/ff5 1573157 0 2026-03-10T12:38:09.980 INFO:tasks.workunit.client.1.vm07.stdout:8/610: mknod d1/d3/db2/cc6 0 2026-03-10T12:38:09.981 INFO:tasks.workunit.client.1.vm07.stdout:4/750: read d0/d4/df2/df6/fcd [1715634,38611] 0 2026-03-10T12:38:09.981 INFO:tasks.workunit.client.1.vm07.stdout:4/751: fdatasync d0/d5c/d7c/f101 0 2026-03-10T12:38:09.985 INFO:tasks.workunit.client.1.vm07.stdout:1/622: symlink d9/df/ld3 0 2026-03-10T12:38:09.988 INFO:tasks.workunit.client.0.vm00.stdout:6/547: getdents d2/d51/d70 0 2026-03-10T12:38:09.988 INFO:tasks.workunit.client.1.vm07.stdout:5/666: mkdir d0/d22/d18/d19/d21/dc2/ded 0 2026-03-10T12:38:09.996 INFO:tasks.workunit.client.1.vm07.stdout:2/547: symlink d0/d42/lc1 0 2026-03-10T12:38:09.997 INFO:tasks.workunit.client.0.vm00.stdout:0/692: creat d3/d7/d3c/fe4 x:0 0 0 2026-03-10T12:38:09.997 INFO:tasks.workunit.client.1.vm07.stdout:2/548: chown d0/d29/d64/d6c/d94/cb4 116106912 1 2026-03-10T12:38:09.997 INFO:tasks.workunit.client.1.vm07.stdout:2/549: readlink d0/d29/d64/d74/d88/lb0 0 2026-03-10T12:38:09.999 INFO:tasks.workunit.client.0.vm00.stdout:6/548: mknod d2/d16/d29/d31/d88/cc6 0 2026-03-10T12:38:09.999 INFO:tasks.workunit.client.1.vm07.stdout:7/603: symlink d0/lcb 0 2026-03-10T12:38:10.000 INFO:tasks.workunit.client.0.vm00.stdout:0/693: truncate d3/d40/d65/fc0 1708714 0 2026-03-10T12:38:10.007 INFO:tasks.workunit.client.0.vm00.stdout:0/694: fsync d3/d7/d3c/d4b/f79 0 2026-03-10T12:38:10.009 INFO:tasks.workunit.client.0.vm00.stdout:0/695: write d3/d7/d4c/d5b/d38/db3/de2/fd4 [171375,44717] 0 2026-03-10T12:38:10.009 INFO:tasks.workunit.client.0.vm00.stdout:0/696: stat d3/d22/f46 0 2026-03-10T12:38:10.011 INFO:tasks.workunit.client.1.vm07.stdout:5/667: creat d0/d22/d18/d3e/d53/fee x:0 0 0 2026-03-10T12:38:10.015 INFO:tasks.workunit.client.1.vm07.stdout:7/604: creat d0/d57/d62/d90/fcc x:0 0 0 2026-03-10T12:38:10.026 INFO:tasks.workunit.client.1.vm07.stdout:5/668: truncate d0/d22/d18/d19/d2e/f62 894821 0 2026-03-10T12:38:10.027 INFO:tasks.workunit.client.1.vm07.stdout:4/752: dread d0/d4/d10/d3c/d2b/d54/de1/f91 [0,4194304] 0 2026-03-10T12:38:10.031 INFO:tasks.workunit.client.1.vm07.stdout:7/605: dread - d0/d52/f97 zero size 2026-03-10T12:38:10.032 INFO:tasks.workunit.client.1.vm07.stdout:1/623: creat d9/df/d29/fd4 x:0 0 0 2026-03-10T12:38:10.036 INFO:tasks.workunit.client.1.vm07.stdout:4/753: mknod d0/d4/d10/d3c/d2b/d54/c107 0 2026-03-10T12:38:10.039 INFO:tasks.workunit.client.1.vm07.stdout:7/606: mknod d0/d61/d79/ccd 0 2026-03-10T12:38:10.057 INFO:tasks.workunit.client.0.vm00.stdout:8/708: dread d0/d93/d17/d48/f4c [0,4194304] 0 2026-03-10T12:38:10.059 INFO:tasks.workunit.client.0.vm00.stdout:8/709: truncate d0/d93/d36/d5b/f65 4617439 0 2026-03-10T12:38:10.059 INFO:tasks.workunit.client.0.vm00.stdout:5/861: dread d1f/d26/d6f/fa9 [0,4194304] 0 2026-03-10T12:38:10.059 INFO:tasks.workunit.client.0.vm00.stdout:5/862: stat d1f/d6a/d118/dcb 0 2026-03-10T12:38:10.062 INFO:tasks.workunit.client.0.vm00.stdout:5/863: creat d1f/d6a/d118/f12d x:0 0 0 2026-03-10T12:38:10.063 INFO:tasks.workunit.client.0.vm00.stdout:5/864: write d1f/d96/dbd/ffb [1312440,22302] 0 2026-03-10T12:38:10.066 INFO:tasks.workunit.client.0.vm00.stdout:8/710: dwrite d0/d93/d2d/f44 [4194304,4194304] 0 2026-03-10T12:38:10.068 INFO:tasks.workunit.client.0.vm00.stdout:5/865: mkdir d1f/d6a/d118/d8e/d12e 0 2026-03-10T12:38:10.073 INFO:tasks.workunit.client.0.vm00.stdout:8/711: dread d0/d46/d6e/f70 [0,4194304] 0 2026-03-10T12:38:10.073 INFO:tasks.workunit.client.0.vm00.stdout:8/712: fdatasync d0/d93/d2d/dc8/fd1 0 2026-03-10T12:38:10.074 INFO:tasks.workunit.client.0.vm00.stdout:8/713: write d0/d93/f27 [1317072,52688] 0 2026-03-10T12:38:10.076 INFO:tasks.workunit.client.0.vm00.stdout:8/714: dread - d0/dd/f9e zero size 2026-03-10T12:38:10.076 INFO:tasks.workunit.client.0.vm00.stdout:8/715: stat d0/d93/d36/l53 0 2026-03-10T12:38:10.077 INFO:tasks.workunit.client.0.vm00.stdout:8/716: readlink d0/l8e 0 2026-03-10T12:38:10.077 INFO:tasks.workunit.client.0.vm00.stdout:8/717: readlink d0/l73 0 2026-03-10T12:38:10.078 INFO:tasks.workunit.client.0.vm00.stdout:8/718: readlink d0/d93/l15 0 2026-03-10T12:38:10.090 INFO:tasks.workunit.client.0.vm00.stdout:4/841: dread df/f19 [0,4194304] 0 2026-03-10T12:38:10.095 INFO:tasks.workunit.client.0.vm00.stdout:4/842: dwrite df/d1f/d22/d26/d65/d91/db9/fea [0,4194304] 0 2026-03-10T12:38:10.100 INFO:tasks.workunit.client.0.vm00.stdout:4/843: mkdir df/d1f/d22/d26/dab/d73/d115 0 2026-03-10T12:38:10.104 INFO:tasks.workunit.client.0.vm00.stdout:4/844: creat df/d1f/f116 x:0 0 0 2026-03-10T12:38:10.105 INFO:tasks.workunit.client.0.vm00.stdout:4/845: link df/d1f/d36/d3a/d41/fe0 df/d1f/d22/d26/d70/f117 0 2026-03-10T12:38:10.106 INFO:tasks.workunit.client.0.vm00.stdout:4/846: chown f3 117 1 2026-03-10T12:38:10.106 INFO:tasks.workunit.client.0.vm00.stdout:5/866: sync 2026-03-10T12:38:10.108 INFO:tasks.workunit.client.0.vm00.stdout:4/847: creat df/d1f/d36/dc6/f118 x:0 0 0 2026-03-10T12:38:10.120 INFO:tasks.workunit.client.1.vm07.stdout:5/669: read d0/d22/dbc/f8b [198862,66801] 0 2026-03-10T12:38:10.122 INFO:tasks.workunit.client.0.vm00.stdout:8/719: dread d0/d93/f8f [0,4194304] 0 2026-03-10T12:38:10.123 INFO:tasks.workunit.client.1.vm07.stdout:5/670: creat d0/d22/d18/d19/d2e/d67/dd9/fef x:0 0 0 2026-03-10T12:38:10.123 INFO:tasks.workunit.client.0.vm00.stdout:8/720: dread - d0/dd/d38/f6d zero size 2026-03-10T12:38:10.124 INFO:tasks.workunit.client.1.vm07.stdout:5/671: mkdir d0/d22/d18/d19/d21/dc2/df0 0 2026-03-10T12:38:10.125 INFO:tasks.workunit.client.0.vm00.stdout:8/721: creat d0/d46/d7e/fd8 x:0 0 0 2026-03-10T12:38:10.125 INFO:tasks.workunit.client.0.vm00.stdout:8/722: chown d0/d93/l15 386 1 2026-03-10T12:38:10.128 INFO:tasks.workunit.client.0.vm00.stdout:8/723: symlink d0/d93/d17/ld9 0 2026-03-10T12:38:10.132 INFO:tasks.workunit.client.1.vm07.stdout:5/672: dwrite d0/d22/d18/d19/d2e/d67/dd9/fef [0,4194304] 0 2026-03-10T12:38:10.132 INFO:tasks.workunit.client.0.vm00.stdout:8/724: rename d0/dd/c50 to d0/d93/d36/d7d/cda 0 2026-03-10T12:38:10.132 INFO:tasks.workunit.client.0.vm00.stdout:8/725: chown d0/d93/d17/db1 4 1 2026-03-10T12:38:10.132 INFO:tasks.workunit.client.0.vm00.stdout:8/726: creat d0/d93/d36/d5b/fdb x:0 0 0 2026-03-10T12:38:10.132 INFO:tasks.workunit.client.0.vm00.stdout:8/727: readlink d0/d93/d17/d48/lab 0 2026-03-10T12:38:10.134 INFO:tasks.workunit.client.0.vm00.stdout:8/728: truncate d0/d5c/fa0 1429070 0 2026-03-10T12:38:10.136 INFO:tasks.workunit.client.0.vm00.stdout:8/729: chown d0/d5c/f42 89004768 1 2026-03-10T12:38:10.136 INFO:tasks.workunit.client.0.vm00.stdout:8/730: readlink d0/d58/lca 0 2026-03-10T12:38:10.137 INFO:tasks.workunit.client.0.vm00.stdout:8/731: write d0/d93/f27 [823799,16901] 0 2026-03-10T12:38:10.138 INFO:tasks.workunit.client.0.vm00.stdout:8/732: symlink d0/d93/d36/d7d/ldc 0 2026-03-10T12:38:10.140 INFO:tasks.workunit.client.0.vm00.stdout:8/733: rename d0/d93/d17/d48/f87 to d0/d46/d7e/fdd 0 2026-03-10T12:38:10.140 INFO:tasks.workunit.client.0.vm00.stdout:8/734: stat d0/d93/d36/d7d 0 2026-03-10T12:38:10.142 INFO:tasks.workunit.client.0.vm00.stdout:5/867: sync 2026-03-10T12:38:10.145 INFO:tasks.workunit.client.0.vm00.stdout:8/735: dwrite d0/d93/d17/f67 [0,4194304] 0 2026-03-10T12:38:10.147 INFO:tasks.workunit.client.0.vm00.stdout:8/736: readlink d0/d93/d36/d7d/ldc 0 2026-03-10T12:38:10.153 INFO:tasks.workunit.client.0.vm00.stdout:5/868: rmdir d1f/d39 39 2026-03-10T12:38:10.163 INFO:tasks.workunit.client.0.vm00.stdout:5/869: dread - d1f/d26/d2e/fb8 zero size 2026-03-10T12:38:10.163 INFO:tasks.workunit.client.0.vm00.stdout:5/870: truncate d1f/d6a/f84 3036128 0 2026-03-10T12:38:10.164 INFO:tasks.workunit.client.0.vm00.stdout:5/871: creat d1f/d26/d2e/d58/d6b/d86/f12f x:0 0 0 2026-03-10T12:38:10.164 INFO:tasks.workunit.client.0.vm00.stdout:5/872: truncate d1f/d6a/d118/d8e/fb6 1378388 0 2026-03-10T12:38:10.170 INFO:tasks.workunit.client.0.vm00.stdout:8/737: dread d0/d93/d2d/f6f [0,4194304] 0 2026-03-10T12:38:10.172 INFO:tasks.workunit.client.0.vm00.stdout:8/738: rename d0/d93/d43 to d0/d93/d17/db1/dde 0 2026-03-10T12:38:10.175 INFO:tasks.workunit.client.0.vm00.stdout:8/739: link d0/d93/d17/ld0 d0/d46/ldf 0 2026-03-10T12:38:10.176 INFO:tasks.workunit.client.0.vm00.stdout:5/873: sync 2026-03-10T12:38:10.176 INFO:tasks.workunit.client.0.vm00.stdout:8/740: creat d0/d93/d36/d51/fe0 x:0 0 0 2026-03-10T12:38:10.179 INFO:tasks.workunit.client.0.vm00.stdout:8/741: mknod d0/d46/d89/ce1 0 2026-03-10T12:38:10.220 INFO:tasks.workunit.client.0.vm00.stdout:2/843: write d4/dd/f17 [3927407,2929] 0 2026-03-10T12:38:10.222 INFO:tasks.workunit.client.0.vm00.stdout:2/844: rename d4/l1a to d4/d53/d9e/d101/l10b 0 2026-03-10T12:38:10.222 INFO:tasks.workunit.client.1.vm07.stdout:0/703: dread d0/d14/d5f/d3b/f4b [0,4194304] 0 2026-03-10T12:38:10.248 INFO:tasks.workunit.client.0.vm00.stdout:1/837: write da/d12/f1d [1156545,105196] 0 2026-03-10T12:38:10.251 INFO:tasks.workunit.client.0.vm00.stdout:9/847: dwrite d0/f116 [4194304,4194304] 0 2026-03-10T12:38:10.252 INFO:tasks.workunit.client.0.vm00.stdout:3/832: dwrite dd/d64/fb5 [4194304,4194304] 0 2026-03-10T12:38:10.261 INFO:tasks.workunit.client.0.vm00.stdout:1/838: mknod da/d24/d5a/c119 0 2026-03-10T12:38:10.264 INFO:tasks.workunit.client.0.vm00.stdout:9/848: mknod d0/d3d/d43/d53/d126/c135 0 2026-03-10T12:38:10.266 INFO:tasks.workunit.client.0.vm00.stdout:1/839: creat da/d21/db3/d59/da6/da4/f11a x:0 0 0 2026-03-10T12:38:10.270 INFO:tasks.workunit.client.0.vm00.stdout:1/840: dwrite da/d21/d27/fe8 [0,4194304] 0 2026-03-10T12:38:10.280 INFO:tasks.workunit.client.0.vm00.stdout:1/841: rename da/d21/db3/d59/da6/fd3 to da/d21/db3/d59/da6/da4/dda/dc0/dfe/d10e/f11b 0 2026-03-10T12:38:10.283 INFO:tasks.workunit.client.0.vm00.stdout:1/842: unlink da/l109 0 2026-03-10T12:38:10.288 INFO:tasks.workunit.client.0.vm00.stdout:1/843: rmdir da/d24/d28/d67/dfb 0 2026-03-10T12:38:10.289 INFO:tasks.workunit.client.0.vm00.stdout:1/844: symlink da/d21/db3/d5d/dab/l11c 0 2026-03-10T12:38:10.295 INFO:tasks.workunit.client.0.vm00.stdout:9/849: sync 2026-03-10T12:38:10.305 INFO:tasks.workunit.client.0.vm00.stdout:9/850: mknod d0/d3d/d59/d4e/dba/d1e/d27/d115/c136 0 2026-03-10T12:38:10.306 INFO:tasks.workunit.client.0.vm00.stdout:9/851: write d0/d7f/db8/dc4/fde [3203058,98799] 0 2026-03-10T12:38:10.308 INFO:tasks.workunit.client.0.vm00.stdout:9/852: chown d0/d3d/d59/d4e/dba/d1e/d27/d115/fdd 3434376 1 2026-03-10T12:38:10.313 INFO:tasks.workunit.client.0.vm00.stdout:9/853: dwrite d0/d7f/db8/dc4/f4f [0,4194304] 0 2026-03-10T12:38:10.317 INFO:tasks.workunit.client.0.vm00.stdout:9/854: stat d0/d3d/d59/d4e/dba/f8d 0 2026-03-10T12:38:10.357 INFO:tasks.workunit.client.0.vm00.stdout:4/848: dwrite df/f12 [4194304,4194304] 0 2026-03-10T12:38:10.366 INFO:tasks.workunit.client.0.vm00.stdout:8/742: rename d0/d93/d60/f7f to d0/dd/d38/d81/fe2 0 2026-03-10T12:38:10.367 INFO:tasks.workunit.client.1.vm07.stdout:5/673: dread d0/d22/d18/d19/d21/f42 [0,4194304] 0 2026-03-10T12:38:10.370 INFO:tasks.workunit.client.0.vm00.stdout:8/743: dread d0/d93/d36/d5b/f65 [0,4194304] 0 2026-03-10T12:38:10.374 INFO:tasks.workunit.client.0.vm00.stdout:8/744: dread d0/f10 [4194304,4194304] 0 2026-03-10T12:38:10.378 INFO:tasks.workunit.client.0.vm00.stdout:4/849: creat df/d1f/d36/d3a/f119 x:0 0 0 2026-03-10T12:38:10.393 INFO:tasks.workunit.client.0.vm00.stdout:3/833: dwrite dd/d64/fb9 [0,4194304] 0 2026-03-10T12:38:10.399 INFO:tasks.workunit.client.0.vm00.stdout:8/745: rename d0/dd/d38/d81/fbd to d0/d93/d17/db1/dde/fe3 0 2026-03-10T12:38:10.402 INFO:tasks.workunit.client.0.vm00.stdout:4/850: chown df/d1f/d22/d26/ff0 172 1 2026-03-10T12:38:10.407 INFO:tasks.workunit.client.0.vm00.stdout:4/851: dwrite df/d1f/d22/f52 [4194304,4194304] 0 2026-03-10T12:38:10.408 INFO:tasks.workunit.client.1.vm07.stdout:5/674: sync 2026-03-10T12:38:10.412 INFO:tasks.workunit.client.0.vm00.stdout:2/845: write d4/d53/d68/fb1 [1292116,66397] 0 2026-03-10T12:38:10.412 INFO:tasks.workunit.client.1.vm07.stdout:5/675: dwrite d0/d22/d18/d3e/d5d/f6d [0,4194304] 0 2026-03-10T12:38:10.415 INFO:tasks.workunit.client.1.vm07.stdout:5/676: creat d0/d22/d18/d19/d72/ff1 x:0 0 0 2026-03-10T12:38:10.419 INFO:tasks.workunit.client.1.vm07.stdout:5/677: dwrite d0/d22/d18/d19/d2e/f88 [4194304,4194304] 0 2026-03-10T12:38:10.432 INFO:tasks.workunit.client.1.vm07.stdout:9/709: write d5/d13/d6c/d7a/f94 [381847,89655] 0 2026-03-10T12:38:10.435 INFO:tasks.workunit.client.1.vm07.stdout:9/710: fdatasync d5/d16/d23/d26/f42 0 2026-03-10T12:38:10.435 INFO:tasks.workunit.client.1.vm07.stdout:9/711: stat d5/d13/d2c/f44 0 2026-03-10T12:38:10.437 INFO:tasks.workunit.client.0.vm00.stdout:7/589: dwrite da/d1b/d40/f74 [0,4194304] 0 2026-03-10T12:38:10.438 INFO:tasks.workunit.client.1.vm07.stdout:6/610: write d1/d4/d6/d16/f5f [51827,75852] 0 2026-03-10T12:38:10.444 INFO:tasks.workunit.client.1.vm07.stdout:6/611: truncate d1/d4/d6/d16/d1a/f6a 999018 0 2026-03-10T12:38:10.454 INFO:tasks.workunit.client.1.vm07.stdout:3/657: write dc/dd/d43/d76/d95/da0/faa [4951187,101897] 0 2026-03-10T12:38:10.457 INFO:tasks.workunit.client.0.vm00.stdout:6/549: dwrite d2/da/dc/f27 [0,4194304] 0 2026-03-10T12:38:10.458 INFO:tasks.workunit.client.1.vm07.stdout:8/611: dwrite d1/fc [0,4194304] 0 2026-03-10T12:38:10.458 INFO:tasks.workunit.client.0.vm00.stdout:6/550: chown d2/d42/l66 11657547 1 2026-03-10T12:38:10.463 INFO:tasks.workunit.client.1.vm07.stdout:3/658: symlink dc/d18/d24/ldf 0 2026-03-10T12:38:10.464 INFO:tasks.workunit.client.0.vm00.stdout:8/746: mknod d0/d93/d36/d7d/ce4 0 2026-03-10T12:38:10.470 INFO:tasks.workunit.client.1.vm07.stdout:8/612: rmdir d1/d3 39 2026-03-10T12:38:10.470 INFO:tasks.workunit.client.0.vm00.stdout:3/834: dread dd/d18/d13/d1d/f5b [0,4194304] 0 2026-03-10T12:38:10.473 INFO:tasks.workunit.client.1.vm07.stdout:2/550: dwrite d0/d42/d26/f5a [0,4194304] 0 2026-03-10T12:38:10.478 INFO:tasks.workunit.client.1.vm07.stdout:9/712: dread d5/d16/d23/d26/f5c [0,4194304] 0 2026-03-10T12:38:10.479 INFO:tasks.workunit.client.1.vm07.stdout:9/713: dread - d5/d13/d6c/da4/feb zero size 2026-03-10T12:38:10.482 INFO:tasks.workunit.client.0.vm00.stdout:0/697: write d3/d7/d3c/d74/f78 [1075300,72380] 0 2026-03-10T12:38:10.486 INFO:tasks.workunit.client.1.vm07.stdout:1/624: write d9/df/d29/d2b/d30/f38 [327828,16330] 0 2026-03-10T12:38:10.488 INFO:tasks.workunit.client.0.vm00.stdout:7/590: dwrite da/d1b/f39 [0,4194304] 0 2026-03-10T12:38:10.491 INFO:tasks.workunit.client.1.vm07.stdout:1/625: dwrite d9/df/d29/f70 [0,4194304] 0 2026-03-10T12:38:10.491 INFO:tasks.workunit.client.0.vm00.stdout:9/855: write d0/d3d/d43/d114/f11c [7477361,16186] 0 2026-03-10T12:38:10.492 INFO:tasks.workunit.client.0.vm00.stdout:7/591: chown da/d25/d2c/d82/d68/cd3 4736386 1 2026-03-10T12:38:10.492 INFO:tasks.workunit.client.1.vm07.stdout:1/626: chown d9/f1b 12727 1 2026-03-10T12:38:10.493 INFO:tasks.workunit.client.0.vm00.stdout:7/592: read - da/d41/d48/fbc zero size 2026-03-10T12:38:10.496 INFO:tasks.workunit.client.0.vm00.stdout:1/845: dwrite da/d24/d28/fb1 [0,4194304] 0 2026-03-10T12:38:10.500 INFO:tasks.workunit.client.0.vm00.stdout:1/846: readlink da/l16 0 2026-03-10T12:38:10.501 INFO:tasks.workunit.client.0.vm00.stdout:6/551: creat d2/da/dc/d94/fc7 x:0 0 0 2026-03-10T12:38:10.501 INFO:tasks.workunit.client.0.vm00.stdout:1/847: truncate da/d12/f1d 3649656 0 2026-03-10T12:38:10.501 INFO:tasks.workunit.client.1.vm07.stdout:1/627: sync 2026-03-10T12:38:10.502 INFO:tasks.workunit.client.1.vm07.stdout:4/754: write d0/d4/df2/df6/d46/f85 [3160477,13683] 0 2026-03-10T12:38:10.507 INFO:tasks.workunit.client.0.vm00.stdout:0/698: mkdir d3/d7/db0/dc4/de5 0 2026-03-10T12:38:10.509 INFO:tasks.workunit.client.1.vm07.stdout:7/607: dwrite d0/d57/d62/fa9 [0,4194304] 0 2026-03-10T12:38:10.509 INFO:tasks.workunit.client.0.vm00.stdout:0/699: truncate d3/d7/d4c/d5b/d38/db3/de2/fd4 917789 0 2026-03-10T12:38:10.512 INFO:tasks.workunit.client.0.vm00.stdout:0/700: readlink d3/d7/d3c/ld8 0 2026-03-10T12:38:10.513 INFO:tasks.workunit.client.1.vm07.stdout:0/704: write d0/d14/d5f/d76/f30 [592600,75224] 0 2026-03-10T12:38:10.513 INFO:tasks.workunit.client.1.vm07.stdout:2/551: truncate d0/d42/d26/d7d/f9a 664608 0 2026-03-10T12:38:10.513 INFO:tasks.workunit.client.1.vm07.stdout:9/714: mknod d5/d1f/d5e/d6b/de0/cf1 0 2026-03-10T12:38:10.519 INFO:tasks.workunit.client.1.vm07.stdout:4/755: rmdir d0/d5c 39 2026-03-10T12:38:10.529 INFO:tasks.workunit.client.0.vm00.stdout:0/701: getdents d3/db/d77 0 2026-03-10T12:38:10.529 INFO:tasks.workunit.client.0.vm00.stdout:0/702: creat d3/d7/d4c/d5b/dc5/fe6 x:0 0 0 2026-03-10T12:38:10.529 INFO:tasks.workunit.client.1.vm07.stdout:9/715: mkdir d5/d13/d9d/df2 0 2026-03-10T12:38:10.529 INFO:tasks.workunit.client.1.vm07.stdout:3/659: rmdir dc/d18/dcb 0 2026-03-10T12:38:10.529 INFO:tasks.workunit.client.1.vm07.stdout:4/756: creat d0/d4/df2/f108 x:0 0 0 2026-03-10T12:38:10.529 INFO:tasks.workunit.client.1.vm07.stdout:0/705: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fef x:0 0 0 2026-03-10T12:38:10.530 INFO:tasks.workunit.client.0.vm00.stdout:6/552: sync 2026-03-10T12:38:10.536 INFO:tasks.workunit.client.1.vm07.stdout:2/552: creat d0/d42/d4e/daf/fc2 x:0 0 0 2026-03-10T12:38:10.537 INFO:tasks.workunit.client.1.vm07.stdout:9/716: creat d5/d1f/ff3 x:0 0 0 2026-03-10T12:38:10.539 INFO:tasks.workunit.client.1.vm07.stdout:1/628: creat d9/df/d29/fd5 x:0 0 0 2026-03-10T12:38:10.540 INFO:tasks.workunit.client.1.vm07.stdout:4/757: mknod d0/d4/d5/d78/dc5/df7/db2/dd5/c109 0 2026-03-10T12:38:10.543 INFO:tasks.workunit.client.1.vm07.stdout:3/660: symlink dc/le0 0 2026-03-10T12:38:10.549 INFO:tasks.workunit.client.1.vm07.stdout:1/629: unlink d9/d2d/d4f/fb5 0 2026-03-10T12:38:10.549 INFO:tasks.workunit.client.1.vm07.stdout:3/661: dwrite dc/d18/d24/f3e [0,4194304] 0 2026-03-10T12:38:10.559 INFO:tasks.workunit.client.0.vm00.stdout:5/874: dread d1f/d6a/f84 [0,4194304] 0 2026-03-10T12:38:10.559 INFO:tasks.workunit.client.1.vm07.stdout:0/706: mkdir d0/d14/d5f/d76/d2f/d31/df0 0 2026-03-10T12:38:10.560 INFO:tasks.workunit.client.0.vm00.stdout:6/553: rmdir d2/d16/d29/d31 39 2026-03-10T12:38:10.560 INFO:tasks.workunit.client.1.vm07.stdout:2/553: truncate d0/d42/d4e/d77/f6f 602089 0 2026-03-10T12:38:10.561 INFO:tasks.workunit.client.1.vm07.stdout:2/554: write d0/d42/d26/d38/d4f/d62/fba [4395272,78506] 0 2026-03-10T12:38:10.562 INFO:tasks.workunit.client.1.vm07.stdout:2/555: write d0/d42/d26/d7d/faa [2986845,112304] 0 2026-03-10T12:38:10.565 INFO:tasks.workunit.client.1.vm07.stdout:9/717: mkdir d5/d13/d9d/df2/df4 0 2026-03-10T12:38:10.566 INFO:tasks.workunit.client.1.vm07.stdout:9/718: chown d5/d13/d22/f9e 241 1 2026-03-10T12:38:10.567 INFO:tasks.workunit.client.1.vm07.stdout:9/719: chown d5/d16/d18/l4d 30963954 1 2026-03-10T12:38:10.569 INFO:tasks.workunit.client.0.vm00.stdout:6/554: creat d2/d42/dae/fc8 x:0 0 0 2026-03-10T12:38:10.574 INFO:tasks.workunit.client.1.vm07.stdout:1/630: fdatasync d9/d2d/d4f/d75/fab 0 2026-03-10T12:38:10.583 INFO:tasks.workunit.client.1.vm07.stdout:2/556: mknod d0/d29/d64/d74/d75/cc3 0 2026-03-10T12:38:10.583 INFO:tasks.workunit.client.0.vm00.stdout:8/747: symlink d0/d93/d36/d7d/le5 0 2026-03-10T12:38:10.583 INFO:tasks.workunit.client.0.vm00.stdout:1/848: symlink da/d21/d27/d6a/d94/l11d 0 2026-03-10T12:38:10.583 INFO:tasks.workunit.client.1.vm07.stdout:0/707: dread d0/d14/d5f/d76/d2f/d31/d4f/f5c [0,4194304] 0 2026-03-10T12:38:10.584 INFO:tasks.workunit.client.0.vm00.stdout:2/846: symlink d4/d53/d76/dba/de8/l10c 0 2026-03-10T12:38:10.585 INFO:tasks.workunit.client.0.vm00.stdout:6/555: creat d2/d16/d29/d31/d88/d92/daa/dc1/fc9 x:0 0 0 2026-03-10T12:38:10.586 INFO:tasks.workunit.client.1.vm07.stdout:1/631: symlink d9/df/d29/d2b/d92/d9d/ld6 0 2026-03-10T12:38:10.589 INFO:tasks.workunit.client.1.vm07.stdout:2/557: readlink d0/l79 0 2026-03-10T12:38:10.589 INFO:tasks.workunit.client.0.vm00.stdout:6/556: dwrite d2/d14/d7a/db9/f85 [0,4194304] 0 2026-03-10T12:38:10.592 INFO:tasks.workunit.client.0.vm00.stdout:8/748: creat d0/d93/d17/da2/fe6 x:0 0 0 2026-03-10T12:38:10.598 INFO:tasks.workunit.client.1.vm07.stdout:2/558: creat d0/d42/d26/d38/d4f/fc4 x:0 0 0 2026-03-10T12:38:10.598 INFO:tasks.workunit.client.1.vm07.stdout:0/708: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/fc7 [0,4194304] 0 2026-03-10T12:38:10.598 INFO:tasks.workunit.client.1.vm07.stdout:1/632: getdents d9/df/d29/d2b/d31/d91 0 2026-03-10T12:38:10.598 INFO:tasks.workunit.client.1.vm07.stdout:1/633: readlink d9/df/d29/d2b/d31/d91/l3f 0 2026-03-10T12:38:10.599 INFO:tasks.workunit.client.1.vm07.stdout:1/634: chown d9/df/c23 11725 1 2026-03-10T12:38:10.599 INFO:tasks.workunit.client.1.vm07.stdout:9/720: sync 2026-03-10T12:38:10.600 INFO:tasks.workunit.client.0.vm00.stdout:1/849: write da/d21/d27/f6e [5053876,82299] 0 2026-03-10T12:38:10.602 INFO:tasks.workunit.client.0.vm00.stdout:1/850: dread - da/d21/db3/d59/da6/da4/dda/dc0/dfe/f107 zero size 2026-03-10T12:38:10.602 INFO:tasks.workunit.client.1.vm07.stdout:2/559: read d0/f44 [393558,96684] 0 2026-03-10T12:38:10.603 INFO:tasks.workunit.client.1.vm07.stdout:2/560: readlink d0/d42/lc1 0 2026-03-10T12:38:10.603 INFO:tasks.workunit.client.0.vm00.stdout:1/851: truncate da/d21/db3/d5d/d80/fcc 1263381 0 2026-03-10T12:38:10.603 INFO:tasks.workunit.client.0.vm00.stdout:4/852: creat df/d1f/d22/d26/f11a x:0 0 0 2026-03-10T12:38:10.605 INFO:tasks.workunit.client.0.vm00.stdout:5/875: mknod d1f/d26/c130 0 2026-03-10T12:38:10.606 INFO:tasks.workunit.client.0.vm00.stdout:8/749: symlink d0/d93/d60/le7 0 2026-03-10T12:38:10.606 INFO:tasks.workunit.client.1.vm07.stdout:2/561: dwrite d0/d42/d4e/d77/f89 [0,4194304] 0 2026-03-10T12:38:10.608 INFO:tasks.workunit.client.1.vm07.stdout:2/562: readlink d0/l79 0 2026-03-10T12:38:10.610 INFO:tasks.workunit.client.0.vm00.stdout:8/750: dwrite d0/d58/d68/f74 [0,4194304] 0 2026-03-10T12:38:10.610 INFO:tasks.workunit.client.1.vm07.stdout:0/709: read - d0/d14/d5f/d41/d6a/d74/fb9 zero size 2026-03-10T12:38:10.612 INFO:tasks.workunit.client.0.vm00.stdout:3/835: creat dd/d3d/d73/f117 x:0 0 0 2026-03-10T12:38:10.613 INFO:tasks.workunit.client.0.vm00.stdout:3/836: chown dd/d2a/da2/d10e 1473155709 1 2026-03-10T12:38:10.614 INFO:tasks.workunit.client.1.vm07.stdout:1/635: chown d9/d2d/d4f/d75/d77/da7/fcd 55 1 2026-03-10T12:38:10.630 INFO:tasks.workunit.client.1.vm07.stdout:5/678: dwrite d0/d22/dbc/f8b [4194304,4194304] 0 2026-03-10T12:38:10.630 INFO:tasks.workunit.client.1.vm07.stdout:7/608: dread d0/d61/db4/f4b [0,4194304] 0 2026-03-10T12:38:10.630 INFO:tasks.workunit.client.0.vm00.stdout:9/856: dread d0/d3d/d59/d4e/dba/d19/f65 [0,4194304] 0 2026-03-10T12:38:10.630 INFO:tasks.workunit.client.0.vm00.stdout:9/857: write d0/d3d/d59/d4e/dba/d1e/d2b/f47 [505227,43236] 0 2026-03-10T12:38:10.630 INFO:tasks.workunit.client.0.vm00.stdout:1/852: rmdir da/d21/db3/d59/da6/d8b 39 2026-03-10T12:38:10.639 INFO:tasks.workunit.client.0.vm00.stdout:9/858: symlink d0/d3d/d59/d4e/dba/d1e/dcb/l137 0 2026-03-10T12:38:10.640 INFO:tasks.workunit.client.1.vm07.stdout:5/679: symlink d0/d22/d18/d19/d2e/d67/dd9/lf2 0 2026-03-10T12:38:10.641 INFO:tasks.workunit.client.0.vm00.stdout:1/853: read - da/d12/d26/dd2/ff9 zero size 2026-03-10T12:38:10.642 INFO:tasks.workunit.client.0.vm00.stdout:2/847: creat d4/f10d x:0 0 0 2026-03-10T12:38:10.643 INFO:tasks.workunit.client.0.vm00.stdout:5/876: mkdir d1f/d26/d2b/d131 0 2026-03-10T12:38:10.647 INFO:tasks.workunit.client.1.vm07.stdout:8/613: dread d1/d3/d5d/f5f [0,4194304] 0 2026-03-10T12:38:10.648 INFO:tasks.workunit.client.1.vm07.stdout:8/614: chown d1/d3/c23 6070 1 2026-03-10T12:38:10.655 INFO:tasks.workunit.client.1.vm07.stdout:7/609: rename d0/d61/db4/d8a/da8 to d0/d57/d62/d90/dce 0 2026-03-10T12:38:10.659 INFO:tasks.workunit.client.0.vm00.stdout:2/848: creat d4/d6/d93/f10e x:0 0 0 2026-03-10T12:38:10.660 INFO:tasks.workunit.client.1.vm07.stdout:0/710: creat d0/d14/d5f/d41/ff1 x:0 0 0 2026-03-10T12:38:10.660 INFO:tasks.workunit.client.0.vm00.stdout:8/751: link d0/dd/d38/d81/la8 d0/d93/d2d/d49/le8 0 2026-03-10T12:38:10.660 INFO:tasks.workunit.client.0.vm00.stdout:2/849: write d4/d6/f9c [1004895,123725] 0 2026-03-10T12:38:10.662 INFO:tasks.workunit.client.0.vm00.stdout:9/859: getdents d0/d7f/db8 0 2026-03-10T12:38:10.668 INFO:tasks.workunit.client.0.vm00.stdout:2/850: read d4/d53/d76/fac [88772,75280] 0 2026-03-10T12:38:10.672 INFO:tasks.workunit.client.0.vm00.stdout:2/851: dread d4/d53/d68/fb1 [0,4194304] 0 2026-03-10T12:38:10.677 INFO:tasks.workunit.client.0.vm00.stdout:8/752: symlink d0/d93/d60/le9 0 2026-03-10T12:38:10.678 INFO:tasks.workunit.client.1.vm07.stdout:0/711: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/ff2 x:0 0 0 2026-03-10T12:38:10.680 INFO:tasks.workunit.client.0.vm00.stdout:8/753: chown d0/d46/d89/f91 65580 1 2026-03-10T12:38:10.682 INFO:tasks.workunit.client.1.vm07.stdout:1/636: dread d9/f36 [0,4194304] 0 2026-03-10T12:38:10.683 INFO:tasks.workunit.client.1.vm07.stdout:1/637: chown d9/df/d29/d2b/d3d/cd1 31682346 1 2026-03-10T12:38:10.684 INFO:tasks.workunit.client.1.vm07.stdout:9/721: getdents d5/d13/d2c/de6/d76 0 2026-03-10T12:38:10.689 INFO:tasks.workunit.client.0.vm00.stdout:6/557: rmdir d2/d51/d70 39 2026-03-10T12:38:10.690 INFO:tasks.workunit.client.0.vm00.stdout:1/854: read da/d24/f76 [900656,39016] 0 2026-03-10T12:38:10.692 INFO:tasks.workunit.client.0.vm00.stdout:5/877: dread d1f/f46 [0,4194304] 0 2026-03-10T12:38:10.693 INFO:tasks.workunit.client.0.vm00.stdout:8/754: symlink d0/d93/d17/db1/dde/lea 0 2026-03-10T12:38:10.695 INFO:tasks.workunit.client.0.vm00.stdout:1/855: unlink da/d21/db3/d5d/l65 0 2026-03-10T12:38:10.703 INFO:tasks.workunit.client.0.vm00.stdout:2/852: rename d4/d6/d2d/dc3 to d4/d10f 0 2026-03-10T12:38:10.704 INFO:tasks.workunit.client.0.vm00.stdout:5/878: creat d1f/d39/d11a/f132 x:0 0 0 2026-03-10T12:38:10.705 INFO:tasks.workunit.client.1.vm07.stdout:8/615: fsync d1/d3/d11/f35 0 2026-03-10T12:38:10.711 INFO:tasks.workunit.client.0.vm00.stdout:6/558: creat d2/d42/d80/d9d/fca x:0 0 0 2026-03-10T12:38:10.711 INFO:tasks.workunit.client.0.vm00.stdout:6/559: readlink d2/da/dc/l12 0 2026-03-10T12:38:10.712 INFO:tasks.workunit.client.1.vm07.stdout:1/638: mkdir d9/d2d/dd7 0 2026-03-10T12:38:10.721 INFO:tasks.workunit.client.1.vm07.stdout:3/662: dread dc/f94 [0,4194304] 0 2026-03-10T12:38:10.724 INFO:tasks.workunit.client.1.vm07.stdout:5/680: link d0/d22/d18/d19/d21/d54/c96 d0/d22/d18/d19/d21/dc2/ded/cf3 0 2026-03-10T12:38:10.735 INFO:tasks.workunit.client.1.vm07.stdout:8/616: dread d1/d3/d6c/fa7 [0,4194304] 0 2026-03-10T12:38:10.736 INFO:tasks.workunit.client.1.vm07.stdout:8/617: write d1/f3d [2777998,76789] 0 2026-03-10T12:38:10.741 INFO:tasks.workunit.client.1.vm07.stdout:3/663: unlink f2 0 2026-03-10T12:38:10.741 INFO:tasks.workunit.client.1.vm07.stdout:3/664: chown dc/dd/lad 5 1 2026-03-10T12:38:10.760 INFO:tasks.workunit.client.0.vm00.stdout:7/593: dwrite da/f13 [0,4194304] 0 2026-03-10T12:38:10.761 INFO:tasks.workunit.client.0.vm00.stdout:0/703: write d3/d7/d3c/f99 [985848,28111] 0 2026-03-10T12:38:10.766 INFO:tasks.workunit.client.0.vm00.stdout:7/594: chown da/d3f/d71/l77 103873 1 2026-03-10T12:38:10.767 INFO:tasks.workunit.client.0.vm00.stdout:0/704: mkdir d3/db/da4/de7 0 2026-03-10T12:38:10.769 INFO:tasks.workunit.client.0.vm00.stdout:0/705: rmdir d3/db/d77/d82 39 2026-03-10T12:38:10.776 INFO:tasks.workunit.client.1.vm07.stdout:4/758: write d0/d4/d10/d3c/d2b/d2d/d9c/fcc [1408937,121993] 0 2026-03-10T12:38:10.783 INFO:tasks.workunit.client.1.vm07.stdout:8/618: mkdir d1/d3/d6/d7b/dc7 0 2026-03-10T12:38:10.784 INFO:tasks.workunit.client.0.vm00.stdout:7/595: truncate da/f35 905940 0 2026-03-10T12:38:10.794 INFO:tasks.workunit.client.1.vm07.stdout:3/665: dread dc/dd/d28/d3b/f70 [0,4194304] 0 2026-03-10T12:38:10.798 INFO:tasks.workunit.client.1.vm07.stdout:2/563: write d0/f15 [4642907,5203] 0 2026-03-10T12:38:10.799 INFO:tasks.workunit.client.1.vm07.stdout:2/564: fdatasync d0/d42/d26/d38/f3d 0 2026-03-10T12:38:10.800 INFO:tasks.workunit.client.1.vm07.stdout:2/565: write d0/d29/d64/d6c/f71 [964735,46126] 0 2026-03-10T12:38:10.800 INFO:tasks.workunit.client.1.vm07.stdout:2/566: truncate d0/d45/fa1 555919 0 2026-03-10T12:38:10.804 INFO:tasks.workunit.client.1.vm07.stdout:6/612: dread d1/d4/d6/d46/d4d/fb [4194304,4194304] 0 2026-03-10T12:38:10.810 INFO:tasks.workunit.client.1.vm07.stdout:2/567: dread d0/f44 [0,4194304] 0 2026-03-10T12:38:10.815 INFO:tasks.workunit.client.0.vm00.stdout:0/706: mknod d3/d7/db0/dc4/ce8 0 2026-03-10T12:38:10.815 INFO:tasks.workunit.client.1.vm07.stdout:7/610: write d0/f3a [1079631,99483] 0 2026-03-10T12:38:10.820 INFO:tasks.workunit.client.1.vm07.stdout:1/639: creat d9/df/d29/d2b/d31/fd8 x:0 0 0 2026-03-10T12:38:10.823 INFO:tasks.workunit.client.1.vm07.stdout:0/712: write d0/d14/d7c/fad [274659,62362] 0 2026-03-10T12:38:10.828 INFO:tasks.workunit.client.1.vm07.stdout:1/640: dread d9/fd [4194304,4194304] 0 2026-03-10T12:38:10.831 INFO:tasks.workunit.client.1.vm07.stdout:9/722: dwrite d5/d16/d23/d26/d68/fa0 [0,4194304] 0 2026-03-10T12:38:10.832 INFO:tasks.workunit.client.0.vm00.stdout:6/560: write d2/da/dc/d83/f97 [247465,118436] 0 2026-03-10T12:38:10.834 INFO:tasks.workunit.client.0.vm00.stdout:0/707: creat d3/db/d77/d82/fe9 x:0 0 0 2026-03-10T12:38:10.835 INFO:tasks.workunit.client.0.vm00.stdout:0/708: chown d3/db/d24/c60 3 1 2026-03-10T12:38:10.835 INFO:tasks.workunit.client.1.vm07.stdout:5/681: write d0/d22/d18/d19/d2e/d67/f94 [207517,121287] 0 2026-03-10T12:38:10.840 INFO:tasks.workunit.client.0.vm00.stdout:7/596: link da/d26/d50/ccf da/d26/d37/d61/cd5 0 2026-03-10T12:38:10.848 INFO:tasks.workunit.client.0.vm00.stdout:0/709: mkdir d3/d7/d4c/dcc/dea 0 2026-03-10T12:38:10.849 INFO:tasks.workunit.client.0.vm00.stdout:0/710: fdatasync d3/d7/d4c/d5b/d38/db3/fe3 0 2026-03-10T12:38:10.849 INFO:tasks.workunit.client.1.vm07.stdout:6/613: readlink d1/d4/d4a/l89 0 2026-03-10T12:38:10.850 INFO:tasks.workunit.client.1.vm07.stdout:6/614: dread - d1/d4/d71/f79 zero size 2026-03-10T12:38:10.850 INFO:tasks.workunit.client.0.vm00.stdout:6/561: mknod d2/d51/d70/ccb 0 2026-03-10T12:38:10.850 INFO:tasks.workunit.client.1.vm07.stdout:6/615: truncate d1/d4/d4a/f94 818482 0 2026-03-10T12:38:10.851 INFO:tasks.workunit.client.1.vm07.stdout:6/616: fsync d1/d4/d6/d53/fb5 0 2026-03-10T12:38:10.859 INFO:tasks.workunit.client.0.vm00.stdout:0/711: unlink d3/d7/d3c/lcb 0 2026-03-10T12:38:10.865 INFO:tasks.workunit.client.0.vm00.stdout:8/755: fsync d0/dd/d38/f3d 0 2026-03-10T12:38:10.872 INFO:tasks.workunit.client.0.vm00.stdout:9/860: dread d0/d3d/d59/d4e/dba/d19/fb1 [0,4194304] 0 2026-03-10T12:38:10.873 INFO:tasks.workunit.client.1.vm07.stdout:0/713: unlink d0/d14/d5f/d76/f3d 0 2026-03-10T12:38:10.874 INFO:tasks.workunit.client.1.vm07.stdout:0/714: chown d0/d14/d5f/d76/d2f/d31/d4f/f70 116951903 1 2026-03-10T12:38:10.875 INFO:tasks.workunit.client.0.vm00.stdout:8/756: dread d0/dd/f9a [0,4194304] 0 2026-03-10T12:38:10.878 INFO:tasks.workunit.client.1.vm07.stdout:4/759: creat d0/d4/d10/d9a/db9/f10a x:0 0 0 2026-03-10T12:38:10.879 INFO:tasks.workunit.client.0.vm00.stdout:9/861: dwrite d0/d3d/d59/d4e/f6f [4194304,4194304] 0 2026-03-10T12:38:10.881 INFO:tasks.workunit.client.1.vm07.stdout:1/641: readlink d9/df/d29/d6b/l9b 0 2026-03-10T12:38:10.884 INFO:tasks.workunit.client.1.vm07.stdout:9/723: truncate d5/d16/f8f 360408 0 2026-03-10T12:38:10.888 INFO:tasks.workunit.client.0.vm00.stdout:7/597: sync 2026-03-10T12:38:10.892 INFO:tasks.workunit.client.1.vm07.stdout:5/682: rename d0/d22/d18/f86 to d0/d22/d18/d19/d2e/d67/ff4 0 2026-03-10T12:38:10.894 INFO:tasks.workunit.client.0.vm00.stdout:9/862: link d0/l62 d0/d3d/d59/d4e/d104/l138 0 2026-03-10T12:38:10.894 INFO:tasks.workunit.client.1.vm07.stdout:3/666: symlink dc/d18/le1 0 2026-03-10T12:38:10.896 INFO:tasks.workunit.client.0.vm00.stdout:9/863: mkdir d0/d3d/d43/d114/d139 0 2026-03-10T12:38:10.897 INFO:tasks.workunit.client.1.vm07.stdout:3/667: dread dc/d18/d2d/f80 [0,4194304] 0 2026-03-10T12:38:10.898 INFO:tasks.workunit.client.0.vm00.stdout:9/864: creat d0/d3d/d59/d4e/dba/d19/d50/f13a x:0 0 0 2026-03-10T12:38:10.902 INFO:tasks.workunit.client.0.vm00.stdout:9/865: write d0/f21 [4549652,57088] 0 2026-03-10T12:38:10.904 INFO:tasks.workunit.client.0.vm00.stdout:9/866: dwrite d0/d3d/d59/d4e/f70 [0,4194304] 0 2026-03-10T12:38:10.917 INFO:tasks.workunit.client.1.vm07.stdout:6/617: truncate d1/d4/d6/f30 1604825 0 2026-03-10T12:38:10.920 INFO:tasks.workunit.client.1.vm07.stdout:6/618: dwrite d1/d4/d6/d43/d65/f7f [0,4194304] 0 2026-03-10T12:38:10.923 INFO:tasks.workunit.client.1.vm07.stdout:2/568: truncate d0/d42/f1b 47702 0 2026-03-10T12:38:10.927 INFO:tasks.workunit.client.0.vm00.stdout:4/853: dwrite df/f1b [0,4194304] 0 2026-03-10T12:38:10.930 INFO:tasks.workunit.client.0.vm00.stdout:3/837: dwrite dd/d18/d14/fbe [0,4194304] 0 2026-03-10T12:38:10.933 INFO:tasks.workunit.client.1.vm07.stdout:0/715: mkdir d0/d14/d5f/d76/d2f/d31/d4f/da8/df3 0 2026-03-10T12:38:10.940 INFO:tasks.workunit.client.0.vm00.stdout:3/838: mkdir dd/d27/d2c/def/d118 0 2026-03-10T12:38:10.940 INFO:tasks.workunit.client.1.vm07.stdout:4/760: mknod d0/d4/d10/d3c/d2b/d2d/d9c/c10b 0 2026-03-10T12:38:10.945 INFO:tasks.workunit.client.1.vm07.stdout:9/724: creat d5/d13/d2c/ff5 x:0 0 0 2026-03-10T12:38:10.955 INFO:tasks.workunit.client.0.vm00.stdout:3/839: stat dd/d2a/da2/de1/d45/f47 0 2026-03-10T12:38:10.955 INFO:tasks.workunit.client.0.vm00.stdout:4/854: dread df/f16 [0,4194304] 0 2026-03-10T12:38:10.955 INFO:tasks.workunit.client.1.vm07.stdout:9/725: write d5/d13/d6c/da4/feb [575194,126291] 0 2026-03-10T12:38:10.955 INFO:tasks.workunit.client.1.vm07.stdout:8/619: creat d1/d3/d6/d50/fc8 x:0 0 0 2026-03-10T12:38:10.955 INFO:tasks.workunit.client.1.vm07.stdout:5/683: chown d0/c8 0 1 2026-03-10T12:38:10.955 INFO:tasks.workunit.client.1.vm07.stdout:3/668: unlink dc/dd/d28/d3b/l6a 0 2026-03-10T12:38:10.955 INFO:tasks.workunit.client.1.vm07.stdout:6/619: readlink d1/d4/d6/d43/d88/d97/lb9 0 2026-03-10T12:38:10.956 INFO:tasks.workunit.client.1.vm07.stdout:6/620: readlink d1/d4/d4a/l89 0 2026-03-10T12:38:10.958 INFO:tasks.workunit.client.1.vm07.stdout:2/569: rmdir d0/d5b 39 2026-03-10T12:38:10.961 INFO:tasks.workunit.client.1.vm07.stdout:7/611: symlink d0/d61/db4/d8a/d9d/lcf 0 2026-03-10T12:38:10.961 INFO:tasks.workunit.client.1.vm07.stdout:7/612: chown d0/l33 123962 1 2026-03-10T12:38:10.961 INFO:tasks.workunit.client.1.vm07.stdout:7/613: chown d0/f21 5 1 2026-03-10T12:38:10.962 INFO:tasks.workunit.client.0.vm00.stdout:1/856: dwrite da/d24/f76 [0,4194304] 0 2026-03-10T12:38:10.962 INFO:tasks.workunit.client.1.vm07.stdout:7/614: write d0/d61/db4/d8a/fbe [135603,94799] 0 2026-03-10T12:38:10.966 INFO:tasks.workunit.client.1.vm07.stdout:0/716: mkdir d0/d14/d5f/d76/d2f/df4 0 2026-03-10T12:38:10.969 INFO:tasks.workunit.client.1.vm07.stdout:1/642: mknod d9/cd9 0 2026-03-10T12:38:10.975 INFO:tasks.workunit.client.0.vm00.stdout:4/855: sync 2026-03-10T12:38:10.982 INFO:tasks.workunit.client.1.vm07.stdout:9/726: dread d5/d16/d23/fb2 [0,4194304] 0 2026-03-10T12:38:10.988 INFO:tasks.workunit.client.1.vm07.stdout:6/621: mknod d1/d4/d44/cc4 0 2026-03-10T12:38:10.988 INFO:tasks.workunit.client.0.vm00.stdout:4/856: read - df/f4f zero size 2026-03-10T12:38:10.989 INFO:tasks.workunit.client.0.vm00.stdout:4/857: write df/d1f/d22/d26/dab/f89 [892195,57969] 0 2026-03-10T12:38:10.992 INFO:tasks.workunit.client.1.vm07.stdout:2/570: unlink d0/d42/d26/d38/l91 0 2026-03-10T12:38:10.997 INFO:tasks.workunit.client.0.vm00.stdout:1/857: truncate da/d24/d28/f116 3876977 0 2026-03-10T12:38:10.999 INFO:tasks.workunit.client.0.vm00.stdout:1/858: truncate da/d21/db3/d5d/d72/d7e/fac 1040969 0 2026-03-10T12:38:11.000 INFO:tasks.workunit.client.1.vm07.stdout:0/717: symlink d0/d14/d5f/d3b/dbc/lf5 0 2026-03-10T12:38:11.004 INFO:tasks.workunit.client.0.vm00.stdout:1/859: rmdir da/d12/da8/d112 0 2026-03-10T12:38:11.006 INFO:tasks.workunit.client.0.vm00.stdout:1/860: unlink da/d24/d28/d67/db0/cde 0 2026-03-10T12:38:11.009 INFO:tasks.workunit.client.1.vm07.stdout:3/669: mkdir dc/d18/de2 0 2026-03-10T12:38:11.009 INFO:tasks.workunit.client.0.vm00.stdout:0/712: dwrite d3/d22/f46 [0,4194304] 0 2026-03-10T12:38:11.018 INFO:tasks.workunit.client.0.vm00.stdout:0/713: unlink d3/db/d24/d25/c34 0 2026-03-10T12:38:11.020 INFO:tasks.workunit.client.1.vm07.stdout:1/643: rmdir d9/df/d29/d2b/d31/d91 39 2026-03-10T12:38:11.020 INFO:tasks.workunit.client.0.vm00.stdout:0/714: chown d3/d33/f4d 977 1 2026-03-10T12:38:11.021 INFO:tasks.workunit.client.0.vm00.stdout:0/715: rmdir d3/d7/d3c/d74 39 2026-03-10T12:38:11.025 INFO:tasks.workunit.client.1.vm07.stdout:9/727: symlink d5/d16/da3/lf6 0 2026-03-10T12:38:11.039 INFO:tasks.workunit.client.1.vm07.stdout:9/728: dread d5/d13/d6c/da4/fa6 [0,4194304] 0 2026-03-10T12:38:11.041 INFO:tasks.workunit.client.1.vm07.stdout:2/571: getdents d0/d42/d26/d38/d4f/d5d 0 2026-03-10T12:38:11.043 INFO:tasks.workunit.client.1.vm07.stdout:9/729: mknod d5/d13/d57/d3e/cf7 0 2026-03-10T12:38:11.045 INFO:tasks.workunit.client.1.vm07.stdout:1/644: creat d9/d2d/d4f/d75/fda x:0 0 0 2026-03-10T12:38:11.048 INFO:tasks.workunit.client.1.vm07.stdout:6/622: sync 2026-03-10T12:38:11.056 INFO:tasks.workunit.client.1.vm07.stdout:6/623: readlink d1/d4/d6/d16/d49/la6 0 2026-03-10T12:38:11.057 INFO:tasks.workunit.client.1.vm07.stdout:1/645: dread d9/df/d29/f70 [0,4194304] 0 2026-03-10T12:38:11.057 INFO:tasks.workunit.client.1.vm07.stdout:6/624: chown d1/d4/d4a/f56 59850212 1 2026-03-10T12:38:11.068 INFO:tasks.workunit.client.1.vm07.stdout:4/761: dread d0/d4/d10/f36 [0,4194304] 0 2026-03-10T12:38:11.069 INFO:tasks.workunit.client.1.vm07.stdout:4/762: fdatasync d0/d4/df2/df6/d46/f85 0 2026-03-10T12:38:11.072 INFO:tasks.workunit.client.1.vm07.stdout:9/730: creat d5/d13/d6c/d7a/ff8 x:0 0 0 2026-03-10T12:38:11.088 INFO:tasks.workunit.client.1.vm07.stdout:4/763: mknod d0/d4/d10/d3c/d2b/d54/c10c 0 2026-03-10T12:38:11.112 INFO:tasks.workunit.client.1.vm07.stdout:1/646: mkdir d9/ddb 0 2026-03-10T12:38:11.120 INFO:tasks.workunit.client.0.vm00.stdout:5/879: write d1f/d26/d2e/d58/d10c/d123/d72/ffa [164151,57731] 0 2026-03-10T12:38:11.121 INFO:tasks.workunit.client.0.vm00.stdout:5/880: mkdir d1f/d39/d133 0 2026-03-10T12:38:11.124 INFO:tasks.workunit.client.0.vm00.stdout:5/881: creat d1f/d6a/d94/dc3/de7/f134 x:0 0 0 2026-03-10T12:38:11.126 INFO:tasks.workunit.client.0.vm00.stdout:5/882: truncate d1f/d6a/d94/dc9/fae 6544940 0 2026-03-10T12:38:11.128 INFO:tasks.workunit.client.0.vm00.stdout:5/883: symlink d1f/d26/d2b/d37/l135 0 2026-03-10T12:38:11.135 INFO:tasks.workunit.client.0.vm00.stdout:6/562: rmdir d2/d51/d70 39 2026-03-10T12:38:11.135 INFO:tasks.workunit.client.1.vm07.stdout:1/647: truncate d9/df/d29/d2b/d31/d91/faf 4938785 0 2026-03-10T12:38:11.135 INFO:tasks.workunit.client.0.vm00.stdout:5/884: sync 2026-03-10T12:38:11.138 INFO:tasks.workunit.client.0.vm00.stdout:6/563: sync 2026-03-10T12:38:11.143 INFO:tasks.workunit.client.0.vm00.stdout:6/564: read d2/d16/f1e [1179238,118835] 0 2026-03-10T12:38:11.144 INFO:tasks.workunit.client.1.vm07.stdout:4/764: rename d0/d4/d10/d3c/d2b/d2d/l3d to d0/d4/d5/l10d 0 2026-03-10T12:38:11.154 INFO:tasks.workunit.client.0.vm00.stdout:9/867: write d0/d3d/d59/d74/faa [1190393,39891] 0 2026-03-10T12:38:11.158 INFO:tasks.workunit.client.0.vm00.stdout:9/868: getdents d0/d3d/d59/d4e/dba/d1e/d85 0 2026-03-10T12:38:11.159 INFO:tasks.workunit.client.0.vm00.stdout:9/869: write d0/f17 [4161114,3061] 0 2026-03-10T12:38:11.163 INFO:tasks.workunit.client.1.vm07.stdout:4/765: mknod d0/d4/d5/c10e 0 2026-03-10T12:38:11.167 INFO:tasks.workunit.client.0.vm00.stdout:6/565: creat d2/da/fcc x:0 0 0 2026-03-10T12:38:11.181 INFO:tasks.workunit.client.1.vm07.stdout:8/620: dwrite d1/f79 [0,4194304] 0 2026-03-10T12:38:11.183 INFO:tasks.workunit.client.1.vm07.stdout:8/621: chown d1/d3/l51 344138 1 2026-03-10T12:38:11.191 INFO:tasks.workunit.client.1.vm07.stdout:8/622: creat d1/d3/d6c/fc9 x:0 0 0 2026-03-10T12:38:11.201 INFO:tasks.workunit.client.0.vm00.stdout:0/716: dwrite d3/db/d77/faa [0,4194304] 0 2026-03-10T12:38:11.203 INFO:tasks.workunit.client.1.vm07.stdout:0/718: dwrite d0/d14/d5f/d76/d2f/d31/d4f/f5c [0,4194304] 0 2026-03-10T12:38:11.205 INFO:tasks.workunit.client.1.vm07.stdout:3/670: dwrite dc/d18/f34 [0,4194304] 0 2026-03-10T12:38:11.207 INFO:tasks.workunit.client.1.vm07.stdout:5/684: dwrite d0/d22/d18/d19/d2e/f62 [0,4194304] 0 2026-03-10T12:38:11.209 INFO:tasks.workunit.client.1.vm07.stdout:8/623: sync 2026-03-10T12:38:11.211 INFO:tasks.workunit.client.1.vm07.stdout:7/615: dread d0/d61/db4/d8a/d9d/fb1 [0,4194304] 0 2026-03-10T12:38:11.215 INFO:tasks.workunit.client.1.vm07.stdout:2/572: dwrite d0/f1d [0,4194304] 0 2026-03-10T12:38:11.215 INFO:tasks.workunit.client.1.vm07.stdout:9/731: dwrite d5/fb [4194304,4194304] 0 2026-03-10T12:38:11.225 INFO:tasks.workunit.client.1.vm07.stdout:6/625: write d1/d4/d6/f30 [2361180,32828] 0 2026-03-10T12:38:11.228 INFO:tasks.workunit.client.1.vm07.stdout:0/719: fsync d0/d14/d5f/d76/f8a 0 2026-03-10T12:38:11.231 INFO:tasks.workunit.client.1.vm07.stdout:3/671: creat dc/d18/d24/fe3 x:0 0 0 2026-03-10T12:38:11.231 INFO:tasks.workunit.client.0.vm00.stdout:0/717: dread d3/d7/d4c/d5b/f37 [0,4194304] 0 2026-03-10T12:38:11.232 INFO:tasks.workunit.client.1.vm07.stdout:1/648: write d9/df/d29/f49 [91456,91427] 0 2026-03-10T12:38:11.232 INFO:tasks.workunit.client.0.vm00.stdout:0/718: write d3/d7/d4c/d5b/d38/db3/fe3 [416144,61665] 0 2026-03-10T12:38:11.239 INFO:tasks.workunit.client.1.vm07.stdout:4/766: write d0/d4/d5/fe8 [90863,60464] 0 2026-03-10T12:38:11.239 INFO:tasks.workunit.client.1.vm07.stdout:5/685: mknod d0/d22/d18/d3e/d53/cf5 0 2026-03-10T12:38:11.241 INFO:tasks.workunit.client.1.vm07.stdout:5/686: write d0/d22/d18/d19/d36/d75/d77/fd7 [20144,70374] 0 2026-03-10T12:38:11.244 INFO:tasks.workunit.client.1.vm07.stdout:4/767: dwrite d0/d4/d10/d5f/d6d/f103 [0,4194304] 0 2026-03-10T12:38:11.246 INFO:tasks.workunit.client.1.vm07.stdout:5/687: dwrite d0/d22/d18/d3e/d5d/f6d [0,4194304] 0 2026-03-10T12:38:11.260 INFO:tasks.workunit.client.1.vm07.stdout:9/732: rmdir d5/d13/d6c 39 2026-03-10T12:38:11.265 INFO:tasks.workunit.client.1.vm07.stdout:0/720: fdatasync d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf 0 2026-03-10T12:38:11.276 INFO:tasks.workunit.client.1.vm07.stdout:3/672: symlink dc/dd/d28/d3b/le4 0 2026-03-10T12:38:11.300 INFO:tasks.workunit.client.1.vm07.stdout:2/573: symlink d0/lc5 0 2026-03-10T12:38:11.314 INFO:tasks.workunit.client.1.vm07.stdout:3/673: mkdir dc/d18/d2d/de5 0 2026-03-10T12:38:11.324 INFO:tasks.workunit.client.1.vm07.stdout:5/688: mkdir d0/d22/d18/d3e/df6 0 2026-03-10T12:38:11.325 INFO:tasks.workunit.client.1.vm07.stdout:5/689: chown d0/d22/dbc/la6 10126 1 2026-03-10T12:38:11.326 INFO:tasks.workunit.client.1.vm07.stdout:2/574: rmdir d0/d42/d4e 39 2026-03-10T12:38:11.327 INFO:tasks.workunit.client.1.vm07.stdout:9/733: creat d5/d13/d2c/de6/dce/ff9 x:0 0 0 2026-03-10T12:38:11.328 INFO:tasks.workunit.client.1.vm07.stdout:3/674: mkdir dc/dd/d1f/dac/de6 0 2026-03-10T12:38:11.330 INFO:tasks.workunit.client.1.vm07.stdout:9/734: fdatasync d5/d69/d93/d97/fc3 0 2026-03-10T12:38:11.335 INFO:tasks.workunit.client.1.vm07.stdout:5/690: creat d0/d22/d18/d19/d21/d3a/ff7 x:0 0 0 2026-03-10T12:38:11.338 INFO:tasks.workunit.client.1.vm07.stdout:5/691: unlink d0/d22/d18/d19/d2e/d67/l6b 0 2026-03-10T12:38:11.340 INFO:tasks.workunit.client.1.vm07.stdout:5/692: creat d0/d22/d18/d3e/df6/ff8 x:0 0 0 2026-03-10T12:38:11.343 INFO:tasks.workunit.client.1.vm07.stdout:5/693: dread d0/d22/dbc/f8b [4194304,4194304] 0 2026-03-10T12:38:11.356 INFO:tasks.workunit.client.1.vm07.stdout:2/575: sync 2026-03-10T12:38:11.356 INFO:tasks.workunit.client.1.vm07.stdout:9/735: sync 2026-03-10T12:38:11.357 INFO:tasks.workunit.client.1.vm07.stdout:2/576: write d0/d42/d1f/d20/fa9 [771544,6415] 0 2026-03-10T12:38:11.364 INFO:tasks.workunit.client.1.vm07.stdout:2/577: readlink d0/d42/d1f/d20/l4d 0 2026-03-10T12:38:11.366 INFO:tasks.workunit.client.0.vm00.stdout:3/840: dwrite dd/d18/d14/fc0 [0,4194304] 0 2026-03-10T12:38:11.366 INFO:tasks.workunit.client.1.vm07.stdout:2/578: rename d0/d42/d4e/l8b to d0/d42/d1f/lc6 0 2026-03-10T12:38:11.369 INFO:tasks.workunit.client.0.vm00.stdout:3/841: symlink dd/d27/d2c/def/d118/l119 0 2026-03-10T12:38:11.370 INFO:tasks.workunit.client.0.vm00.stdout:3/842: write dd/d3d/d8a/f8b [215957,63574] 0 2026-03-10T12:38:11.373 INFO:tasks.workunit.client.0.vm00.stdout:3/843: fdatasync dd/d3d/d8a/ffb 0 2026-03-10T12:38:11.385 INFO:tasks.workunit.client.1.vm07.stdout:8/624: write d1/d3/f1f [2373517,55942] 0 2026-03-10T12:38:11.386 INFO:tasks.workunit.client.0.vm00.stdout:0/719: dwrite d3/d40/f7a [0,4194304] 0 2026-03-10T12:38:11.388 INFO:tasks.workunit.client.0.vm00.stdout:0/720: write d3/d22/d3a/fd6 [349482,111676] 0 2026-03-10T12:38:11.392 INFO:tasks.workunit.client.0.vm00.stdout:0/721: read d3/d7/d4c/f73 [1163487,8527] 0 2026-03-10T12:38:11.392 INFO:tasks.workunit.client.0.vm00.stdout:0/722: readlink d3/d7/d4c/d5b/d38/d44/d5a/l5c 0 2026-03-10T12:38:11.396 INFO:tasks.workunit.client.1.vm07.stdout:8/625: dread d1/d3/d18/f2e [0,4194304] 0 2026-03-10T12:38:11.399 INFO:tasks.workunit.client.1.vm07.stdout:8/626: mknod d1/d3/d40/d92/db6/cca 0 2026-03-10T12:38:11.406 INFO:tasks.workunit.client.0.vm00.stdout:0/723: truncate d3/d22/d3a/fd9 2033833 0 2026-03-10T12:38:11.407 INFO:tasks.workunit.client.1.vm07.stdout:8/627: rename d1/d3/ff to d1/d3/d40/d92/db6/fcb 0 2026-03-10T12:38:11.411 INFO:tasks.workunit.client.1.vm07.stdout:6/626: write d1/d4/d6/d53/fa9 [797599,17754] 0 2026-03-10T12:38:11.414 INFO:tasks.workunit.client.1.vm07.stdout:5/694: fdatasync d0/d22/d18/d19/d2e/f62 0 2026-03-10T12:38:11.420 INFO:tasks.workunit.client.0.vm00.stdout:0/724: truncate d3/d7/d4c/d5b/d38/fa2 862846 0 2026-03-10T12:38:11.421 INFO:tasks.workunit.client.1.vm07.stdout:8/628: sync 2026-03-10T12:38:11.422 INFO:tasks.workunit.client.1.vm07.stdout:5/695: truncate d0/d22/d18/d19/d21/fa1 92748 0 2026-03-10T12:38:11.423 INFO:tasks.workunit.client.0.vm00.stdout:0/725: dwrite d3/db/f97 [0,4194304] 0 2026-03-10T12:38:11.427 INFO:tasks.workunit.client.1.vm07.stdout:6/627: sync 2026-03-10T12:38:11.428 INFO:tasks.workunit.client.1.vm07.stdout:5/696: sync 2026-03-10T12:38:11.428 INFO:tasks.workunit.client.1.vm07.stdout:8/629: sync 2026-03-10T12:38:11.436 INFO:tasks.workunit.client.1.vm07.stdout:1/649: dwrite d9/df/dc2/f57 [0,4194304] 0 2026-03-10T12:38:11.436 INFO:tasks.workunit.client.1.vm07.stdout:1/650: write d9/d2d/fcb [1372831,51890] 0 2026-03-10T12:38:11.437 INFO:tasks.workunit.client.1.vm07.stdout:7/616: truncate d0/f9b 3431525 0 2026-03-10T12:38:11.458 INFO:tasks.workunit.client.1.vm07.stdout:5/697: creat d0/d22/d18/d19/d2e/da9/ff9 x:0 0 0 2026-03-10T12:38:11.470 INFO:tasks.workunit.client.1.vm07.stdout:4/768: dwrite d0/d5c/fe2 [0,4194304] 0 2026-03-10T12:38:11.471 INFO:tasks.workunit.client.0.vm00.stdout:5/885: write d1f/d26/f79 [1785250,82865] 0 2026-03-10T12:38:11.472 INFO:tasks.workunit.client.1.vm07.stdout:4/769: chown d0/d4/df2/df6/d46 1622201624 1 2026-03-10T12:38:11.478 INFO:tasks.workunit.client.0.vm00.stdout:9/870: write d0/d3d/d59/d4e/dba/d19/d50/fe0 [1420702,116887] 0 2026-03-10T12:38:11.479 INFO:tasks.workunit.client.0.vm00.stdout:9/871: chown d0/d3d/d43/f54 2 1 2026-03-10T12:38:11.480 INFO:tasks.workunit.client.1.vm07.stdout:0/721: write d0/d14/d5f/d76/d2f/d31/d79/dcc/fe3 [604050,20562] 0 2026-03-10T12:38:11.481 INFO:tasks.workunit.client.0.vm00.stdout:9/872: dread d0/d3d/d59/d4e/dba/d19/f5c [4194304,4194304] 0 2026-03-10T12:38:11.492 INFO:tasks.workunit.client.0.vm00.stdout:9/873: truncate d0/d7f/db8/fc6 626127 0 2026-03-10T12:38:11.497 INFO:tasks.workunit.client.0.vm00.stdout:9/874: truncate d0/d3d/d43/ff7 561224 0 2026-03-10T12:38:11.500 INFO:tasks.workunit.client.1.vm07.stdout:5/698: rename d0/d22/d18/d3e/d53/d9e/l6e to d0/d22/d18/d19/d21/d54/dcb/db8/dec/lfa 0 2026-03-10T12:38:11.501 INFO:tasks.workunit.client.1.vm07.stdout:3/675: write dc/dd/d43/d5c/f65 [671428,93403] 0 2026-03-10T12:38:11.501 INFO:tasks.workunit.client.0.vm00.stdout:3/844: write dd/d64/f98 [3296421,77995] 0 2026-03-10T12:38:11.503 INFO:tasks.workunit.client.1.vm07.stdout:6/628: creat d1/d4/fc5 x:0 0 0 2026-03-10T12:38:11.504 INFO:tasks.workunit.client.1.vm07.stdout:6/629: chown d1/d4/d6/f7d 10739 1 2026-03-10T12:38:11.507 INFO:tasks.workunit.client.0.vm00.stdout:3/845: dread dd/d64/d93/ff7 [0,4194304] 0 2026-03-10T12:38:11.509 INFO:tasks.workunit.client.1.vm07.stdout:4/770: dread d0/d4/df2/df6/d46/f56 [0,4194304] 0 2026-03-10T12:38:11.509 INFO:tasks.workunit.client.1.vm07.stdout:8/630: creat d1/d3/fcc x:0 0 0 2026-03-10T12:38:11.511 INFO:tasks.workunit.client.0.vm00.stdout:3/846: mkdir dd/d3d/d8a/de0/d55/d11a 0 2026-03-10T12:38:11.514 INFO:tasks.workunit.client.0.vm00.stdout:3/847: dwrite dd/d64/fb9 [0,4194304] 0 2026-03-10T12:38:11.519 INFO:tasks.workunit.client.0.vm00.stdout:3/848: mknod dd/d3d/d8a/de0/c11b 0 2026-03-10T12:38:11.521 INFO:tasks.workunit.client.1.vm07.stdout:1/651: creat d9/df/d29/d2b/db1/fdc x:0 0 0 2026-03-10T12:38:11.521 INFO:tasks.workunit.client.1.vm07.stdout:4/771: dread d0/d4/d10/d9a/db9/fef [0,4194304] 0 2026-03-10T12:38:11.525 INFO:tasks.workunit.client.1.vm07.stdout:9/736: dwrite d5/d13/d57/fa7 [0,4194304] 0 2026-03-10T12:38:11.529 INFO:tasks.workunit.client.1.vm07.stdout:0/722: symlink d0/d14/d5f/d3b/lf6 0 2026-03-10T12:38:11.529 INFO:tasks.workunit.client.1.vm07.stdout:0/723: fdatasync d0/d14/d7c/fde 0 2026-03-10T12:38:11.530 INFO:tasks.workunit.client.1.vm07.stdout:2/579: write d0/f9c [645858,76323] 0 2026-03-10T12:38:11.530 INFO:tasks.workunit.client.1.vm07.stdout:2/580: fsync d0/d42/d1f/d20/fa9 0 2026-03-10T12:38:11.536 INFO:tasks.workunit.client.0.vm00.stdout:9/875: dread d0/d3d/d59/d4e/dba/d1e/d27/f28 [0,4194304] 0 2026-03-10T12:38:11.547 INFO:tasks.workunit.client.0.vm00.stdout:2/853: rename d4/d6/d2d/l2f to d4/d6/d2d/de5/l110 0 2026-03-10T12:38:11.550 INFO:tasks.workunit.client.0.vm00.stdout:2/854: link d4/f73 d4/d53/d9e/f111 0 2026-03-10T12:38:11.550 INFO:tasks.workunit.client.0.vm00.stdout:8/757: rename d0/l3 to d0/dd/leb 0 2026-03-10T12:38:11.552 INFO:tasks.workunit.client.1.vm07.stdout:6/630: mknod d1/d4/d6/d4e/d64/cc6 0 2026-03-10T12:38:11.553 INFO:tasks.workunit.client.0.vm00.stdout:8/758: fdatasync d0/d46/d7e/f8a 0 2026-03-10T12:38:11.553 INFO:tasks.workunit.client.0.vm00.stdout:2/855: dread - d4/dd/fe2 zero size 2026-03-10T12:38:11.560 INFO:tasks.workunit.client.0.vm00.stdout:2/856: link d4/dd/f3c d4/d6/de7/f112 0 2026-03-10T12:38:11.560 INFO:tasks.workunit.client.0.vm00.stdout:2/857: dread - d4/dd/fe2 zero size 2026-03-10T12:38:11.561 INFO:tasks.workunit.client.0.vm00.stdout:7/598: rename da/d47/f62 to da/d26/d37/fd6 0 2026-03-10T12:38:11.562 INFO:tasks.workunit.client.0.vm00.stdout:7/599: fdatasync da/f16 0 2026-03-10T12:38:11.566 INFO:tasks.workunit.client.0.vm00.stdout:4/858: rename df/d32/l4a to df/d1f/d36/l11b 0 2026-03-10T12:38:11.569 INFO:tasks.workunit.client.0.vm00.stdout:8/759: dread d0/d93/d2d/d49/f6c [0,4194304] 0 2026-03-10T12:38:11.570 INFO:tasks.workunit.client.1.vm07.stdout:8/631: rename d1/d3/d6/d7b to d1/d3/db2/dcd 0 2026-03-10T12:38:11.571 INFO:tasks.workunit.client.0.vm00.stdout:7/600: dread da/d26/f27 [0,4194304] 0 2026-03-10T12:38:11.572 INFO:tasks.workunit.client.0.vm00.stdout:7/601: chown da/d25/f5a 1665 1 2026-03-10T12:38:11.572 INFO:tasks.workunit.client.0.vm00.stdout:3/849: rename dd/d3d/d73 to dd/d18/d14/d2b/d11c 0 2026-03-10T12:38:11.573 INFO:tasks.workunit.client.1.vm07.stdout:1/652: creat d9/d2d/d4f/d5a/fdd x:0 0 0 2026-03-10T12:38:11.575 INFO:tasks.workunit.client.0.vm00.stdout:7/602: dread da/d1b/d40/fca [0,4194304] 0 2026-03-10T12:38:11.575 INFO:tasks.workunit.client.0.vm00.stdout:4/859: symlink df/d1f/d36/d3a/d41/l11c 0 2026-03-10T12:38:11.576 INFO:tasks.workunit.client.0.vm00.stdout:4/860: read - df/d63/ddb/ff8 zero size 2026-03-10T12:38:11.585 INFO:tasks.workunit.client.0.vm00.stdout:0/726: dread d3/d22/d3a/f8c [0,4194304] 0 2026-03-10T12:38:11.591 INFO:tasks.workunit.client.0.vm00.stdout:1/861: write da/d24/d28/f116 [2957789,14123] 0 2026-03-10T12:38:11.593 INFO:tasks.workunit.client.0.vm00.stdout:7/603: dread f0 [4194304,4194304] 0 2026-03-10T12:38:11.594 INFO:tasks.workunit.client.0.vm00.stdout:8/760: mknod d0/d58/d68/cec 0 2026-03-10T12:38:11.595 INFO:tasks.workunit.client.0.vm00.stdout:8/761: stat d0/d93/d36/d5b/f95 0 2026-03-10T12:38:11.596 INFO:tasks.workunit.client.1.vm07.stdout:2/581: fdatasync d0/d42/f2c 0 2026-03-10T12:38:11.597 INFO:tasks.workunit.client.0.vm00.stdout:7/604: dread da/f13 [0,4194304] 0 2026-03-10T12:38:11.597 INFO:tasks.workunit.client.1.vm07.stdout:2/582: chown d0/d29/d64/f78 878812207 1 2026-03-10T12:38:11.598 INFO:tasks.workunit.client.1.vm07.stdout:4/772: dread d0/d4/d5/da/f15 [4194304,4194304] 0 2026-03-10T12:38:11.600 INFO:tasks.workunit.client.0.vm00.stdout:9/876: rename d0/d3d/df2/f12a to d0/d3d/d43/d53/d126/f13b 0 2026-03-10T12:38:11.604 INFO:tasks.workunit.client.0.vm00.stdout:1/862: dwrite da/d24/d73/fb6 [0,4194304] 0 2026-03-10T12:38:11.604 INFO:tasks.workunit.client.0.vm00.stdout:3/850: mknod dd/d18/d13/d1d/dc6/c11d 0 2026-03-10T12:38:11.605 INFO:tasks.workunit.client.1.vm07.stdout:7/617: link d0/f2f d0/d57/d62/d90/dce/fd0 0 2026-03-10T12:38:11.610 INFO:tasks.workunit.client.0.vm00.stdout:7/605: write da/d26/d37/f6f [66626,91957] 0 2026-03-10T12:38:11.613 INFO:tasks.workunit.client.1.vm07.stdout:4/773: sync 2026-03-10T12:38:11.614 INFO:tasks.workunit.client.0.vm00.stdout:3/851: dwrite dd/d18/d13/d1d/f69 [0,4194304] 0 2026-03-10T12:38:11.616 INFO:tasks.workunit.client.0.vm00.stdout:5/886: write d1f/d26/d2e/fb8 [394252,61714] 0 2026-03-10T12:38:11.625 INFO:tasks.workunit.client.1.vm07.stdout:6/631: mkdir d1/d4/d6/d46/d4d/dc7 0 2026-03-10T12:38:11.628 INFO:tasks.workunit.client.1.vm07.stdout:3/676: dwrite dc/dd/f9a [0,4194304] 0 2026-03-10T12:38:11.636 INFO:tasks.workunit.client.0.vm00.stdout:8/762: write d0/d46/d7e/f8a [538440,44022] 0 2026-03-10T12:38:11.636 INFO:tasks.workunit.client.0.vm00.stdout:8/763: readlink d0/d46/d89/ld3 0 2026-03-10T12:38:11.648 INFO:tasks.workunit.client.0.vm00.stdout:5/887: creat d1f/d26/d2b/d35/d78/d99/dcd/f136 x:0 0 0 2026-03-10T12:38:11.667 INFO:tasks.workunit.client.0.vm00.stdout:2/858: rename d4/dd/c25 to d4/d53/d76/d9b/d107/c113 0 2026-03-10T12:38:11.670 INFO:tasks.workunit.client.1.vm07.stdout:1/653: dread d9/fe [0,4194304] 0 2026-03-10T12:38:11.670 INFO:tasks.workunit.client.1.vm07.stdout:1/654: write d9/d2d/d80/fc0 [736441,110823] 0 2026-03-10T12:38:11.689 INFO:tasks.workunit.client.1.vm07.stdout:4/774: unlink d0/d4/d5/fd3 0 2026-03-10T12:38:11.689 INFO:tasks.workunit.client.1.vm07.stdout:2/583: dread d0/d42/d4e/d77/f6f [0,4194304] 0 2026-03-10T12:38:11.691 INFO:tasks.workunit.client.1.vm07.stdout:9/737: dwrite d5/d1f/d75/fbc [0,4194304] 0 2026-03-10T12:38:11.697 INFO:tasks.workunit.client.1.vm07.stdout:0/724: link d0/d14/d5f/d76/d2f/d31/l3f d0/d14/d5f/d41/d6a/d9a/lf7 0 2026-03-10T12:38:11.697 INFO:tasks.workunit.client.0.vm00.stdout:5/888: rename d1f/f59 to d1f/d26/d2b/d37/f137 0 2026-03-10T12:38:11.699 INFO:tasks.workunit.client.1.vm07.stdout:0/725: truncate d0/d14/d5f/d76/d2f/d31/d79/d85/fc6 4235061 0 2026-03-10T12:38:11.701 INFO:tasks.workunit.client.1.vm07.stdout:9/738: sync 2026-03-10T12:38:11.713 INFO:tasks.workunit.client.1.vm07.stdout:5/699: truncate d0/d22/d18/d19/d2e/f62 3305889 0 2026-03-10T12:38:11.714 INFO:tasks.workunit.client.0.vm00.stdout:6/566: link d2/d16/f23 d2/d51/fcd 0 2026-03-10T12:38:11.714 INFO:tasks.workunit.client.0.vm00.stdout:6/567: chown d2/d16/l65 13745 1 2026-03-10T12:38:11.714 INFO:tasks.workunit.client.0.vm00.stdout:6/568: readlink d2/l7 0 2026-03-10T12:38:11.718 INFO:tasks.workunit.client.0.vm00.stdout:6/569: mkdir d2/d9f/dce 0 2026-03-10T12:38:11.732 INFO:tasks.workunit.client.0.vm00.stdout:7/606: rmdir da/d26/d37/d56 39 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.0.vm00.stdout:4/861: dwrite df/d1f/d36/d3a/fa9 [0,4194304] 0 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.0.vm00.stdout:9/877: write d0/d3d/d59/f45 [1169110,88562] 0 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.1.vm07.stdout:2/584: getdents d0/d29/d64/db5/dbb 0 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.1.vm07.stdout:8/632: link d1/d3/d6/d54/fa1 d1/d3/d6c/fce 0 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.1.vm07.stdout:0/726: rename d0/d14/d5f/d76/d2f/d31/d4f/l53 to d0/d14/d5f/d76/d2f/d31/d4f/da8/de2/lf8 0 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.1.vm07.stdout:9/739: dread d5/d69/d93/d97/fe3 [0,4194304] 0 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.1.vm07.stdout:9/740: write d5/d16/d23/fc8 [2163973,109499] 0 2026-03-10T12:38:11.733 INFO:tasks.workunit.client.1.vm07.stdout:9/741: dwrite d5/f1c [0,4194304] 0 2026-03-10T12:38:11.737 INFO:tasks.workunit.client.0.vm00.stdout:7/607: creat da/d26/d50/d73/fd7 x:0 0 0 2026-03-10T12:38:11.737 INFO:tasks.workunit.client.1.vm07.stdout:0/727: readlink d0/l34 0 2026-03-10T12:38:11.739 INFO:tasks.workunit.client.1.vm07.stdout:5/700: creat d0/d22/d18/d19/ffb x:0 0 0 2026-03-10T12:38:11.740 INFO:tasks.workunit.client.1.vm07.stdout:5/701: truncate d0/d22/d18/d19/d72/fd8 263895 0 2026-03-10T12:38:11.740 INFO:tasks.workunit.client.1.vm07.stdout:0/728: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/f89 [0,4194304] 0 2026-03-10T12:38:11.742 INFO:tasks.workunit.client.0.vm00.stdout:7/608: dwrite da/f16 [0,4194304] 0 2026-03-10T12:38:11.744 INFO:tasks.workunit.client.0.vm00.stdout:5/889: sync 2026-03-10T12:38:11.749 INFO:tasks.workunit.client.0.vm00.stdout:7/609: mknod da/d47/d87/cd8 0 2026-03-10T12:38:11.753 INFO:tasks.workunit.client.0.vm00.stdout:1/863: dwrite da/d12/d26/dd2/ff9 [0,4194304] 0 2026-03-10T12:38:11.757 INFO:tasks.workunit.client.0.vm00.stdout:9/878: unlink d0/d3d/d59/c96 0 2026-03-10T12:38:11.763 INFO:tasks.workunit.client.0.vm00.stdout:5/890: fsync d1f/d26/d2e/d58/d10c/d123/d72/f85 0 2026-03-10T12:38:11.767 INFO:tasks.workunit.client.1.vm07.stdout:9/742: dwrite d5/d13/d6c/d7a/fe5 [0,4194304] 0 2026-03-10T12:38:11.768 INFO:tasks.workunit.client.0.vm00.stdout:3/852: dwrite dd/d3d/d8a/de0/d55/fda [0,4194304] 0 2026-03-10T12:38:11.772 INFO:tasks.workunit.client.0.vm00.stdout:0/727: write d3/f9c [1169556,21588] 0 2026-03-10T12:38:11.773 INFO:tasks.workunit.client.0.vm00.stdout:1/864: symlink da/d21/d39/l11e 0 2026-03-10T12:38:11.775 INFO:tasks.workunit.client.0.vm00.stdout:9/879: mknod d0/d3d/d59/d4e/dba/d19/d50/c13c 0 2026-03-10T12:38:11.782 INFO:tasks.workunit.client.0.vm00.stdout:9/880: write d0/d3d/d59/d4e/dba/d1e/d85/d98/fd0 [2432882,44204] 0 2026-03-10T12:38:11.782 INFO:tasks.workunit.client.0.vm00.stdout:8/764: write d0/d58/f8c [82175,123931] 0 2026-03-10T12:38:11.782 INFO:tasks.workunit.client.0.vm00.stdout:8/765: write d0/d46/d7e/f8a [1759980,1688] 0 2026-03-10T12:38:11.782 INFO:tasks.workunit.client.0.vm00.stdout:2/859: dwrite d4/d6/dca/faf [0,4194304] 0 2026-03-10T12:38:11.790 INFO:tasks.workunit.client.0.vm00.stdout:1/865: mkdir da/d21/db3/d5d/d11f 0 2026-03-10T12:38:11.790 INFO:tasks.workunit.client.0.vm00.stdout:7/610: fsync da/d25/d2c/d82/d68/fcd 0 2026-03-10T12:38:11.794 INFO:tasks.workunit.client.0.vm00.stdout:8/766: creat d0/d93/d36/d7d/fed x:0 0 0 2026-03-10T12:38:11.801 INFO:tasks.workunit.client.1.vm07.stdout:5/702: dread d0/d22/d18/f4c [0,4194304] 0 2026-03-10T12:38:11.801 INFO:tasks.workunit.client.0.vm00.stdout:2/860: mknod d4/d6/c114 0 2026-03-10T12:38:11.801 INFO:tasks.workunit.client.0.vm00.stdout:2/861: fsync d4/dd/da7/ffc 0 2026-03-10T12:38:11.801 INFO:tasks.workunit.client.0.vm00.stdout:8/767: creat d0/d93/d17/db1/fee x:0 0 0 2026-03-10T12:38:11.804 INFO:tasks.workunit.client.0.vm00.stdout:7/611: dread da/f17 [0,4194304] 0 2026-03-10T12:38:11.813 INFO:tasks.workunit.client.0.vm00.stdout:8/768: symlink d0/d93/d36/d5b/lef 0 2026-03-10T12:38:11.817 INFO:tasks.workunit.client.0.vm00.stdout:2/862: unlink d4/dd/c21 0 2026-03-10T12:38:11.823 INFO:tasks.workunit.client.0.vm00.stdout:8/769: creat d0/d93/d2d/d49/ff0 x:0 0 0 2026-03-10T12:38:11.823 INFO:tasks.workunit.client.0.vm00.stdout:1/866: getdents da/d21/db3/d59/da6/da4/dda/dc0/dfe 0 2026-03-10T12:38:11.831 INFO:tasks.workunit.client.0.vm00.stdout:0/728: dread d3/db/f16 [0,4194304] 0 2026-03-10T12:38:11.832 INFO:tasks.workunit.client.0.vm00.stdout:0/729: dread d3/d22/d3a/f8c [0,4194304] 0 2026-03-10T12:38:11.832 INFO:tasks.workunit.client.1.vm07.stdout:0/729: getdents d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9 0 2026-03-10T12:38:11.833 INFO:tasks.workunit.client.0.vm00.stdout:8/770: link d0/d93/d36/l53 d0/d46/d6e/lf1 0 2026-03-10T12:38:11.834 INFO:tasks.workunit.client.0.vm00.stdout:0/730: fdatasync d3/db/d24/f2f 0 2026-03-10T12:38:11.835 INFO:tasks.workunit.client.1.vm07.stdout:5/703: sync 2026-03-10T12:38:11.836 INFO:tasks.workunit.client.1.vm07.stdout:5/704: fdatasync d0/d22/d18/d3e/d5d/f6d 0 2026-03-10T12:38:11.838 INFO:tasks.workunit.client.1.vm07.stdout:9/743: creat d5/d13/d2c/de6/ffa x:0 0 0 2026-03-10T12:38:11.847 INFO:tasks.workunit.client.0.vm00.stdout:7/612: sync 2026-03-10T12:38:11.847 INFO:tasks.workunit.client.0.vm00.stdout:2/863: sync 2026-03-10T12:38:11.849 INFO:tasks.workunit.client.0.vm00.stdout:2/864: creat d4/d53/d76/dba/f115 x:0 0 0 2026-03-10T12:38:11.857 INFO:tasks.workunit.client.0.vm00.stdout:7/613: symlink da/d25/d2c/d82/d68/ld9 0 2026-03-10T12:38:11.858 INFO:tasks.workunit.client.1.vm07.stdout:5/705: chown d0/d22/d18/d3e/d53/cc0 8505837 1 2026-03-10T12:38:11.862 INFO:tasks.workunit.client.0.vm00.stdout:7/614: write da/d41/f4b [2605804,53160] 0 2026-03-10T12:38:11.879 INFO:tasks.workunit.client.0.vm00.stdout:4/862: dwrite df/d1f/d22/d26/d65/d91/f50 [4194304,4194304] 0 2026-03-10T12:38:11.879 INFO:tasks.workunit.client.0.vm00.stdout:4/863: stat df/d1f/d36/dc6 0 2026-03-10T12:38:11.892 INFO:tasks.workunit.client.0.vm00.stdout:5/891: write d1f/d26/d2b/d35/fad [5708308,97896] 0 2026-03-10T12:38:11.898 INFO:tasks.workunit.client.0.vm00.stdout:3/853: write dd/d18/d13/f22 [3881837,111403] 0 2026-03-10T12:38:11.900 INFO:tasks.workunit.client.0.vm00.stdout:5/892: symlink d1f/d26/d2b/d37/db2/l138 0 2026-03-10T12:38:11.903 INFO:tasks.workunit.client.0.vm00.stdout:1/867: rename da/d21/db3/d5d to da/d21/db3/d59/d120 0 2026-03-10T12:38:11.913 INFO:tasks.workunit.client.0.vm00.stdout:1/868: rmdir da/d12/d91 39 2026-03-10T12:38:11.913 INFO:tasks.workunit.client.0.vm00.stdout:8/771: write d0/d93/d17/f63 [3905556,26893] 0 2026-03-10T12:38:11.914 INFO:tasks.workunit.client.0.vm00.stdout:1/869: write da/d21/db3/d59/da6/da4/dda/dc0/dc3/f100 [148768,90934] 0 2026-03-10T12:38:11.918 INFO:tasks.workunit.client.0.vm00.stdout:3/854: creat dd/d3d/f11e x:0 0 0 2026-03-10T12:38:11.918 INFO:tasks.workunit.client.0.vm00.stdout:1/870: read da/d24/d28/f116 [1953898,108364] 0 2026-03-10T12:38:11.924 INFO:tasks.workunit.client.0.vm00.stdout:2/865: write d4/d6/f30 [3097443,20995] 0 2026-03-10T12:38:11.926 INFO:tasks.workunit.client.0.vm00.stdout:3/855: dread dd/f25 [0,4194304] 0 2026-03-10T12:38:11.932 INFO:tasks.workunit.client.0.vm00.stdout:4/864: dwrite df/f42 [0,4194304] 0 2026-03-10T12:38:11.934 INFO:tasks.workunit.client.0.vm00.stdout:4/865: write df/f42 [7450083,68265] 0 2026-03-10T12:38:11.934 INFO:tasks.workunit.client.0.vm00.stdout:4/866: readlink df/d1f/l59 0 2026-03-10T12:38:11.935 INFO:tasks.workunit.client.0.vm00.stdout:5/893: dwrite d1f/d26/d2e/d58/d10c/d123/f102 [0,4194304] 0 2026-03-10T12:38:11.966 INFO:tasks.workunit.client.0.vm00.stdout:5/894: mknod d1f/d26/d2b/d37/dbf/c139 0 2026-03-10T12:38:11.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:11.974+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mgrmap(e 21) v1 ==== 49900+0+0 (secure 0 0 0) 0x7fc25c01d550 con 0x7fc260071980 2026-03-10T12:38:11.988 INFO:tasks.workunit.client.0.vm00.stdout:2/866: dwrite d4/d53/d76/d9b/dad/d8e/fd0 [0,4194304] 0 2026-03-10T12:38:11.991 INFO:tasks.workunit.client.0.vm00.stdout:3/856: dwrite dd/d18/d14/d2b/f8d [0,4194304] 0 2026-03-10T12:38:11.991 INFO:tasks.workunit.client.0.vm00.stdout:4/867: dwrite df/d63/d77/fe8 [0,4194304] 0 2026-03-10T12:38:12.001 INFO:tasks.workunit.client.0.vm00.stdout:1/871: mkdir da/d21/db3/d59/d120/d72/d121 0 2026-03-10T12:38:12.002 INFO:tasks.workunit.client.0.vm00.stdout:5/895: rmdir d1f/d26/d2e/d58/d10c/d123/d5b 39 2026-03-10T12:38:12.002 INFO:tasks.workunit.client.0.vm00.stdout:1/872: fdatasync da/d21/f74 0 2026-03-10T12:38:12.007 INFO:tasks.workunit.client.0.vm00.stdout:8/772: getdents d0/d93/d2d/d49 0 2026-03-10T12:38:12.010 INFO:tasks.workunit.client.1.vm07.stdout:3/677: mknod dc/dd/d1f/ce7 0 2026-03-10T12:38:12.012 INFO:tasks.workunit.client.1.vm07.stdout:3/678: creat dc/d18/d24/fe8 x:0 0 0 2026-03-10T12:38:12.016 INFO:tasks.workunit.client.0.vm00.stdout:7/615: rmdir da/d3f 39 2026-03-10T12:38:12.027 INFO:tasks.workunit.client.1.vm07.stdout:7/618: dwrite d0/d47/f81 [0,4194304] 0 2026-03-10T12:38:12.030 INFO:tasks.workunit.client.1.vm07.stdout:6/632: write d1/d4/d6/d43/f90 [387138,74294] 0 2026-03-10T12:38:12.033 INFO:tasks.workunit.client.1.vm07.stdout:1/655: dwrite d9/df/d55/f6f [0,4194304] 0 2026-03-10T12:38:12.035 INFO:tasks.workunit.client.1.vm07.stdout:1/656: chown d9/df/f4a 0 1 2026-03-10T12:38:12.047 INFO:tasks.workunit.client.1.vm07.stdout:6/633: rename d1/d4/d6/d43/d88/d97/fa7 to d1/d4/d9b/fc8 0 2026-03-10T12:38:12.054 INFO:tasks.workunit.client.0.vm00.stdout:8/773: read d0/dd/d38/d81/f88 [180686,13220] 0 2026-03-10T12:38:12.055 INFO:tasks.workunit.client.1.vm07.stdout:6/634: unlink d1/d4/d6/d16/f63 0 2026-03-10T12:38:12.059 INFO:tasks.workunit.client.0.vm00.stdout:4/868: fdatasync df/d93/dbc/ff5 0 2026-03-10T12:38:12.060 INFO:tasks.workunit.client.1.vm07.stdout:6/635: read d1/d4/d6/f41 [2794749,80959] 0 2026-03-10T12:38:12.061 INFO:tasks.workunit.client.1.vm07.stdout:7/619: getdents d0/d61/db4 0 2026-03-10T12:38:12.071 INFO:tasks.workunit.client.0.vm00.stdout:7/616: dread f9 [0,4194304] 0 2026-03-10T12:38:12.075 INFO:tasks.workunit.client.0.vm00.stdout:8/774: fsync d0/d46/d6e/f70 0 2026-03-10T12:38:12.088 INFO:tasks.workunit.client.1.vm07.stdout:6/636: creat d1/fc9 x:0 0 0 2026-03-10T12:38:12.092 INFO:tasks.workunit.client.1.vm07.stdout:4/775: write d0/d4/d5/d34/f94 [1367999,82695] 0 2026-03-10T12:38:12.095 INFO:tasks.workunit.client.1.vm07.stdout:4/776: mknod d0/d4/df2/df6/c10f 0 2026-03-10T12:38:12.102 INFO:tasks.workunit.client.0.vm00.stdout:6/570: write d2/d16/f41 [448469,103616] 0 2026-03-10T12:38:12.102 INFO:tasks.workunit.client.1.vm07.stdout:2/585: write d0/d29/d64/d74/d88/f58 [1963237,43667] 0 2026-03-10T12:38:12.107 INFO:tasks.workunit.client.1.vm07.stdout:6/637: fsync d1/d4/d6/d43/d65/f76 0 2026-03-10T12:38:12.108 INFO:tasks.workunit.client.0.vm00.stdout:2/867: rename d4/f73 to d4/d6/d2d/d3a/d43/f116 0 2026-03-10T12:38:12.111 INFO:tasks.workunit.client.1.vm07.stdout:6/638: dwrite d1/d4/d6/d43/d65/f7f [0,4194304] 0 2026-03-10T12:38:12.112 INFO:tasks.workunit.client.0.vm00.stdout:8/775: dwrite d0/dd/d38/d81/fe2 [0,4194304] 0 2026-03-10T12:38:12.114 INFO:tasks.workunit.client.0.vm00.stdout:8/776: fdatasync d0/d93/d36/d51/fe0 0 2026-03-10T12:38:12.116 INFO:tasks.workunit.client.1.vm07.stdout:8/633: dwrite d1/d3/d6/d50/f5e [0,4194304] 0 2026-03-10T12:38:12.133 INFO:tasks.workunit.client.0.vm00.stdout:5/896: link d1f/d6a/d118/dcb/ld4 d1f/d26/d2b/d37/l13a 0 2026-03-10T12:38:12.135 INFO:tasks.workunit.client.1.vm07.stdout:2/586: rename d0/d42/d1f/d20/l4d to d0/d80/lc7 0 2026-03-10T12:38:12.142 INFO:tasks.workunit.client.0.vm00.stdout:1/873: rename da/d24/d28/d67/c46 to da/d24/d28/d67/da2/d78/dbe/c122 0 2026-03-10T12:38:12.142 INFO:tasks.workunit.client.1.vm07.stdout:6/639: mknod d1/d4/d4a/cca 0 2026-03-10T12:38:12.142 INFO:tasks.workunit.client.0.vm00.stdout:6/571: rename d2 to d2/d42/d9c/dcf 22 2026-03-10T12:38:12.142 INFO:tasks.workunit.client.0.vm00.stdout:6/572: dread - d2/da/f6a zero size 2026-03-10T12:38:12.152 INFO:tasks.workunit.client.1.vm07.stdout:6/640: symlink d1/d4/d6/d16/d1a/d2c/lcb 0 2026-03-10T12:38:12.159 INFO:tasks.workunit.client.0.vm00.stdout:6/573: link d2/da/dc/d94/lad d2/d14/ld0 0 2026-03-10T12:38:12.160 INFO:tasks.workunit.client.0.vm00.stdout:6/574: symlink d2/d14/ld1 0 2026-03-10T12:38:12.163 INFO:tasks.workunit.client.0.vm00.stdout:5/897: rename d1f/d96/l98 to d1f/d26/d2e/d58/d6b/d86/l13b 0 2026-03-10T12:38:12.164 INFO:tasks.workunit.client.0.vm00.stdout:5/898: chown d1f/d26/d2e/d58/d10c/d123/d72/f85 4 1 2026-03-10T12:38:12.165 INFO:tasks.workunit.client.0.vm00.stdout:5/899: symlink d1f/d26/d2e/d58/d10c/d123/dd6/l13c 0 2026-03-10T12:38:12.167 INFO:tasks.workunit.client.0.vm00.stdout:5/900: mkdir d1f/d26/d2b/d35/d78/d99/daf/d13d 0 2026-03-10T12:38:12.167 INFO:tasks.workunit.client.0.vm00.stdout:8/777: sync 2026-03-10T12:38:12.168 INFO:tasks.workunit.client.0.vm00.stdout:8/778: dread - d0/d93/d2d/d49/ff0 zero size 2026-03-10T12:38:12.169 INFO:tasks.workunit.client.1.vm07.stdout:8/634: sync 2026-03-10T12:38:12.169 INFO:tasks.workunit.client.0.vm00.stdout:5/901: mknod d1f/d26/d2b/d37/dcc/c13e 0 2026-03-10T12:38:12.170 INFO:tasks.workunit.client.0.vm00.stdout:5/902: truncate d1f/d26/d101/f128 1625229 0 2026-03-10T12:38:12.170 INFO:tasks.workunit.client.0.vm00.stdout:5/903: chown d1f/d26/l3d 360 1 2026-03-10T12:38:12.170 INFO:tasks.workunit.client.1.vm07.stdout:8/635: unlink d1/d3/c23 0 2026-03-10T12:38:12.171 INFO:tasks.workunit.client.0.vm00.stdout:3/857: write dd/d64/f7b [1560528,27319] 0 2026-03-10T12:38:12.172 INFO:tasks.workunit.client.1.vm07.stdout:8/636: getdents d1/d3/d18 0 2026-03-10T12:38:12.175 INFO:tasks.workunit.client.0.vm00.stdout:5/904: mknod d1f/d26/de3/d104/c13f 0 2026-03-10T12:38:12.178 INFO:tasks.workunit.client.0.vm00.stdout:5/905: dwrite d1f/d26/d2b/f44 [4194304,4194304] 0 2026-03-10T12:38:12.180 INFO:tasks.workunit.client.0.vm00.stdout:3/858: chown dd/d27/f44 6 1 2026-03-10T12:38:12.180 INFO:tasks.workunit.client.0.vm00.stdout:3/859: dread - dd/d2a/da2/de1/f10f zero size 2026-03-10T12:38:12.182 INFO:tasks.workunit.client.0.vm00.stdout:6/575: sync 2026-03-10T12:38:12.183 INFO:tasks.workunit.client.0.vm00.stdout:6/576: chown d2/d14/d7a/la0 59 1 2026-03-10T12:38:12.183 INFO:tasks.workunit.client.0.vm00.stdout:6/577: fsync d2/d14/d7a/f8d 0 2026-03-10T12:38:12.184 INFO:tasks.workunit.client.0.vm00.stdout:6/578: write d2/d14/d7a/db9/f9b [802371,10828] 0 2026-03-10T12:38:12.186 INFO:tasks.workunit.client.0.vm00.stdout:3/860: chown dd/d18/d13/d1d/dc6/d106/lb3 31829 1 2026-03-10T12:38:12.189 INFO:tasks.workunit.client.0.vm00.stdout:5/906: mknod d1f/c140 0 2026-03-10T12:38:12.190 INFO:tasks.workunit.client.0.vm00.stdout:6/579: mknod d2/d14/d7a/cd2 0 2026-03-10T12:38:12.198 INFO:tasks.workunit.client.0.vm00.stdout:9/881: dread d0/f1a [0,4194304] 0 2026-03-10T12:38:12.199 INFO:tasks.workunit.client.0.vm00.stdout:9/882: write d0/d3d/d59/d4e/dba/d19/f20 [1208456,119452] 0 2026-03-10T12:38:12.202 INFO:tasks.workunit.client.0.vm00.stdout:9/883: rename d0/d9b/cb4 to d0/d3d/d59/d4e/dba/d19/c13d 0 2026-03-10T12:38:12.227 INFO:tasks.workunit.client.0.vm00.stdout:4/869: dwrite df/d1f/d36/d3a/f44 [0,4194304] 0 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: Active manager daemon vm07.kfawlb restarted 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: Activating manager daemon vm07.kfawlb 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/crt"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/key"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: mgrmap e21: vm07.kfawlb(active, starting, since 0.0156241s) 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr metadata", "who": "vm07.kfawlb", "id": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:38:12.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:12 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:38:12.240 INFO:tasks.workunit.client.0.vm00.stdout:2/868: dwrite d4/d6/f4e [0,4194304] 0 2026-03-10T12:38:12.244 INFO:tasks.workunit.client.0.vm00.stdout:1/874: write da/d12/f64 [509926,29457] 0 2026-03-10T12:38:12.249 INFO:tasks.workunit.client.0.vm00.stdout:1/875: read da/d21/d27/f54 [4818774,127906] 0 2026-03-10T12:38:12.251 INFO:tasks.workunit.client.0.vm00.stdout:8/779: write d0/d93/d36/d5b/f69 [428788,66887] 0 2026-03-10T12:38:12.255 INFO:tasks.workunit.client.0.vm00.stdout:8/780: dwrite d0/d93/d17/d48/fc7 [0,4194304] 0 2026-03-10T12:38:12.259 INFO:tasks.workunit.client.0.vm00.stdout:5/907: write d1f/d26/d2e/d58/d10c/d123/fa7 [3597802,71186] 0 2026-03-10T12:38:12.262 INFO:tasks.workunit.client.0.vm00.stdout:3/861: dwrite dd/d2a/da2/de1/d45/f75 [0,4194304] 0 2026-03-10T12:38:12.262 INFO:tasks.workunit.client.0.vm00.stdout:5/908: dread d1f/d26/d2e/fba [0,4194304] 0 2026-03-10T12:38:12.264 INFO:tasks.workunit.client.0.vm00.stdout:5/909: chown d1f/d26/c130 26325692 1 2026-03-10T12:38:12.272 INFO:tasks.workunit.client.0.vm00.stdout:4/870: symlink df/d1f/l11d 0 2026-03-10T12:38:12.274 INFO:tasks.workunit.client.0.vm00.stdout:5/910: dread d1f/d26/d2e/f10b [0,4194304] 0 2026-03-10T12:38:12.277 INFO:tasks.workunit.client.0.vm00.stdout:2/869: dread - d4/dd/d63/fd4 zero size 2026-03-10T12:38:12.277 INFO:tasks.workunit.client.0.vm00.stdout:5/911: dwrite f19 [0,4194304] 0 2026-03-10T12:38:12.285 INFO:tasks.workunit.client.0.vm00.stdout:1/876: rename da/d21/db3/d59/d120/d80/fcc to da/d21/db3/d59/da6/da4/dda/dc0/dfe/f123 0 2026-03-10T12:38:12.287 INFO:tasks.workunit.client.0.vm00.stdout:8/781: creat d0/dd/d38/ff2 x:0 0 0 2026-03-10T12:38:12.289 INFO:tasks.workunit.client.0.vm00.stdout:3/862: chown dd/d18/d14/c9d 302 1 2026-03-10T12:38:12.304 INFO:tasks.workunit.client.0.vm00.stdout:3/863: mkdir dd/d18/d14/d2b/d11c/d11f 0 2026-03-10T12:38:12.306 INFO:tasks.workunit.client.0.vm00.stdout:4/871: fsync df/d32/d64/f67 0 2026-03-10T12:38:12.308 INFO:tasks.workunit.client.0.vm00.stdout:2/870: mknod d4/d53/d76/d9b/c117 0 2026-03-10T12:38:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: Active manager daemon vm07.kfawlb restarted 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: Activating manager daemon vm07.kfawlb 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/crt"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/key"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: mgrmap e21: vm07.kfawlb(active, starting, since 0.0156241s) 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr metadata", "who": "vm07.kfawlb", "id": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:38:12.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:12 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:38:12.319 INFO:tasks.workunit.client.0.vm00.stdout:9/884: dread d0/d7f/db8/dc4/f6c [0,4194304] 0 2026-03-10T12:38:12.319 INFO:tasks.workunit.client.0.vm00.stdout:3/864: creat dd/d3d/d8a/de0/d55/dfd/f120 x:0 0 0 2026-03-10T12:38:12.321 INFO:tasks.workunit.client.0.vm00.stdout:4/872: creat df/d1f/d36/dc6/f11e x:0 0 0 2026-03-10T12:38:12.321 INFO:tasks.workunit.client.0.vm00.stdout:4/873: write df/d63/ddb/ff8 [878849,42197] 0 2026-03-10T12:38:12.326 INFO:tasks.workunit.client.0.vm00.stdout:3/865: dread - dd/d18/d13/d99/da5/fdf zero size 2026-03-10T12:38:12.334 INFO:tasks.workunit.client.0.vm00.stdout:3/866: read - dd/d18/d13/d99/da5/fdf zero size 2026-03-10T12:38:12.334 INFO:tasks.workunit.client.0.vm00.stdout:0/731: truncate d3/d40/d65/f92 855956 0 2026-03-10T12:38:12.334 INFO:tasks.workunit.client.0.vm00.stdout:0/732: truncate d3/d7/f11 3967253 0 2026-03-10T12:38:12.334 INFO:tasks.workunit.client.1.vm07.stdout:0/730: dwrite d0/d14/d5f/fb3 [0,4194304] 0 2026-03-10T12:38:12.334 INFO:tasks.workunit.client.1.vm07.stdout:0/731: dread - d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/fe4 zero size 2026-03-10T12:38:12.334 INFO:tasks.workunit.client.1.vm07.stdout:0/732: readlink d0/l26 0 2026-03-10T12:38:12.334 INFO:tasks.workunit.client.1.vm07.stdout:0/733: chown d0/d14/d7c/fde 2 1 2026-03-10T12:38:12.337 INFO:tasks.workunit.client.0.vm00.stdout:0/733: chown d3/d22/f71 147117354 1 2026-03-10T12:38:12.341 INFO:tasks.workunit.client.0.vm00.stdout:4/874: rename df/d1f/d22/l4b to df/d1f/d22/d26/d65/da7/d10e/l11f 0 2026-03-10T12:38:12.341 INFO:tasks.workunit.client.1.vm07.stdout:0/734: truncate d0/d14/d5f/d76/d2f/d31/d4f/f61 174522 0 2026-03-10T12:38:12.342 INFO:tasks.workunit.client.1.vm07.stdout:0/735: write d0/d14/d5f/d76/d2f/d31/d79/d85/fc6 [1666822,20193] 0 2026-03-10T12:38:12.346 INFO:tasks.workunit.client.0.vm00.stdout:0/734: mkdir d3/d22/d3a/deb 0 2026-03-10T12:38:12.352 INFO:tasks.workunit.client.1.vm07.stdout:0/736: rename d0/d14/d5f/d76/d2f/d31/d79/dd7 to d0/d14/d5f/d41/d6a/d9a/df9 0 2026-03-10T12:38:12.352 INFO:tasks.workunit.client.0.vm00.stdout:0/735: read - d3/d7/d4c/d5b/d38/db3/fbb zero size 2026-03-10T12:38:12.353 INFO:tasks.workunit.client.0.vm00.stdout:0/736: read - d3/db/d77/d82/fce zero size 2026-03-10T12:38:12.355 INFO:tasks.workunit.client.1.vm07.stdout:9/744: truncate d5/d13/d57/d4f/d6a/f8a 434733 0 2026-03-10T12:38:12.357 INFO:tasks.workunit.client.1.vm07.stdout:5/706: dwrite d0/f1f [0,4194304] 0 2026-03-10T12:38:12.358 INFO:tasks.workunit.client.0.vm00.stdout:8/782: dwrite d0/d93/d36/f41 [0,4194304] 0 2026-03-10T12:38:12.360 INFO:tasks.workunit.client.1.vm07.stdout:5/707: write d0/d22/d18/d19/d2e/da9/ff9 [555455,6834] 0 2026-03-10T12:38:12.363 INFO:tasks.workunit.client.1.vm07.stdout:0/737: sync 2026-03-10T12:38:12.365 INFO:tasks.workunit.client.1.vm07.stdout:5/708: readlink d0/d22/d18/d19/l28 0 2026-03-10T12:38:12.365 INFO:tasks.workunit.client.1.vm07.stdout:5/709: readlink d0/d22/d18/d19/d21/d54/lc5 0 2026-03-10T12:38:12.366 INFO:tasks.workunit.client.1.vm07.stdout:0/738: mknod d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/cfa 0 2026-03-10T12:38:12.367 INFO:tasks.workunit.client.1.vm07.stdout:5/710: creat d0/d22/d18/d19/d72/ffc x:0 0 0 2026-03-10T12:38:12.369 INFO:tasks.workunit.client.0.vm00.stdout:9/885: write d0/d3d/d59/f94 [59567,41666] 0 2026-03-10T12:38:12.369 INFO:tasks.workunit.client.1.vm07.stdout:0/739: sync 2026-03-10T12:38:12.371 INFO:tasks.workunit.client.1.vm07.stdout:0/740: stat d0/d14/d5f/d76/d2f/d31/d79/d9e/lac 0 2026-03-10T12:38:12.376 INFO:tasks.workunit.client.0.vm00.stdout:2/871: creat d4/d53/d9e/d101/f118 x:0 0 0 2026-03-10T12:38:12.377 INFO:tasks.workunit.client.0.vm00.stdout:2/872: chown d4/d6/d2d/d3a/fcb 2 1 2026-03-10T12:38:12.379 INFO:tasks.workunit.client.0.vm00.stdout:8/783: fdatasync d0/d46/d89/f91 0 2026-03-10T12:38:12.389 INFO:tasks.workunit.client.0.vm00.stdout:3/867: link dd/d3d/d8a/de0/d55/dfd/l10d dd/d3d/d8a/l121 0 2026-03-10T12:38:12.389 INFO:tasks.workunit.client.0.vm00.stdout:3/868: truncate dd/d18/d14/d2b/d11c/f108 586421 0 2026-03-10T12:38:12.389 INFO:tasks.workunit.client.0.vm00.stdout:9/886: symlink d0/d7f/db8/dc4/db0/l13e 0 2026-03-10T12:38:12.391 INFO:tasks.workunit.client.0.vm00.stdout:2/873: dread - d4/d10f/fce zero size 2026-03-10T12:38:12.395 INFO:tasks.workunit.client.0.vm00.stdout:8/784: rename d0/d46/d6e to d0/dd/d38/d81/df3 0 2026-03-10T12:38:12.398 INFO:tasks.workunit.client.0.vm00.stdout:5/912: dread d1f/f97 [4194304,4194304] 0 2026-03-10T12:38:12.402 INFO:tasks.workunit.client.0.vm00.stdout:3/869: rmdir dd/d18/d13/d1d/dc6/d106 39 2026-03-10T12:38:12.403 INFO:tasks.workunit.client.0.vm00.stdout:3/870: write dd/d18/d13/f22 [4073011,85107] 0 2026-03-10T12:38:12.425 INFO:tasks.workunit.client.0.vm00.stdout:9/887: symlink d0/d9b/l13f 0 2026-03-10T12:38:12.434 INFO:tasks.workunit.client.0.vm00.stdout:2/874: mknod d4/d53/d9e/d101/c119 0 2026-03-10T12:38:12.434 INFO:tasks.workunit.client.0.vm00.stdout:2/875: chown d4/d6/d2d/d31/l4f 33 1 2026-03-10T12:38:12.446 INFO:tasks.workunit.client.0.vm00.stdout:4/875: dwrite df/f20 [0,4194304] 0 2026-03-10T12:38:12.450 INFO:tasks.workunit.client.0.vm00.stdout:5/913: mkdir d1f/d26/d2e/d58/d141 0 2026-03-10T12:38:12.456 INFO:tasks.workunit.client.0.vm00.stdout:8/785: dread d0/d93/d17/f67 [0,4194304] 0 2026-03-10T12:38:12.461 INFO:tasks.workunit.client.0.vm00.stdout:9/888: dread d0/d3d/d59/d4e/dba/d19/f20 [4194304,4194304] 0 2026-03-10T12:38:12.468 INFO:tasks.workunit.client.0.vm00.stdout:3/871: creat dd/d3d/d8a/de0/de4/dac/f122 x:0 0 0 2026-03-10T12:38:12.470 INFO:tasks.workunit.client.1.vm07.stdout:3/679: write dc/dd/f96 [3235775,70888] 0 2026-03-10T12:38:12.470 INFO:tasks.workunit.client.1.vm07.stdout:3/680: write dc/d18/d24/f37 [2871013,64390] 0 2026-03-10T12:38:12.470 INFO:tasks.workunit.client.1.vm07.stdout:3/681: chown dc/dd/f19 43603040 1 2026-03-10T12:38:12.471 INFO:tasks.workunit.client.0.vm00.stdout:3/872: dwrite dd/d3d/d8a/f113 [0,4194304] 0 2026-03-10T12:38:12.485 INFO:tasks.workunit.client.0.vm00.stdout:2/876: dread d4/d53/d68/f69 [0,4194304] 0 2026-03-10T12:38:12.491 INFO:tasks.workunit.client.1.vm07.stdout:1/657: write d9/df/d29/d2b/d31/f7d [332135,27251] 0 2026-03-10T12:38:12.491 INFO:tasks.workunit.client.1.vm07.stdout:1/658: write d9/df/dc2/fa6 [1353368,49215] 0 2026-03-10T12:38:12.492 INFO:tasks.workunit.client.1.vm07.stdout:1/659: chown d9/f52 28 1 2026-03-10T12:38:12.493 INFO:tasks.workunit.client.1.vm07.stdout:1/660: chown d9/df/d29/d2b/d31/d91/l66 2 1 2026-03-10T12:38:12.498 INFO:tasks.workunit.client.1.vm07.stdout:7/620: dwrite d0/f27 [0,4194304] 0 2026-03-10T12:38:12.499 INFO:tasks.workunit.client.0.vm00.stdout:7/617: write da/d47/d87/fb3 [222358,32359] 0 2026-03-10T12:38:12.504 INFO:tasks.workunit.client.1.vm07.stdout:4/777: dwrite d0/d4/df2/df6/f50 [0,4194304] 0 2026-03-10T12:38:12.506 INFO:tasks.workunit.client.1.vm07.stdout:4/778: readlink d0/d4/d5/d78/dc5/le3 0 2026-03-10T12:38:12.511 INFO:tasks.workunit.client.0.vm00.stdout:7/618: fdatasync da/d25/f2b 0 2026-03-10T12:38:12.511 INFO:tasks.workunit.client.0.vm00.stdout:7/619: dread - da/d41/d7b/fb0 zero size 2026-03-10T12:38:12.523 INFO:tasks.workunit.client.0.vm00.stdout:4/876: rename df/d32/d76/c9b to df/d1f/d22/d26/dab/c120 0 2026-03-10T12:38:12.528 INFO:tasks.workunit.client.1.vm07.stdout:4/779: rename d0/d5c/d7c/ff9 to d0/d4/d5/d78/dc5/df7/db2/dd5/f110 0 2026-03-10T12:38:12.531 INFO:tasks.workunit.client.0.vm00.stdout:3/873: truncate dd/d2a/da2/db4/fdb 742601 0 2026-03-10T12:38:12.533 INFO:tasks.workunit.client.0.vm00.stdout:6/580: write d2/d16/f47 [5127383,64304] 0 2026-03-10T12:38:12.534 INFO:tasks.workunit.client.1.vm07.stdout:8/637: write d1/d3/d11/f90 [676283,123837] 0 2026-03-10T12:38:12.537 INFO:tasks.workunit.client.1.vm07.stdout:2/587: dwrite d0/d29/f32 [0,4194304] 0 2026-03-10T12:38:12.543 INFO:tasks.workunit.client.0.vm00.stdout:2/877: dwrite d4/dd/da7/fd2 [0,4194304] 0 2026-03-10T12:38:12.543 INFO:tasks.workunit.client.1.vm07.stdout:6/641: dwrite d1/d4/d6/d46/d4d/f22 [0,4194304] 0 2026-03-10T12:38:12.553 INFO:tasks.workunit.client.0.vm00.stdout:4/877: rename df/la6 to df/d1f/d36/d3a/d41/d111/l121 0 2026-03-10T12:38:12.554 INFO:tasks.workunit.client.0.vm00.stdout:4/878: rename df/d1f/d22/d26/d65/da7 to df/d1f/d22/d26/d65/da7/d10e/d122 22 2026-03-10T12:38:12.560 INFO:tasks.workunit.client.0.vm00.stdout:5/914: getdents d1f/d26/d2e/d58/d6b/deb 0 2026-03-10T12:38:12.560 INFO:tasks.workunit.client.0.vm00.stdout:5/915: readlink d1f/d26/l2d 0 2026-03-10T12:38:12.562 INFO:tasks.workunit.client.0.vm00.stdout:3/874: rmdir dd/d3d/d8a/de0/d55/dd3/df4 0 2026-03-10T12:38:12.566 INFO:tasks.workunit.client.0.vm00.stdout:5/916: creat d1f/d26/d2b/d37/db2/f142 x:0 0 0 2026-03-10T12:38:12.574 INFO:tasks.workunit.client.1.vm07.stdout:9/745: dwrite d5/d13/d57/f95 [0,4194304] 0 2026-03-10T12:38:12.575 INFO:tasks.workunit.client.1.vm07.stdout:9/746: fdatasync d5/d16/d23/d26/f5c 0 2026-03-10T12:38:12.579 INFO:tasks.workunit.client.0.vm00.stdout:1/877: dread da/fe0 [0,4194304] 0 2026-03-10T12:38:12.580 INFO:tasks.workunit.client.0.vm00.stdout:0/737: dwrite d3/d7/d4c/d5b/d38/db3/de2/f68 [0,4194304] 0 2026-03-10T12:38:12.580 INFO:tasks.workunit.client.1.vm07.stdout:5/711: dwrite d0/d22/d18/d3e/d53/faa [0,4194304] 0 2026-03-10T12:38:12.584 INFO:tasks.workunit.client.0.vm00.stdout:3/875: link dd/d2a/da2/de1/d38/f63 dd/d2a/da2/de1/d101/f123 0 2026-03-10T12:38:12.594 INFO:tasks.workunit.client.1.vm07.stdout:0/741: dwrite d0/d14/d5f/d3b/f5b [0,4194304] 0 2026-03-10T12:38:12.594 INFO:tasks.workunit.client.0.vm00.stdout:3/876: chown dd/d64/d93/l111 940 1 2026-03-10T12:38:12.594 INFO:tasks.workunit.client.0.vm00.stdout:1/878: creat da/d21/db3/d59/d120/d72/d7e/f124 x:0 0 0 2026-03-10T12:38:12.595 INFO:tasks.workunit.client.0.vm00.stdout:7/620: mknod da/d3f/dd1/cda 0 2026-03-10T12:38:12.597 INFO:tasks.workunit.client.0.vm00.stdout:1/879: unlink da/d21/c5c 0 2026-03-10T12:38:12.598 INFO:tasks.workunit.client.0.vm00.stdout:6/581: mknod d2/d14/dc0/cd3 0 2026-03-10T12:38:12.603 INFO:tasks.workunit.client.0.vm00.stdout:3/877: truncate dd/d18/d14/fa0 367594 0 2026-03-10T12:38:12.605 INFO:tasks.workunit.client.0.vm00.stdout:4/879: sync 2026-03-10T12:38:12.606 INFO:tasks.workunit.client.0.vm00.stdout:1/880: symlink da/d24/d28/d67/da2/d78/dbe/l125 0 2026-03-10T12:38:12.608 INFO:tasks.workunit.client.1.vm07.stdout:2/588: rename d0/d42/d26/d7d/faa to d0/d42/d26/d7d/fc8 0 2026-03-10T12:38:12.610 INFO:tasks.workunit.client.0.vm00.stdout:4/880: dwrite df/d1f/d36/d3a/d41/fc7 [4194304,4194304] 0 2026-03-10T12:38:12.610 INFO:tasks.workunit.client.0.vm00.stdout:3/878: unlink dd/d18/d13/l68 0 2026-03-10T12:38:12.615 INFO:tasks.workunit.client.1.vm07.stdout:6/642: symlink d1/d4/d6/d43/d88/d97/lcc 0 2026-03-10T12:38:12.618 INFO:tasks.workunit.client.0.vm00.stdout:7/621: mknod da/d26/cdb 0 2026-03-10T12:38:12.620 INFO:tasks.workunit.client.0.vm00.stdout:1/881: sync 2026-03-10T12:38:12.627 INFO:tasks.workunit.client.0.vm00.stdout:0/738: rename d3/d7/d4c/d5b/d38/db3/de2/fa0 to d3/d40/fec 0 2026-03-10T12:38:12.628 INFO:tasks.workunit.client.0.vm00.stdout:0/739: chown d3/d7/d4c/d5b/d38/db3/fbb 1991 1 2026-03-10T12:38:12.628 INFO:tasks.workunit.client.1.vm07.stdout:9/747: creat d5/d1f/d7d/ffb x:0 0 0 2026-03-10T12:38:12.629 INFO:tasks.workunit.client.1.vm07.stdout:9/748: chown d5/d13/d22/c2d 1 1 2026-03-10T12:38:12.631 INFO:tasks.workunit.client.0.vm00.stdout:3/879: mknod dd/d18/d13/d1d/dc6/d106/c124 0 2026-03-10T12:38:12.636 INFO:tasks.workunit.client.0.vm00.stdout:0/740: mkdir d3/d7/d4c/dcc/ded 0 2026-03-10T12:38:12.638 INFO:tasks.workunit.client.0.vm00.stdout:0/741: dwrite d3/db/d77/d82/fe9 [0,4194304] 0 2026-03-10T12:38:12.641 INFO:tasks.workunit.client.0.vm00.stdout:7/622: mknod da/d26/d37/dc7/cdc 0 2026-03-10T12:38:12.653 INFO:tasks.workunit.client.0.vm00.stdout:0/742: symlink d3/d22/lee 0 2026-03-10T12:38:12.661 INFO:tasks.workunit.client.1.vm07.stdout:2/589: symlink d0/d42/d1f/d20/lc9 0 2026-03-10T12:38:12.663 INFO:tasks.workunit.client.0.vm00.stdout:0/743: mknod d3/db/d77/cef 0 2026-03-10T12:38:12.664 INFO:tasks.workunit.client.1.vm07.stdout:6/643: rmdir d1/d4/d6/d4e/d64 39 2026-03-10T12:38:12.669 INFO:tasks.workunit.client.0.vm00.stdout:7/623: rename da/d41/d48/l5d to da/d26/d50/ldd 0 2026-03-10T12:38:12.672 INFO:tasks.workunit.client.0.vm00.stdout:0/744: truncate d3/d7/d4c/d5b/d38/db3/fcd 410661 0 2026-03-10T12:38:12.676 INFO:tasks.workunit.client.1.vm07.stdout:0/742: creat d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/ffb x:0 0 0 2026-03-10T12:38:12.684 INFO:tasks.workunit.client.1.vm07.stdout:6/644: unlink d1/d4/d6/d4e/l85 0 2026-03-10T12:38:12.688 INFO:tasks.workunit.client.1.vm07.stdout:2/590: mkdir d0/d29/d64/db5/dbb/dca 0 2026-03-10T12:38:12.692 INFO:tasks.workunit.client.1.vm07.stdout:6/645: mknod d1/d4/d6/d43/ccd 0 2026-03-10T12:38:12.693 INFO:tasks.workunit.client.1.vm07.stdout:6/646: write d1/d4/d6/d16/fbc [4894546,112100] 0 2026-03-10T12:38:12.697 INFO:tasks.workunit.client.1.vm07.stdout:2/591: symlink d0/d29/d64/d6c/lcb 0 2026-03-10T12:38:12.697 INFO:tasks.workunit.client.1.vm07.stdout:2/592: chown d0/d42/d4e/daf 242295778 1 2026-03-10T12:38:12.700 INFO:tasks.workunit.client.0.vm00.stdout:8/786: write d0/f10 [2615332,66911] 0 2026-03-10T12:38:12.702 INFO:tasks.workunit.client.0.vm00.stdout:8/787: creat d0/d46/d89/ff4 x:0 0 0 2026-03-10T12:38:12.703 INFO:tasks.workunit.client.0.vm00.stdout:8/788: stat d0/dd/d38/d81/df3/lf1 0 2026-03-10T12:38:12.703 INFO:tasks.workunit.client.0.vm00.stdout:8/789: dread - d0/d46/d89/f91 zero size 2026-03-10T12:38:12.705 INFO:tasks.workunit.client.0.vm00.stdout:8/790: creat d0/d93/d36/d7d/ff5 x:0 0 0 2026-03-10T12:38:12.706 INFO:tasks.workunit.client.1.vm07.stdout:6/647: truncate d1/d4/d6/d16/d1a/d33/f3c 1748760 0 2026-03-10T12:38:12.706 INFO:tasks.workunit.client.0.vm00.stdout:8/791: write d0/d93/d36/d5b/f69 [2589113,89692] 0 2026-03-10T12:38:12.708 INFO:tasks.workunit.client.0.vm00.stdout:9/889: write d0/d3d/d59/d4e/dba/d19/fb6 [836457,2233] 0 2026-03-10T12:38:12.709 INFO:tasks.workunit.client.0.vm00.stdout:8/792: rmdir d0/d46 39 2026-03-10T12:38:12.710 INFO:tasks.workunit.client.0.vm00.stdout:8/793: write d0/d93/d36/d51/fe0 [439236,30465] 0 2026-03-10T12:38:12.711 INFO:tasks.workunit.client.0.vm00.stdout:9/890: rename d0/d3d/d59/cd3 to d0/d3d/d125/c140 0 2026-03-10T12:38:12.718 INFO:tasks.workunit.client.0.vm00.stdout:8/794: symlink d0/d46/d7e/lf6 0 2026-03-10T12:38:12.719 INFO:tasks.workunit.client.0.vm00.stdout:9/891: rename d0/f1a to d0/d3d/d59/d4e/dba/d1e/dcb/f141 0 2026-03-10T12:38:12.719 INFO:tasks.workunit.client.0.vm00.stdout:8/795: write d0/d93/d2d/f44 [6499016,47739] 0 2026-03-10T12:38:12.721 INFO:tasks.workunit.client.0.vm00.stdout:9/892: mknod d0/d9b/c142 0 2026-03-10T12:38:12.721 INFO:tasks.workunit.client.0.vm00.stdout:9/893: chown d0/d7f/l90 1147901 1 2026-03-10T12:38:12.723 INFO:tasks.workunit.client.0.vm00.stdout:8/796: mknod d0/dd/d38/d81/df3/d9b/cf7 0 2026-03-10T12:38:12.725 INFO:tasks.workunit.client.1.vm07.stdout:2/593: getdents d0/d5b/d98 0 2026-03-10T12:38:12.726 INFO:tasks.workunit.client.1.vm07.stdout:2/594: write d0/d29/d64/d6c/f71 [30698,83655] 0 2026-03-10T12:38:12.726 INFO:tasks.workunit.client.0.vm00.stdout:8/797: creat d0/d93/d60/ff8 x:0 0 0 2026-03-10T12:38:12.726 INFO:tasks.workunit.client.0.vm00.stdout:9/894: getdents d0/d3d/d59/d4e/d104/d12d 0 2026-03-10T12:38:12.727 INFO:tasks.workunit.client.0.vm00.stdout:9/895: readlink d0/d3d/d59/d4e/dba/d1e/d27/lc8 0 2026-03-10T12:38:12.728 INFO:tasks.workunit.client.1.vm07.stdout:3/682: write dc/dd/d28/d7a/f88 [1114382,110935] 0 2026-03-10T12:38:12.730 INFO:tasks.workunit.client.0.vm00.stdout:9/896: fsync d0/d3d/d43/d53/f79 0 2026-03-10T12:38:12.731 INFO:tasks.workunit.client.1.vm07.stdout:1/661: write d9/f1a [1500411,83876] 0 2026-03-10T12:38:12.731 INFO:tasks.workunit.client.1.vm07.stdout:3/683: dread dc/d18/f34 [0,4194304] 0 2026-03-10T12:38:12.732 INFO:tasks.workunit.client.1.vm07.stdout:3/684: chown dc/dd/d28/d7a/d8e/f9b 7467 1 2026-03-10T12:38:12.735 INFO:tasks.workunit.client.1.vm07.stdout:7/621: write d0/d67/d6f/d80/fac [841780,11185] 0 2026-03-10T12:38:12.740 INFO:tasks.workunit.client.1.vm07.stdout:4/780: write d0/d4/d10/d3c/d2b/f60 [1595067,9538] 0 2026-03-10T12:38:12.746 INFO:tasks.workunit.client.0.vm00.stdout:8/798: creat d0/d93/d17/ff9 x:0 0 0 2026-03-10T12:38:12.746 INFO:tasks.workunit.client.0.vm00.stdout:8/799: readlink d0/d93/d2d/lbb 0 2026-03-10T12:38:12.748 INFO:tasks.workunit.client.0.vm00.stdout:8/800: creat d0/d93/d17/ffa x:0 0 0 2026-03-10T12:38:12.751 INFO:tasks.workunit.client.0.vm00.stdout:8/801: creat d0/d93/d2d/dc8/ffb x:0 0 0 2026-03-10T12:38:12.752 INFO:tasks.workunit.client.0.vm00.stdout:8/802: readlink d0/d58/lca 0 2026-03-10T12:38:12.752 INFO:tasks.workunit.client.0.vm00.stdout:8/803: stat d0/d93/d2d/d49 0 2026-03-10T12:38:12.753 INFO:tasks.workunit.client.0.vm00.stdout:8/804: dread - d0/d46/d89/ff4 zero size 2026-03-10T12:38:12.753 INFO:tasks.workunit.client.1.vm07.stdout:5/712: write d0/d22/d18/d3e/d5d/dcf/fd2 [877286,77192] 0 2026-03-10T12:38:12.763 INFO:tasks.workunit.client.0.vm00.stdout:6/582: dread d2/d51/f63 [0,4194304] 0 2026-03-10T12:38:12.765 INFO:tasks.workunit.client.1.vm07.stdout:8/638: dwrite d1/d3/d6c/fce [0,4194304] 0 2026-03-10T12:38:12.766 INFO:tasks.workunit.client.0.vm00.stdout:8/805: unlink d0/d93/d36/d5b/f69 0 2026-03-10T12:38:12.767 INFO:tasks.workunit.client.0.vm00.stdout:6/583: dwrite d2/d42/d80/fbd [0,4194304] 0 2026-03-10T12:38:12.769 INFO:tasks.workunit.client.1.vm07.stdout:1/662: mkdir d9/d2d/d4f/dde 0 2026-03-10T12:38:12.769 INFO:tasks.workunit.client.0.vm00.stdout:6/584: readlink d2/d42/l6b 0 2026-03-10T12:38:12.773 INFO:tasks.workunit.client.0.vm00.stdout:5/917: dwrite d1f/d26/d2b/d35/d78/d99/daf/fdb [0,4194304] 0 2026-03-10T12:38:12.780 INFO:tasks.workunit.client.1.vm07.stdout:7/622: read d0/f3f [4043467,27406] 0 2026-03-10T12:38:12.787 INFO:tasks.workunit.client.1.vm07.stdout:9/749: dwrite d5/d13/d57/d4f/d6a/f8e [0,4194304] 0 2026-03-10T12:38:12.794 INFO:tasks.workunit.client.1.vm07.stdout:4/781: symlink d0/d4/d10/d3c/d2b/l111 0 2026-03-10T12:38:12.795 INFO:tasks.workunit.client.0.vm00.stdout:5/918: dwrite d1f/d26/d2e/d58/d6b/d86/f12f [0,4194304] 0 2026-03-10T12:38:12.795 INFO:tasks.workunit.client.0.vm00.stdout:6/585: creat d2/d42/fd4 x:0 0 0 2026-03-10T12:38:12.795 INFO:tasks.workunit.client.0.vm00.stdout:5/919: chown d1f/d26/d2b/d37/f8a 1058277782 1 2026-03-10T12:38:12.795 INFO:tasks.workunit.client.0.vm00.stdout:8/806: creat d0/d93/d17/d48/ffc x:0 0 0 2026-03-10T12:38:12.795 INFO:tasks.workunit.client.0.vm00.stdout:1/882: write da/d24/d28/fdd [227233,52464] 0 2026-03-10T12:38:12.795 INFO:tasks.workunit.client.0.vm00.stdout:1/883: chown da/d24/d28/d67/db0 59445705 1 2026-03-10T12:38:12.797 INFO:tasks.workunit.client.0.vm00.stdout:4/881: dwrite df/d1f/d22/d26/f9c [0,4194304] 0 2026-03-10T12:38:12.798 INFO:tasks.workunit.client.0.vm00.stdout:8/807: dwrite d0/d93/d36/d7d/fed [0,4194304] 0 2026-03-10T12:38:12.801 INFO:tasks.workunit.client.0.vm00.stdout:7/624: dwrite da/d26/d50/d73/d89/fac [0,4194304] 0 2026-03-10T12:38:12.803 INFO:tasks.workunit.client.0.vm00.stdout:7/625: chown da/f10 2113 1 2026-03-10T12:38:12.814 INFO:tasks.workunit.client.0.vm00.stdout:5/920: chown d1f/d26/d2e/d58/d10c/d123/d5b/l83 339 1 2026-03-10T12:38:12.815 INFO:tasks.workunit.client.0.vm00.stdout:8/808: symlink d0/d46/d7e/lfd 0 2026-03-10T12:38:12.815 INFO:tasks.workunit.client.1.vm07.stdout:5/713: symlink d0/d22/d18/d19/d21/d54/dcb/db8/lfd 0 2026-03-10T12:38:12.815 INFO:tasks.workunit.client.1.vm07.stdout:2/595: getdents d0/d42/d1f/dc0 0 2026-03-10T12:38:12.815 INFO:tasks.workunit.client.1.vm07.stdout:2/596: dwrite d0/d42/d26/f2e [4194304,4194304] 0 2026-03-10T12:38:12.815 INFO:tasks.workunit.client.1.vm07.stdout:2/597: dwrite d0/d42/d4e/d77/f89 [0,4194304] 0 2026-03-10T12:38:12.823 INFO:tasks.workunit.client.0.vm00.stdout:5/921: mknod d1f/d26/d2b/d35/d78/c143 0 2026-03-10T12:38:12.823 INFO:tasks.workunit.client.0.vm00.stdout:5/922: stat d1f/d26/d2b/fd0 0 2026-03-10T12:38:12.824 INFO:tasks.workunit.client.0.vm00.stdout:5/923: chown d1f/d26/d2e/d58/d10c/d123/d5b/l93 5363 1 2026-03-10T12:38:12.825 INFO:tasks.workunit.client.0.vm00.stdout:5/924: chown d1f/d26/d2b/d35/d78/d7f/lf2 9453924 1 2026-03-10T12:38:12.828 INFO:tasks.workunit.client.1.vm07.stdout:8/639: mkdir d1/d3/d6/d50/d70/dcf 0 2026-03-10T12:38:12.828 INFO:tasks.workunit.client.0.vm00.stdout:1/884: mkdir da/d12/d126 0 2026-03-10T12:38:12.829 INFO:tasks.workunit.client.0.vm00.stdout:1/885: write da/d24/d28/d67/fed [669421,92229] 0 2026-03-10T12:38:12.832 INFO:tasks.workunit.client.1.vm07.stdout:1/663: stat d9/df/c6a 0 2026-03-10T12:38:12.839 INFO:tasks.workunit.client.0.vm00.stdout:4/882: truncate df/d1f/d36/f51 154598 0 2026-03-10T12:38:12.842 INFO:tasks.workunit.client.1.vm07.stdout:1/664: dread d9/d2d/fcb [0,4194304] 0 2026-03-10T12:38:12.844 INFO:tasks.workunit.client.0.vm00.stdout:7/626: unlink da/cf 0 2026-03-10T12:38:12.847 INFO:tasks.workunit.client.1.vm07.stdout:7/623: truncate d0/d61/db4/f54 423216 0 2026-03-10T12:38:12.855 INFO:tasks.workunit.client.0.vm00.stdout:2/878: dread d4/d6/d2d/d31/f79 [0,4194304] 0 2026-03-10T12:38:12.859 INFO:tasks.workunit.client.0.vm00.stdout:7/627: chown da/d26/d50/d73/l7f 3844 1 2026-03-10T12:38:12.859 INFO:tasks.workunit.client.1.vm07.stdout:2/598: creat d0/d29/d64/d74/d75/fcc x:0 0 0 2026-03-10T12:38:12.860 INFO:tasks.workunit.client.1.vm07.stdout:2/599: write d0/d29/d64/d74/d88/f51 [3715949,49842] 0 2026-03-10T12:38:12.861 INFO:tasks.workunit.client.0.vm00.stdout:8/809: dwrite d0/dd/f9a [4194304,4194304] 0 2026-03-10T12:38:12.865 INFO:tasks.workunit.client.1.vm07.stdout:8/640: creat d1/d3/d18/fd0 x:0 0 0 2026-03-10T12:38:12.869 INFO:tasks.workunit.client.0.vm00.stdout:1/886: creat da/d24/d5a/d71/d10c/f127 x:0 0 0 2026-03-10T12:38:12.872 INFO:tasks.workunit.client.0.vm00.stdout:7/628: creat da/d1b/d40/db6/fde x:0 0 0 2026-03-10T12:38:12.880 INFO:tasks.workunit.client.0.vm00.stdout:2/879: mknod d4/d53/d9e/d101/c11a 0 2026-03-10T12:38:12.883 INFO:tasks.workunit.client.0.vm00.stdout:4/883: symlink df/d1f/d36/d3a/l123 0 2026-03-10T12:38:12.890 INFO:tasks.workunit.client.0.vm00.stdout:0/745: dwrite d3/db/d77/d82/fce [0,4194304] 0 2026-03-10T12:38:12.893 INFO:tasks.workunit.client.1.vm07.stdout:0/743: dwrite d0/d14/f19 [4194304,4194304] 0 2026-03-10T12:38:12.898 INFO:tasks.workunit.client.0.vm00.stdout:5/925: creat d1f/d26/d2b/f144 x:0 0 0 2026-03-10T12:38:12.902 INFO:tasks.workunit.client.0.vm00.stdout:5/926: dwrite d1f/d26/d2b/d37/f81 [0,4194304] 0 2026-03-10T12:38:12.904 INFO:tasks.workunit.client.0.vm00.stdout:5/927: write d1f/d26/d2e/d58/d10c/d123/fa7 [3520604,96017] 0 2026-03-10T12:38:12.906 INFO:tasks.workunit.client.0.vm00.stdout:1/887: truncate da/d12/d26/f2e 175486 0 2026-03-10T12:38:12.908 INFO:tasks.workunit.client.1.vm07.stdout:6/648: truncate d1/d4/d6/d4e/d64/f6f 3723005 0 2026-03-10T12:38:12.916 INFO:tasks.workunit.client.0.vm00.stdout:3/880: write dd/f25 [68531,62446] 0 2026-03-10T12:38:12.917 INFO:tasks.workunit.client.0.vm00.stdout:3/881: write dd/d3d/d8a/f102 [214172,28690] 0 2026-03-10T12:38:12.920 INFO:tasks.workunit.client.0.vm00.stdout:4/884: creat df/d6c/f124 x:0 0 0 2026-03-10T12:38:12.924 INFO:tasks.workunit.client.1.vm07.stdout:9/750: symlink d5/d16/d23/lfc 0 2026-03-10T12:38:12.924 INFO:tasks.workunit.client.1.vm07.stdout:9/751: chown d5/d13/d22/f32 26 1 2026-03-10T12:38:12.927 INFO:tasks.workunit.client.0.vm00.stdout:7/629: dread da/d26/d37/d56/f9a [0,4194304] 0 2026-03-10T12:38:12.927 INFO:tasks.workunit.client.0.vm00.stdout:7/630: chown da/d25/d2c/d82/d68/fcd 7132 1 2026-03-10T12:38:12.928 INFO:tasks.workunit.client.0.vm00.stdout:3/882: dread - dd/d18/d14/feb zero size 2026-03-10T12:38:12.930 INFO:tasks.workunit.client.0.vm00.stdout:4/885: mknod df/d1f/d22/d26/dab/d73/c125 0 2026-03-10T12:38:12.933 INFO:tasks.workunit.client.1.vm07.stdout:7/624: rename d0/d47/c82 to d0/d57/d62/d90/dce/cd1 0 2026-03-10T12:38:12.939 INFO:tasks.workunit.client.0.vm00.stdout:9/897: rmdir d0/d3d/d59 39 2026-03-10T12:38:12.945 INFO:tasks.workunit.client.0.vm00.stdout:8/810: getdents d0 0 2026-03-10T12:38:12.946 INFO:tasks.workunit.client.1.vm07.stdout:9/752: fdatasync d5/d13/d57/d4f/f88 0 2026-03-10T12:38:12.946 INFO:tasks.workunit.client.1.vm07.stdout:2/600: mkdir d0/dcd 0 2026-03-10T12:38:12.947 INFO:tasks.workunit.client.0.vm00.stdout:2/880: rename d4/dd/db9/d6d/c6a to d4/d53/d9e/c11b 0 2026-03-10T12:38:12.949 INFO:tasks.workunit.client.0.vm00.stdout:9/898: mkdir d0/d5/d143 0 2026-03-10T12:38:12.951 INFO:tasks.workunit.client.1.vm07.stdout:7/625: dwrite d0/d61/db4/f9e [0,4194304] 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.1.vm07.stdout:0/744: symlink d0/d14/d5f/d3b/dbc/d8d/lfc 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.1.vm07.stdout:8/641: creat d1/d3/d40/fd1 x:0 0 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.0.vm00.stdout:5/928: getdents d1f/d26/de3 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.0.vm00.stdout:1/888: getdents da/d21/db3/d59/d120/dab 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.0.vm00.stdout:0/746: rename d3/d7/d4c/d5b/dc5/fe6 to d3/d22/d3a/deb/ff0 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.0.vm00.stdout:1/889: dwrite da/d21/d27/d6a/f9e [0,4194304] 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.0.vm00.stdout:0/747: dwrite d3/db/da4/fa7 [4194304,4194304] 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.0.vm00.stdout:2/881: readlink d4/d6/d2d/de5/l110 0 2026-03-10T12:38:12.968 INFO:tasks.workunit.client.1.vm07.stdout:7/626: creat d0/d67/d6f/d80/fd2 x:0 0 0 2026-03-10T12:38:12.976 INFO:tasks.workunit.client.0.vm00.stdout:0/748: truncate d3/d22/d3a/fd9 1516209 0 2026-03-10T12:38:12.979 INFO:tasks.workunit.client.1.vm07.stdout:8/642: truncate d1/d3/d6c/fc5 5067100 0 2026-03-10T12:38:12.981 INFO:tasks.workunit.client.1.vm07.stdout:3/685: write dc/d18/d24/f3f [2714315,97018] 0 2026-03-10T12:38:12.983 INFO:tasks.workunit.client.0.vm00.stdout:8/811: mkdir d0/dd/dfe 0 2026-03-10T12:38:12.988 INFO:tasks.workunit.client.0.vm00.stdout:5/929: creat d1f/d6a/d94/dc3/f145 x:0 0 0 2026-03-10T12:38:12.988 INFO:tasks.workunit.client.1.vm07.stdout:7/627: read d0/f20 [725291,39122] 0 2026-03-10T12:38:12.993 INFO:tasks.workunit.client.0.vm00.stdout:2/882: fdatasync d4/d6/d2d/d31/f46 0 2026-03-10T12:38:13.002 INFO:tasks.workunit.client.0.vm00.stdout:1/890: symlink da/d21/db3/d59/d120/d72/d121/l128 0 2026-03-10T12:38:13.004 INFO:tasks.workunit.client.0.vm00.stdout:1/891: write da/d21/db3/d59/da6/da4/dda/dc0/dfe/f107 [656788,109568] 0 2026-03-10T12:38:13.005 INFO:tasks.workunit.client.1.vm07.stdout:3/686: read dc/d18/f79 [72838,107350] 0 2026-03-10T12:38:13.008 INFO:tasks.workunit.client.1.vm07.stdout:7/628: dwrite d0/d61/db4/fc4 [0,4194304] 0 2026-03-10T12:38:13.021 INFO:tasks.workunit.client.1.vm07.stdout:0/745: link d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/fd5 d0/d14/d5f/d76/d2f/d31/d79/ffd 0 2026-03-10T12:38:13.021 INFO:tasks.workunit.client.0.vm00.stdout:4/886: link df/d1f/d36/d3a/d41/fe0 df/f126 0 2026-03-10T12:38:13.021 INFO:tasks.workunit.client.0.vm00.stdout:8/812: rename d0/d93/d17/l4e to d0/d93/d36/d51/lff 0 2026-03-10T12:38:13.022 INFO:tasks.workunit.client.0.vm00.stdout:6/586: dwrite d2/d16/d29/f4c [0,4194304] 0 2026-03-10T12:38:13.026 INFO:tasks.workunit.client.1.vm07.stdout:7/629: truncate d0/d61/d79/f8d 236519 0 2026-03-10T12:38:13.026 INFO:tasks.workunit.client.1.vm07.stdout:0/746: dread - d0/d14/d5f/d76/d2f/d31/d79/d85/fcf zero size 2026-03-10T12:38:13.028 INFO:tasks.workunit.client.1.vm07.stdout:2/601: sync 2026-03-10T12:38:13.028 INFO:tasks.workunit.client.0.vm00.stdout:1/892: rmdir da/d24/d5a 39 2026-03-10T12:38:13.030 INFO:tasks.workunit.client.1.vm07.stdout:2/602: truncate d0/d29/d64/d6c/fb9 454603 0 2026-03-10T12:38:13.031 INFO:tasks.workunit.client.1.vm07.stdout:3/687: symlink dc/dd/d43/d76/d95/dde/le9 0 2026-03-10T12:38:13.039 INFO:tasks.workunit.client.0.vm00.stdout:2/883: mknod d4/d53/d76/d9b/dad/c11c 0 2026-03-10T12:38:13.040 INFO:tasks.workunit.client.0.vm00.stdout:9/899: link d0/d3d/d43/d53/d126/l12f d0/d3d/d59/d4e/dba/d1e/d85/d98/l144 0 2026-03-10T12:38:13.040 INFO:tasks.workunit.client.0.vm00.stdout:9/900: readlink d0/d9b/l13f 0 2026-03-10T12:38:13.042 INFO:tasks.workunit.client.0.vm00.stdout:8/813: fsync d0/d93/fa5 0 2026-03-10T12:38:13.044 INFO:tasks.workunit.client.1.vm07.stdout:7/630: dread d0/d57/d62/f75 [0,4194304] 0 2026-03-10T12:38:13.048 INFO:tasks.workunit.client.0.vm00.stdout:1/893: dread - da/d24/d28/fd6 zero size 2026-03-10T12:38:13.053 INFO:tasks.workunit.client.1.vm07.stdout:0/747: creat d0/d14/d5f/d76/d2f/ffe x:0 0 0 2026-03-10T12:38:13.053 INFO:tasks.workunit.client.1.vm07.stdout:0/748: write d0/d14/d5f/d76/d2f/ffe [76387,13627] 0 2026-03-10T12:38:13.053 INFO:tasks.workunit.client.1.vm07.stdout:0/749: chown d0/d14/d5f/d76/l43 1453981 1 2026-03-10T12:38:13.053 INFO:tasks.workunit.client.1.vm07.stdout:2/603: creat d0/d80/d93/fce x:0 0 0 2026-03-10T12:38:13.053 INFO:tasks.workunit.client.0.vm00.stdout:2/884: rmdir d4/d10f 39 2026-03-10T12:38:13.059 INFO:tasks.workunit.client.1.vm07.stdout:7/631: rmdir d0/d57/d62 39 2026-03-10T12:38:13.060 INFO:tasks.workunit.client.0.vm00.stdout:8/814: creat d0/dd/d38/f100 x:0 0 0 2026-03-10T12:38:13.065 INFO:tasks.workunit.client.0.vm00.stdout:5/930: getdents d1f/d26/de3/db7 0 2026-03-10T12:38:13.072 INFO:tasks.workunit.client.0.vm00.stdout:1/894: mkdir da/d21/d39/d129 0 2026-03-10T12:38:13.073 INFO:tasks.workunit.client.1.vm07.stdout:7/632: rename d0/d67/d6f/l8c to d0/d61/ld3 0 2026-03-10T12:38:13.075 INFO:tasks.workunit.client.1.vm07.stdout:3/688: link dc/d18/d99/da3/fb1 dc/dd/d1f/d45/fea 0 2026-03-10T12:38:13.076 INFO:tasks.workunit.client.1.vm07.stdout:3/689: stat dc/d18/d24/fe3 0 2026-03-10T12:38:13.082 INFO:tasks.workunit.client.0.vm00.stdout:8/815: dread d0/d93/fa5 [0,4194304] 0 2026-03-10T12:38:13.088 INFO:tasks.workunit.client.0.vm00.stdout:8/816: chown d0/d58/d68 4574997 1 2026-03-10T12:38:13.088 INFO:tasks.workunit.client.1.vm07.stdout:3/690: truncate dc/dd/d28/d7a/f7f 304850 0 2026-03-10T12:38:13.088 INFO:tasks.workunit.client.1.vm07.stdout:7/633: rmdir d0/d57/d62/d90/dce 39 2026-03-10T12:38:13.088 INFO:tasks.workunit.client.1.vm07.stdout:3/691: dwrite dc/d18/d2d/f71 [0,4194304] 0 2026-03-10T12:38:13.088 INFO:tasks.workunit.client.1.vm07.stdout:3/692: readlink dc/dd/d1f/l23 0 2026-03-10T12:38:13.088 INFO:tasks.workunit.client.1.vm07.stdout:0/750: sync 2026-03-10T12:38:13.096 INFO:tasks.workunit.client.1.vm07.stdout:7/634: mkdir d0/d47/da0/dd4 0 2026-03-10T12:38:13.098 INFO:tasks.workunit.client.1.vm07.stdout:3/693: creat dc/dd/d43/feb x:0 0 0 2026-03-10T12:38:13.099 INFO:tasks.workunit.client.1.vm07.stdout:3/694: read dc/d18/d2d/f71 [3560607,129386] 0 2026-03-10T12:38:13.100 INFO:tasks.workunit.client.1.vm07.stdout:0/751: creat d0/d14/d5f/d76/d2f/d31/d79/d85/fff x:0 0 0 2026-03-10T12:38:13.101 INFO:tasks.workunit.client.0.vm00.stdout:2/885: truncate d4/d6/f89 1064375 0 2026-03-10T12:38:13.107 INFO:tasks.workunit.client.1.vm07.stdout:7/635: creat d0/d57/d62/d90/fd5 x:0 0 0 2026-03-10T12:38:13.108 INFO:tasks.workunit.client.1.vm07.stdout:0/752: getdents d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65 0 2026-03-10T12:38:13.112 INFO:tasks.workunit.client.1.vm07.stdout:0/753: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf [0,4194304] 0 2026-03-10T12:38:13.113 INFO:tasks.workunit.client.1.vm07.stdout:0/754: write d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/ffb [318728,28477] 0 2026-03-10T12:38:13.114 INFO:tasks.workunit.client.1.vm07.stdout:0/755: dread - d0/d14/d5f/d41/d6a/fe0 zero size 2026-03-10T12:38:13.115 INFO:tasks.workunit.client.1.vm07.stdout:0/756: write d0/d14/d5f/d76/d2f/d31/d4f/f92 [2340712,46311] 0 2026-03-10T12:38:13.127 INFO:tasks.workunit.client.1.vm07.stdout:0/757: sync 2026-03-10T12:38:13.128 INFO:tasks.workunit.client.1.vm07.stdout:0/758: stat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fbf 0 2026-03-10T12:38:13.129 INFO:tasks.workunit.client.1.vm07.stdout:0/759: stat d0/d14/d5f/d41 0 2026-03-10T12:38:13.130 INFO:tasks.workunit.client.1.vm07.stdout:0/760: truncate d0/d14/d5f/d76/d93/fdf 386156 0 2026-03-10T12:38:13.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.131+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mon.1 v2:192.168.123.107:3300/0 8 ==== mgrmap(e 22) v1 ==== 50027+0+0 (secure 0 0 0) 0x7fc25c052fe0 con 0x7fc260071980 2026-03-10T12:38:13.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.131+0000 7fc256ffd700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fc24c041c20 0x7fc24c044000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:13.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.131+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 --> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc24c044540 con 0x7fc24c041c20 2026-03-10T12:38:13.134 INFO:tasks.workunit.client.1.vm07.stdout:0/761: mknod d0/d14/d5f/d41/d6a/c100 0 2026-03-10T12:38:13.136 INFO:tasks.workunit.client.1.vm07.stdout:0/762: creat d0/d14/d5f/d76/d2f/d31/d79/d9e/f101 x:0 0 0 2026-03-10T12:38:13.138 INFO:tasks.workunit.client.1.vm07.stdout:0/763: fdatasync d0/d14/d5f/d76/d2f/d31/d79/d85/fb5 0 2026-03-10T12:38:13.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.141+0000 7fc26557f700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fc24c041c20 0x7fc24c044000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:13.141 INFO:tasks.workunit.client.0.vm00.stdout:8/817: unlink d0/d93/d17/db1/dde/fd2 0 2026-03-10T12:38:13.142 INFO:tasks.workunit.client.0.vm00.stdout:9/901: rmdir d0/d7f/db8/df9 0 2026-03-10T12:38:13.142 INFO:tasks.workunit.client.0.vm00.stdout:2/886: unlink d4/dd/db9/l56 0 2026-03-10T12:38:13.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.144+0000 7fc26557f700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fc24c041c20 0x7fc24c044000 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc258009200 tx=0x7fc25800c960 comp rx=0 tx=0).ready entity=mgr.24461 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:13.148 INFO:tasks.workunit.client.1.vm07.stdout:4/782: write d0/d4/df2/df6/d46/d76/fa2 [5007150,64579] 0 2026-03-10T12:38:13.151 INFO:tasks.workunit.client.1.vm07.stdout:4/783: mknod d0/d5c/d7c/c112 0 2026-03-10T12:38:13.152 INFO:tasks.workunit.client.1.vm07.stdout:0/764: creat d0/d14/d5f/d41/d6a/f102 x:0 0 0 2026-03-10T12:38:13.162 INFO:tasks.workunit.client.0.vm00.stdout:2/887: mkdir d4/d6/de7/d11d 0 2026-03-10T12:38:13.162 INFO:tasks.workunit.client.0.vm00.stdout:2/888: chown d4/d6/d93/dc6/cc7 25714567 1 2026-03-10T12:38:13.163 INFO:tasks.workunit.client.1.vm07.stdout:0/765: creat d0/d14/d5f/d76/d2f/d31/d4f/d9d/f103 x:0 0 0 2026-03-10T12:38:13.169 INFO:tasks.workunit.client.1.vm07.stdout:0/766: rmdir d0/d14/d5f/d76/d2f/d31/d4f/da8/df3 0 2026-03-10T12:38:13.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.166+0000 7fc256ffd700 1 -- 192.168.123.100:0/1897598474 <== mgr.24461 v2:192.168.123.107:6828/3729807627 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7fc24c044540 con 0x7fc24c041c20 2026-03-10T12:38:13.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 -- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fc24c041c20 msgr2=0x7fc24c044000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fc24c041c20 0x7fc24c044000 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc258009200 tx=0x7fc25800c960 comp rx=0 tx=0).stop 2026-03-10T12:38:13.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 -- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260071980 msgr2=0x7fc260082550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260071980 0x7fc260082550 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fc25c00bfd0 tx=0x7fc25c009d70 comp rx=0 tx=0).stop 2026-03-10T12:38:13.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 -- 192.168.123.100:0/1897598474 shutdown_connections 2026-03-10T12:38:13.178 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fc24c041c20 0x7fc24c044000 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc260071980 0x7fc260082550 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 --2- 192.168.123.100:0/1897598474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc260082a90 0x7fc260082f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 -- 192.168.123.100:0/1897598474 >> 192.168.123.100:0/1897598474 conn(0x7fc26006d1a0 msgr2=0x7fc260076520 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:13.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 -- 192.168.123.100:0/1897598474 shutdown_connections 2026-03-10T12:38:13.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.169+0000 7fc254ff9700 1 -- 192.168.123.100:0/1897598474 wait complete. 2026-03-10T12:38:13.179 INFO:tasks.workunit.client.0.vm00.stdout:8/818: dread d0/d93/d2d/fba [0,4194304] 0 2026-03-10T12:38:13.196 INFO:tasks.workunit.client.0.vm00.stdout:3/883: write dd/d18/d13/d1d/dc6/d106/f9c [1188335,108665] 0 2026-03-10T12:38:13.199 INFO:tasks.workunit.client.1.vm07.stdout:5/714: dwrite d0/d22/d18/fb4 [0,4194304] 0 2026-03-10T12:38:13.204 INFO:tasks.workunit.client.1.vm07.stdout:1/665: write d9/d2d/d4f/d75/fab [528729,54717] 0 2026-03-10T12:38:13.205 INFO:tasks.workunit.client.1.vm07.stdout:1/666: write d9/d2d/d4f/d75/f83 [6539609,65667] 0 2026-03-10T12:38:13.206 INFO:tasks.workunit.client.0.vm00.stdout:8/819: unlink d0/dd/l5d 0 2026-03-10T12:38:13.207 INFO:tasks.workunit.client.0.vm00.stdout:8/820: write d0/d93/d17/d48/fc7 [2592575,64077] 0 2026-03-10T12:38:13.213 INFO:tasks.workunit.client.1.vm07.stdout:1/667: sync 2026-03-10T12:38:13.217 INFO:tasks.workunit.client.0.vm00.stdout:5/931: write d1f/d6a/d94/dc9/d106/f109 [910906,2309] 0 2026-03-10T12:38:13.221 INFO:tasks.workunit.client.0.vm00.stdout:3/884: getdents dd/d27/d2c 0 2026-03-10T12:38:13.228 INFO:tasks.workunit.client.1.vm07.stdout:4/784: dread d0/d4/d10/fc7 [0,4194304] 0 2026-03-10T12:38:13.229 INFO:tasks.workunit.client.0.vm00.stdout:5/932: rename d1f/d26/d2e/d58/d10c/d123/d72/c7b to d1f/d26/d2e/d58/d6b/deb/c146 0 2026-03-10T12:38:13.233 INFO:tasks.workunit.client.1.vm07.stdout:4/785: getdents d0/d4/df2/df6/d46/d76 0 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: Manager daemon vm07.kfawlb is now available 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: Migrating agent root cert to cert store 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: Migrating agent root key to cert store 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: Checking for cert/key for grafana.vm00 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: Migrating grafana.vm00 cert to cert store 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: Migrating grafana.vm00 key to cert store 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/trash_purge_schedule"}]: dispatch 2026-03-10T12:38:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:13 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/trash_purge_schedule"}]: dispatch 2026-03-10T12:38:13.237 INFO:tasks.workunit.client.1.vm07.stdout:4/786: unlink d0/d4/d5/d8f/ce4 0 2026-03-10T12:38:13.237 INFO:tasks.workunit.client.0.vm00.stdout:1/895: truncate da/d12/d26/dd2/ff9 1411289 0 2026-03-10T12:38:13.238 INFO:tasks.workunit.client.1.vm07.stdout:4/787: write d0/d4/d10/d5f/d6d/f103 [2562577,116552] 0 2026-03-10T12:38:13.244 INFO:tasks.workunit.client.0.vm00.stdout:9/902: write d0/d3d/d43/f68 [649596,84877] 0 2026-03-10T12:38:13.251 INFO:tasks.workunit.client.0.vm00.stdout:7/631: dwrite da/f17 [0,4194304] 0 2026-03-10T12:38:13.253 INFO:tasks.workunit.client.0.vm00.stdout:2/889: dwrite d4/d10f/fce [0,4194304] 0 2026-03-10T12:38:13.256 INFO:tasks.workunit.client.0.vm00.stdout:3/885: rename dd/d18/d14 to dd/d3d/d8a/de0/d55/dfd/d125 0 2026-03-10T12:38:13.262 INFO:tasks.workunit.client.0.vm00.stdout:7/632: mkdir da/d26/d37/d56/ddf 0 2026-03-10T12:38:13.262 INFO:tasks.workunit.client.0.vm00.stdout:7/633: fsync da/d41/d48/d81/fcc 0 2026-03-10T12:38:13.263 INFO:tasks.workunit.client.0.vm00.stdout:7/634: chown da/d26/d50/d73/lbe 326 1 2026-03-10T12:38:13.270 INFO:tasks.workunit.client.0.vm00.stdout:7/635: dwrite da/d1b/f39 [0,4194304] 0 2026-03-10T12:38:13.272 INFO:tasks.workunit.client.0.vm00.stdout:7/636: readlink da/d26/d50/d73/lbe 0 2026-03-10T12:38:13.272 INFO:tasks.workunit.client.0.vm00.stdout:5/933: mknod d1f/d6a/d118/c147 0 2026-03-10T12:38:13.286 INFO:tasks.workunit.client.0.vm00.stdout:3/886: dread - dd/d18/d13/d1d/f86 zero size 2026-03-10T12:38:13.287 INFO:tasks.workunit.client.0.vm00.stdout:5/934: dwrite d1f/d26/d2b/d35/fad [4194304,4194304] 0 2026-03-10T12:38:13.291 INFO:tasks.workunit.client.0.vm00.stdout:3/887: dwrite dd/d3d/d8a/f113 [0,4194304] 0 2026-03-10T12:38:13.296 INFO:tasks.workunit.client.0.vm00.stdout:9/903: sync 2026-03-10T12:38:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: Manager daemon vm07.kfawlb is now available 2026-03-10T12:38:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: Migrating agent root cert to cert store 2026-03-10T12:38:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: Migrating agent root key to cert store 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: Checking for cert/key for grafana.vm00 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: Migrating grafana.vm00 cert to cert store 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: Migrating grafana.vm00 key to cert store 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/trash_purge_schedule"}]: dispatch 2026-03-10T12:38:13.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:13 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm07.kfawlb/trash_purge_schedule"}]: dispatch 2026-03-10T12:38:13.317 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.316+0000 7f11baab8700 1 -- 192.168.123.100:0/3620248782 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f11b4071980 msgr2=0x7f11b4071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.317 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.316+0000 7f11baab8700 1 --2- 192.168.123.100:0/3620248782 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f11b4071980 0x7f11b4071d90 secure :-1 s=READY pgs=341 cs=0 l=1 rev1=1 crypto rx=0x7f11a4007780 tx=0x7f11a400c050 comp rx=0 tx=0).stop 2026-03-10T12:38:13.317 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 -- 192.168.123.100:0/3620248782 shutdown_connections 2026-03-10T12:38:13.317 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 --2- 192.168.123.100:0/3620248782 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 0x7f11b40770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.318 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 --2- 192.168.123.100:0/3620248782 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f11b4071980 0x7f11b4071d90 unknown :-1 s=CLOSED pgs=341 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.318 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 -- 192.168.123.100:0/3620248782 >> 192.168.123.100:0/3620248782 conn(0x7f11b406d1a0 msgr2=0x7f11b406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:13.318 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 -- 192.168.123.100:0/3620248782 shutdown_connections 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 -- 192.168.123.100:0/3620248782 wait complete. 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 Processor -- start 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 -- start start 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 0x7f11b41312f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f11b4131830 0x7f11b407f4b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f11b4131d30 con 0x7f11b4131830 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.317+0000 7f11baab8700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f11b4131ea0 con 0x7f11b4072360 2026-03-10T12:38:13.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.318+0000 7f11b8854700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 0x7f11b41312f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.318+0000 7f11b8854700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 0x7f11b41312f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60978/0 (socket says 192.168.123.100:60978) 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.318+0000 7f11b8854700 1 -- 192.168.123.100:0/2801272233 learned_addr learned my addr 192.168.123.100:0/2801272233 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.319+0000 7f11b8854700 1 -- 192.168.123.100:0/2801272233 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f11b4131830 msgr2=0x7f11b407f4b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.319+0000 7f11b8854700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f11b4131830 0x7f11b407f4b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.319+0000 7f11b8854700 1 -- 192.168.123.100:0/2801272233 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f11a4007430 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.319+0000 7f11b8854700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 0x7f11b41312f0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f11a4007fd0 tx=0x7f11a400da70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.319+0000 7f11b1ffb700 1 -- 192.168.123.100:0/2801272233 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f11a400f040 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.319+0000 7f11baab8700 1 -- 192.168.123.100:0/2801272233 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f11b407f9f0 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.319+0000 7f11baab8700 1 -- 192.168.123.100:0/2801272233 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f11b407feb0 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.320+0000 7f11b1ffb700 1 -- 192.168.123.100:0/2801272233 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f11a400a5b0 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.320+0000 7f11b1ffb700 1 -- 192.168.123.100:0/2801272233 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f11a40085c0 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.321+0000 7f11b1ffb700 1 -- 192.168.123.100:0/2801272233 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 22) v1 ==== 50027+0+0 (secure 0 0 0) 0x7f11a401a070 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.323+0000 7f11b1ffb700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f119c03db20 0x7f119c03ffd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.323+0000 7f11b1ffb700 1 -- 192.168.123.100:0/2801272233 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f11a4054640 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.323+0000 7f11b3fff700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f119c03db20 0x7f119c03ffd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.323+0000 7f11baab8700 1 -- 192.168.123.100:0/2801272233 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f11a0005320 con 0x7f11b4072360 2026-03-10T12:38:13.325 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.324+0000 7f11b3fff700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f119c03db20 0x7f119c03ffd0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f11b4072ff0 tx=0x7f11ac009250 comp rx=0 tx=0).ready entity=mgr.24461 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:13.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.328+0000 7f11b1ffb700 1 -- 192.168.123.100:0/2801272233 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f11a40037a0 con 0x7f11b4072360 2026-03-10T12:38:13.340 INFO:tasks.workunit.client.0.vm00.stdout:5/935: creat d1f/d26/d2e/d58/d10c/d123/d5b/dd1/f148 x:0 0 0 2026-03-10T12:38:13.361 INFO:tasks.workunit.client.0.vm00.stdout:2/890: dwrite d4/d6/d2d/d3a/f44 [0,4194304] 0 2026-03-10T12:38:13.368 INFO:tasks.workunit.client.0.vm00.stdout:2/891: read - d4/dd/d63/f83 zero size 2026-03-10T12:38:13.378 INFO:tasks.workunit.client.0.vm00.stdout:9/904: dread d0/d3d/d43/d53/fa5 [0,4194304] 0 2026-03-10T12:38:13.381 INFO:tasks.workunit.client.1.vm07.stdout:6/649: write d1/d4/d9b/fc8 [826909,130912] 0 2026-03-10T12:38:13.382 INFO:tasks.workunit.client.1.vm07.stdout:6/650: mknod d1/d4/d6/d53/cce 0 2026-03-10T12:38:13.385 INFO:tasks.workunit.client.1.vm07.stdout:6/651: link d1/d4/d6/f91 d1/fcf 0 2026-03-10T12:38:13.387 INFO:tasks.workunit.client.1.vm07.stdout:6/652: truncate d1/d4/d6/d16/d1a/f6a 490009 0 2026-03-10T12:38:13.390 INFO:tasks.workunit.client.0.vm00.stdout:3/888: write dd/d18/f7c [2612458,20254] 0 2026-03-10T12:38:13.391 INFO:tasks.workunit.client.0.vm00.stdout:3/889: fsync dd/d64/fc2 0 2026-03-10T12:38:13.394 INFO:tasks.workunit.client.0.vm00.stdout:5/936: getdents d1f/d26/d2e/d58/d141 0 2026-03-10T12:38:13.398 INFO:tasks.workunit.client.0.vm00.stdout:2/892: stat d4/dd/fe6 0 2026-03-10T12:38:13.403 INFO:tasks.workunit.client.1.vm07.stdout:9/753: write d5/d13/d2c/de6/fad [2991183,3468] 0 2026-03-10T12:38:13.410 INFO:tasks.workunit.client.0.vm00.stdout:7/637: read da/d26/d50/d73/fce [2887217,25420] 0 2026-03-10T12:38:13.411 INFO:tasks.workunit.client.1.vm07.stdout:8/643: write d1/d3/f57 [3941504,19766] 0 2026-03-10T12:38:13.411 INFO:tasks.workunit.client.0.vm00.stdout:0/749: write d3/d7/d4c/d5b/f2a [1534035,102874] 0 2026-03-10T12:38:13.412 INFO:tasks.workunit.client.1.vm07.stdout:8/644: mkdir d1/d3/d6/d54/dd2 0 2026-03-10T12:38:13.412 INFO:tasks.workunit.client.1.vm07.stdout:8/645: chown d1/d3/d6c/lb1 51145 1 2026-03-10T12:38:13.416 INFO:tasks.workunit.client.0.vm00.stdout:6/587: write d2/d42/d80/d89/fb8 [492767,125835] 0 2026-03-10T12:38:13.417 INFO:tasks.workunit.client.0.vm00.stdout:0/750: mknod d3/d33/cf1 0 2026-03-10T12:38:13.419 INFO:tasks.workunit.client.0.vm00.stdout:1/896: dread da/d12/d91/fb8 [0,4194304] 0 2026-03-10T12:38:13.421 INFO:tasks.workunit.client.0.vm00.stdout:6/588: mkdir d2/d16/d29/d31/d88/dd5 0 2026-03-10T12:38:13.422 INFO:tasks.workunit.client.1.vm07.stdout:9/754: sync 2026-03-10T12:38:13.422 INFO:tasks.workunit.client.1.vm07.stdout:7/636: rename d0/d67/d6f to d0/d57/dd6 0 2026-03-10T12:38:13.424 INFO:tasks.workunit.client.0.vm00.stdout:0/751: creat d3/d7/db0/ff2 x:0 0 0 2026-03-10T12:38:13.424 INFO:tasks.workunit.client.1.vm07.stdout:2/604: truncate d0/d42/d26/f2e 12302208 0 2026-03-10T12:38:13.425 INFO:tasks.workunit.client.0.vm00.stdout:2/893: readlink d4/d6/d93/lcf 0 2026-03-10T12:38:13.425 INFO:tasks.workunit.client.1.vm07.stdout:9/755: symlink d5/d1f/d75/lfd 0 2026-03-10T12:38:13.427 INFO:tasks.workunit.client.0.vm00.stdout:6/589: unlink d2/d16/l65 0 2026-03-10T12:38:13.428 INFO:tasks.workunit.client.1.vm07.stdout:8/646: dread d1/d3/d6/d50/f80 [0,4194304] 0 2026-03-10T12:38:13.429 INFO:tasks.workunit.client.0.vm00.stdout:1/897: sync 2026-03-10T12:38:13.431 INFO:tasks.workunit.client.1.vm07.stdout:5/715: rename d0/d22/d18/d19/d21/d3a/fde to d0/d22/d18/d19/d21/d54/dcb/de8/ffe 0 2026-03-10T12:38:13.433 INFO:tasks.workunit.client.0.vm00.stdout:0/752: stat d3/d7/d4c/d5b/l32 0 2026-03-10T12:38:13.435 INFO:tasks.workunit.client.1.vm07.stdout:3/695: write dc/dd/d1f/d45/f5e [4411017,127795] 0 2026-03-10T12:38:13.437 INFO:tasks.workunit.client.1.vm07.stdout:7/637: dwrite d0/d61/db4/f4b [0,4194304] 0 2026-03-10T12:38:13.444 INFO:tasks.workunit.client.0.vm00.stdout:2/894: rename d4/d6/d2d/de5 to d4/d6/d93/dc6/d11e 0 2026-03-10T12:38:13.450 INFO:tasks.workunit.client.1.vm07.stdout:2/605: creat d0/d42/d4e/daf/fcf x:0 0 0 2026-03-10T12:38:13.456 INFO:tasks.workunit.client.0.vm00.stdout:9/905: dwrite d0/d3d/d59/d4e/dba/d1e/d85/fe7 [0,4194304] 0 2026-03-10T12:38:13.465 INFO:tasks.workunit.client.0.vm00.stdout:0/753: rename d3/db/d24/d25/fbd to d3/d7/d4c/d9d/ff3 0 2026-03-10T12:38:13.466 INFO:tasks.workunit.client.1.vm07.stdout:8/647: mknod d1/d3/d6/d50/d70/cd3 0 2026-03-10T12:38:13.468 INFO:tasks.workunit.client.1.vm07.stdout:0/767: write d0/d14/d5f/d41/d6a/d74/fb9 [171624,85000] 0 2026-03-10T12:38:13.469 INFO:tasks.workunit.client.0.vm00.stdout:3/890: creat dd/f126 x:0 0 0 2026-03-10T12:38:13.475 INFO:tasks.workunit.client.0.vm00.stdout:0/754: rename d3/d7/d4c/d9d/fc8 to d3/d7/d4c/dcc/dea/ff4 0 2026-03-10T12:38:13.479 INFO:tasks.workunit.client.0.vm00.stdout:5/937: link d1f/d39/f5f d1f/d26/d2b/d131/f149 0 2026-03-10T12:38:13.480 INFO:tasks.workunit.client.0.vm00.stdout:0/755: symlink d3/d7/d4c/d5b/d38/d44/lf5 0 2026-03-10T12:38:13.480 INFO:tasks.workunit.client.0.vm00.stdout:0/756: stat d3/d7/d4c/d5b/d38/db3/de2/fd4 0 2026-03-10T12:38:13.481 INFO:tasks.workunit.client.1.vm07.stdout:7/638: rmdir d0/d57/d62/d90 39 2026-03-10T12:38:13.492 INFO:tasks.workunit.client.1.vm07.stdout:5/716: creat d0/d22/d18/d19/d36/d75/ddc/fff x:0 0 0 2026-03-10T12:38:13.494 INFO:tasks.workunit.client.1.vm07.stdout:8/648: unlink d1/d3/d11/c97 0 2026-03-10T12:38:13.499 INFO:tasks.workunit.client.1.vm07.stdout:4/788: rename d0/d4/d10/d3c/d2b/d2d/da7/fc6 to d0/d4/d10/d9a/f113 0 2026-03-10T12:38:13.499 INFO:tasks.workunit.client.1.vm07.stdout:7/639: sync 2026-03-10T12:38:13.505 INFO:tasks.workunit.client.1.vm07.stdout:8/649: mkdir d1/d3/d6/d50/d70/dd4 0 2026-03-10T12:38:13.506 INFO:tasks.workunit.client.1.vm07.stdout:0/768: creat d0/d14/d5f/d76/d2f/d31/d4f/d9d/f104 x:0 0 0 2026-03-10T12:38:13.507 INFO:tasks.workunit.client.1.vm07.stdout:0/769: truncate d0/d14/d5f/d76/f30 5839373 0 2026-03-10T12:38:13.510 INFO:tasks.workunit.client.1.vm07.stdout:0/770: dwrite d0/d14/f19 [4194304,4194304] 0 2026-03-10T12:38:13.515 INFO:tasks.workunit.client.1.vm07.stdout:9/756: rename d5/d13/d2c/de6/f82 to d5/d69/ffe 0 2026-03-10T12:38:13.515 INFO:tasks.workunit.client.1.vm07.stdout:9/757: stat d5/d13/d6c/da4/fa6 0 2026-03-10T12:38:13.517 INFO:tasks.workunit.client.1.vm07.stdout:5/717: getdents d0/d22/d18/d19/d21/d54/dd1 0 2026-03-10T12:38:13.518 INFO:tasks.workunit.client.0.vm00.stdout:8/821: read d0/dd/d38/d81/df3/f70 [3894034,14746] 0 2026-03-10T12:38:13.520 INFO:tasks.workunit.client.1.vm07.stdout:2/606: getdents d0/d42/d1f 0 2026-03-10T12:38:13.522 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.521+0000 7f11baab8700 1 -- 192.168.123.100:0/2801272233 --> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f11a0000bf0 con 0x7f119c03db20 2026-03-10T12:38:13.529 INFO:tasks.workunit.client.1.vm07.stdout:0/771: rmdir d0/d14/d5f/d76/d2f/d31/d79/dcc 39 2026-03-10T12:38:13.530 INFO:tasks.workunit.client.1.vm07.stdout:3/696: dread dc/dd/d28/d3b/fc1 [0,4194304] 0 2026-03-10T12:38:13.540 INFO:tasks.workunit.client.1.vm07.stdout:9/758: mknod d5/d13/d9b/cff 0 2026-03-10T12:38:13.544 INFO:tasks.workunit.client.1.vm07.stdout:9/759: dwrite d5/d13/f67 [0,4194304] 0 2026-03-10T12:38:13.546 INFO:tasks.workunit.client.1.vm07.stdout:4/789: mkdir d0/d4/d10/d114 0 2026-03-10T12:38:13.547 INFO:tasks.workunit.client.0.vm00.stdout:6/590: dread d2/da/dc/f27 [0,4194304] 0 2026-03-10T12:38:13.547 INFO:tasks.workunit.client.0.vm00.stdout:6/591: dread - d2/da/f6a zero size 2026-03-10T12:38:13.548 INFO:tasks.workunit.client.1.vm07.stdout:5/718: dread - d0/d22/d18/d19/d21/d3a/f85 zero size 2026-03-10T12:38:13.560 INFO:tasks.workunit.client.1.vm07.stdout:8/650: truncate d1/d3/d6/d50/f56 4035869 0 2026-03-10T12:38:13.560 INFO:tasks.workunit.client.1.vm07.stdout:8/651: write d1/f79 [8960504,80232] 0 2026-03-10T12:38:13.565 INFO:tasks.workunit.client.1.vm07.stdout:1/668: dwrite d9/d2d/d4f/f95 [0,4194304] 0 2026-03-10T12:38:13.567 INFO:tasks.workunit.client.1.vm07.stdout:1/669: write d9/df/d55/f6f [2278751,90497] 0 2026-03-10T12:38:13.581 INFO:tasks.workunit.client.0.vm00.stdout:6/592: mkdir d2/d14/dbb/dd6 0 2026-03-10T12:38:13.591 INFO:tasks.workunit.client.0.vm00.stdout:6/593: creat d2/d14/dbb/fd7 x:0 0 0 2026-03-10T12:38:13.591 INFO:tasks.workunit.client.1.vm07.stdout:9/760: creat d5/d13/d9d/f100 x:0 0 0 2026-03-10T12:38:13.597 INFO:tasks.workunit.client.1.vm07.stdout:4/790: creat d0/d4/d5/d78/dc5/df7/db2/dd5/f115 x:0 0 0 2026-03-10T12:38:13.597 INFO:tasks.workunit.client.0.vm00.stdout:4/887: dread df/d1f/d22/f30 [0,4194304] 0 2026-03-10T12:38:13.604 INFO:tasks.workunit.client.1.vm07.stdout:6/653: stat d1/d4/d6/d4e/d64/f6f 0 2026-03-10T12:38:13.607 INFO:tasks.workunit.client.1.vm07.stdout:8/652: truncate d1/d3/d40/d92/db6/f67 4801769 0 2026-03-10T12:38:13.607 INFO:tasks.workunit.client.1.vm07.stdout:8/653: fdatasync d1/d3/d6c/fc9 0 2026-03-10T12:38:13.609 INFO:tasks.workunit.client.0.vm00.stdout:2/895: dread - d4/d6/d2d/d31/f71 zero size 2026-03-10T12:38:13.610 INFO:tasks.workunit.client.0.vm00.stdout:6/594: rename d2/d14/d7a/db9/f46 to d2/d16/d29/d31/fd8 0 2026-03-10T12:38:13.610 INFO:tasks.workunit.client.0.vm00.stdout:2/896: write d4/d53/d76/d9b/dad/d8e/f103 [25732,126497] 0 2026-03-10T12:38:13.613 INFO:tasks.workunit.client.0.vm00.stdout:1/898: fdatasync da/d21/db3/d59/da6/da4/dda/dc0/dfe/d10e/f11b 0 2026-03-10T12:38:13.617 INFO:tasks.workunit.client.0.vm00.stdout:1/899: symlink da/d21/db3/d59/d120/d72/d7e/l12a 0 2026-03-10T12:38:13.618 INFO:tasks.workunit.client.1.vm07.stdout:5/719: dread d0/d22/d18/d19/d21/f37 [4194304,4194304] 0 2026-03-10T12:38:13.620 INFO:tasks.workunit.client.1.vm07.stdout:5/720: write d0/d22/d18/d19/d36/d75/ddc/fff [265359,95790] 0 2026-03-10T12:38:13.622 INFO:tasks.workunit.client.0.vm00.stdout:8/822: unlink d0/d5c/f4a 0 2026-03-10T12:38:13.623 INFO:tasks.workunit.client.1.vm07.stdout:5/721: dwrite d0/d22/d18/d19/d36/d75/fdb [0,4194304] 0 2026-03-10T12:38:13.623 INFO:tasks.workunit.client.1.vm07.stdout:5/722: readlink d0/d22/d18/d19/d21/d54/dcb/le1 0 2026-03-10T12:38:13.624 INFO:tasks.workunit.client.0.vm00.stdout:2/897: mknod d4/d6/d2d/d3a/c11f 0 2026-03-10T12:38:13.625 INFO:tasks.workunit.client.0.vm00.stdout:2/898: write d4/f10d [882266,20363] 0 2026-03-10T12:38:13.635 INFO:tasks.workunit.client.1.vm07.stdout:5/723: sync 2026-03-10T12:38:13.639 INFO:tasks.workunit.client.0.vm00.stdout:1/900: fdatasync da/d21/db3/d59/da6/da4/dda/fbb 0 2026-03-10T12:38:13.642 INFO:tasks.workunit.client.0.vm00.stdout:8/823: unlink d0/d93/fa5 0 2026-03-10T12:38:13.647 INFO:tasks.workunit.client.0.vm00.stdout:4/888: creat df/d1f/d22/d26/d65/d91/d101/f127 x:0 0 0 2026-03-10T12:38:13.647 INFO:tasks.workunit.client.0.vm00.stdout:4/889: truncate df/d1f/d36/dc6/df1/f108 89708 0 2026-03-10T12:38:13.647 INFO:tasks.workunit.client.0.vm00.stdout:4/890: truncate df/d1f/d36/d3a/fdf 36276 0 2026-03-10T12:38:13.647 INFO:tasks.workunit.client.0.vm00.stdout:4/891: chown df/d1f/d36/d3a/d41/de4 262081 1 2026-03-10T12:38:13.647 INFO:tasks.workunit.client.0.vm00.stdout:4/892: fdatasync df/d1f/d22/d26/d65/d91/d101/f127 0 2026-03-10T12:38:13.648 INFO:tasks.workunit.client.1.vm07.stdout:0/772: mkdir d0/d14/d5f/d76/d2f/d31/df0/d105 0 2026-03-10T12:38:13.648 INFO:tasks.workunit.client.0.vm00.stdout:4/893: write df/d6c/f124 [976845,57294] 0 2026-03-10T12:38:13.649 INFO:tasks.workunit.client.1.vm07.stdout:0/773: read d0/d14/d5f/d76/d2f/d31/d4f/fc4 [2444113,77701] 0 2026-03-10T12:38:13.652 INFO:tasks.workunit.client.0.vm00.stdout:7/638: write da/d26/d50/fc9 [666377,95332] 0 2026-03-10T12:38:13.654 INFO:tasks.workunit.client.0.vm00.stdout:1/901: truncate da/fe0 2465435 0 2026-03-10T12:38:13.656 INFO:tasks.workunit.client.1.vm07.stdout:9/761: fdatasync d5/d1f/d5e/d6b/fae 0 2026-03-10T12:38:13.676 INFO:tasks.workunit.client.1.vm07.stdout:6/654: creat d1/d4/d44/fd0 x:0 0 0 2026-03-10T12:38:13.677 INFO:tasks.workunit.client.0.vm00.stdout:9/906: write d0/d3d/d43/f119 [204554,177] 0 2026-03-10T12:38:13.677 INFO:tasks.workunit.client.1.vm07.stdout:8/654: creat d1/d3/d5d/fd5 x:0 0 0 2026-03-10T12:38:13.680 INFO:tasks.workunit.client.1.vm07.stdout:8/655: dwrite d1/d3/d40/d92/dba/fc3 [0,4194304] 0 2026-03-10T12:38:13.684 INFO:tasks.workunit.client.0.vm00.stdout:7/639: mknod da/d26/d37/ce0 0 2026-03-10T12:38:13.690 INFO:tasks.workunit.client.0.vm00.stdout:3/891: write dd/d3d/d8a/de0/d55/dfd/d125/feb [303115,75740] 0 2026-03-10T12:38:13.695 INFO:tasks.workunit.client.0.vm00.stdout:5/938: write d1f/d26/d2e/d58/d6b/deb/fef [347903,128738] 0 2026-03-10T12:38:13.698 INFO:tasks.workunit.client.0.vm00.stdout:7/640: creat da/d47/fe1 x:0 0 0 2026-03-10T12:38:13.701 INFO:tasks.workunit.client.0.vm00.stdout:1/902: link da/d21/d27/fe8 da/d21/db3/d59/d120/dab/f12b 0 2026-03-10T12:38:13.704 INFO:tasks.workunit.client.1.vm07.stdout:7/640: rename d0/d61/l63 to d0/d57/d62/ld7 0 2026-03-10T12:38:13.708 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:38:13.708 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (4m) 2m ago 5m 22.8M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:38:13.708 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (5m) 2m ago 5m 8074k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:38:13.708 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (4m) 10s ago 4m 8568k - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (5m) 2m ago 5m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (4m) 10s ago 4m 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (4m) 2m ago 5m 82.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (3m) 2m ago 3m 17.2M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (3m) 2m ago 3m 14.2M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (3m) 10s ago 3m 15.5M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (3m) 10s ago 3m 163M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:9283,8765,8443 running (5m) 2m ago 5m 498M - 18.2.0 dc2bc1663786 8dc0a869be20 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (13s) 10s ago 4m 39.7M - 19.2.3-678-ge911bdeb 654f31e6858e ca47c92cac17 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (5m) 2m ago 5m 50.6M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (4m) 10s ago 4m 35.9M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (5m) 2m ago 5m 14.4M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (4m) 10s ago 4m 14.9M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (4m) 2m ago 4m 45.5M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (4m) 2m ago 4m 45.9M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (3m) 2m ago 3m 46.7M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (3m) 10s ago 3m 367M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (3m) 10s ago 3m 317M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (3m) 10s ago 3m 324M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (4m) 2m ago 4m 39.1M - 2.43.0 a07b618ecd1d 5d567c813f4b 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.702+0000 7f11b1ffb700 1 -- 192.168.123.100:0/2801272233 <== mgr.24461 v2:192.168.123.107:6828/3729807627 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f11a0000bf0 con 0x7f119c03db20 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.707+0000 7f119b7fe700 1 -- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f119c03db20 msgr2=0x7f119c03ffd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.707+0000 7f119b7fe700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f119c03db20 0x7f119c03ffd0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f11b4072ff0 tx=0x7f11ac009250 comp rx=0 tx=0).stop 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.707+0000 7f119b7fe700 1 -- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 msgr2=0x7f11b41312f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.709 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.707+0000 7f119b7fe700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 0x7f11b41312f0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f11a4007fd0 tx=0x7f11a400da70 comp rx=0 tx=0).stop 2026-03-10T12:38:13.709 INFO:tasks.workunit.client.1.vm07.stdout:9/762: creat d5/d13/d57/d4f/d6a/f101 x:0 0 0 2026-03-10T12:38:13.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.710+0000 7f119b7fe700 1 -- 192.168.123.100:0/2801272233 shutdown_connections 2026-03-10T12:38:13.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.710+0000 7f119b7fe700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f119c03db20 0x7f119c03ffd0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.710+0000 7f119b7fe700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f11b4072360 0x7f11b41312f0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.710+0000 7f119b7fe700 1 --2- 192.168.123.100:0/2801272233 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f11b4131830 0x7f11b407f4b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.710+0000 7f119b7fe700 1 -- 192.168.123.100:0/2801272233 >> 192.168.123.100:0/2801272233 conn(0x7f11b406d1a0 msgr2=0x7f11b4076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:13.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.711+0000 7f119b7fe700 1 -- 192.168.123.100:0/2801272233 shutdown_connections 2026-03-10T12:38:13.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.712+0000 7f119b7fe700 1 -- 192.168.123.100:0/2801272233 wait complete. 2026-03-10T12:38:13.738 INFO:tasks.workunit.client.1.vm07.stdout:7/641: unlink d0/d61/d79/ccd 0 2026-03-10T12:38:13.742 INFO:tasks.workunit.client.1.vm07.stdout:4/791: creat d0/d4/d10/f116 x:0 0 0 2026-03-10T12:38:13.743 INFO:tasks.workunit.client.1.vm07.stdout:4/792: chown d0/d8e 20684 1 2026-03-10T12:38:13.743 INFO:tasks.workunit.client.1.vm07.stdout:4/793: write d0/d4/d5/d78/dc5/df7/f97 [2574276,49770] 0 2026-03-10T12:38:13.746 INFO:tasks.workunit.client.1.vm07.stdout:1/670: write d9/df/d29/f8b [5230449,11284] 0 2026-03-10T12:38:13.746 INFO:tasks.workunit.client.0.vm00.stdout:6/595: write d2/da/dc/d2f/f4f [1560849,80077] 0 2026-03-10T12:38:13.751 INFO:tasks.workunit.client.1.vm07.stdout:8/656: creat d1/d3/d18/d8e/fd6 x:0 0 0 2026-03-10T12:38:13.752 INFO:tasks.workunit.client.1.vm07.stdout:3/697: write dc/d18/d24/f49 [1170905,103694] 0 2026-03-10T12:38:13.752 INFO:tasks.workunit.client.0.vm00.stdout:3/892: rename dd/d3d/d8a/de0/d55/dfd/d125/l58 to dd/d27/d2c/def/d118/l127 0 2026-03-10T12:38:13.754 INFO:tasks.workunit.client.0.vm00.stdout:0/757: truncate d3/d22/d3a/f8c 363892 0 2026-03-10T12:38:13.756 INFO:tasks.workunit.client.0.vm00.stdout:3/893: dwrite dd/f126 [0,4194304] 0 2026-03-10T12:38:13.757 INFO:tasks.workunit.client.1.vm07.stdout:0/774: creat d0/d14/d5f/d76/d2f/d31/f106 x:0 0 0 2026-03-10T12:38:13.762 INFO:tasks.workunit.client.1.vm07.stdout:2/607: rename d0/d42/d26/d38/d4f/d62/c63 to d0/d29/d64/db5/cd0 0 2026-03-10T12:38:13.764 INFO:tasks.workunit.client.1.vm07.stdout:2/608: dread d0/d42/d4e/d77/f89 [0,4194304] 0 2026-03-10T12:38:13.772 INFO:tasks.workunit.client.1.vm07.stdout:7/642: creat d0/d61/db4/d8a/fd8 x:0 0 0 2026-03-10T12:38:13.775 INFO:tasks.workunit.client.1.vm07.stdout:5/724: dwrite d0/d22/d18/d19/d2e/da9/fb5 [0,4194304] 0 2026-03-10T12:38:13.794 INFO:tasks.workunit.client.0.vm00.stdout:7/641: write da/d41/d7b/f83 [703209,10999] 0 2026-03-10T12:38:13.799 INFO:tasks.workunit.client.0.vm00.stdout:6/596: rename d2/d16/f2a to d2/da/fd9 0 2026-03-10T12:38:13.802 INFO:tasks.workunit.client.1.vm07.stdout:4/794: rmdir d0/d4/d5/da 39 2026-03-10T12:38:13.803 INFO:tasks.workunit.client.1.vm07.stdout:4/795: chown d0/d4/d5/d34/lde 184397471 1 2026-03-10T12:38:13.807 INFO:tasks.workunit.client.1.vm07.stdout:1/671: creat d9/d2d/d80/fdf x:0 0 0 2026-03-10T12:38:13.808 INFO:tasks.workunit.client.1.vm07.stdout:1/672: write d9/df/d29/f8b [1382777,109752] 0 2026-03-10T12:38:13.810 INFO:tasks.workunit.client.0.vm00.stdout:0/758: link d3/db/c23 d3/d7/db0/dc4/cf6 0 2026-03-10T12:38:13.811 INFO:tasks.workunit.client.0.vm00.stdout:7/642: mknod da/d1b/d40/ce2 0 2026-03-10T12:38:13.811 INFO:tasks.workunit.client.0.vm00.stdout:1/903: unlink da/d12/f99 0 2026-03-10T12:38:13.814 INFO:tasks.workunit.client.1.vm07.stdout:6/655: creat d1/d4/d6/d16/fd1 x:0 0 0 2026-03-10T12:38:13.820 INFO:tasks.workunit.client.0.vm00.stdout:0/759: dwrite d3/d7/d4c/d5b/d38/db3/de2/f68 [0,4194304] 0 2026-03-10T12:38:13.824 INFO:tasks.workunit.client.0.vm00.stdout:9/907: getdents d0/d3d/d43/d114/d139 0 2026-03-10T12:38:13.825 INFO:tasks.workunit.client.1.vm07.stdout:0/775: creat d0/d14/d5f/d3b/dbc/d8d/f107 x:0 0 0 2026-03-10T12:38:13.826 INFO:tasks.workunit.client.0.vm00.stdout:7/643: creat da/d41/d7b/d9d/dba/fe3 x:0 0 0 2026-03-10T12:38:13.826 INFO:tasks.workunit.client.0.vm00.stdout:0/760: rmdir d3/db/d77 39 2026-03-10T12:38:13.837 INFO:tasks.workunit.client.0.vm00.stdout:7/644: unlink da/d26/d37/d56/f9a 0 2026-03-10T12:38:13.841 INFO:tasks.workunit.client.0.vm00.stdout:2/899: dwrite d4/d53/d9e/f60 [0,4194304] 0 2026-03-10T12:38:13.844 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 -- 192.168.123.100:0/3350238934 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c072330 msgr2=0x7f611c0770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.844 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 --2- 192.168.123.100:0/3350238934 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c072330 0x7f611c0770b0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f611400d3f0 tx=0x7f611400d700 comp rx=0 tx=0).stop 2026-03-10T12:38:13.844 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 -- 192.168.123.100:0/3350238934 shutdown_connections 2026-03-10T12:38:13.844 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 --2- 192.168.123.100:0/3350238934 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c072330 0x7f611c0770b0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.844 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 --2- 192.168.123.100:0/3350238934 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f611c071950 0x7f611c071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.844 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 -- 192.168.123.100:0/3350238934 >> 192.168.123.100:0/3350238934 conn(0x7f611c06d1a0 msgr2=0x7f611c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 -- 192.168.123.100:0/3350238934 shutdown_connections 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.843+0000 7f6121223700 1 -- 192.168.123.100:0/3350238934 wait complete. 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.844+0000 7f6121223700 1 Processor -- start 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.844+0000 7f6121223700 1 -- start start 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.844+0000 7f6121223700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f611c071950 0x7f611c131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.844+0000 7f6121223700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c131890 0x7f611c07f4f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.844+0000 7f6121223700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f611c131d90 con 0x7f611c071950 2026-03-10T12:38:13.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.844+0000 7f6121223700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f611c131ed0 con 0x7f611c131890 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.846+0000 7f611b7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c131890 0x7f611c07f4f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.846+0000 7f611b7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c131890 0x7f611c07f4f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:32768/0 (socket says 192.168.123.100:32768) 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.846+0000 7f611b7fe700 1 -- 192.168.123.100:0/4055734810 learned_addr learned my addr 192.168.123.100:0/4055734810 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.846+0000 7f611bfff700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f611c071950 0x7f611c131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.846+0000 7f611bfff700 1 -- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c131890 msgr2=0x7f611c07f4f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.846+0000 7f611bfff700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c131890 0x7f611c07f4f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.846+0000 7f611bfff700 1 -- 192.168.123.100:0/4055734810 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6114007ed0 con 0x7f611c071950 2026-03-10T12:38:13.847 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.847+0000 7f611bfff700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f611c071950 0x7f611c131350 secure :-1 s=READY pgs=342 cs=0 l=1 rev1=1 crypto rx=0x7f610c00b700 tx=0x7f610c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:13.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.847+0000 7f61197fa700 1 -- 192.168.123.100:0/4055734810 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f610c010820 con 0x7f611c071950 2026-03-10T12:38:13.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.847+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f611c07fa90 con 0x7f611c071950 2026-03-10T12:38:13.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.847+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f611c07ff90 con 0x7f611c071950 2026-03-10T12:38:13.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.848+0000 7f6102ffd700 1 -- 192.168.123.100:0/4055734810 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f611c04ea50 con 0x7f611c071950 2026-03-10T12:38:13.849 INFO:tasks.workunit.client.0.vm00.stdout:4/894: getdents df/d1f/d22/d26/d65 0 2026-03-10T12:38:13.850 INFO:tasks.workunit.client.0.vm00.stdout:1/904: mkdir da/d21/db3/d59/d120/d72/d121/d12c 0 2026-03-10T12:38:13.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.853+0000 7f61197fa700 1 -- 192.168.123.100:0/4055734810 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f610c010e60 con 0x7f611c071950 2026-03-10T12:38:13.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.853+0000 7f61197fa700 1 -- 192.168.123.100:0/4055734810 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f610c00f5d0 con 0x7f611c071950 2026-03-10T12:38:13.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.853+0000 7f61197fa700 1 -- 192.168.123.100:0/4055734810 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 22) v1 ==== 50027+0+0 (secure 0 0 0) 0x7f610c00f7f0 con 0x7f611c071950 2026-03-10T12:38:13.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.853+0000 7f61197fa700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f610403db70 0x7f6104040020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:13.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.854+0000 7f61197fa700 1 -- 192.168.123.100:0/4055734810 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f610c053a30 con 0x7f611c071950 2026-03-10T12:38:13.856 INFO:tasks.workunit.client.0.vm00.stdout:8/824: dwrite d0/d5c/f42 [0,4194304] 0 2026-03-10T12:38:13.859 INFO:tasks.workunit.client.0.vm00.stdout:2/900: symlink d4/d6/d93/dc6/l120 0 2026-03-10T12:38:13.859 INFO:tasks.workunit.client.0.vm00.stdout:1/905: rmdir da/d21/d27/d6a 39 2026-03-10T12:38:13.860 INFO:tasks.workunit.client.0.vm00.stdout:8/825: fdatasync d0/d93/d17/d48/ffc 0 2026-03-10T12:38:13.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.861+0000 7f61197fa700 1 -- 192.168.123.100:0/4055734810 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f610c00e570 con 0x7f611c071950 2026-03-10T12:38:13.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.865+0000 7f611b7fe700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f610403db70 0x7f6104040020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:13.865 INFO:tasks.workunit.client.0.vm00.stdout:7/645: creat da/d47/fe4 x:0 0 0 2026-03-10T12:38:13.867 INFO:tasks.workunit.client.0.vm00.stdout:7/646: read da/d25/f2b [6074814,45817] 0 2026-03-10T12:38:13.869 INFO:tasks.workunit.client.1.vm07.stdout:1/673: rename d9/c2a to d9/df/d79/ce0 0 2026-03-10T12:38:13.869 INFO:tasks.workunit.client.1.vm07.stdout:1/674: chown d9/df/d29/f82 11 1 2026-03-10T12:38:13.870 INFO:tasks.workunit.client.1.vm07.stdout:8/657: fdatasync d1/d3/d40/f8c 0 2026-03-10T12:38:13.879 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:13.872+0000 7f611b7fe700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f610403db70 0x7f6104040020 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f6114000f80 tx=0x7f611400db40 comp rx=0 tx=0).ready entity=mgr.24461 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:13.879 INFO:tasks.workunit.client.1.vm07.stdout:3/698: mknod dc/d18/d99/d9c/cec 0 2026-03-10T12:38:13.883 INFO:tasks.workunit.client.0.vm00.stdout:5/939: dwrite d1f/d26/d2b/d35/f68 [0,4194304] 0 2026-03-10T12:38:13.886 INFO:tasks.workunit.client.1.vm07.stdout:9/763: write d5/f65 [5063034,49232] 0 2026-03-10T12:38:13.890 INFO:tasks.workunit.client.1.vm07.stdout:9/764: dwrite d5/d13/d2c/de6/dce/ff9 [0,4194304] 0 2026-03-10T12:38:13.893 INFO:tasks.workunit.client.0.vm00.stdout:4/895: sync 2026-03-10T12:38:13.894 INFO:tasks.workunit.client.0.vm00.stdout:6/597: dread d2/d16/f23 [0,4194304] 0 2026-03-10T12:38:13.894 INFO:tasks.workunit.client.1.vm07.stdout:7/643: symlink d0/d57/ld9 0 2026-03-10T12:38:13.896 INFO:tasks.workunit.client.1.vm07.stdout:0/776: truncate d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/fc7 1198884 0 2026-03-10T12:38:13.897 INFO:tasks.workunit.client.0.vm00.stdout:7/647: creat da/d1b/fe5 x:0 0 0 2026-03-10T12:38:13.904 INFO:tasks.workunit.client.1.vm07.stdout:5/725: truncate d0/d22/d18/d19/d21/d54/dcb/de8/ffe 826884 0 2026-03-10T12:38:13.909 INFO:tasks.workunit.client.0.vm00.stdout:3/894: dwrite dd/d27/d2c/f7d [0,4194304] 0 2026-03-10T12:38:13.909 INFO:tasks.workunit.client.1.vm07.stdout:5/726: fdatasync d0/d22/d18/d19/d2e/da9/fb5 0 2026-03-10T12:38:13.910 INFO:tasks.workunit.client.0.vm00.stdout:6/598: dwrite d2/d42/fd4 [0,4194304] 0 2026-03-10T12:38:13.915 INFO:tasks.workunit.client.0.vm00.stdout:4/896: dwrite df/d1f/d36/dc6/df1/f108 [0,4194304] 0 2026-03-10T12:38:13.919 INFO:tasks.workunit.client.0.vm00.stdout:9/908: rename d0/d7f/d88/fa8 to d0/d7f/db8/dc4/d106/f145 0 2026-03-10T12:38:13.926 INFO:tasks.workunit.client.0.vm00.stdout:4/897: readlink df/d1f/d36/d3a/le3 0 2026-03-10T12:38:13.926 INFO:tasks.workunit.client.0.vm00.stdout:7/648: dread - da/d41/d48/fd4 zero size 2026-03-10T12:38:13.931 INFO:tasks.workunit.client.0.vm00.stdout:6/599: creat d2/da/fda x:0 0 0 2026-03-10T12:38:13.932 INFO:tasks.workunit.client.0.vm00.stdout:6/600: write d2/da/fcc [1009664,10874] 0 2026-03-10T12:38:13.934 INFO:tasks.workunit.client.1.vm07.stdout:4/796: unlink d0/d4/d10/d3c/d2b/d54/de1/ca9 0 2026-03-10T12:38:13.939 INFO:tasks.workunit.client.0.vm00.stdout:1/906: rename da/d24/l3f to da/d24/d5a/d71/l12d 0 2026-03-10T12:38:13.942 INFO:tasks.workunit.client.0.vm00.stdout:7/649: dread f1 [0,4194304] 0 2026-03-10T12:38:13.948 INFO:tasks.workunit.client.1.vm07.stdout:6/656: rename d1/d4/d6/d16/f5f to d1/d4/d6/d16/d1a/d33/fd2 0 2026-03-10T12:38:13.954 INFO:tasks.workunit.client.1.vm07.stdout:1/675: chown d9/df/d29/d2b/d31/l3b 6450316 1 2026-03-10T12:38:13.956 INFO:tasks.workunit.client.0.vm00.stdout:6/601: mknod d2/d14/d7a/db9/cdb 0 2026-03-10T12:38:13.961 INFO:tasks.workunit.client.0.vm00.stdout:2/901: dwrite d4/dd/d63/fd4 [0,4194304] 0 2026-03-10T12:38:13.965 INFO:tasks.workunit.client.1.vm07.stdout:7/644: creat d0/d47/da0/fda x:0 0 0 2026-03-10T12:38:13.965 INFO:tasks.workunit.client.0.vm00.stdout:7/650: chown da/d3f/d60/lb2 1 1 2026-03-10T12:38:13.965 INFO:tasks.workunit.client.0.vm00.stdout:2/902: readlink d4/d6/d93/dc6/d11e/l110 0 2026-03-10T12:38:13.969 INFO:tasks.workunit.client.1.vm07.stdout:0/777: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/f108 x:0 0 0 2026-03-10T12:38:13.971 INFO:tasks.workunit.client.0.vm00.stdout:1/907: creat da/d24/d28/d67/f12e x:0 0 0 2026-03-10T12:38:13.973 INFO:tasks.workunit.client.0.vm00.stdout:6/602: creat d2/da/dc/d2f/fdc x:0 0 0 2026-03-10T12:38:13.974 INFO:tasks.workunit.client.1.vm07.stdout:0/778: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/ff2 [0,4194304] 0 2026-03-10T12:38:13.977 INFO:tasks.workunit.client.1.vm07.stdout:0/779: write d0/d14/d5f/d3b/f5b [2127908,74893] 0 2026-03-10T12:38:13.988 INFO:tasks.workunit.client.1.vm07.stdout:1/676: chown d9/d2d/d4f/d5a/f93 711535 1 2026-03-10T12:38:13.990 INFO:tasks.workunit.client.1.vm07.stdout:3/699: symlink dc/d18/led 0 2026-03-10T12:38:13.992 INFO:tasks.workunit.client.0.vm00.stdout:6/603: symlink d2/d16/ldd 0 2026-03-10T12:38:13.999 INFO:tasks.workunit.client.1.vm07.stdout:7/645: creat d0/d61/fdb x:0 0 0 2026-03-10T12:38:13.999 INFO:tasks.workunit.client.1.vm07.stdout:5/727: symlink d0/d22/d18/d19/l100 0 2026-03-10T12:38:14.000 INFO:tasks.workunit.client.1.vm07.stdout:6/657: dread d1/d4/d6/d16/f50 [0,4194304] 0 2026-03-10T12:38:14.001 INFO:tasks.workunit.client.0.vm00.stdout:8/826: write d0/dd/d38/d81/f88 [862540,26996] 0 2026-03-10T12:38:14.003 INFO:tasks.workunit.client.1.vm07.stdout:4/797: creat d0/d4/d10/d114/f117 x:0 0 0 2026-03-10T12:38:14.004 INFO:tasks.workunit.client.1.vm07.stdout:1/677: dread - d9/df/d29/d6b/fcc zero size 2026-03-10T12:38:14.006 INFO:tasks.workunit.client.1.vm07.stdout:3/700: creat dc/dd/d1f/dac/fee x:0 0 0 2026-03-10T12:38:14.007 INFO:tasks.workunit.client.1.vm07.stdout:0/780: dread d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/ffb [0,4194304] 0 2026-03-10T12:38:14.008 INFO:tasks.workunit.client.1.vm07.stdout:0/781: stat d0/d14/d5f/d76/d2f/d31/d79/d85/lbd 0 2026-03-10T12:38:14.010 INFO:tasks.workunit.client.1.vm07.stdout:7/646: fsync d0/f56 0 2026-03-10T12:38:14.020 INFO:tasks.workunit.client.1.vm07.stdout:6/658: creat d1/d4/d6/d16/d49/fd3 x:0 0 0 2026-03-10T12:38:14.020 INFO:tasks.workunit.client.0.vm00.stdout:7/651: getdents da/d41/d48 0 2026-03-10T12:38:14.020 INFO:tasks.workunit.client.0.vm00.stdout:6/604: rmdir d2/d42/d80/d9d/d9e 0 2026-03-10T12:38:14.024 INFO:tasks.workunit.client.0.vm00.stdout:8/827: creat d0/dd/d38/f101 x:0 0 0 2026-03-10T12:38:14.027 INFO:tasks.workunit.client.1.vm07.stdout:1/678: mkdir d9/df/dc2/de1 0 2026-03-10T12:38:14.032 INFO:tasks.workunit.client.0.vm00.stdout:6/605: symlink d2/d16/d29/d31/d34/lde 0 2026-03-10T12:38:14.032 INFO:tasks.workunit.client.1.vm07.stdout:3/701: mkdir dc/d18/d99/da3/def 0 2026-03-10T12:38:14.036 INFO:tasks.workunit.client.1.vm07.stdout:1/679: dwrite d9/df/f26 [4194304,4194304] 0 2026-03-10T12:38:14.036 INFO:tasks.workunit.client.1.vm07.stdout:0/782: unlink d0/d14/d5f/d3b/dbc/lf5 0 2026-03-10T12:38:14.040 INFO:tasks.workunit.client.1.vm07.stdout:7/647: readlink d0/l2d 0 2026-03-10T12:38:14.050 INFO:tasks.workunit.client.0.vm00.stdout:6/606: rename d2/d14/l95 to d2/da/dc/d94/ldf 0 2026-03-10T12:38:14.051 INFO:tasks.workunit.client.0.vm00.stdout:6/607: symlink d2/d14/dbb/dd6/le0 0 2026-03-10T12:38:14.051 INFO:tasks.workunit.client.0.vm00.stdout:8/828: creat d0/d93/d2d/d49/f102 x:0 0 0 2026-03-10T12:38:14.051 INFO:tasks.workunit.client.0.vm00.stdout:8/829: write d0/d93/d36/d5b/fdb [283306,20389] 0 2026-03-10T12:38:14.051 INFO:tasks.workunit.client.1.vm07.stdout:6/659: symlink d1/d4/d9b/ld4 0 2026-03-10T12:38:14.051 INFO:tasks.workunit.client.1.vm07.stdout:6/660: chown d1/d4/f11 580 1 2026-03-10T12:38:14.051 INFO:tasks.workunit.client.1.vm07.stdout:3/702: rmdir dc/dd/d43/d76/d95/dde 39 2026-03-10T12:38:14.051 INFO:tasks.workunit.client.1.vm07.stdout:3/703: chown dc/dd/d1f/l23 2207 1 2026-03-10T12:38:14.054 INFO:tasks.workunit.client.1.vm07.stdout:1/680: dwrite d9/df/d55/fce [0,4194304] 0 2026-03-10T12:38:14.056 INFO:tasks.workunit.client.1.vm07.stdout:0/783: dwrite d0/d14/d5f/d3b/f4b [0,4194304] 0 2026-03-10T12:38:14.057 INFO:tasks.workunit.client.1.vm07.stdout:0/784: stat d0/d14/d5f/d76/d2f 0 2026-03-10T12:38:14.060 INFO:tasks.workunit.client.0.vm00.stdout:6/608: sync 2026-03-10T12:38:14.066 INFO:tasks.workunit.client.0.vm00.stdout:2/903: dread d4/dd/f62 [0,4194304] 0 2026-03-10T12:38:14.079 INFO:tasks.workunit.client.1.vm07.stdout:3/704: mkdir dc/dd/d28/d7a/d8e/df0 0 2026-03-10T12:38:14.082 INFO:tasks.workunit.client.1.vm07.stdout:1/681: rename d9/df/d29/d2b/d3d to d9/d2d/de2 0 2026-03-10T12:38:14.086 INFO:tasks.workunit.client.1.vm07.stdout:5/728: getdents d0/d22/d18/d3e/d5d 0 2026-03-10T12:38:14.089 INFO:tasks.workunit.client.1.vm07.stdout:5/729: mknod d0/d22/d18/d19/d36/c101 0 2026-03-10T12:38:14.092 INFO:tasks.workunit.client.1.vm07.stdout:5/730: dread - d0/d22/d18/d3e/d5d/db6/fe4 zero size 2026-03-10T12:38:14.094 INFO:tasks.workunit.client.1.vm07.stdout:5/731: dread d0/d22/d18/fb4 [0,4194304] 0 2026-03-10T12:38:14.097 INFO:tasks.workunit.client.1.vm07.stdout:0/785: link d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/l7d d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/l109 0 2026-03-10T12:38:14.100 INFO:tasks.workunit.client.1.vm07.stdout:0/786: mknod d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/c10a 0 2026-03-10T12:38:14.101 INFO:tasks.workunit.client.0.vm00.stdout:3/895: dread dd/d3d/fe3 [0,4194304] 0 2026-03-10T12:38:14.103 INFO:tasks.workunit.client.1.vm07.stdout:0/787: fdatasync d0/d14/d5f/d76/d2f/d31/d4f/d9d/fda 0 2026-03-10T12:38:14.103 INFO:tasks.workunit.client.1.vm07.stdout:0/788: write d0/d14/f19 [5227175,105104] 0 2026-03-10T12:38:14.104 INFO:tasks.workunit.client.1.vm07.stdout:0/789: read d0/d14/d5f/d3b/f4b [2317680,53969] 0 2026-03-10T12:38:14.106 INFO:tasks.workunit.client.1.vm07.stdout:0/790: creat d0/d14/d5f/d76/d2f/d31/df0/f10b x:0 0 0 2026-03-10T12:38:14.114 INFO:tasks.workunit.client.1.vm07.stdout:0/791: unlink d0/d14/d5f/d41/ff1 0 2026-03-10T12:38:14.114 INFO:tasks.workunit.client.1.vm07.stdout:0/792: write d0/d14/d7c/fde [938256,89832] 0 2026-03-10T12:38:14.114 INFO:tasks.workunit.client.1.vm07.stdout:0/793: chown d0/d14/d5f/d41/d6a 50439992 1 2026-03-10T12:38:14.114 INFO:tasks.workunit.client.1.vm07.stdout:7/648: dread d0/d52/f5d [0,4194304] 0 2026-03-10T12:38:14.114 INFO:tasks.workunit.client.1.vm07.stdout:7/649: chown d0/f3f 27 1 2026-03-10T12:38:14.114 INFO:tasks.workunit.client.1.vm07.stdout:7/650: stat d0/d61/db4/d8a/d9d/fb1 0 2026-03-10T12:38:14.121 INFO:tasks.workunit.client.0.vm00.stdout:2/904: rename d4/dd/db9 to d4/d6/d121 0 2026-03-10T12:38:14.123 INFO:tasks.workunit.client.1.vm07.stdout:7/651: creat d0/d61/db4/fdc x:0 0 0 2026-03-10T12:38:14.124 INFO:tasks.workunit.client.0.vm00.stdout:5/940: write d1f/d26/fe9 [125367,43411] 0 2026-03-10T12:38:14.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.128+0000 7f6102ffd700 1 -- 192.168.123.100:0/4055734810 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f611c061960 con 0x7f611c071950 2026-03-10T12:38:14.130 INFO:tasks.workunit.client.0.vm00.stdout:1/908: getdents da/d21/db3/d59/da6/d8b 0 2026-03-10T12:38:14.132 INFO:tasks.workunit.client.0.vm00.stdout:2/905: dread d4/dd/da7/fd2 [0,4194304] 0 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 12, 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:38:14.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.129+0000 7f61197fa700 1 -- 192.168.123.100:0/4055734810 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f610c014070 con 0x7f611c071950 2026-03-10T12:38:14.133 INFO:tasks.workunit.client.1.vm07.stdout:7/652: mknod d0/d57/d62/cdd 0 2026-03-10T12:38:14.134 INFO:tasks.workunit.client.1.vm07.stdout:7/653: chown d0/f7b 10644940 1 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.134+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f610403db70 msgr2=0x7f6104040020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.134+0000 7f6121223700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f610403db70 0x7f6104040020 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f6114000f80 tx=0x7f611400db40 comp rx=0 tx=0).stop 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.134+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f611c071950 msgr2=0x7f611c131350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.134+0000 7f6121223700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f611c071950 0x7f611c131350 secure :-1 s=READY pgs=342 cs=0 l=1 rev1=1 crypto rx=0x7f610c00b700 tx=0x7f610c00bac0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.134+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 shutdown_connections 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.134+0000 7f6121223700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f610403db70 0x7f6104040020 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.134+0000 7f6121223700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f611c071950 0x7f611c131350 unknown :-1 s=CLOSED pgs=342 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.135+0000 7f6121223700 1 --2- 192.168.123.100:0/4055734810 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f611c131890 0x7f611c07f4f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.135+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 >> 192.168.123.100:0/4055734810 conn(0x7f611c06d1a0 msgr2=0x7f611c076460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.135+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 shutdown_connections 2026-03-10T12:38:14.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.135+0000 7f6121223700 1 -- 192.168.123.100:0/4055734810 wait complete. 2026-03-10T12:38:14.141 INFO:tasks.workunit.client.1.vm07.stdout:7/654: mkdir d0/d47/dde 0 2026-03-10T12:38:14.144 INFO:tasks.workunit.client.0.vm00.stdout:3/896: dread - dd/d18/d13/d1d/fc9 zero size 2026-03-10T12:38:14.145 INFO:tasks.workunit.client.1.vm07.stdout:7/655: rename d0/ca6 to d0/d47/dde/cdf 0 2026-03-10T12:38:14.146 INFO:tasks.workunit.client.1.vm07.stdout:1/682: sync 2026-03-10T12:38:14.159 INFO:tasks.workunit.client.0.vm00.stdout:9/909: dwrite d0/d3d/d43/d53/f66 [0,4194304] 0 2026-03-10T12:38:14.164 INFO:tasks.workunit.client.0.vm00.stdout:7/652: getdents da/d26/d37/d56 0 2026-03-10T12:38:14.165 INFO:tasks.workunit.client.0.vm00.stdout:7/653: symlink da/d3f/d71/le6 0 2026-03-10T12:38:14.166 INFO:tasks.workunit.client.0.vm00.stdout:7/654: readlink da/d25/d2c/d82/d68/l3b 0 2026-03-10T12:38:14.182 INFO:tasks.workunit.client.0.vm00.stdout:5/941: mkdir d1f/d6a/d94/dc3/de7/d14a 0 2026-03-10T12:38:14.183 INFO:tasks.workunit.client.0.vm00.stdout:8/830: write d0/d93/d60/f98 [846616,47984] 0 2026-03-10T12:38:14.188 INFO:tasks.workunit.client.0.vm00.stdout:8/831: write d0/d93/d17/ff9 [984128,86013] 0 2026-03-10T12:38:14.194 INFO:tasks.workunit.client.0.vm00.stdout:4/898: dwrite df/d1f/d22/f3c [4194304,4194304] 0 2026-03-10T12:38:14.197 INFO:tasks.workunit.client.0.vm00.stdout:0/761: dwrite d3/d7/d4c/d5b/d38/db3/de2/fad [0,4194304] 0 2026-03-10T12:38:14.201 INFO:tasks.workunit.client.0.vm00.stdout:0/762: dread - d3/d7/d3c/d4b/f79 zero size 2026-03-10T12:38:14.220 INFO:tasks.workunit.client.0.vm00.stdout:0/763: rename d3/d7/d4c/d5b/d38/db3/de2/fd4 to d3/db/d77/ff7 0 2026-03-10T12:38:14.221 INFO:tasks.workunit.client.1.vm07.stdout:1/683: getdents d9/df/d79 0 2026-03-10T12:38:14.225 INFO:tasks.workunit.client.0.vm00.stdout:0/764: truncate d3/d7/d3c/f19 848301 0 2026-03-10T12:38:14.226 INFO:tasks.workunit.client.0.vm00.stdout:7/655: creat da/d25/d2e/d4c/fe7 x:0 0 0 2026-03-10T12:38:14.227 INFO:tasks.workunit.client.1.vm07.stdout:1/684: mkdir d9/d2d/d4f/d75/de3 0 2026-03-10T12:38:14.230 INFO:tasks.workunit.client.0.vm00.stdout:0/765: creat d3/d7/d4c/d5b/d38/d44/d5a/ff8 x:0 0 0 2026-03-10T12:38:14.231 INFO:tasks.workunit.client.1.vm07.stdout:1/685: symlink d9/df/d29/d2b/d92/le4 0 2026-03-10T12:38:14.233 INFO:tasks.workunit.client.0.vm00.stdout:7/656: creat da/d26/d50/d73/fe8 x:0 0 0 2026-03-10T12:38:14.234 INFO:tasks.workunit.client.1.vm07.stdout:1/686: fdatasync d9/f1f 0 2026-03-10T12:38:14.234 INFO:tasks.workunit.client.0.vm00.stdout:3/897: truncate dd/d18/d13/f9e 730293 0 2026-03-10T12:38:14.235 INFO:tasks.workunit.client.0.vm00.stdout:3/898: write dd/d64/fb9 [3321279,82935] 0 2026-03-10T12:38:14.241 INFO:tasks.workunit.client.0.vm00.stdout:3/899: dwrite dd/d18/d13/f6b [4194304,4194304] 0 2026-03-10T12:38:14.246 INFO:tasks.workunit.client.0.vm00.stdout:3/900: truncate dd/d3d/d8a/de0/d55/dfd/f120 1036248 0 2026-03-10T12:38:14.248 INFO:tasks.workunit.client.0.vm00.stdout:0/766: read d3/db/d24/d25/f7d [1985155,25713] 0 2026-03-10T12:38:14.249 INFO:tasks.workunit.client.0.vm00.stdout:0/767: mkdir d3/d7/d4c/d5b/d38/d44/df9 0 2026-03-10T12:38:14.251 INFO:tasks.workunit.client.0.vm00.stdout:0/768: fsync d3/d40/fec 0 2026-03-10T12:38:14.253 INFO:tasks.workunit.client.0.vm00.stdout:0/769: rename d3/d7/d4c/d5b/d38/l4f to d3/d7/d4c/d5b/dc5/lfa 0 2026-03-10T12:38:14.255 INFO:tasks.workunit.client.0.vm00.stdout:0/770: creat d3/d7/d4c/d9d/ffb x:0 0 0 2026-03-10T12:38:14.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 -- 192.168.123.100:0/2584611648 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c072360 msgr2=0x7fe65c0770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 --2- 192.168.123.100:0/2584611648 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c072360 0x7fe65c0770e0 secure :-1 s=READY pgs=343 cs=0 l=1 rev1=1 crypto rx=0x7fe654009230 tx=0x7fe654009260 comp rx=0 tx=0).stop 2026-03-10T12:38:14.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 -- 192.168.123.100:0/2584611648 shutdown_connections 2026-03-10T12:38:14.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 --2- 192.168.123.100:0/2584611648 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c072360 0x7fe65c0770e0 unknown :-1 s=CLOSED pgs=343 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 --2- 192.168.123.100:0/2584611648 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 0x7fe65c071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 -- 192.168.123.100:0/2584611648 >> 192.168.123.100:0/2584611648 conn(0x7fe65c06d1a0 msgr2=0x7fe65c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:14.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 -- 192.168.123.100:0/2584611648 shutdown_connections 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.256+0000 7fe663c6c700 1 -- 192.168.123.100:0/2584611648 wait complete. 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe663c6c700 1 Processor -- start 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe663c6c700 1 -- start start 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe663c6c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 0x7fe65c082490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe663c6c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c0829d0 0x7fe65c082e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe663c6c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe65c083e40 con 0x7fe65c0829d0 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe663c6c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe65c1b2a90 con 0x7fe65c071980 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe661a08700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 0x7fe65c082490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe661a08700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 0x7fe65c082490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:32780/0 (socket says 192.168.123.100:32780) 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.257+0000 7fe661a08700 1 -- 192.168.123.100:0/4276140805 learned_addr learned my addr 192.168.123.100:0/4276140805 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.258+0000 7fe661207700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c0829d0 0x7fe65c082e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.258+0000 7fe661a08700 1 -- 192.168.123.100:0/4276140805 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c0829d0 msgr2=0x7fe65c082e40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.258+0000 7fe661a08700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c0829d0 0x7fe65c082e40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.258+0000 7fe661a08700 1 -- 192.168.123.100:0/4276140805 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe654008ee0 con 0x7fe65c071980 2026-03-10T12:38:14.260 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.260+0000 7fe661a08700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 0x7fe65c082490 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fe65800c8a0 tx=0x7fe65800cc60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:14.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.260+0000 7fe652ffd700 1 -- 192.168.123.100:0/4276140805 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe65800cea0 con 0x7fe65c071980 2026-03-10T12:38:14.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.260+0000 7fe663c6c700 1 -- 192.168.123.100:0/4276140805 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe65c1b2c90 con 0x7fe65c071980 2026-03-10T12:38:14.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.260+0000 7fe663c6c700 1 -- 192.168.123.100:0/4276140805 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe65c1b3190 con 0x7fe65c071980 2026-03-10T12:38:14.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.261+0000 7fe652ffd700 1 -- 192.168.123.100:0/4276140805 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe658004830 con 0x7fe65c071980 2026-03-10T12:38:14.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.261+0000 7fe663c6c700 1 -- 192.168.123.100:0/4276140805 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe65c07c8b0 con 0x7fe65c071980 2026-03-10T12:38:14.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.261+0000 7fe652ffd700 1 -- 192.168.123.100:0/4276140805 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe6580056a0 con 0x7fe65c071980 2026-03-10T12:38:14.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.261+0000 7fe652ffd700 1 -- 192.168.123.100:0/4276140805 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 23) v1 ==== 50327+0+0 (secure 0 0 0) 0x7fe658007770 con 0x7fe65c071980 2026-03-10T12:38:14.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.262+0000 7fe652ffd700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fe64803dea0 0x7fe648040350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.262+0000 7fe652ffd700 1 -- 192.168.123.100:0/4276140805 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fe658058d00 con 0x7fe65c071980 2026-03-10T12:38:14.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.262+0000 7fe661207700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fe64803dea0 0x7fe648040350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.263 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.263+0000 7fe661207700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fe64803dea0 0x7fe648040350 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe65400ec60 tx=0x7fe654010040 comp rx=0 tx=0).ready entity=mgr.24461 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:14.265 INFO:tasks.workunit.client.0.vm00.stdout:0/771: dread d3/d7/d4c/f96 [0,4194304] 0 2026-03-10T12:38:14.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.265+0000 7fe652ffd700 1 -- 192.168.123.100:0/4276140805 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fe658017d70 con 0x7fe65c071980 2026-03-10T12:38:14.283 INFO:tasks.workunit.client.0.vm00.stdout:9/910: write d0/fdc [8529485,39408] 0 2026-03-10T12:38:14.284 INFO:tasks.workunit.client.0.vm00.stdout:5/942: write d1f/d26/d2b/d35/f41 [365636,34136] 0 2026-03-10T12:38:14.289 INFO:tasks.workunit.client.0.vm00.stdout:8/832: read d0/dd/fbc [272088,24256] 0 2026-03-10T12:38:14.306 INFO:tasks.workunit.client.0.vm00.stdout:1/909: creat da/d21/db3/d59/da6/d8b/d98/f12f x:0 0 0 2026-03-10T12:38:14.307 INFO:tasks.workunit.client.0.vm00.stdout:1/910: write da/d24/d28/fdd [38615,17541] 0 2026-03-10T12:38:14.309 INFO:tasks.workunit.client.0.vm00.stdout:2/906: dread d4/d6/f9c [0,4194304] 0 2026-03-10T12:38:14.311 INFO:tasks.workunit.client.1.vm07.stdout:2/609: write d0/d42/f1b [318487,30524] 0 2026-03-10T12:38:14.322 INFO:tasks.workunit.client.1.vm07.stdout:2/610: unlink d0/d29/d64/f78 0 2026-03-10T12:38:14.323 INFO:tasks.workunit.client.1.vm07.stdout:2/611: read - d0/d42/d4e/daf/fcf zero size 2026-03-10T12:38:14.327 INFO:tasks.workunit.client.1.vm07.stdout:8/658: dwrite d1/d3/d6/faf [0,4194304] 0 2026-03-10T12:38:14.328 INFO:tasks.workunit.client.1.vm07.stdout:2/612: dwrite d0/d80/d93/fce [0,4194304] 0 2026-03-10T12:38:14.329 INFO:tasks.workunit.client.1.vm07.stdout:8/659: chown d1/d3/d6/faf 251 1 2026-03-10T12:38:14.340 INFO:tasks.workunit.client.1.vm07.stdout:8/660: mkdir d1/d3/d6/d50/d70/dd4/dd7 0 2026-03-10T12:38:14.340 INFO:tasks.workunit.client.0.vm00.stdout:3/901: truncate dd/d27/d2c/fb1 2193486 0 2026-03-10T12:38:14.340 INFO:tasks.workunit.client.1.vm07.stdout:8/661: readlink d1/la0 0 2026-03-10T12:38:14.349 INFO:tasks.workunit.client.1.vm07.stdout:8/662: rename d1/d3/c7a to d1/d3/d11/d87/cd8 0 2026-03-10T12:38:14.356 INFO:tasks.workunit.client.1.vm07.stdout:8/663: dread d1/d3/d11/f35 [0,4194304] 0 2026-03-10T12:38:14.356 INFO:tasks.workunit.client.1.vm07.stdout:8/664: read - d1/d3/d40/f5a zero size 2026-03-10T12:38:14.361 INFO:tasks.workunit.client.0.vm00.stdout:5/943: dread d1f/d26/d2b/d35/d78/d7f/fb9 [0,4194304] 0 2026-03-10T12:38:14.365 INFO:tasks.workunit.client.1.vm07.stdout:8/665: rename d1/d3/d6/d50/f5e to d1/d3/d18/fd9 0 2026-03-10T12:38:14.366 INFO:tasks.workunit.client.1.vm07.stdout:8/666: truncate d1/d3/d11/f86 5090970 0 2026-03-10T12:38:14.368 INFO:tasks.workunit.client.1.vm07.stdout:8/667: rename d1/d3/d18/f2e to d1/d3/d6c/fda 0 2026-03-10T12:38:14.368 INFO:tasks.workunit.client.1.vm07.stdout:2/613: sync 2026-03-10T12:38:14.371 INFO:tasks.workunit.client.1.vm07.stdout:8/668: creat d1/d3/d6/d54/dd2/fdb x:0 0 0 2026-03-10T12:38:14.371 INFO:tasks.workunit.client.0.vm00.stdout:8/833: symlink d0/d93/d36/d51/l103 0 2026-03-10T12:38:14.380 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:14 vm00.local ceph-mon[50686]: mgrmap e22: vm07.kfawlb(active, since 1.16355s) 2026-03-10T12:38:14.382 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:14 vm00.local ceph-mon[50686]: from='client.24465 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:14.382 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:14 vm00.local ceph-mon[50686]: pgmap v3: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:14.382 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:14 vm00.local ceph-mon[50686]: from='client.24491 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:14.385 INFO:tasks.workunit.client.1.vm07.stdout:2/614: dread d0/d29/d64/d74/d88/f58 [0,4194304] 0 2026-03-10T12:38:14.386 INFO:tasks.workunit.client.1.vm07.stdout:8/669: dread d1/d3/d40/f5b [0,4194304] 0 2026-03-10T12:38:14.389 INFO:tasks.workunit.client.1.vm07.stdout:2/615: rename d0/d42/d1f/d90/c5e to d0/d29/d64/d74/d88/cd1 0 2026-03-10T12:38:14.390 INFO:tasks.workunit.client.1.vm07.stdout:8/670: mknod d1/d3/d11/cdc 0 2026-03-10T12:38:14.392 INFO:tasks.workunit.client.1.vm07.stdout:2/616: creat d0/d29/d64/fd2 x:0 0 0 2026-03-10T12:38:14.395 INFO:tasks.workunit.client.1.vm07.stdout:8/671: getdents d1/d3/d6/d54 0 2026-03-10T12:38:14.397 INFO:tasks.workunit.client.1.vm07.stdout:9/765: write d5/d16/d23/d26/f46 [222390,114423] 0 2026-03-10T12:38:14.398 INFO:tasks.workunit.client.1.vm07.stdout:9/766: chown d5/d13/d22/l3f 8367533 1 2026-03-10T12:38:14.399 INFO:tasks.workunit.client.1.vm07.stdout:8/672: symlink d1/d3/d40/ldd 0 2026-03-10T12:38:14.399 INFO:tasks.workunit.client.1.vm07.stdout:8/673: stat d1/d3/db2/dcd 0 2026-03-10T12:38:14.404 INFO:tasks.workunit.client.1.vm07.stdout:9/767: mkdir d5/d13/d6c/da4/d102 0 2026-03-10T12:38:14.404 INFO:tasks.workunit.client.1.vm07.stdout:2/617: dread d0/f4a [0,4194304] 0 2026-03-10T12:38:14.404 INFO:tasks.workunit.client.0.vm00.stdout:9/911: write d0/d3d/d59/d4e/dba/d1e/d2b/f5f [3963600,26667] 0 2026-03-10T12:38:14.407 INFO:tasks.workunit.client.0.vm00.stdout:4/899: dwrite df/f3d [0,4194304] 0 2026-03-10T12:38:14.408 INFO:tasks.workunit.client.0.vm00.stdout:9/912: dwrite d0/d3d/d59/d4e/f70 [0,4194304] 0 2026-03-10T12:38:14.414 INFO:tasks.workunit.client.1.vm07.stdout:8/674: mkdir d1/d3/d6c/dde 0 2026-03-10T12:38:14.416 INFO:tasks.workunit.client.1.vm07.stdout:2/618: creat d0/d42/d26/d38/d4f/d62/fd3 x:0 0 0 2026-03-10T12:38:14.418 INFO:tasks.workunit.client.0.vm00.stdout:2/907: fsync d4/d6/d93/dc6/fd1 0 2026-03-10T12:38:14.421 INFO:tasks.workunit.client.1.vm07.stdout:8/675: mknod d1/d3/d6/d50/cdf 0 2026-03-10T12:38:14.422 INFO:tasks.workunit.client.1.vm07.stdout:9/768: link d5/d13/d2c/l3c d5/d16/d23/d26/d68/l103 0 2026-03-10T12:38:14.424 INFO:tasks.workunit.client.1.vm07.stdout:8/676: rmdir d1/d3/d11/d87 39 2026-03-10T12:38:14.424 INFO:tasks.workunit.client.1.vm07.stdout:8/677: chown d1/d3/d11/cdc 315628 1 2026-03-10T12:38:14.437 INFO:tasks.workunit.client.1.vm07.stdout:9/769: creat d5/d13/d6c/f104 x:0 0 0 2026-03-10T12:38:14.446 INFO:tasks.workunit.client.0.vm00.stdout:8/834: symlink d0/dd/d38/d81/l104 0 2026-03-10T12:38:14.466 INFO:tasks.workunit.client.1.vm07.stdout:4/798: write d0/d4/d5/da/fee [316683,91555] 0 2026-03-10T12:38:14.466 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.466+0000 7fe663c6c700 1 -- 192.168.123.100:0/4276140805 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fe65c1b2e20 con 0x7fe65c071980 2026-03-10T12:38:14.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.466+0000 7fe652ffd700 1 -- 192.168.123.100:0/4276140805 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1870 (secure 0 0 0) 0x7fe65c1b2e20 con 0x7fe65c071980 2026-03-10T12:38:14.468 INFO:tasks.workunit.client.1.vm07.stdout:4/799: mknod d0/d4/d10/c118 0 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:38:14.469 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:14.470 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:14.471 INFO:tasks.workunit.client.0.vm00.stdout:6/609: dwrite d2/da/dc/f25 [0,4194304] 0 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 -- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fe64803dea0 msgr2=0x7fe648040350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fe64803dea0 0x7fe648040350 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe65400ec60 tx=0x7fe654010040 comp rx=0 tx=0).stop 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 -- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 msgr2=0x7fe65c082490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 0x7fe65c082490 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fe65800c8a0 tx=0x7fe65800cc60 comp rx=0 tx=0).stop 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 -- 192.168.123.100:0/4276140805 shutdown_connections 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fe64803dea0 0x7fe648040350 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe65c071980 0x7fe65c082490 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.472 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 --2- 192.168.123.100:0/4276140805 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe65c0829d0 0x7fe65c082e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 -- 192.168.123.100:0/4276140805 >> 192.168.123.100:0/4276140805 conn(0x7fe65c06d1a0 msgr2=0x7fe65c076460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:14.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 -- 192.168.123.100:0/4276140805 shutdown_connections 2026-03-10T12:38:14.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.472+0000 7fe650ff9700 1 -- 192.168.123.100:0/4276140805 wait complete. 2026-03-10T12:38:14.473 INFO:tasks.workunit.client.1.vm07.stdout:4/800: read d0/d4/d10/d3c/d2b/d2d/f65 [2688971,81748] 0 2026-03-10T12:38:14.476 INFO:tasks.workunit.client.1.vm07.stdout:6/661: dwrite d1/d4/d6/f2a [0,4194304] 0 2026-03-10T12:38:14.477 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:38:14.479 INFO:tasks.workunit.client.0.vm00.stdout:4/900: dread f3 [0,4194304] 0 2026-03-10T12:38:14.480 INFO:tasks.workunit.client.0.vm00.stdout:4/901: write df/d1f/d22/d26/d65/d91/db9/fea [2219551,105902] 0 2026-03-10T12:38:14.481 INFO:tasks.workunit.client.0.vm00.stdout:4/902: stat c7 0 2026-03-10T12:38:14.490 INFO:tasks.workunit.client.1.vm07.stdout:3/705: dwrite dc/d18/d2d/f80 [0,4194304] 0 2026-03-10T12:38:14.491 INFO:tasks.workunit.client.1.vm07.stdout:4/801: mkdir d0/d4/d5/d78/dc5/df7/d119 0 2026-03-10T12:38:14.492 INFO:tasks.workunit.client.1.vm07.stdout:6/662: rmdir d1/d4/d6/d46 39 2026-03-10T12:38:14.495 INFO:tasks.workunit.client.0.vm00.stdout:9/913: rmdir d0/d3d/d59/d4e/dba 39 2026-03-10T12:38:14.500 INFO:tasks.workunit.client.0.vm00.stdout:3/902: write dd/d2a/f78 [5397725,99136] 0 2026-03-10T12:38:14.501 INFO:tasks.workunit.client.0.vm00.stdout:2/908: symlink d4/d6/dca/l122 0 2026-03-10T12:38:14.505 INFO:tasks.workunit.client.0.vm00.stdout:5/944: dwrite d1f/d26/d2b/f5c [0,4194304] 0 2026-03-10T12:38:14.507 INFO:tasks.workunit.client.0.vm00.stdout:5/945: write d1f/d26/d2b/d37/db2/f142 [893276,70280] 0 2026-03-10T12:38:14.508 INFO:tasks.workunit.client.1.vm07.stdout:3/706: mkdir dc/dd/d43/df1 0 2026-03-10T12:38:14.516 INFO:tasks.workunit.client.1.vm07.stdout:5/732: dwrite d0/d22/d18/d19/d2e/f62 [0,4194304] 0 2026-03-10T12:38:14.518 INFO:tasks.workunit.client.1.vm07.stdout:6/663: mkdir d1/d4/d6/d53/da3/dd5 0 2026-03-10T12:38:14.525 INFO:tasks.workunit.client.1.vm07.stdout:3/707: rename dc/d18/lb2 to dc/dd/d43/d76/d95/da0/lf2 0 2026-03-10T12:38:14.532 INFO:tasks.workunit.client.1.vm07.stdout:6/664: mkdir d1/d4/d6/d53/d66/dd6 0 2026-03-10T12:38:14.534 INFO:tasks.workunit.client.1.vm07.stdout:3/708: symlink dc/dd/d43/d76/daf/lf3 0 2026-03-10T12:38:14.555 INFO:tasks.workunit.client.0.vm00.stdout:8/835: creat d0/dd/d38/f105 x:0 0 0 2026-03-10T12:38:14.560 INFO:tasks.workunit.client.1.vm07.stdout:0/794: dwrite d0/d14/d5f/d76/f78 [0,4194304] 0 2026-03-10T12:38:14.564 INFO:tasks.workunit.client.0.vm00.stdout:1/911: truncate da/d21/f74 4624128 0 2026-03-10T12:38:14.564 INFO:tasks.workunit.client.1.vm07.stdout:0/795: dwrite d0/d14/d5f/d76/f30 [4194304,4194304] 0 2026-03-10T12:38:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:14 vm07.local ceph-mon[58582]: mgrmap e22: vm07.kfawlb(active, since 1.16355s) 2026-03-10T12:38:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:14 vm07.local ceph-mon[58582]: from='client.24465 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:14 vm07.local ceph-mon[58582]: pgmap v3: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:14 vm07.local ceph-mon[58582]: from='client.24491 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:14.575 INFO:tasks.workunit.client.0.vm00.stdout:9/914: chown d0/d3d/d59/d4e/dba/d1e/d27/d115/lac 61001 1 2026-03-10T12:38:14.577 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.575+0000 7fdaa78ba700 1 -- 192.168.123.100:0/484458066 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0071980 msgr2=0x7fdaa0071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.575+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/484458066 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0071980 0x7fdaa0071d90 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fda9c00bc70 tx=0x7fda9c00bf80 comp rx=0 tx=0).stop 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.575+0000 7fdaa78ba700 1 -- 192.168.123.100:0/484458066 shutdown_connections 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.575+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/484458066 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdaa0072360 0x7fdaa00770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.575+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/484458066 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0071980 0x7fdaa0071d90 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.575+0000 7fdaa78ba700 1 -- 192.168.123.100:0/484458066 >> 192.168.123.100:0/484458066 conn(0x7fdaa006d1a0 msgr2=0x7fdaa006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 -- 192.168.123.100:0/484458066 shutdown_connections 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 -- 192.168.123.100:0/484458066 wait complete. 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 Processor -- start 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 -- start start 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0072360 0x7fdaa0082520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdaa0082a60 0x7fdaa0082ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdaa012dd80 con 0x7fdaa0082a60 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.576+0000 7fdaa78ba700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdaa012def0 con 0x7fdaa0072360 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.577+0000 7fdaa4e55700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdaa0082a60 0x7fdaa0082ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.577+0000 7fdaa4e55700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdaa0082a60 0x7fdaa0082ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:47452/0 (socket says 192.168.123.100:47452) 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.577+0000 7fdaa4e55700 1 -- 192.168.123.100:0/1108704450 learned_addr learned my addr 192.168.123.100:0/1108704450 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.577+0000 7fdaa5656700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0072360 0x7fdaa0082520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.577+0000 7fdaa5656700 1 -- 192.168.123.100:0/1108704450 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdaa0082a60 msgr2=0x7fdaa0082ed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.577+0000 7fdaa5656700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdaa0082a60 0x7fdaa0082ed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.577+0000 7fdaa5656700 1 -- 192.168.123.100:0/1108704450 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda9c00b920 con 0x7fdaa0072360 2026-03-10T12:38:14.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.578+0000 7fdaa5656700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0072360 0x7fdaa0082520 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fda9c003ab0 tx=0x7fda9c0045a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:14.580 INFO:tasks.workunit.client.0.vm00.stdout:3/903: mkdir dd/d3d/d8a/de0/d55/dfd/d125/d2b/d128 0 2026-03-10T12:38:14.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.580+0000 7fda967fc700 1 -- 192.168.123.100:0/1108704450 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda9c010030 con 0x7fdaa0072360 2026-03-10T12:38:14.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.580+0000 7fda967fc700 1 -- 192.168.123.100:0/1108704450 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fda9c00dd60 con 0x7fdaa0072360 2026-03-10T12:38:14.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.580+0000 7fda967fc700 1 -- 192.168.123.100:0/1108704450 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda9c014880 con 0x7fdaa0072360 2026-03-10T12:38:14.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.580+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdaa012e0c0 con 0x7fdaa0072360 2026-03-10T12:38:14.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.580+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdaa012e5b0 con 0x7fdaa0072360 2026-03-10T12:38:14.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.581+0000 7fda8bfff700 1 -- 192.168.123.100:0/1108704450 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdaa004ea50 con 0x7fdaa0072360 2026-03-10T12:38:14.581 INFO:tasks.workunit.client.0.vm00.stdout:8/836: creat d0/d93/d36/d7d/f106 x:0 0 0 2026-03-10T12:38:14.581 INFO:tasks.workunit.client.0.vm00.stdout:1/912: dread - da/d12/d91/fb5 zero size 2026-03-10T12:38:14.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.582+0000 7fda967fc700 1 -- 192.168.123.100:0/1108704450 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 23) v1 ==== 50327+0+0 (secure 0 0 0) 0x7fda9c01dc60 con 0x7fdaa0072360 2026-03-10T12:38:14.584 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.582+0000 7fda967fc700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fda8c03def0 0x7fda8c0403a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.584 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.582+0000 7fda967fc700 1 -- 192.168.123.100:0/1108704450 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5448+0+0 (secure 0 0 0) 0x7fda9c02b030 con 0x7fdaa0072360 2026-03-10T12:38:14.584 INFO:tasks.workunit.client.0.vm00.stdout:3/904: symlink dd/d2a/l129 0 2026-03-10T12:38:14.584 INFO:tasks.workunit.client.0.vm00.stdout:9/915: creat d0/d3d/d43/d53/d126/f146 x:0 0 0 2026-03-10T12:38:14.585 INFO:tasks.workunit.client.0.vm00.stdout:4/903: creat df/d1f/d22/dcb/def/f128 x:0 0 0 2026-03-10T12:38:14.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.585+0000 7fda967fc700 1 -- 192.168.123.100:0/1108704450 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fda9c01cbd0 con 0x7fdaa0072360 2026-03-10T12:38:14.588 INFO:tasks.workunit.client.0.vm00.stdout:1/913: fsync da/d21/d27/fe8 0 2026-03-10T12:38:14.590 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.589+0000 7fdaa4e55700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fda8c03def0 0x7fda8c0403a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.595 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.594+0000 7fdaa4e55700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fda8c03def0 0x7fda8c0403a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fda980078a0 tx=0x7fda98007d00 comp rx=0 tx=0).ready entity=mgr.24461 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:14.614 INFO:tasks.workunit.client.1.vm07.stdout:0/796: dread d0/d14/d7c/fad [0,4194304] 0 2026-03-10T12:38:14.616 INFO:tasks.workunit.client.1.vm07.stdout:0/797: read d0/d14/d5f/d3b/fcb [379714,86418] 0 2026-03-10T12:38:14.619 INFO:tasks.workunit.client.1.vm07.stdout:0/798: rename d0/d14/d5f/d3b/dbc/d8d to d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9/d10c 0 2026-03-10T12:38:14.622 INFO:tasks.workunit.client.1.vm07.stdout:0/799: creat d0/d14/d5f/d76/d2f/d31/f10d x:0 0 0 2026-03-10T12:38:14.622 INFO:tasks.workunit.client.0.vm00.stdout:4/904: dread df/f12 [0,4194304] 0 2026-03-10T12:38:14.636 INFO:tasks.workunit.client.1.vm07.stdout:7/656: dwrite d0/d61/db4/d8a/d9d/fb1 [0,4194304] 0 2026-03-10T12:38:14.641 INFO:tasks.workunit.client.1.vm07.stdout:6/665: fdatasync d1/d4/d6/f2a 0 2026-03-10T12:38:14.641 INFO:tasks.workunit.client.1.vm07.stdout:6/666: readlink d1/d4/d6/d16/d1a/d2c/lcb 0 2026-03-10T12:38:14.642 INFO:tasks.workunit.client.0.vm00.stdout:3/905: unlink dd/d64/d93/ce5 0 2026-03-10T12:38:14.645 INFO:tasks.workunit.client.0.vm00.stdout:3/906: dread dd/d27/d2c/f89 [0,4194304] 0 2026-03-10T12:38:14.653 INFO:tasks.workunit.client.0.vm00.stdout:5/946: creat d1f/f14b x:0 0 0 2026-03-10T12:38:14.655 INFO:tasks.workunit.client.1.vm07.stdout:7/657: dread d0/f20 [0,4194304] 0 2026-03-10T12:38:14.659 INFO:tasks.workunit.client.1.vm07.stdout:7/658: dwrite d0/d61/db4/fc4 [0,4194304] 0 2026-03-10T12:38:14.660 INFO:tasks.workunit.client.1.vm07.stdout:7/659: dread d0/d52/f5d [0,4194304] 0 2026-03-10T12:38:14.676 INFO:tasks.workunit.client.0.vm00.stdout:1/914: chown da/d21/d27/d6a/f6d 4464016 1 2026-03-10T12:38:14.692 INFO:tasks.workunit.client.0.vm00.stdout:3/907: creat dd/d2a/da2/f12a x:0 0 0 2026-03-10T12:38:14.694 INFO:tasks.workunit.client.0.vm00.stdout:5/947: rmdir d1f/d26/d2b 39 2026-03-10T12:38:14.705 INFO:tasks.workunit.client.0.vm00.stdout:4/905: dread df/d1f/d22/d26/d65/d91/db9/fea [0,4194304] 0 2026-03-10T12:38:14.709 INFO:tasks.workunit.client.0.vm00.stdout:3/908: mknod dd/d64/c12b 0 2026-03-10T12:38:14.713 INFO:tasks.workunit.client.0.vm00.stdout:5/948: creat d1f/d6a/f14c x:0 0 0 2026-03-10T12:38:14.724 INFO:tasks.workunit.client.0.vm00.stdout:4/906: rmdir df/d1f/d36 39 2026-03-10T12:38:14.725 INFO:tasks.workunit.client.0.vm00.stdout:4/907: chown df/d1f/d22/d26/ff0 0 1 2026-03-10T12:38:14.747 INFO:tasks.workunit.client.0.vm00.stdout:5/949: dwrite d1f/d26/d2e/d58/d10c/d123/d72/fa0 [0,4194304] 0 2026-03-10T12:38:14.758 INFO:tasks.workunit.client.0.vm00.stdout:4/908: rename df/d1f/d36/d3a/fe1 to df/d63/d94/f129 0 2026-03-10T12:38:14.761 INFO:tasks.workunit.client.0.vm00.stdout:4/909: dwrite df/d1f/d22/d26/dab/f89 [0,4194304] 0 2026-03-10T12:38:14.772 INFO:tasks.workunit.client.0.vm00.stdout:5/950: symlink d1f/d26/d2b/d37/dcc/l14d 0 2026-03-10T12:38:14.777 INFO:tasks.workunit.client.0.vm00.stdout:4/910: symlink df/d1f/d36/d3a/d41/de4/l12a 0 2026-03-10T12:38:14.782 INFO:tasks.workunit.client.0.vm00.stdout:4/911: mkdir df/d1f/d22/d26/dab/d73/dda/d12b 0 2026-03-10T12:38:14.788 INFO:tasks.workunit.client.0.vm00.stdout:4/912: dread f9 [4194304,4194304] 0 2026-03-10T12:38:14.788 INFO:tasks.workunit.client.0.vm00.stdout:2/909: write d4/d6/d121/d6d/faa [1242038,47611] 0 2026-03-10T12:38:14.795 INFO:tasks.workunit.client.0.vm00.stdout:5/951: rename d1f/d6a/d94/dc3/de7/d14a to d1f/d26/d2b/d37/d14e 0 2026-03-10T12:38:14.796 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.795+0000 7fda8bfff700 1 -- 192.168.123.100:0/1108704450 --> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdaa0061ce0 con 0x7fda8c03def0 2026-03-10T12:38:14.797 INFO:tasks.workunit.client.0.vm00.stdout:4/913: creat df/d1f/d36/d3a/d41/de4/f12c x:0 0 0 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "1/23 daemons upgraded", 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout: "message": "", 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:38:14.803 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.800+0000 7fda967fc700 1 -- 192.168.123.100:0/1108704450 <== mgr.24461 v2:192.168.123.107:6828/3729807627 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+318 (secure 0 0 0) 0x7fdaa0061ce0 con 0x7fda8c03def0 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fda8c03def0 msgr2=0x7fda8c0403a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fda8c03def0 0x7fda8c0403a0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fda980078a0 tx=0x7fda98007d00 comp rx=0 tx=0).stop 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0072360 msgr2=0x7fdaa0082520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0072360 0x7fdaa0082520 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7fda9c003ab0 tx=0x7fda9c0045a0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 shutdown_connections 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7fda8c03def0 0x7fda8c0403a0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdaa0072360 0x7fdaa0082520 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 --2- 192.168.123.100:0/1108704450 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdaa0082a60 0x7fdaa0082ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.804+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 >> 192.168.123.100:0/1108704450 conn(0x7fdaa006d1a0 msgr2=0x7fdaa006e0c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.805+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 shutdown_connections 2026-03-10T12:38:14.805 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.805+0000 7fdaa78ba700 1 -- 192.168.123.100:0/1108704450 wait complete. 2026-03-10T12:38:14.820 INFO:tasks.workunit.client.0.vm00.stdout:7/657: getdents da/d26/d50/d73 0 2026-03-10T12:38:14.823 INFO:tasks.workunit.client.1.vm07.stdout:1/687: dwrite d9/d2d/de2/f4c [0,4194304] 0 2026-03-10T12:38:14.825 INFO:tasks.workunit.client.0.vm00.stdout:5/952: mkdir d1f/d26/d6f/d14f 0 2026-03-10T12:38:14.825 INFO:tasks.workunit.client.0.vm00.stdout:5/953: chown d1f/d26/d2b/d35/d78/d99/daf 1577 1 2026-03-10T12:38:14.832 INFO:tasks.workunit.client.0.vm00.stdout:9/916: write d0/fc9 [1457168,108000] 0 2026-03-10T12:38:14.832 INFO:tasks.workunit.client.0.vm00.stdout:0/772: write d3/d7/d4c/d5b/d38/db3/fca [109719,55054] 0 2026-03-10T12:38:14.833 INFO:tasks.workunit.client.0.vm00.stdout:8/837: dwrite d0/d93/d17/d48/f4c [0,4194304] 0 2026-03-10T12:38:14.845 INFO:tasks.workunit.client.1.vm07.stdout:1/688: truncate d9/df/f58 98328 0 2026-03-10T12:38:14.845 INFO:tasks.workunit.client.0.vm00.stdout:1/915: dwrite da/d21/db3/d59/da6/da4/dda/dc0/dfe/d10e/f11b [0,4194304] 0 2026-03-10T12:38:14.845 INFO:tasks.workunit.client.1.vm07.stdout:1/689: chown d9/f1a 3641 1 2026-03-10T12:38:14.849 INFO:tasks.workunit.client.1.vm07.stdout:1/690: dwrite d9/d2d/d4f/d75/f83 [0,4194304] 0 2026-03-10T12:38:14.852 INFO:tasks.workunit.client.0.vm00.stdout:3/909: dwrite dd/d3d/d8a/de0/d55/dfd/d125/fa0 [0,4194304] 0 2026-03-10T12:38:14.855 INFO:tasks.workunit.client.0.vm00.stdout:3/910: write dd/d2a/da2/de1/d45/f75 [2965005,41095] 0 2026-03-10T12:38:14.866 INFO:tasks.workunit.client.0.vm00.stdout:7/658: rename da/d47/d87/cbd to da/d41/ce9 0 2026-03-10T12:38:14.869 INFO:tasks.workunit.client.0.vm00.stdout:0/773: dwrite d3/d22/fde [0,4194304] 0 2026-03-10T12:38:14.879 INFO:tasks.workunit.client.0.vm00.stdout:2/910: dwrite d4/d6/d2d/d31/fcc [0,4194304] 0 2026-03-10T12:38:14.883 INFO:tasks.workunit.client.0.vm00.stdout:4/914: creat df/d8a/f12d x:0 0 0 2026-03-10T12:38:14.887 INFO:tasks.workunit.client.0.vm00.stdout:4/915: chown df/f1c 36065333 1 2026-03-10T12:38:14.887 INFO:tasks.workunit.client.0.vm00.stdout:4/916: chown df/d1f/l11d 994905287 1 2026-03-10T12:38:14.887 INFO:tasks.workunit.client.0.vm00.stdout:7/659: link da/d25/l6b da/d47/d87/lea 0 2026-03-10T12:38:14.887 INFO:tasks.workunit.client.0.vm00.stdout:0/774: rename d3/d7/d4c/d5b/f2b to d3/d22/d3a/deb/ffc 0 2026-03-10T12:38:14.888 INFO:tasks.workunit.client.0.vm00.stdout:9/917: mkdir d0/d3d/df2/d147 0 2026-03-10T12:38:14.890 INFO:tasks.workunit.client.0.vm00.stdout:7/660: fsync da/d26/d37/f4a 0 2026-03-10T12:38:14.891 INFO:tasks.workunit.client.0.vm00.stdout:7/661: chown da/d26/d37/ce0 0 1 2026-03-10T12:38:14.891 INFO:tasks.workunit.client.0.vm00.stdout:7/662: readlink da/d1b/l6d 0 2026-03-10T12:38:14.892 INFO:tasks.workunit.client.0.vm00.stdout:7/663: fdatasync da/d41/d48/d81/fcc 0 2026-03-10T12:38:14.892 INFO:tasks.workunit.client.0.vm00.stdout:7/664: chown da/d1b/l3c 15072 1 2026-03-10T12:38:14.896 INFO:tasks.workunit.client.0.vm00.stdout:7/665: mknod da/ceb 0 2026-03-10T12:38:14.897 INFO:tasks.workunit.client.0.vm00.stdout:0/775: rename d3/d7/l41 to d3/d7/lfd 0 2026-03-10T12:38:14.898 INFO:tasks.workunit.client.0.vm00.stdout:7/666: dread da/f10 [0,4194304] 0 2026-03-10T12:38:14.903 INFO:tasks.workunit.client.0.vm00.stdout:7/667: rename da/d26/d37/d56/cb9 to da/d25/d2e/cec 0 2026-03-10T12:38:14.909 INFO:tasks.workunit.client.0.vm00.stdout:7/668: creat da/d26/d37/d56/fed x:0 0 0 2026-03-10T12:38:14.909 INFO:tasks.workunit.client.0.vm00.stdout:2/911: rmdir d4/d53/d76/d9b/d107 39 2026-03-10T12:38:14.909 INFO:tasks.workunit.client.0.vm00.stdout:7/669: truncate da/d41/d48/fae 1408523 0 2026-03-10T12:38:14.909 INFO:tasks.workunit.client.0.vm00.stdout:7/670: dread - da/d47/fe1 zero size 2026-03-10T12:38:14.910 INFO:tasks.workunit.client.0.vm00.stdout:7/671: write da/d26/f97 [2024344,77527] 0 2026-03-10T12:38:14.917 INFO:tasks.workunit.client.0.vm00.stdout:9/918: fdatasync d0/d3d/d59/d4e/dba/d1e/d27/d115/fdd 0 2026-03-10T12:38:14.917 INFO:tasks.workunit.client.0.vm00.stdout:9/919: write d0/d3d/d59/d4e/dba/d19/d50/f13a [410659,20474] 0 2026-03-10T12:38:14.927 INFO:tasks.workunit.client.0.vm00.stdout:7/672: rmdir da/d3f/dd1 39 2026-03-10T12:38:14.934 INFO:tasks.workunit.client.0.vm00.stdout:5/954: write f11 [4873572,114876] 0 2026-03-10T12:38:14.936 INFO:tasks.workunit.client.0.vm00.stdout:7/673: rename da/d25/l6b to da/d41/d7b/d9d/dba/lee 0 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 -- 192.168.123.100:0/398640334 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dcc071950 msgr2=0x7f9dcc071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 --2- 192.168.123.100:0/398640334 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dcc071950 0x7f9dcc071d60 secure :-1 s=READY pgs=344 cs=0 l=1 rev1=1 crypto rx=0x7f9dbc00bc70 tx=0x7f9dbc00bf80 comp rx=0 tx=0).stop 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 -- 192.168.123.100:0/398640334 shutdown_connections 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 --2- 192.168.123.100:0/398640334 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc072330 0x7f9dcc0770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 --2- 192.168.123.100:0/398640334 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dcc071950 0x7f9dcc071d60 unknown :-1 s=CLOSED pgs=344 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 -- 192.168.123.100:0/398640334 >> 192.168.123.100:0/398640334 conn(0x7f9dcc06d1a0 msgr2=0x7f9dcc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:14.943 INFO:tasks.workunit.client.0.vm00.stdout:1/916: unlink da/d21/db3/d59/da6/d8b/cc7 0 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 -- 192.168.123.100:0/398640334 shutdown_connections 2026-03-10T12:38:14.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 -- 192.168.123.100:0/398640334 wait complete. 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 Processor -- start 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 -- start start 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dcc072330 0x7f9dcc0824c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.942+0000 7f9dd116c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc082a00 0x7f9dcc082e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.943+0000 7f9dd116c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dcc1b2a90 con 0x7f9dcc072330 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.943+0000 7f9dd116c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dcc1b2bd0 con 0x7f9dcc082a00 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.943+0000 7f9dcb7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc082a00 0x7f9dcc082e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.943+0000 7f9dcb7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc082a00 0x7f9dcc082e70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:32828/0 (socket says 192.168.123.100:32828) 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.943+0000 7f9dcb7fe700 1 -- 192.168.123.100:0/1075449107 learned_addr learned my addr 192.168.123.100:0/1075449107 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.944+0000 7f9dcb7fe700 1 -- 192.168.123.100:0/1075449107 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dcc072330 msgr2=0x7f9dcc0824c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.944+0000 7f9dcb7fe700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dcc072330 0x7f9dcc0824c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:14.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.944+0000 7f9dcb7fe700 1 -- 192.168.123.100:0/1075449107 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9dbc00b920 con 0x7f9dcc082a00 2026-03-10T12:38:14.945 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.944+0000 7f9dcb7fe700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc082a00 0x7f9dcc082e70 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f9dc400ea00 tx=0x7f9dc400edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:14.945 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.945+0000 7f9dc97fa700 1 -- 192.168.123.100:0/1075449107 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9dc4004d60 con 0x7f9dcc082a00 2026-03-10T12:38:14.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.945+0000 7f9dd116c700 1 -- 192.168.123.100:0/1075449107 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9dcc1b2d10 con 0x7f9dcc082a00 2026-03-10T12:38:14.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.945+0000 7f9dd116c700 1 -- 192.168.123.100:0/1075449107 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9dcc1b3200 con 0x7f9dcc082a00 2026-03-10T12:38:14.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.945+0000 7f9dc97fa700 1 -- 192.168.123.100:0/1075449107 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9dc4013070 con 0x7f9dcc082a00 2026-03-10T12:38:14.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.945+0000 7f9dc97fa700 1 -- 192.168.123.100:0/1075449107 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9dc4008780 con 0x7f9dcc082a00 2026-03-10T12:38:14.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.946+0000 7f9dc97fa700 1 -- 192.168.123.100:0/1075449107 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 23) v1 ==== 50327+0+0 (secure 0 0 0) 0x7f9dc400b7f0 con 0x7f9dcc082a00 2026-03-10T12:38:14.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.946+0000 7f9dc97fa700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f9db403dea0 0x7f9db4040350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:14.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.946+0000 7f9dc97fa700 1 -- 192.168.123.100:0/1075449107 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5448+0+0 (secure 0 0 0) 0x7f9dc4015070 con 0x7f9dcc082a00 2026-03-10T12:38:14.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.947+0000 7f9dd116c700 1 -- 192.168.123.100:0/1075449107 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9db8005320 con 0x7f9dcc082a00 2026-03-10T12:38:14.952 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.950+0000 7f9dc97fa700 1 -- 192.168.123.100:0/1075449107 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f9dc400f450 con 0x7f9dcc082a00 2026-03-10T12:38:14.952 INFO:tasks.workunit.client.1.vm07.stdout:1/691: dread d9/df/f4a [0,4194304] 0 2026-03-10T12:38:14.952 INFO:tasks.workunit.client.0.vm00.stdout:7/674: symlink da/d25/lef 0 2026-03-10T12:38:14.954 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.952+0000 7f9dcbfff700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f9db403dea0 0x7f9db4040350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:14.955 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:14.955+0000 7f9dcbfff700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f9db403dea0 0x7f9db4040350 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9dbc00bc70 tx=0x7f9dbc00d330 comp rx=0 tx=0).ready entity=mgr.24461 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:14.957 INFO:tasks.workunit.client.0.vm00.stdout:7/675: dwrite da/d25/d2e/d4c/fe7 [0,4194304] 0 2026-03-10T12:38:14.959 INFO:tasks.workunit.client.0.vm00.stdout:3/911: mkdir dd/d18/d13/d99/dd9/d12c 0 2026-03-10T12:38:14.966 INFO:tasks.workunit.client.0.vm00.stdout:3/912: dwrite dd/d3d/f11e [0,4194304] 0 2026-03-10T12:38:14.969 INFO:tasks.workunit.client.0.vm00.stdout:8/838: dwrite d0/d93/d2d/f55 [0,4194304] 0 2026-03-10T12:38:14.971 INFO:tasks.workunit.client.1.vm07.stdout:1/692: link d9/d2d/de2/cae d9/df/d29/d2b/d31/d91/ce5 0 2026-03-10T12:38:14.977 INFO:tasks.workunit.client.0.vm00.stdout:2/912: rmdir d4/d10f/de1 39 2026-03-10T12:38:14.977 INFO:tasks.workunit.client.1.vm07.stdout:1/693: dread d9/df/f4a [0,4194304] 0 2026-03-10T12:38:14.981 INFO:tasks.workunit.client.0.vm00.stdout:7/676: mknod da/d25/d2e/cf0 0 2026-03-10T12:38:14.988 INFO:tasks.workunit.client.0.vm00.stdout:7/677: creat da/d47/ff1 x:0 0 0 2026-03-10T12:38:14.998 INFO:tasks.workunit.client.0.vm00.stdout:4/917: symlink df/d1f/d22/d26/l12e 0 2026-03-10T12:38:14.998 INFO:tasks.workunit.client.0.vm00.stdout:1/917: truncate da/d21/d27/fa0 2181822 0 2026-03-10T12:38:14.998 INFO:tasks.workunit.client.0.vm00.stdout:3/913: rmdir dd/d3d/d8a/de0/de4/dac 39 2026-03-10T12:38:15.008 INFO:tasks.workunit.client.0.vm00.stdout:3/914: mkdir dd/d3d/d8a/de0/d55/dfd/d125/d2b/d11c/d12d 0 2026-03-10T12:38:15.022 INFO:tasks.workunit.client.0.vm00.stdout:5/955: dread d1f/d26/d2b/d37/f4c [0,4194304] 0 2026-03-10T12:38:15.022 INFO:tasks.workunit.client.0.vm00.stdout:9/920: getdents d0/d3d/d59/d4e/dba/d1e/d2b 0 2026-03-10T12:38:15.022 INFO:tasks.workunit.client.0.vm00.stdout:1/918: dread da/d24/d5a/f7c [0,4194304] 0 2026-03-10T12:38:15.024 INFO:tasks.workunit.client.0.vm00.stdout:1/919: read da/d21/db3/d59/d120/ff8 [3939947,3914] 0 2026-03-10T12:38:15.025 INFO:tasks.workunit.client.0.vm00.stdout:5/956: link d1f/d26/d2b/f44 d1f/d26/d2e/d58/d10c/f150 0 2026-03-10T12:38:15.030 INFO:tasks.workunit.client.1.vm07.stdout:1/694: dread d9/df/f11 [0,4194304] 0 2026-03-10T12:38:15.032 INFO:tasks.workunit.client.1.vm07.stdout:1/695: dread - d9/df/d29/d2b/d31/fc6 zero size 2026-03-10T12:38:15.033 INFO:tasks.workunit.client.0.vm00.stdout:1/920: symlink da/d21/db3/d59/da6/da4/dda/dc0/dfe/l130 0 2026-03-10T12:38:15.043 INFO:tasks.workunit.client.0.vm00.stdout:3/915: mknod dd/d3d/d115/c12e 0 2026-03-10T12:38:15.043 INFO:tasks.workunit.client.0.vm00.stdout:3/916: chown dd/d27/d2c/lb7 175241 1 2026-03-10T12:38:15.044 INFO:tasks.workunit.client.0.vm00.stdout:1/921: dwrite da/d21/db3/d59/da6/d8b/d98/f12f [0,4194304] 0 2026-03-10T12:38:15.050 INFO:tasks.workunit.client.0.vm00.stdout:8/839: read d0/dd/d38/d81/df3/f70 [2112446,21469] 0 2026-03-10T12:38:15.055 INFO:tasks.workunit.client.0.vm00.stdout:5/957: read d1f/d26/d2e/f10b [426368,114112] 0 2026-03-10T12:38:15.077 INFO:tasks.workunit.client.0.vm00.stdout:3/917: rename dd/d64/d93/l111 to dd/d2a/da2/db4/l12f 0 2026-03-10T12:38:15.078 INFO:tasks.workunit.client.0.vm00.stdout:8/840: rmdir d0/d93/d2d 39 2026-03-10T12:38:15.083 INFO:tasks.workunit.client.0.vm00.stdout:2/913: sync 2026-03-10T12:38:15.083 INFO:tasks.workunit.client.0.vm00.stdout:4/918: sync 2026-03-10T12:38:15.083 INFO:tasks.workunit.client.1.vm07.stdout:1/696: sync 2026-03-10T12:38:15.083 INFO:tasks.workunit.client.0.vm00.stdout:4/919: readlink df/d1f/d36/d3a/d41/lc9 0 2026-03-10T12:38:15.087 INFO:tasks.workunit.client.1.vm07.stdout:1/697: creat d9/df/d29/d6b/fe6 x:0 0 0 2026-03-10T12:38:15.088 INFO:tasks.workunit.client.0.vm00.stdout:2/914: dread d4/d53/d76/d9b/dad/d8e/f103 [0,4194304] 0 2026-03-10T12:38:15.088 INFO:tasks.workunit.client.0.vm00.stdout:2/915: stat d4/d53/d76/d9b/c117 0 2026-03-10T12:38:15.092 INFO:tasks.workunit.client.0.vm00.stdout:5/958: creat d1f/d6a/d94/dc3/de7/f151 x:0 0 0 2026-03-10T12:38:15.094 INFO:tasks.workunit.client.0.vm00.stdout:1/922: rename da/d24/f45 to da/d21/d27/d118/f131 0 2026-03-10T12:38:15.101 INFO:tasks.workunit.client.0.vm00.stdout:8/841: stat d0/d93/d2d/d49/f102 0 2026-03-10T12:38:15.102 INFO:tasks.workunit.client.0.vm00.stdout:8/842: chown d0/d93/d17/l3b 148814163 1 2026-03-10T12:38:15.126 INFO:tasks.workunit.client.1.vm07.stdout:2/619: truncate d0/d29/d64/d74/f8e 1030011 0 2026-03-10T12:38:15.126 INFO:tasks.workunit.client.1.vm07.stdout:2/620: write d0/f1d [3107294,98506] 0 2026-03-10T12:38:15.131 INFO:tasks.workunit.client.1.vm07.stdout:8/678: dwrite d1/d3/d6c/fc5 [0,4194304] 0 2026-03-10T12:38:15.132 INFO:tasks.workunit.client.1.vm07.stdout:9/770: dwrite d5/d13/d6c/fdf [0,4194304] 0 2026-03-10T12:38:15.144 INFO:tasks.workunit.client.0.vm00.stdout:5/959: rename d1f/d26/d2b/cd5 to d1f/d39/c152 0 2026-03-10T12:38:15.149 INFO:tasks.workunit.client.0.vm00.stdout:6/610: write d2/d16/d29/d31/d88/d92/fba [570779,81845] 0 2026-03-10T12:38:15.159 INFO:tasks.workunit.client.1.vm07.stdout:9/771: dwrite d5/d16/d18/f1e [0,4194304] 0 2026-03-10T12:38:15.164 INFO:tasks.workunit.client.1.vm07.stdout:4/802: dwrite d0/d4/df2/df6/d46/d76/fa0 [4194304,4194304] 0 2026-03-10T12:38:15.167 INFO:tasks.workunit.client.0.vm00.stdout:6/611: creat d2/d14/d7a/fe1 x:0 0 0 2026-03-10T12:38:15.167 INFO:tasks.workunit.client.0.vm00.stdout:6/612: chown d2/d14/dc0 19 1 2026-03-10T12:38:15.170 INFO:tasks.workunit.client.1.vm07.stdout:8/679: creat d1/d3/d6c/dde/fe0 x:0 0 0 2026-03-10T12:38:15.171 INFO:tasks.workunit.client.1.vm07.stdout:9/772: sync 2026-03-10T12:38:15.173 INFO:tasks.workunit.client.0.vm00.stdout:8/843: mknod d0/dd/d38/d81/df3/d9b/c107 0 2026-03-10T12:38:15.174 INFO:tasks.workunit.client.0.vm00.stdout:6/613: unlink d2/d16/d29/d31/d34/la1 0 2026-03-10T12:38:15.183 INFO:tasks.workunit.client.0.vm00.stdout:6/614: getdents d2/d42/dae 0 2026-03-10T12:38:15.186 INFO:tasks.workunit.client.0.vm00.stdout:6/615: read d2/da/dc/d2f/fb4 [975144,938] 0 2026-03-10T12:38:15.187 INFO:tasks.workunit.client.0.vm00.stdout:6/616: write d2/d16/f41 [103700,31772] 0 2026-03-10T12:38:15.191 INFO:tasks.workunit.client.0.vm00.stdout:9/921: dwrite d0/d3d/d59/d4e/dba/d19/f95 [0,4194304] 0 2026-03-10T12:38:15.193 INFO:tasks.workunit.client.0.vm00.stdout:8/844: sync 2026-03-10T12:38:15.195 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:15 vm00.local ceph-mon[50686]: Deploying cephadm binary to vm07 2026-03-10T12:38:15.205 INFO:tasks.workunit.client.1.vm07.stdout:5/733: write d0/d22/d18/d19/d2e/d67/fa0 [5445280,123718] 0 2026-03-10T12:38:15.207 INFO:tasks.workunit.client.1.vm07.stdout:5/734: symlink d0/d22/l102 0 2026-03-10T12:38:15.231 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.230+0000 7f9dd116c700 1 -- 192.168.123.100:0/1075449107 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f9db8005190 con 0x7f9dcc082a00 2026-03-10T12:38:15.231 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.230+0000 7f9dc97fa700 1 -- 192.168.123.100:0/1075449107 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f9dc4018390 con 0x7f9dcc082a00 2026-03-10T12:38:15.231 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:38:15.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 -- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f9db403dea0 msgr2=0x7f9db4040350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f9db403dea0 0x7f9db4040350 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f9dbc00bc70 tx=0x7f9dbc00d330 comp rx=0 tx=0).stop 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 -- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc082a00 msgr2=0x7f9dcc082e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc082a00 0x7f9dcc082e70 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f9dc400ea00 tx=0x7f9dc400edc0 comp rx=0 tx=0).stop 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 -- 192.168.123.100:0/1075449107 shutdown_connections 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:6828/3729807627,v1:192.168.123.107:6829/3729807627] conn(0x7f9db403dea0 0x7f9db4040350 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9dcc072330 0x7f9dcc0824c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 --2- 192.168.123.100:0/1075449107 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9dcc082a00 0x7f9dcc082e70 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 -- 192.168.123.100:0/1075449107 >> 192.168.123.100:0/1075449107 conn(0x7f9dcc06d1a0 msgr2=0x7f9dcc0705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 -- 192.168.123.100:0/1075449107 shutdown_connections 2026-03-10T12:38:15.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:15.234+0000 7f9db2ffd700 1 -- 192.168.123.100:0/1075449107 wait complete. 2026-03-10T12:38:15.243 INFO:tasks.workunit.client.1.vm07.stdout:8/680: fsync d1/d3/d11/f77 0 2026-03-10T12:38:15.247 INFO:tasks.workunit.client.1.vm07.stdout:8/681: fsync d1/d3/d6c/f9b 0 2026-03-10T12:38:15.249 INFO:tasks.workunit.client.1.vm07.stdout:9/773: dread d5/d69/d93/d97/fa2 [0,4194304] 0 2026-03-10T12:38:15.249 INFO:tasks.workunit.client.1.vm07.stdout:9/774: chown d5/d13/d9d/df2 226555638 1 2026-03-10T12:38:15.251 INFO:tasks.workunit.client.1.vm07.stdout:6/667: rename d1/d4/d6/d53 to d1/dd7 0 2026-03-10T12:38:15.252 INFO:tasks.workunit.client.0.vm00.stdout:4/920: dwrite df/d1f/d36/faa [0,4194304] 0 2026-03-10T12:38:15.256 INFO:tasks.workunit.client.0.vm00.stdout:9/922: unlink d0/d3d/d59/d4e/dba/d1e/d2b/f47 0 2026-03-10T12:38:15.258 INFO:tasks.workunit.client.1.vm07.stdout:3/709: write dc/dd/f22 [1024952,59441] 0 2026-03-10T12:38:15.258 INFO:tasks.workunit.client.1.vm07.stdout:3/710: stat dc/dd/fbc 0 2026-03-10T12:38:15.261 INFO:tasks.workunit.client.1.vm07.stdout:7/660: write d0/d52/fa4 [184832,8009] 0 2026-03-10T12:38:15.261 INFO:tasks.workunit.client.1.vm07.stdout:7/661: fdatasync d0/d47/da0/fda 0 2026-03-10T12:38:15.262 INFO:tasks.workunit.client.1.vm07.stdout:7/662: stat d0/d61/db4/fad 0 2026-03-10T12:38:15.263 INFO:tasks.workunit.client.1.vm07.stdout:0/800: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:38:15.264 INFO:tasks.workunit.client.0.vm00.stdout:1/923: creat da/d21/db3/d59/d120/d72/f132 x:0 0 0 2026-03-10T12:38:15.273 INFO:tasks.workunit.client.0.vm00.stdout:4/921: dread - df/d1f/d22/d26/d70/fbd zero size 2026-03-10T12:38:15.276 INFO:tasks.workunit.client.0.vm00.stdout:7/678: rmdir da/d47/d87 39 2026-03-10T12:38:15.277 INFO:tasks.workunit.client.0.vm00.stdout:4/922: read df/f20 [3345933,31683] 0 2026-03-10T12:38:15.280 INFO:tasks.workunit.client.0.vm00.stdout:5/960: link d1f/d26/d2b/fd0 d1f/d26/d2b/d35/d78/d99/daf/f153 0 2026-03-10T12:38:15.284 INFO:tasks.workunit.client.0.vm00.stdout:5/961: dwrite d1f/d26/d2b/d37/db2/f142 [0,4194304] 0 2026-03-10T12:38:15.292 INFO:tasks.workunit.client.1.vm07.stdout:9/775: read d5/d13/d22/f9e [12278,101575] 0 2026-03-10T12:38:15.293 INFO:tasks.workunit.client.0.vm00.stdout:3/918: dwrite dd/d2a/da2/de1/d101/f123 [0,4194304] 0 2026-03-10T12:38:15.293 INFO:tasks.workunit.client.1.vm07.stdout:9/776: chown d5/d1f/fb9 2892974 1 2026-03-10T12:38:15.294 INFO:tasks.workunit.client.0.vm00.stdout:2/916: write d4/f67 [1456815,79507] 0 2026-03-10T12:38:15.297 INFO:tasks.workunit.client.1.vm07.stdout:6/668: fsync d1/d4/d6/f13 0 2026-03-10T12:38:15.299 INFO:tasks.workunit.client.0.vm00.stdout:8/845: creat d0/d5c/f108 x:0 0 0 2026-03-10T12:38:15.300 INFO:tasks.workunit.client.0.vm00.stdout:8/846: dread - d0/d46/d89/f91 zero size 2026-03-10T12:38:15.305 INFO:tasks.workunit.client.1.vm07.stdout:3/711: dread dc/dd/d43/d5c/f9d [0,4194304] 0 2026-03-10T12:38:15.309 INFO:tasks.workunit.client.0.vm00.stdout:5/962: dread d1f/d26/d2e/f8c [0,4194304] 0 2026-03-10T12:38:15.309 INFO:tasks.workunit.client.1.vm07.stdout:0/801: unlink d0/d14/d5f/d76/d2f/d31/d4f/d9d/fe5 0 2026-03-10T12:38:15.312 INFO:tasks.workunit.client.1.vm07.stdout:8/682: truncate d1/d3/d6c/fda 279715 0 2026-03-10T12:38:15.312 INFO:tasks.workunit.client.0.vm00.stdout:5/963: dwrite d1f/d26/d2e/d58/d10c/d123/f102 [0,4194304] 0 2026-03-10T12:38:15.314 INFO:tasks.workunit.client.0.vm00.stdout:3/919: symlink dd/d27/d2c/def/d118/l130 0 2026-03-10T12:38:15.318 INFO:tasks.workunit.client.1.vm07.stdout:9/777: creat d5/d16/d23/d26/d68/f105 x:0 0 0 2026-03-10T12:38:15.319 INFO:tasks.workunit.client.1.vm07.stdout:9/778: chown d5/d13/d9d 5 1 2026-03-10T12:38:15.319 INFO:tasks.workunit.client.0.vm00.stdout:2/917: symlink d4/d53/d76/dba/l123 0 2026-03-10T12:38:15.321 INFO:tasks.workunit.client.0.vm00.stdout:8/847: creat d0/d93/d36/d51/f109 x:0 0 0 2026-03-10T12:38:15.322 INFO:tasks.workunit.client.1.vm07.stdout:4/803: rename d0/d4/d10/d5f/l69 to d0/d4/d10/d3c/d2b/d2d/d9c/l11a 0 2026-03-10T12:38:15.324 INFO:tasks.workunit.client.0.vm00.stdout:0/776: write d3/d22/f71 [1554755,122056] 0 2026-03-10T12:38:15.324 INFO:tasks.workunit.client.0.vm00.stdout:8/848: dwrite d0/dd/d38/d81/f88 [0,4194304] 0 2026-03-10T12:38:15.327 INFO:tasks.workunit.client.0.vm00.stdout:4/923: truncate df/f126 849114 0 2026-03-10T12:38:15.329 INFO:tasks.workunit.client.0.vm00.stdout:5/964: dread - d1f/d39/f10d zero size 2026-03-10T12:38:15.338 INFO:tasks.workunit.client.0.vm00.stdout:0/777: dread d3/d33/f4d [0,4194304] 0 2026-03-10T12:38:15.340 INFO:tasks.workunit.client.0.vm00.stdout:9/923: getdents d0/d3d/d59/d4e/dba/d1e/d2b 0 2026-03-10T12:38:15.341 INFO:tasks.workunit.client.0.vm00.stdout:8/849: dread d0/d93/d17/d48/fc7 [0,4194304] 0 2026-03-10T12:38:15.346 INFO:tasks.workunit.client.1.vm07.stdout:6/669: fdatasync d1/d4/d6/f41 0 2026-03-10T12:38:15.351 INFO:tasks.workunit.client.0.vm00.stdout:2/918: rename d4/d53/d9e/f111 to d4/d6/de7/f124 0 2026-03-10T12:38:15.356 INFO:tasks.workunit.client.0.vm00.stdout:9/924: rmdir d0/d7f 39 2026-03-10T12:38:15.366 INFO:tasks.workunit.client.0.vm00.stdout:8/850: creat d0/d46/d7e/f10a x:0 0 0 2026-03-10T12:38:15.366 INFO:tasks.workunit.client.0.vm00.stdout:8/851: readlink d0/d93/d17/d48/lab 0 2026-03-10T12:38:15.367 INFO:tasks.workunit.client.0.vm00.stdout:8/852: chown d0/d93/d17/fb2 29276423 1 2026-03-10T12:38:15.367 INFO:tasks.workunit.client.1.vm07.stdout:0/802: symlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/l10e 0 2026-03-10T12:38:15.367 INFO:tasks.workunit.client.0.vm00.stdout:8/853: stat d0/d93/d36/d5b/l99 0 2026-03-10T12:38:15.370 INFO:tasks.workunit.client.0.vm00.stdout:1/924: write da/d24/d28/d67/da2/f9c [166567,101496] 0 2026-03-10T12:38:15.373 INFO:tasks.workunit.client.1.vm07.stdout:1/698: dwrite d9/df/d29/d2b/d31/d91/faf [0,4194304] 0 2026-03-10T12:38:15.377 INFO:tasks.workunit.client.1.vm07.stdout:1/699: chown d9/df/d29/d2b/d31/d91/d59/fa4 15457 1 2026-03-10T12:38:15.377 INFO:tasks.workunit.client.1.vm07.stdout:1/700: dread - d9/df/d29/d2b/d31/fd8 zero size 2026-03-10T12:38:15.379 INFO:tasks.workunit.client.1.vm07.stdout:9/779: mknod d5/d13/d22/c106 0 2026-03-10T12:38:15.379 INFO:tasks.workunit.client.1.vm07.stdout:2/621: write d0/d42/f5f [953463,104390] 0 2026-03-10T12:38:15.381 INFO:tasks.workunit.client.0.vm00.stdout:4/924: link df/d8a/f12d df/d93/d9e/f12f 0 2026-03-10T12:38:15.383 INFO:tasks.workunit.client.1.vm07.stdout:2/622: dwrite d0/d29/d64/fd2 [0,4194304] 0 2026-03-10T12:38:15.387 INFO:tasks.workunit.client.1.vm07.stdout:5/735: rename d0/d22/d18/d19/d21/f37 to d0/d22/d18/d19/d2e/da9/f103 0 2026-03-10T12:38:15.387 INFO:tasks.workunit.client.0.vm00.stdout:3/920: dread dd/d3d/d65/fad [0,4194304] 0 2026-03-10T12:38:15.388 INFO:tasks.workunit.client.0.vm00.stdout:5/965: mknod d1f/d26/d6f/d14f/c154 0 2026-03-10T12:38:15.389 INFO:tasks.workunit.client.1.vm07.stdout:6/670: mkdir d1/dd7/da3/dd8 0 2026-03-10T12:38:15.391 INFO:tasks.workunit.client.1.vm07.stdout:7/663: symlink d0/d57/d62/d90/da1/le0 0 2026-03-10T12:38:15.394 INFO:tasks.workunit.client.1.vm07.stdout:0/803: creat d0/d14/d7c/f10f x:0 0 0 2026-03-10T12:38:15.395 INFO:tasks.workunit.client.1.vm07.stdout:0/804: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fef [403258,111502] 0 2026-03-10T12:38:15.401 INFO:tasks.workunit.client.1.vm07.stdout:1/701: unlink d9/d2d/d80/fc0 0 2026-03-10T12:38:15.405 INFO:tasks.workunit.client.0.vm00.stdout:9/925: symlink d0/d3d/d59/d4e/dba/d1e/d85/de5/l148 0 2026-03-10T12:38:15.405 INFO:tasks.workunit.client.1.vm07.stdout:8/683: symlink d1/d3/le1 0 2026-03-10T12:38:15.410 INFO:tasks.workunit.client.0.vm00.stdout:6/617: dwrite d2/d16/f1e [0,4194304] 0 2026-03-10T12:38:15.411 INFO:tasks.workunit.client.0.vm00.stdout:9/926: dwrite d0/f21 [0,4194304] 0 2026-03-10T12:38:15.413 INFO:tasks.workunit.client.0.vm00.stdout:7/679: dwrite da/d47/fb7 [0,4194304] 0 2026-03-10T12:38:15.413 INFO:tasks.workunit.client.0.vm00.stdout:7/680: chown da/d26/d50/d73/lbe 100465 1 2026-03-10T12:38:15.418 INFO:tasks.workunit.client.1.vm07.stdout:0/805: fsync d0/d14/d5f/d76/d2f/d31/d79/d85/fcf 0 2026-03-10T12:38:15.421 INFO:tasks.workunit.client.0.vm00.stdout:8/854: fsync d0/d93/f23 0 2026-03-10T12:38:15.421 INFO:tasks.workunit.client.0.vm00.stdout:8/855: fdatasync d0/f10 0 2026-03-10T12:38:15.422 INFO:tasks.workunit.client.1.vm07.stdout:5/736: dread d0/d22/d18/d19/d21/d3a/fa2 [0,4194304] 0 2026-03-10T12:38:15.422 INFO:tasks.workunit.client.1.vm07.stdout:5/737: stat d0/d22/d18/d19/d2e/f52 0 2026-03-10T12:38:15.428 INFO:tasks.workunit.client.0.vm00.stdout:4/925: write df/d1f/d22/d26/d65/da7/d10e/f113 [5049788,130091] 0 2026-03-10T12:38:15.429 INFO:tasks.workunit.client.0.vm00.stdout:4/926: write df/d6c/f124 [1956775,23356] 0 2026-03-10T12:38:15.436 INFO:tasks.workunit.client.0.vm00.stdout:5/966: symlink d1f/d26/d2b/d35/d78/d99/l155 0 2026-03-10T12:38:15.450 INFO:tasks.workunit.client.0.vm00.stdout:0/778: write d3/d7/d4c/f76 [2716020,115623] 0 2026-03-10T12:38:15.454 INFO:tasks.workunit.client.1.vm07.stdout:8/684: truncate d1/d3/d40/f5b 1457314 0 2026-03-10T12:38:15.455 INFO:tasks.workunit.client.0.vm00.stdout:9/927: creat d0/d3d/d125/f149 x:0 0 0 2026-03-10T12:38:15.471 INFO:tasks.workunit.client.1.vm07.stdout:1/702: truncate d9/d2d/de2/fbf 963521 0 2026-03-10T12:38:15.476 INFO:tasks.workunit.client.0.vm00.stdout:4/927: dread - df/d93/fee zero size 2026-03-10T12:38:15.479 INFO:tasks.workunit.client.1.vm07.stdout:2/623: link d0/d42/d1f/d20/cbd d0/d29/d64/d74/d88/cd4 0 2026-03-10T12:38:15.480 INFO:tasks.workunit.client.1.vm07.stdout:2/624: read d0/d29/d64/d74/d88/f58 [1921262,108533] 0 2026-03-10T12:38:15.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:15 vm00.local ceph-mon[50686]: Deploying cephadm binary to vm00 2026-03-10T12:38:15.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:15 vm00.local ceph-mon[50686]: pgmap v4: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:15.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:15 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/4055734810' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:38:15.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:15 vm00.local ceph-mon[50686]: mgrmap e23: vm07.kfawlb(active, since 2s) 2026-03-10T12:38:15.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:15 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/4276140805' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:38:15.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:15 vm00.local ceph-mon[50686]: from='client.24501 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:15.487 INFO:tasks.workunit.client.1.vm07.stdout:2/625: dread d0/d42/d26/f2e [0,4194304] 0 2026-03-10T12:38:15.501 INFO:tasks.workunit.client.1.vm07.stdout:6/671: mkdir d1/d4/d6/d46/d4d/dc7/dd9 0 2026-03-10T12:38:15.501 INFO:tasks.workunit.client.1.vm07.stdout:2/626: dwrite d0/d29/d64/f67 [0,4194304] 0 2026-03-10T12:38:15.502 INFO:tasks.workunit.client.1.vm07.stdout:9/780: getdents d5/d13/d9d 0 2026-03-10T12:38:15.505 INFO:tasks.workunit.client.1.vm07.stdout:5/738: link d0/d22/d18/d19/d21/d54/f9b d0/dbf/f104 0 2026-03-10T12:38:15.514 INFO:tasks.workunit.client.1.vm07.stdout:1/703: sync 2026-03-10T12:38:15.519 INFO:tasks.workunit.client.1.vm07.stdout:3/712: write dc/dd/d28/d3b/f9f [257113,37070] 0 2026-03-10T12:38:15.520 INFO:tasks.workunit.client.1.vm07.stdout:3/713: readlink dc/d18/d24/l66 0 2026-03-10T12:38:15.524 INFO:tasks.workunit.client.0.vm00.stdout:2/919: write d4/d6/d2d/d3a/d43/d85/fa3 [4332456,23412] 0 2026-03-10T12:38:15.525 INFO:tasks.workunit.client.1.vm07.stdout:3/714: creat dc/d18/de2/ff4 x:0 0 0 2026-03-10T12:38:15.525 INFO:tasks.workunit.client.0.vm00.stdout:3/921: write dd/d3d/d8a/ffb [986374,40682] 0 2026-03-10T12:38:15.526 INFO:tasks.workunit.client.1.vm07.stdout:4/804: write d0/d8e/fc4 [600101,18749] 0 2026-03-10T12:38:15.526 INFO:tasks.workunit.client.0.vm00.stdout:1/925: dwrite da/d21/db3/d59/ff0 [0,4194304] 0 2026-03-10T12:38:15.527 INFO:tasks.workunit.client.1.vm07.stdout:3/715: symlink dc/dd/d1f/dac/lf5 0 2026-03-10T12:38:15.528 INFO:tasks.workunit.client.1.vm07.stdout:3/716: fdatasync dc/dd/d1f/f27 0 2026-03-10T12:38:15.531 INFO:tasks.workunit.client.1.vm07.stdout:3/717: dread dc/dd/d28/d3b/fc1 [4194304,4194304] 0 2026-03-10T12:38:15.533 INFO:tasks.workunit.client.1.vm07.stdout:3/718: truncate dc/dd/f85 2844299 0 2026-03-10T12:38:15.536 INFO:tasks.workunit.client.0.vm00.stdout:8/856: getdents d0/dd/dfe 0 2026-03-10T12:38:15.540 INFO:tasks.workunit.client.0.vm00.stdout:4/928: mknod df/d93/dbc/c130 0 2026-03-10T12:38:15.541 INFO:tasks.workunit.client.1.vm07.stdout:7/664: write d0/d52/f97 [721183,34197] 0 2026-03-10T12:38:15.541 INFO:tasks.workunit.client.0.vm00.stdout:6/618: write d2/da/dc/d2f/f3a [1768425,105490] 0 2026-03-10T12:38:15.542 INFO:tasks.workunit.client.0.vm00.stdout:7/681: write da/f10 [6641079,59551] 0 2026-03-10T12:38:15.543 INFO:tasks.workunit.client.1.vm07.stdout:1/704: sync 2026-03-10T12:38:15.543 INFO:tasks.workunit.client.1.vm07.stdout:4/805: sync 2026-03-10T12:38:15.548 INFO:tasks.workunit.client.1.vm07.stdout:4/806: creat d0/d4/d5/da/f11b x:0 0 0 2026-03-10T12:38:15.550 INFO:tasks.workunit.client.0.vm00.stdout:6/619: creat d2/d42/d9c/fe2 x:0 0 0 2026-03-10T12:38:15.551 INFO:tasks.workunit.client.0.vm00.stdout:5/967: mkdir d1f/d26/d2b/d35/d78/d99/daf/d13d/d156 0 2026-03-10T12:38:15.552 INFO:tasks.workunit.client.0.vm00.stdout:6/620: chown d2/d42/d80/lbc 91048 1 2026-03-10T12:38:15.553 INFO:tasks.workunit.client.1.vm07.stdout:1/705: creat d9/df/fe7 x:0 0 0 2026-03-10T12:38:15.554 INFO:tasks.workunit.client.0.vm00.stdout:6/621: mknod d2/d14/ce3 0 2026-03-10T12:38:15.556 INFO:tasks.workunit.client.1.vm07.stdout:4/807: symlink d0/d4/d10/l11c 0 2026-03-10T12:38:15.557 INFO:tasks.workunit.client.1.vm07.stdout:4/808: chown d0/d4/d10/c118 16669818 1 2026-03-10T12:38:15.559 INFO:tasks.workunit.client.1.vm07.stdout:7/665: link d0/d61/l6b d0/d57/d62/le1 0 2026-03-10T12:38:15.559 INFO:tasks.workunit.client.1.vm07.stdout:7/666: chown d0/d52/lc9 1935 1 2026-03-10T12:38:15.559 INFO:tasks.workunit.client.1.vm07.stdout:1/706: dread - d9/df/d29/d2b/d31/f86 zero size 2026-03-10T12:38:15.562 INFO:tasks.workunit.client.0.vm00.stdout:3/922: symlink dd/d3d/d8a/de0/d55/l131 0 2026-03-10T12:38:15.563 INFO:tasks.workunit.client.0.vm00.stdout:3/923: chown dd/d18/d13/d99/da5/fcc 2637 1 2026-03-10T12:38:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:15 vm07.local ceph-mon[58582]: Deploying cephadm binary to vm07 2026-03-10T12:38:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:15 vm07.local ceph-mon[58582]: Deploying cephadm binary to vm00 2026-03-10T12:38:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:15 vm07.local ceph-mon[58582]: pgmap v4: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:15 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/4055734810' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:38:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:15 vm07.local ceph-mon[58582]: mgrmap e23: vm07.kfawlb(active, since 2s) 2026-03-10T12:38:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:15 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/4276140805' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:38:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:15 vm07.local ceph-mon[58582]: from='client.24501 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:15.573 INFO:tasks.workunit.client.1.vm07.stdout:0/806: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/fa4 [0,4194304] 0 2026-03-10T12:38:15.574 INFO:tasks.workunit.client.0.vm00.stdout:0/779: dwrite d3/db/d77/d82/fc2 [0,4194304] 0 2026-03-10T12:38:15.591 INFO:tasks.workunit.client.1.vm07.stdout:7/667: fdatasync d0/f3 0 2026-03-10T12:38:15.599 INFO:tasks.workunit.client.0.vm00.stdout:9/928: dwrite d0/d3d/d59/d74/f102 [0,4194304] 0 2026-03-10T12:38:15.605 INFO:tasks.workunit.client.0.vm00.stdout:8/857: symlink d0/dd/d38/l10b 0 2026-03-10T12:38:15.606 INFO:tasks.workunit.client.1.vm07.stdout:2/627: write d0/d42/d1f/d20/f3f [2270097,127046] 0 2026-03-10T12:38:15.607 INFO:tasks.workunit.client.1.vm07.stdout:6/672: dwrite d1/d4/f82 [0,4194304] 0 2026-03-10T12:38:15.607 INFO:tasks.workunit.client.0.vm00.stdout:5/968: symlink d1f/d26/d2b/d37/da4/l157 0 2026-03-10T12:38:15.611 INFO:tasks.workunit.client.0.vm00.stdout:5/969: dwrite d1f/d6a/f14c [0,4194304] 0 2026-03-10T12:38:15.613 INFO:tasks.workunit.client.1.vm07.stdout:8/685: dwrite d1/d3/d5d/f5f [0,4194304] 0 2026-03-10T12:38:15.614 INFO:tasks.workunit.client.0.vm00.stdout:6/622: readlink d2/d51/d70/lb7 0 2026-03-10T12:38:15.614 INFO:tasks.workunit.client.1.vm07.stdout:5/739: dwrite d0/d22/d18/d19/d2e/f59 [0,4194304] 0 2026-03-10T12:38:15.616 INFO:tasks.workunit.client.1.vm07.stdout:8/686: dread - d1/d3/d40/f5a zero size 2026-03-10T12:38:15.616 INFO:tasks.workunit.client.1.vm07.stdout:9/781: dwrite d5/d13/d57/d4f/d6a/f8a [0,4194304] 0 2026-03-10T12:38:15.621 INFO:tasks.workunit.client.1.vm07.stdout:1/707: dread d9/f6d [0,4194304] 0 2026-03-10T12:38:15.626 INFO:tasks.workunit.client.0.vm00.stdout:8/858: truncate d0/d93/d2d/f75 1896808 0 2026-03-10T12:38:15.630 INFO:tasks.workunit.client.0.vm00.stdout:0/780: dread d3/d40/f7a [0,4194304] 0 2026-03-10T12:38:15.633 INFO:tasks.workunit.client.0.vm00.stdout:4/929: link df/d1f/d36/d3a/d41/f33 df/d1f/d22/dcb/f131 0 2026-03-10T12:38:15.634 INFO:tasks.workunit.client.1.vm07.stdout:3/719: write dc/dd/f19 [2360334,119435] 0 2026-03-10T12:38:15.639 INFO:tasks.workunit.client.0.vm00.stdout:6/623: mknod d2/da/ce4 0 2026-03-10T12:38:15.640 INFO:tasks.workunit.client.0.vm00.stdout:7/682: write da/d26/d37/f79 [1177552,48583] 0 2026-03-10T12:38:15.645 INFO:tasks.workunit.client.0.vm00.stdout:6/624: symlink d2/d42/d80/d89/le5 0 2026-03-10T12:38:15.649 INFO:tasks.workunit.client.0.vm00.stdout:6/625: write d2/d14/d7a/db9/f4a [4035714,37040] 0 2026-03-10T12:38:15.656 INFO:tasks.workunit.client.0.vm00.stdout:6/626: mkdir d2/d42/dae/de6 0 2026-03-10T12:38:15.660 INFO:tasks.workunit.client.0.vm00.stdout:5/970: rename d1f/d26/d2b/d35/d78/d99/dcd/cdd to d1f/d26/de3/c158 0 2026-03-10T12:38:15.662 INFO:tasks.workunit.client.0.vm00.stdout:6/627: mknod d2/d16/d29/d31/d88/d92/ce7 0 2026-03-10T12:38:15.663 INFO:tasks.workunit.client.0.vm00.stdout:6/628: readlink d2/d42/d80/d89/le5 0 2026-03-10T12:38:15.671 INFO:tasks.workunit.client.1.vm07.stdout:1/708: symlink d9/df/d29/d2b/d31/d91/le8 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.1.vm07.stdout:3/720: mkdir dc/d18/de2/df6 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.1.vm07.stdout:7/668: mkdir d0/d47/da0/dd4/de2 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:2/920: creat d4/d10f/f125 x:0 0 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:2/921: chown d4/d6/d2d/d3a/d43/le0 122604015 1 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:4/930: rmdir df/d63/d94 39 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:5/971: creat d1f/d26/d2e/d58/d10c/f159 x:0 0 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:2/922: unlink d4/c5 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:9/929: creat d0/d7f/db8/f14a x:0 0 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:6/629: creat d2/d16/d29/d31/d88/dd5/fe8 x:0 0 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:4/931: creat df/f132 x:0 0 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:4/932: chown df/c7f 13421 1 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:2/923: creat d4/d53/d9e/d10a/f126 x:0 0 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:0/781: rename d3/d22/d3a/deb/ff0 to d3/d7/d4c/d5b/ffe 0 2026-03-10T12:38:15.694 INFO:tasks.workunit.client.0.vm00.stdout:2/924: unlink d4/d53/d76/f8b 0 2026-03-10T12:38:15.695 INFO:tasks.workunit.client.0.vm00.stdout:9/930: dread d0/d3d/d59/d4e/dba/fb9 [0,4194304] 0 2026-03-10T12:38:15.696 INFO:tasks.workunit.client.0.vm00.stdout:4/933: creat df/d1f/d36/f133 x:0 0 0 2026-03-10T12:38:15.696 INFO:tasks.workunit.client.1.vm07.stdout:2/628: creat d0/d42/d4e/dab/fd5 x:0 0 0 2026-03-10T12:38:15.697 INFO:tasks.workunit.client.0.vm00.stdout:9/931: mknod d0/d3d/df2/c14b 0 2026-03-10T12:38:15.699 INFO:tasks.workunit.client.0.vm00.stdout:4/934: symlink df/d1f/d22/d26/dab/l134 0 2026-03-10T12:38:15.699 INFO:tasks.workunit.client.0.vm00.stdout:4/935: write df/d1f/d36/dc6/f11e [244174,23040] 0 2026-03-10T12:38:15.700 INFO:tasks.workunit.client.1.vm07.stdout:6/673: creat d1/dd7/d66/dd6/fda x:0 0 0 2026-03-10T12:38:15.700 INFO:tasks.workunit.client.1.vm07.stdout:6/674: chown d1/d4/d6/l32 84069 1 2026-03-10T12:38:15.701 INFO:tasks.workunit.client.0.vm00.stdout:4/936: creat df/d1f/d22/d26/dab/d73/dda/f135 x:0 0 0 2026-03-10T12:38:15.703 INFO:tasks.workunit.client.0.vm00.stdout:0/782: chown d3/d7/d3c/c94 12611 1 2026-03-10T12:38:15.710 INFO:tasks.workunit.client.1.vm07.stdout:1/709: fdatasync d9/d2d/d4f/f98 0 2026-03-10T12:38:15.710 INFO:tasks.workunit.client.0.vm00.stdout:9/932: mkdir d0/d7f/db8/d14c 0 2026-03-10T12:38:15.710 INFO:tasks.workunit.client.0.vm00.stdout:9/933: dwrite d0/d3d/d59/d4e/dba/d1e/dcb/f141 [0,4194304] 0 2026-03-10T12:38:15.711 INFO:tasks.workunit.client.0.vm00.stdout:7/683: sync 2026-03-10T12:38:15.713 INFO:tasks.workunit.client.1.vm07.stdout:3/721: mknod dc/dd/d28/d3b/cf7 0 2026-03-10T12:38:15.730 INFO:tasks.workunit.client.0.vm00.stdout:4/937: dread df/d63/d77/f8d [0,4194304] 0 2026-03-10T12:38:15.753 INFO:tasks.workunit.client.0.vm00.stdout:7/684: rename da/d25/d2c/l31 to da/d3f/lf2 0 2026-03-10T12:38:15.753 INFO:tasks.workunit.client.1.vm07.stdout:3/722: rename dc/dd/d28/l2b to dc/dd/d28/dd0/lf8 0 2026-03-10T12:38:15.753 INFO:tasks.workunit.client.0.vm00.stdout:9/934: fdatasync d0/d7f/db8/dc4/d106/f145 0 2026-03-10T12:38:15.755 INFO:tasks.workunit.client.0.vm00.stdout:1/926: write da/d21/db3/d59/da6/f111 [548824,120341] 0 2026-03-10T12:38:15.756 INFO:tasks.workunit.client.0.vm00.stdout:1/927: stat da/d21/db3/d59/d120/dab/f12b 0 2026-03-10T12:38:15.759 INFO:tasks.workunit.client.1.vm07.stdout:4/809: write d0/d4/d5/fea [5058145,75268] 0 2026-03-10T12:38:15.759 INFO:tasks.workunit.client.1.vm07.stdout:1/710: symlink d9/d2d/d80/d8e/dc7/le9 0 2026-03-10T12:38:15.763 INFO:tasks.workunit.client.1.vm07.stdout:4/810: dwrite d0/d4/d5/da/fee [0,4194304] 0 2026-03-10T12:38:15.770 INFO:tasks.workunit.client.1.vm07.stdout:4/811: creat d0/d4/d10/d5f/d6d/f11d x:0 0 0 2026-03-10T12:38:15.771 INFO:tasks.workunit.client.0.vm00.stdout:7/685: creat da/d41/d7b/d9d/dc8/ff3 x:0 0 0 2026-03-10T12:38:15.775 INFO:tasks.workunit.client.1.vm07.stdout:3/723: symlink dc/dd/lf9 0 2026-03-10T12:38:15.775 INFO:tasks.workunit.client.1.vm07.stdout:1/711: mknod d9/df/d29/d2b/cea 0 2026-03-10T12:38:15.775 INFO:tasks.workunit.client.1.vm07.stdout:7/669: getdents d0/d47/da0 0 2026-03-10T12:38:15.775 INFO:tasks.workunit.client.0.vm00.stdout:3/924: dwrite dd/d64/fb2 [0,4194304] 0 2026-03-10T12:38:15.775 INFO:tasks.workunit.client.0.vm00.stdout:6/630: dread d2/d16/f17 [0,4194304] 0 2026-03-10T12:38:15.782 INFO:tasks.workunit.client.0.vm00.stdout:1/928: creat da/d24/d5a/d71/dd4/f133 x:0 0 0 2026-03-10T12:38:15.782 INFO:tasks.workunit.client.1.vm07.stdout:1/712: dread d9/df/d29/d2b/d31/f7d [0,4194304] 0 2026-03-10T12:38:15.784 INFO:tasks.workunit.client.0.vm00.stdout:3/925: read - dd/d3d/d8a/de0/de4/dac/f122 zero size 2026-03-10T12:38:15.790 INFO:tasks.workunit.client.0.vm00.stdout:6/631: creat d2/d42/d80/d9d/fe9 x:0 0 0 2026-03-10T12:38:15.790 INFO:tasks.workunit.client.0.vm00.stdout:6/632: fsync d2/da/dc/d94/fc7 0 2026-03-10T12:38:15.794 INFO:tasks.workunit.client.1.vm07.stdout:1/713: mknod d9/df/d79/ceb 0 2026-03-10T12:38:15.795 INFO:tasks.workunit.client.0.vm00.stdout:3/926: truncate dd/d64/d93/ff7 644102 0 2026-03-10T12:38:15.797 INFO:tasks.workunit.client.0.vm00.stdout:3/927: write dd/d18/d13/d1d/dc6/d106/f9c [903708,22144] 0 2026-03-10T12:38:15.798 INFO:tasks.workunit.client.1.vm07.stdout:3/724: read dc/dd/d43/d5c/fa9 [3606098,1526] 0 2026-03-10T12:38:15.799 INFO:tasks.workunit.client.1.vm07.stdout:7/670: truncate d0/d61/d79/f8d 271675 0 2026-03-10T12:38:15.802 INFO:tasks.workunit.client.1.vm07.stdout:3/725: dwrite dc/dd/d43/d76/d95/fb6 [0,4194304] 0 2026-03-10T12:38:15.835 INFO:tasks.workunit.client.0.vm00.stdout:3/928: dread dd/d2a/da2/de1/d45/f75 [0,4194304] 0 2026-03-10T12:38:15.836 INFO:tasks.workunit.client.0.vm00.stdout:3/929: chown dd/d18/d13/c26 15 1 2026-03-10T12:38:15.838 INFO:tasks.workunit.client.0.vm00.stdout:3/930: truncate dd/d2a/da2/db4/fe8 568004 0 2026-03-10T12:38:15.842 INFO:tasks.workunit.client.0.vm00.stdout:3/931: mkdir dd/d27/d2c/d132 0 2026-03-10T12:38:15.844 INFO:tasks.workunit.client.0.vm00.stdout:8/859: write d0/d46/d7e/fd8 [231568,34946] 0 2026-03-10T12:38:15.845 INFO:tasks.workunit.client.0.vm00.stdout:3/932: symlink dd/d3d/d8a/de0/d55/dfd/d125/d2b/l133 0 2026-03-10T12:38:15.847 INFO:tasks.workunit.client.1.vm07.stdout:5/740: write d0/d22/d18/d19/d2e/d67/fe2 [665806,1566] 0 2026-03-10T12:38:15.852 INFO:tasks.workunit.client.0.vm00.stdout:8/860: mkdir d0/d58/d68/d10c 0 2026-03-10T12:38:15.852 INFO:tasks.workunit.client.1.vm07.stdout:9/782: dwrite d5/d1f/f3d [0,4194304] 0 2026-03-10T12:38:15.852 INFO:tasks.workunit.client.0.vm00.stdout:5/972: dwrite d1f/d6a/d94/fb3 [0,4194304] 0 2026-03-10T12:38:15.854 INFO:tasks.workunit.client.1.vm07.stdout:0/807: dwrite d0/d14/d5f/d76/f27 [0,4194304] 0 2026-03-10T12:38:15.855 INFO:tasks.workunit.client.0.vm00.stdout:3/933: symlink dd/d18/d13/d1d/dc6/dd5/l134 0 2026-03-10T12:38:15.856 INFO:tasks.workunit.client.1.vm07.stdout:8/687: dwrite d1/d3/d6/f24 [0,4194304] 0 2026-03-10T12:38:15.862 INFO:tasks.workunit.client.0.vm00.stdout:0/783: write d3/d7/d4c/dcc/dea/ff4 [383859,1858] 0 2026-03-10T12:38:15.863 INFO:tasks.workunit.client.0.vm00.stdout:0/784: write d3/d22/fde [3741442,101652] 0 2026-03-10T12:38:15.865 INFO:tasks.workunit.client.1.vm07.stdout:2/629: dwrite d0/d42/d1f/d20/f39 [0,4194304] 0 2026-03-10T12:38:15.870 INFO:tasks.workunit.client.1.vm07.stdout:6/675: write d1/dd7/d66/fab [351879,2408] 0 2026-03-10T12:38:15.879 INFO:tasks.workunit.client.0.vm00.stdout:5/973: creat d1f/d6a/d118/d8e/f15a x:0 0 0 2026-03-10T12:38:15.879 INFO:tasks.workunit.client.0.vm00.stdout:0/785: rmdir d3/db 39 2026-03-10T12:38:15.884 INFO:tasks.workunit.client.0.vm00.stdout:3/934: mknod dd/d2a/c135 0 2026-03-10T12:38:15.894 INFO:tasks.workunit.client.0.vm00.stdout:2/925: dwrite d4/d6/f22 [0,4194304] 0 2026-03-10T12:38:15.902 INFO:tasks.workunit.client.0.vm00.stdout:5/974: fdatasync d1f/d26/d2b/d35/f42 0 2026-03-10T12:38:15.916 INFO:tasks.workunit.client.0.vm00.stdout:5/975: creat d1f/d39/d133/f15b x:0 0 0 2026-03-10T12:38:15.918 INFO:tasks.workunit.client.0.vm00.stdout:2/926: link d4/d53/d76/d9b/dad/f50 d4/d6/d2d/d31/f127 0 2026-03-10T12:38:15.918 INFO:tasks.workunit.client.0.vm00.stdout:0/786: getdents d3/d7/d4c/d5b/d38/db3/de2 0 2026-03-10T12:38:15.918 INFO:tasks.workunit.client.0.vm00.stdout:2/927: dread - d4/d53/d9e/d101/f118 zero size 2026-03-10T12:38:15.923 INFO:tasks.workunit.client.0.vm00.stdout:5/976: symlink d1f/d26/d2b/d35/d78/d99/dcd/d122/l15c 0 2026-03-10T12:38:15.938 INFO:tasks.workunit.client.1.vm07.stdout:4/812: write d0/d4/d5/da/fd4 [94244,19784] 0 2026-03-10T12:38:15.938 INFO:tasks.workunit.client.0.vm00.stdout:5/977: write d1f/d26/d2b/f5c [3187550,121064] 0 2026-03-10T12:38:15.938 INFO:tasks.workunit.client.0.vm00.stdout:5/978: symlink d1f/d26/d2e/d58/d6b/d113/l15d 0 2026-03-10T12:38:15.938 INFO:tasks.workunit.client.0.vm00.stdout:7/686: write da/d1b/f22 [1053099,68519] 0 2026-03-10T12:38:15.938 INFO:tasks.workunit.client.0.vm00.stdout:2/928: truncate d4/dd/ff 4511878 0 2026-03-10T12:38:15.939 INFO:tasks.workunit.client.0.vm00.stdout:5/979: creat d1f/d26/d2e/d58/d6b/de0/f15e x:0 0 0 2026-03-10T12:38:15.940 INFO:tasks.workunit.client.0.vm00.stdout:7/687: creat da/d26/d50/d73/d89/ff4 x:0 0 0 2026-03-10T12:38:15.940 INFO:tasks.workunit.client.0.vm00.stdout:2/929: mkdir d4/d53/d76/d9b/d107/d128 0 2026-03-10T12:38:15.990 INFO:tasks.workunit.client.0.vm00.stdout:0/787: getdents d3/d7/d4c/d5b 0 2026-03-10T12:38:15.992 INFO:tasks.workunit.client.0.vm00.stdout:3/935: read dd/d2a/da2/de1/d45/f47 [696777,114302] 0 2026-03-10T12:38:15.994 INFO:tasks.workunit.client.0.vm00.stdout:3/936: creat dd/d2a/da2/de1/f136 x:0 0 0 2026-03-10T12:38:15.996 INFO:tasks.workunit.client.0.vm00.stdout:3/937: rmdir dd/d2a/da2/d10e 0 2026-03-10T12:38:15.998 INFO:tasks.workunit.client.0.vm00.stdout:0/788: chown d3/db/d24/fb1 1060530819 1 2026-03-10T12:38:15.999 INFO:tasks.workunit.client.0.vm00.stdout:0/789: truncate d3/db/d77/d82/fce 4620149 0 2026-03-10T12:38:16.000 INFO:tasks.workunit.client.0.vm00.stdout:0/790: chown d3/d7/d4c/d5b/ffe 74413 1 2026-03-10T12:38:16.003 INFO:tasks.workunit.client.0.vm00.stdout:3/938: read dd/d18/d13/d99/da5/fd4 [2603295,129727] 0 2026-03-10T12:38:16.019 INFO:tasks.workunit.client.0.vm00.stdout:2/930: mkdir d4/d6/d93/d129 0 2026-03-10T12:38:16.022 INFO:tasks.workunit.client.0.vm00.stdout:4/938: rename df/d1f/d36/d3a/c55 to df/d32/d64/c136 0 2026-03-10T12:38:16.024 INFO:tasks.workunit.client.0.vm00.stdout:4/939: truncate df/f85 3728457 0 2026-03-10T12:38:16.031 INFO:tasks.workunit.client.0.vm00.stdout:4/940: dread df/d1f/d36/d3a/d41/f47 [0,4194304] 0 2026-03-10T12:38:16.032 INFO:tasks.workunit.client.0.vm00.stdout:4/941: rmdir df/d6c/d90 39 2026-03-10T12:38:16.033 INFO:tasks.workunit.client.0.vm00.stdout:4/942: symlink df/l137 0 2026-03-10T12:38:16.034 INFO:tasks.workunit.client.0.vm00.stdout:4/943: chown df/d1f/d36/d3a/d41/df7 6262364 1 2026-03-10T12:38:16.039 INFO:tasks.workunit.client.0.vm00.stdout:9/935: dwrite d0/d3d/d59/fad [0,4194304] 0 2026-03-10T12:38:16.042 INFO:tasks.workunit.client.0.vm00.stdout:0/791: symlink d3/d7/d3c/lff 0 2026-03-10T12:38:16.043 INFO:tasks.workunit.client.0.vm00.stdout:9/936: truncate d0/d5/dc/f2a 3405128 0 2026-03-10T12:38:16.044 INFO:tasks.workunit.client.0.vm00.stdout:6/633: rename d2/da/f2c to d2/d9f/dce/fea 0 2026-03-10T12:38:16.045 INFO:tasks.workunit.client.1.vm07.stdout:1/714: dwrite d9/d2d/d80/d8e/fa0 [0,4194304] 0 2026-03-10T12:38:16.046 INFO:tasks.workunit.client.1.vm07.stdout:7/671: write d0/d47/dab/dae/fbd [465872,127520] 0 2026-03-10T12:38:16.047 INFO:tasks.workunit.client.0.vm00.stdout:6/634: dwrite d2/d14/d7a/db9/f4a [0,4194304] 0 2026-03-10T12:38:16.047 INFO:tasks.workunit.client.1.vm07.stdout:3/726: write f1 [5287626,78701] 0 2026-03-10T12:38:16.048 INFO:tasks.workunit.client.1.vm07.stdout:3/727: chown dc/dd/fc5 53640212 1 2026-03-10T12:38:16.063 INFO:tasks.workunit.client.1.vm07.stdout:9/783: rename d5/d16/d23/d26/l60 to d5/d13/d57/l107 0 2026-03-10T12:38:16.066 INFO:tasks.workunit.client.0.vm00.stdout:3/939: getdents dd/d27/d2c/def/d118 0 2026-03-10T12:38:16.068 INFO:tasks.workunit.client.0.vm00.stdout:7/688: rename da/d41/d7b/d9d/dc8/ff3 to da/d25/d2c/d82/ff5 0 2026-03-10T12:38:16.073 INFO:tasks.workunit.client.0.vm00.stdout:8/861: dwrite d0/d93/d17/fb2 [0,4194304] 0 2026-03-10T12:38:16.074 INFO:tasks.workunit.client.0.vm00.stdout:8/862: read d0/dd/d38/d81/f88 [2600117,67074] 0 2026-03-10T12:38:16.075 INFO:tasks.workunit.client.0.vm00.stdout:8/863: dread - d0/d93/d36/d51/f109 zero size 2026-03-10T12:38:16.077 INFO:tasks.workunit.client.1.vm07.stdout:0/808: rmdir d0/d14/d5f/d76/d2f/d31/d4f 39 2026-03-10T12:38:16.083 INFO:tasks.workunit.client.1.vm07.stdout:8/688: symlink d1/d3/d6/d54/le2 0 2026-03-10T12:38:16.084 INFO:tasks.workunit.client.0.vm00.stdout:6/635: fdatasync d2/d16/d74/f7d 0 2026-03-10T12:38:16.084 INFO:tasks.workunit.client.0.vm00.stdout:6/636: chown d2/d14 315 1 2026-03-10T12:38:16.086 INFO:tasks.workunit.client.1.vm07.stdout:2/630: unlink d0/d42/d1f/f84 0 2026-03-10T12:38:16.091 INFO:tasks.workunit.client.1.vm07.stdout:9/784: dread d5/d16/d23/d26/f5c [0,4194304] 0 2026-03-10T12:38:16.092 INFO:tasks.workunit.client.0.vm00.stdout:8/864: symlink d0/d46/d89/l10d 0 2026-03-10T12:38:16.092 INFO:tasks.workunit.client.1.vm07.stdout:6/676: chown d1/l21 13 1 2026-03-10T12:38:16.093 INFO:tasks.workunit.client.0.vm00.stdout:8/865: write d0/d93/d36/d51/fe0 [108475,67598] 0 2026-03-10T12:38:16.093 INFO:tasks.workunit.client.0.vm00.stdout:7/689: fdatasync da/d41/f72 0 2026-03-10T12:38:16.093 INFO:tasks.workunit.client.0.vm00.stdout:8/866: stat d0/d93/d36/d7d 0 2026-03-10T12:38:16.094 INFO:tasks.workunit.client.0.vm00.stdout:7/690: fdatasync da/d26/d37/d56/fed 0 2026-03-10T12:38:16.097 INFO:tasks.workunit.client.1.vm07.stdout:1/715: truncate d9/d2d/d4f/d5a/f65 3076546 0 2026-03-10T12:38:16.098 INFO:tasks.workunit.client.0.vm00.stdout:0/792: getdents d3/d7/d4c/d5b/d38/d44 0 2026-03-10T12:38:16.100 INFO:tasks.workunit.client.0.vm00.stdout:0/793: creat d3/d7/d4c/d5b/d38/d44/f100 x:0 0 0 2026-03-10T12:38:16.101 INFO:tasks.workunit.client.1.vm07.stdout:8/689: creat d1/d3/d6c/fe3 x:0 0 0 2026-03-10T12:38:16.106 INFO:tasks.workunit.client.1.vm07.stdout:9/785: mkdir d5/d13/d2c/de6/d64/d108 0 2026-03-10T12:38:16.106 INFO:tasks.workunit.client.1.vm07.stdout:6/677: fsync d1/d4/d6/d4e/d64/fa4 0 2026-03-10T12:38:16.110 INFO:tasks.workunit.client.0.vm00.stdout:5/980: dwrite d1f/d96/dbd/fc5 [0,4194304] 0 2026-03-10T12:38:16.111 INFO:tasks.workunit.client.1.vm07.stdout:1/716: rmdir d9/d2d/d4f/d75/d77/da7 39 2026-03-10T12:38:16.111 INFO:tasks.workunit.client.0.vm00.stdout:0/794: creat d3/d7/d3c/d74/f101 x:0 0 0 2026-03-10T12:38:16.112 INFO:tasks.workunit.client.1.vm07.stdout:3/728: symlink dc/dd/d43/d76/d95/lfa 0 2026-03-10T12:38:16.113 INFO:tasks.workunit.client.1.vm07.stdout:5/741: link d0/d22/d18/d19/d21/d54/dcb/db8/fca d0/d22/d18/d19/de5/f105 0 2026-03-10T12:38:16.113 INFO:tasks.workunit.client.1.vm07.stdout:5/742: chown d0/d22/d18/dc7 55599 1 2026-03-10T12:38:16.113 INFO:tasks.workunit.client.0.vm00.stdout:0/795: write d3/d7/d4c/d5b/d38/d44/d5a/ff8 [439818,104438] 0 2026-03-10T12:38:16.114 INFO:tasks.workunit.client.1.vm07.stdout:0/809: fsync d0/d14/d5f/d76/d2f/d31/d4f/fa7 0 2026-03-10T12:38:16.115 INFO:tasks.workunit.client.1.vm07.stdout:0/810: truncate d0/d14/d5f/f54 4288196 0 2026-03-10T12:38:16.116 INFO:tasks.workunit.client.1.vm07.stdout:0/811: write d0/d14/d5f/d41/d6a/f102 [211378,39488] 0 2026-03-10T12:38:16.116 INFO:tasks.workunit.client.1.vm07.stdout:8/690: creat d1/d3/d6/d50/d70/fe4 x:0 0 0 2026-03-10T12:38:16.117 INFO:tasks.workunit.client.1.vm07.stdout:9/786: dread - d5/d1f/d7d/fcc zero size 2026-03-10T12:38:16.117 INFO:tasks.workunit.client.0.vm00.stdout:5/981: dwrite d1f/d26/d2e/fb8 [0,4194304] 0 2026-03-10T12:38:16.119 INFO:tasks.workunit.client.0.vm00.stdout:5/982: creat d1f/d6a/d94/dc9/d106/f15f x:0 0 0 2026-03-10T12:38:16.123 INFO:tasks.workunit.client.0.vm00.stdout:5/983: symlink d1f/d26/d6f/l160 0 2026-03-10T12:38:16.128 INFO:tasks.workunit.client.1.vm07.stdout:9/787: dwrite d5/d13/d57/d3e/fa9 [4194304,4194304] 0 2026-03-10T12:38:16.131 INFO:tasks.workunit.client.1.vm07.stdout:1/717: dread d9/f1a [0,4194304] 0 2026-03-10T12:38:16.136 INFO:tasks.workunit.client.1.vm07.stdout:5/743: creat d0/d22/d18/d19/d36/d75/d77/f106 x:0 0 0 2026-03-10T12:38:16.150 INFO:tasks.workunit.client.1.vm07.stdout:0/812: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf [0,4194304] 0 2026-03-10T12:38:16.150 INFO:tasks.workunit.client.1.vm07.stdout:6/678: link d1/d4/d6/d46/d4d/fb d1/dd7/da3/fdb 0 2026-03-10T12:38:16.150 INFO:tasks.workunit.client.1.vm07.stdout:3/729: mknod dc/cfb 0 2026-03-10T12:38:16.150 INFO:tasks.workunit.client.1.vm07.stdout:1/718: chown d9/d2d/de2/c46 0 1 2026-03-10T12:38:16.150 INFO:tasks.workunit.client.1.vm07.stdout:1/719: dread - d9/df/d29/d2b/db1/fdc zero size 2026-03-10T12:38:16.150 INFO:tasks.workunit.client.1.vm07.stdout:0/813: fdatasync d0/d14/d5f/d41/fe8 0 2026-03-10T12:38:16.150 INFO:tasks.workunit.client.1.vm07.stdout:6/679: unlink d1/d4/d6/f2a 0 2026-03-10T12:38:16.152 INFO:tasks.workunit.client.1.vm07.stdout:3/730: stat dc/dd/d1f/d45/fea 0 2026-03-10T12:38:16.155 INFO:tasks.workunit.client.1.vm07.stdout:0/814: stat d0/d14/l25 0 2026-03-10T12:38:16.166 INFO:tasks.workunit.client.1.vm07.stdout:1/720: symlink d9/df/d29/d2b/d92/d9d/lec 0 2026-03-10T12:38:16.166 INFO:tasks.workunit.client.1.vm07.stdout:1/721: truncate d9/df/fe7 826840 0 2026-03-10T12:38:16.166 INFO:tasks.workunit.client.1.vm07.stdout:6/680: fsync d1/dd7/f5e 0 2026-03-10T12:38:16.166 INFO:tasks.workunit.client.1.vm07.stdout:2/631: dread d0/d42/d4e/d77/f6f [0,4194304] 0 2026-03-10T12:38:16.166 INFO:tasks.workunit.client.1.vm07.stdout:5/744: dread d0/d22/d18/d19/d21/f42 [4194304,4194304] 0 2026-03-10T12:38:16.167 INFO:tasks.workunit.client.0.vm00.stdout:0/796: sync 2026-03-10T12:38:16.168 INFO:tasks.workunit.client.1.vm07.stdout:8/691: rename d1/d3/d11/f3c to d1/d3/d11/d87/fe5 0 2026-03-10T12:38:16.169 INFO:tasks.workunit.client.1.vm07.stdout:2/632: dread d0/d42/d26/f2e [4194304,4194304] 0 2026-03-10T12:38:16.172 INFO:tasks.workunit.client.1.vm07.stdout:3/731: fsync dc/dd/d43/d76/d95/da0/fa2 0 2026-03-10T12:38:16.186 INFO:tasks.workunit.client.1.vm07.stdout:9/788: dread d5/d69/ffe [0,4194304] 0 2026-03-10T12:38:16.187 INFO:tasks.workunit.client.1.vm07.stdout:9/789: chown d5/d16/d23 421109 1 2026-03-10T12:38:16.191 INFO:tasks.workunit.client.1.vm07.stdout:1/722: truncate d9/f1f 634196 0 2026-03-10T12:38:16.193 INFO:tasks.workunit.client.0.vm00.stdout:7/691: rmdir da/d41/d7b/d9d/dc8 39 2026-03-10T12:38:16.193 INFO:tasks.workunit.client.0.vm00.stdout:0/797: mkdir d3/d7/d4c/dcc/dea/d102 0 2026-03-10T12:38:16.200 INFO:tasks.workunit.client.1.vm07.stdout:0/815: dread d0/d14/d5f/d76/d2f/d31/d4f/f92 [0,4194304] 0 2026-03-10T12:38:16.204 INFO:tasks.workunit.client.1.vm07.stdout:0/816: dwrite d0/d14/d5f/d76/d2f/ffe [0,4194304] 0 2026-03-10T12:38:16.206 INFO:tasks.workunit.client.1.vm07.stdout:0/817: chown d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65 17049 1 2026-03-10T12:38:16.222 INFO:tasks.workunit.client.1.vm07.stdout:7/672: write d0/d61/db4/fad [260803,43697] 0 2026-03-10T12:38:16.230 INFO:tasks.workunit.client.1.vm07.stdout:4/813: truncate d0/d4/df2/df6/f50 3374686 0 2026-03-10T12:38:16.241 INFO:tasks.workunit.client.0.vm00.stdout:0/798: mkdir d3/db/d103 0 2026-03-10T12:38:16.242 INFO:tasks.workunit.client.1.vm07.stdout:6/681: mkdir d1/d4/d6/d46/d4d/dc7/dd9/ddc 0 2026-03-10T12:38:16.245 INFO:tasks.workunit.client.1.vm07.stdout:0/818: read d0/f1d [1922190,29060] 0 2026-03-10T12:38:16.246 INFO:tasks.workunit.client.0.vm00.stdout:7/692: getdents da/d1b 0 2026-03-10T12:38:16.250 INFO:tasks.workunit.client.1.vm07.stdout:8/692: symlink d1/d3/d6/d50/d70/dcf/le6 0 2026-03-10T12:38:16.254 INFO:tasks.workunit.client.1.vm07.stdout:8/693: dwrite d1/d3/d6/f24 [0,4194304] 0 2026-03-10T12:38:16.259 INFO:tasks.workunit.client.0.vm00.stdout:7/693: mknod da/d25/d2c/cf6 0 2026-03-10T12:38:16.261 INFO:tasks.workunit.client.1.vm07.stdout:3/732: link dc/dd/d28/d7a/f88 dc/d18/de2/df6/ffc 0 2026-03-10T12:38:16.262 INFO:tasks.workunit.client.1.vm07.stdout:3/733: chown dc/dd/d28/d7a/cca 287538 1 2026-03-10T12:38:16.264 INFO:tasks.workunit.client.1.vm07.stdout:0/819: stat d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/fd5 0 2026-03-10T12:38:16.265 INFO:tasks.workunit.client.1.vm07.stdout:1/723: link d9/df/d29/d2b/d31/d91/d59/f84 d9/df/d29/d2b/d92/fed 0 2026-03-10T12:38:16.265 INFO:tasks.workunit.client.0.vm00.stdout:7/694: creat da/d26/d37/d56/ff7 x:0 0 0 2026-03-10T12:38:16.265 INFO:tasks.workunit.client.1.vm07.stdout:1/724: chown d9/d2d/d4f/d5a 4 1 2026-03-10T12:38:16.274 INFO:tasks.workunit.client.1.vm07.stdout:9/790: write d5/d13/d6c/da4/fd0 [4077701,83846] 0 2026-03-10T12:38:16.276 INFO:tasks.workunit.client.1.vm07.stdout:2/633: dwrite d0/f8d [4194304,4194304] 0 2026-03-10T12:38:16.285 INFO:tasks.workunit.client.1.vm07.stdout:3/734: mknod dc/dd/d1f/d45/cfd 0 2026-03-10T12:38:16.291 INFO:tasks.workunit.client.1.vm07.stdout:5/745: rename d0/d22/d18/d19/d21/d3a/c64 to d0/c107 0 2026-03-10T12:38:16.301 INFO:tasks.workunit.client.0.vm00.stdout:7/695: mkdir da/d25/d2c/d82/d68/df8 0 2026-03-10T12:38:16.301 INFO:tasks.workunit.client.0.vm00.stdout:0/799: dwrite d3/d7/f9f [0,4194304] 0 2026-03-10T12:38:16.301 INFO:tasks.workunit.client.1.vm07.stdout:5/746: write d0/d22/d18/d19/ffb [708611,22211] 0 2026-03-10T12:38:16.301 INFO:tasks.workunit.client.1.vm07.stdout:0/820: fsync d0/d14/d5f/d76/d2f/d31/f4d 0 2026-03-10T12:38:16.301 INFO:tasks.workunit.client.1.vm07.stdout:7/673: dwrite d0/d67/f71 [0,4194304] 0 2026-03-10T12:38:16.308 INFO:tasks.workunit.client.0.vm00.stdout:0/800: mknod d3/d22/d3a/c104 0 2026-03-10T12:38:16.313 INFO:tasks.workunit.client.1.vm07.stdout:5/747: rmdir d0/d22/d18/d19/d21 39 2026-03-10T12:38:16.315 INFO:tasks.workunit.client.0.vm00.stdout:0/801: symlink d3/d7/d3c/d74/l105 0 2026-03-10T12:38:16.316 INFO:tasks.workunit.client.1.vm07.stdout:9/791: mknod d5/d13/d2c/de6/d64/d108/c109 0 2026-03-10T12:38:16.319 INFO:tasks.workunit.client.1.vm07.stdout:6/682: rename d1/d4/d6/d16/d49/f7a to d1/d4/d6/d16/fdd 0 2026-03-10T12:38:16.323 INFO:tasks.workunit.client.1.vm07.stdout:2/634: rename d0/d42/d4e/f81 to d0/d80/d93/fd6 0 2026-03-10T12:38:16.336 INFO:tasks.workunit.client.0.vm00.stdout:0/802: mkdir d3/d7/d4c/dcc/dea/d102/d106 0 2026-03-10T12:38:16.336 INFO:tasks.workunit.client.0.vm00.stdout:0/803: readlink d3/d7/d4c/d5b/d38/d44/d5a/l5c 0 2026-03-10T12:38:16.336 INFO:tasks.workunit.client.1.vm07.stdout:9/792: mkdir d5/d1f/d5e/d10a 0 2026-03-10T12:38:16.337 INFO:tasks.workunit.client.1.vm07.stdout:9/793: truncate d5/d16/d23/d26/d68/fdc 411401 0 2026-03-10T12:38:16.337 INFO:tasks.workunit.client.1.vm07.stdout:3/735: rename dc/dd/lad to dc/d18/de2/df6/lfe 0 2026-03-10T12:38:16.337 INFO:tasks.workunit.client.1.vm07.stdout:5/748: getdents d0/d22/d18/d19/d36/d75/ddc 0 2026-03-10T12:38:16.337 INFO:tasks.workunit.client.1.vm07.stdout:7/674: dread d0/d61/d79/fba [0,4194304] 0 2026-03-10T12:38:16.337 INFO:tasks.workunit.client.1.vm07.stdout:7/675: chown d0/d47/lc1 683 1 2026-03-10T12:38:16.339 INFO:tasks.workunit.client.0.vm00.stdout:2/931: write d4/dd/d63/f83 [972050,56627] 0 2026-03-10T12:38:16.340 INFO:tasks.workunit.client.0.vm00.stdout:2/932: write d4/d6/d2d/d31/fcc [4817921,82961] 0 2026-03-10T12:38:16.344 INFO:tasks.workunit.client.0.vm00.stdout:4/944: write df/d1f/d22/d26/d65/d91/f50 [3162129,110356] 0 2026-03-10T12:38:16.345 INFO:tasks.workunit.client.1.vm07.stdout:0/821: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/f9b [0,4194304] 0 2026-03-10T12:38:16.345 INFO:tasks.workunit.client.0.vm00.stdout:4/945: mkdir df/d1f/d22/dcb/d138 0 2026-03-10T12:38:16.351 INFO:tasks.workunit.client.1.vm07.stdout:6/683: rename d1/d4/d6/d4e/d64/fb1 to d1/d4/d6/d43/fde 0 2026-03-10T12:38:16.351 INFO:tasks.workunit.client.0.vm00.stdout:9/937: dwrite d0/d7f/d88/f113 [0,4194304] 0 2026-03-10T12:38:16.354 INFO:tasks.workunit.client.0.vm00.stdout:9/938: fdatasync d0/d3d/d59/d4e/dba/d19/d50/fbd 0 2026-03-10T12:38:16.354 INFO:tasks.workunit.client.1.vm07.stdout:6/684: dwrite d1/d4/d6/d16/fbc [0,4194304] 0 2026-03-10T12:38:16.354 INFO:tasks.workunit.client.0.vm00.stdout:9/939: write d0/d3d/d59/f45 [993398,43212] 0 2026-03-10T12:38:16.355 INFO:tasks.workunit.client.0.vm00.stdout:9/940: write d0/d3d/d59/d74/f102 [3345575,109248] 0 2026-03-10T12:38:16.367 INFO:tasks.workunit.client.0.vm00.stdout:9/941: symlink d0/d5/d143/l14d 0 2026-03-10T12:38:16.368 INFO:tasks.workunit.client.1.vm07.stdout:4/814: dwrite d0/d4/d5/d78/dc5/df7/fb0 [0,4194304] 0 2026-03-10T12:38:16.370 INFO:tasks.workunit.client.1.vm07.stdout:4/815: write d0/d4/d5/fe8 [212197,66656] 0 2026-03-10T12:38:16.381 INFO:tasks.workunit.client.0.vm00.stdout:3/940: dwrite f7 [8388608,4194304] 0 2026-03-10T12:38:16.383 INFO:tasks.workunit.client.0.vm00.stdout:6/637: write d2/d51/f63 [1233324,14020] 0 2026-03-10T12:38:16.386 INFO:tasks.workunit.client.0.vm00.stdout:2/933: unlink d4/d53/l55 0 2026-03-10T12:38:16.390 INFO:tasks.workunit.client.0.vm00.stdout:3/941: unlink dd/d4e/faa 0 2026-03-10T12:38:16.394 INFO:tasks.workunit.client.1.vm07.stdout:8/694: dwrite d1/d3/d11/f46 [0,4194304] 0 2026-03-10T12:38:16.397 INFO:tasks.workunit.client.1.vm07.stdout:3/736: mknod dc/dd/d43/d76/d95/db8/cff 0 2026-03-10T12:38:16.397 INFO:tasks.workunit.client.1.vm07.stdout:3/737: fsync dc/dd/f9a 0 2026-03-10T12:38:16.404 INFO:tasks.workunit.client.0.vm00.stdout:2/934: creat d4/d53/f12a x:0 0 0 2026-03-10T12:38:16.408 INFO:tasks.workunit.client.0.vm00.stdout:2/935: dread d4/d6/d2d/d31/fcc [0,4194304] 0 2026-03-10T12:38:16.410 INFO:tasks.workunit.client.0.vm00.stdout:2/936: rmdir d4/d53/d76/d9b 39 2026-03-10T12:38:16.411 INFO:tasks.workunit.client.1.vm07.stdout:0/822: rename d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/cd6 to d0/d14/d5f/c110 0 2026-03-10T12:38:16.411 INFO:tasks.workunit.client.0.vm00.stdout:0/804: fdatasync d3/d7/d4c/d5b/d38/d44/d5a/ff8 0 2026-03-10T12:38:16.412 INFO:tasks.workunit.client.1.vm07.stdout:3/738: mknod dc/d18/d99/da3/def/c100 0 2026-03-10T12:38:16.413 INFO:tasks.workunit.client.1.vm07.stdout:3/739: chown dc/dd/d28/d7a/d8e/ca8 241885079 1 2026-03-10T12:38:16.414 INFO:tasks.workunit.client.1.vm07.stdout:8/695: truncate d1/d3/d40/f7e 79306 0 2026-03-10T12:38:16.415 INFO:tasks.workunit.client.1.vm07.stdout:6/685: getdents d1/d4/d6/d16/d1a/d2c 0 2026-03-10T12:38:16.420 INFO:tasks.workunit.client.1.vm07.stdout:0/823: unlink d0/d14/d5f/d76/d2f/d31/d79/d9e/lac 0 2026-03-10T12:38:16.420 INFO:tasks.workunit.client.0.vm00.stdout:2/937: dwrite d4/d6/f30 [4194304,4194304] 0 2026-03-10T12:38:16.420 INFO:tasks.workunit.client.0.vm00.stdout:0/805: symlink d3/d7/d4c/d9d/l107 0 2026-03-10T12:38:16.420 INFO:tasks.workunit.client.0.vm00.stdout:5/984: write d1f/d26/d2b/d37/f9e [3485,25864] 0 2026-03-10T12:38:16.420 INFO:tasks.workunit.client.0.vm00.stdout:0/806: readlink d3/db/d77/lcf 0 2026-03-10T12:38:16.426 INFO:tasks.workunit.client.1.vm07.stdout:8/696: mkdir d1/d3/d6c/dde/de7 0 2026-03-10T12:38:16.427 INFO:tasks.workunit.client.0.vm00.stdout:2/938: rmdir d4/d53/d76 39 2026-03-10T12:38:16.427 INFO:tasks.workunit.client.0.vm00.stdout:0/807: unlink d3/d22/lee 0 2026-03-10T12:38:16.427 INFO:tasks.workunit.client.1.vm07.stdout:6/686: creat d1/d4/d6/d46/d4d/fdf x:0 0 0 2026-03-10T12:38:16.428 INFO:tasks.workunit.client.1.vm07.stdout:6/687: chown d1/d4/d6/d16/d49/cb3 3876 1 2026-03-10T12:38:16.429 INFO:tasks.workunit.client.1.vm07.stdout:6/688: readlink d1/d4/d6/d16/l2f 0 2026-03-10T12:38:16.429 INFO:tasks.workunit.client.1.vm07.stdout:8/697: unlink d1/d3/d6/fb7 0 2026-03-10T12:38:16.430 INFO:tasks.workunit.client.0.vm00.stdout:0/808: write d3/d40/f7a [2378104,45] 0 2026-03-10T12:38:16.435 INFO:tasks.workunit.client.1.vm07.stdout:6/689: fsync d1/d4/d6/d16/faf 0 2026-03-10T12:38:16.444 INFO:tasks.workunit.client.0.vm00.stdout:0/809: truncate f2 3566500 0 2026-03-10T12:38:16.444 INFO:tasks.workunit.client.0.vm00.stdout:7/696: truncate da/d25/d2e/d4c/fe7 302852 0 2026-03-10T12:38:16.444 INFO:tasks.workunit.client.1.vm07.stdout:6/690: chown d1/d4/d6/d16/d1a/d6e 547 1 2026-03-10T12:38:16.444 INFO:tasks.workunit.client.1.vm07.stdout:8/698: mknod d1/d3/d11/ce8 0 2026-03-10T12:38:16.444 INFO:tasks.workunit.client.1.vm07.stdout:1/725: write d9/d2d/de2/fbf [1341632,19965] 0 2026-03-10T12:38:16.445 INFO:tasks.workunit.client.0.vm00.stdout:7/697: fdatasync da/d25/d2c/f4f 0 2026-03-10T12:38:16.449 INFO:tasks.workunit.client.1.vm07.stdout:6/691: mkdir d1/d4/d6/d16/d1a/d2c/de0 0 2026-03-10T12:38:16.449 INFO:tasks.workunit.client.1.vm07.stdout:0/824: getdents d0/d14/d5f/d3b/dbc 0 2026-03-10T12:38:16.450 INFO:tasks.workunit.client.0.vm00.stdout:0/810: fdatasync d3/d7/f10 0 2026-03-10T12:38:16.450 INFO:tasks.workunit.client.1.vm07.stdout:0/825: read - d0/d14/d5f/d76/d2f/d31/d79/d9e/f101 zero size 2026-03-10T12:38:16.453 INFO:tasks.workunit.client.0.vm00.stdout:2/939: creat d4/d53/d76/d9b/f12b x:0 0 0 2026-03-10T12:38:16.453 INFO:tasks.workunit.client.1.vm07.stdout:6/692: truncate d1/d4/d6/f8d 1603320 0 2026-03-10T12:38:16.456 INFO:tasks.workunit.client.1.vm07.stdout:6/693: dwrite d1/d4/d6/f30 [0,4194304] 0 2026-03-10T12:38:16.466 INFO:tasks.workunit.client.1.vm07.stdout:6/694: read d1/d4/d6/d43/d65/f7f [455271,124676] 0 2026-03-10T12:38:16.466 INFO:tasks.workunit.client.0.vm00.stdout:3/942: dread dd/d2a/da2/de1/d100/f8c [0,4194304] 0 2026-03-10T12:38:16.466 INFO:tasks.workunit.client.0.vm00.stdout:3/943: fsync dd/d2a/da2/de1/d38/f63 0 2026-03-10T12:38:16.466 INFO:tasks.workunit.client.0.vm00.stdout:4/946: write df/d1f/d22/d26/d65/d91/d101/fe6 [3270555,40330] 0 2026-03-10T12:38:16.466 INFO:tasks.workunit.client.1.vm07.stdout:0/826: symlink d0/d14/d5f/d76/d2f/d31/d4f/d9d/l111 0 2026-03-10T12:38:16.470 INFO:tasks.workunit.client.0.vm00.stdout:8/867: dwrite d0/d93/d2d/f75 [0,4194304] 0 2026-03-10T12:38:16.471 INFO:tasks.workunit.client.0.vm00.stdout:1/929: dread da/d21/db3/d59/d120/d72/f9a [0,4194304] 0 2026-03-10T12:38:16.474 INFO:tasks.workunit.client.1.vm07.stdout:6/695: unlink d1/d4/d6/d16/d1a/d2c/f78 0 2026-03-10T12:38:16.477 INFO:tasks.workunit.client.1.vm07.stdout:0/827: write d0/d14/d7c/fad [55931,42326] 0 2026-03-10T12:38:16.478 INFO:tasks.workunit.client.1.vm07.stdout:0/828: fsync d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/fa4 0 2026-03-10T12:38:16.481 INFO:tasks.workunit.client.1.vm07.stdout:6/696: unlink d1/d4/d6/f7d 0 2026-03-10T12:38:16.481 INFO:tasks.workunit.client.1.vm07.stdout:6/697: fdatasync d1/d4/d9b/fc8 0 2026-03-10T12:38:16.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:16 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/1075449107' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:38:16.487 INFO:tasks.workunit.client.1.vm07.stdout:0/829: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/f112 x:0 0 0 2026-03-10T12:38:16.493 INFO:tasks.workunit.client.1.vm07.stdout:6/698: mkdir d1/d4/d9b/de1 0 2026-03-10T12:38:16.495 INFO:tasks.workunit.client.1.vm07.stdout:0/830: mknod d0/d14/d5f/d76/da1/c113 0 2026-03-10T12:38:16.497 INFO:tasks.workunit.client.0.vm00.stdout:8/868: rmdir d0/d93/d36/d51 39 2026-03-10T12:38:16.498 INFO:tasks.workunit.client.0.vm00.stdout:8/869: chown d0/d58/d68/l90 58924 1 2026-03-10T12:38:16.503 INFO:tasks.workunit.client.0.vm00.stdout:2/940: creat d4/d6/d2d/d3a/d43/dd5/f12c x:0 0 0 2026-03-10T12:38:16.503 INFO:tasks.workunit.client.1.vm07.stdout:4/816: sync 2026-03-10T12:38:16.503 INFO:tasks.workunit.client.1.vm07.stdout:1/726: sync 2026-03-10T12:38:16.503 INFO:tasks.workunit.client.1.vm07.stdout:4/817: chown d0/d4/d5/da/d95 2415 1 2026-03-10T12:38:16.508 INFO:tasks.workunit.client.0.vm00.stdout:5/985: dread f11 [0,4194304] 0 2026-03-10T12:38:16.518 INFO:tasks.workunit.client.0.vm00.stdout:8/870: chown d0/dd/l57 516657108 1 2026-03-10T12:38:16.519 INFO:tasks.workunit.client.0.vm00.stdout:6/638: write d2/f5e [2663521,111956] 0 2026-03-10T12:38:16.519 INFO:tasks.workunit.client.1.vm07.stdout:2/635: write d0/d42/d26/d38/d4f/f65 [107786,85683] 0 2026-03-10T12:38:16.519 INFO:tasks.workunit.client.1.vm07.stdout:7/676: write d0/f3a [2902300,78744] 0 2026-03-10T12:38:16.520 INFO:tasks.workunit.client.0.vm00.stdout:6/639: dread - d2/da/dc/d94/fc7 zero size 2026-03-10T12:38:16.520 INFO:tasks.workunit.client.1.vm07.stdout:9/794: truncate d5/d13/d6c/fdf 2787748 0 2026-03-10T12:38:16.521 INFO:tasks.workunit.client.1.vm07.stdout:5/749: write d0/d22/d18/d19/d21/d54/f7d [1287396,1766] 0 2026-03-10T12:38:16.521 INFO:tasks.workunit.client.1.vm07.stdout:3/740: write dc/f17 [3533065,108681] 0 2026-03-10T12:38:16.523 INFO:tasks.workunit.client.1.vm07.stdout:1/727: dwrite d9/df/d29/d2b/d31/fd8 [0,4194304] 0 2026-03-10T12:38:16.524 INFO:tasks.workunit.client.1.vm07.stdout:0/831: truncate d0/d14/d7c/f90 1766439 0 2026-03-10T12:38:16.525 INFO:tasks.workunit.client.0.vm00.stdout:5/986: truncate d1f/d26/de3/db7/ff7 194546 0 2026-03-10T12:38:16.527 INFO:tasks.workunit.client.1.vm07.stdout:4/818: dwrite d0/d4/d10/d3c/fe5 [0,4194304] 0 2026-03-10T12:38:16.539 INFO:tasks.workunit.client.0.vm00.stdout:3/944: rename dd/d18/d13/d1d/dc6/d106/f9c to dd/d2a/da2/de1/f137 0 2026-03-10T12:38:16.539 INFO:tasks.workunit.client.0.vm00.stdout:2/941: mknod d4/d10f/c12d 0 2026-03-10T12:38:16.541 INFO:tasks.workunit.client.0.vm00.stdout:2/942: truncate d4/d53/d9e/d101/f118 133390 0 2026-03-10T12:38:16.541 INFO:tasks.workunit.client.0.vm00.stdout:5/987: mkdir d1f/d26/de3/db7/d161 0 2026-03-10T12:38:16.542 INFO:tasks.workunit.client.0.vm00.stdout:2/943: fsync d4/d10f/fce 0 2026-03-10T12:38:16.543 INFO:tasks.workunit.client.0.vm00.stdout:2/944: readlink d4/d6/dca/l122 0 2026-03-10T12:38:16.544 INFO:tasks.workunit.client.0.vm00.stdout:8/871: dwrite d0/f9d [0,4194304] 0 2026-03-10T12:38:16.545 INFO:tasks.workunit.client.0.vm00.stdout:8/872: stat d0/d46/fc6 0 2026-03-10T12:38:16.545 INFO:tasks.workunit.client.1.vm07.stdout:2/636: creat d0/d42/d26/d38/d4f/d5d/fd7 x:0 0 0 2026-03-10T12:38:16.548 INFO:tasks.workunit.client.1.vm07.stdout:7/677: mknod d0/d57/dd6/d80/ce3 0 2026-03-10T12:38:16.557 INFO:tasks.workunit.client.1.vm07.stdout:0/832: write d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/ffb [1328657,7379] 0 2026-03-10T12:38:16.560 INFO:tasks.workunit.client.0.vm00.stdout:8/873: creat d0/d93/d36/d5b/f10e x:0 0 0 2026-03-10T12:38:16.561 INFO:tasks.workunit.client.0.vm00.stdout:8/874: truncate d0/dd/d38/f100 723996 0 2026-03-10T12:38:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:16 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/1075449107' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:38:16.566 INFO:tasks.workunit.client.0.vm00.stdout:3/945: mkdir dd/d3d/d8a/de0/d55/d11a/d138 0 2026-03-10T12:38:16.573 INFO:tasks.workunit.client.1.vm07.stdout:8/699: write d1/f2 [142291,90683] 0 2026-03-10T12:38:16.573 INFO:tasks.workunit.client.0.vm00.stdout:8/875: truncate d0/d93/d36/d5b/f95 545790 0 2026-03-10T12:38:16.575 INFO:tasks.workunit.client.0.vm00.stdout:3/946: write dd/d3d/d8a/de0/d55/dfd/d125/d2b/d11c/f117 [932036,72286] 0 2026-03-10T12:38:16.576 INFO:tasks.workunit.client.1.vm07.stdout:8/700: dwrite d1/d3/f57 [0,4194304] 0 2026-03-10T12:38:16.582 INFO:tasks.workunit.client.1.vm07.stdout:1/728: dread d9/df/d29/d2b/f4e [4194304,4194304] 0 2026-03-10T12:38:16.585 INFO:tasks.workunit.client.0.vm00.stdout:3/947: mkdir dd/d139 0 2026-03-10T12:38:16.587 INFO:tasks.workunit.client.1.vm07.stdout:2/637: mknod d0/d42/d4e/dab/cd8 0 2026-03-10T12:38:16.588 INFO:tasks.workunit.client.0.vm00.stdout:3/948: dwrite dd/d64/f98 [0,4194304] 0 2026-03-10T12:38:16.592 INFO:tasks.workunit.client.1.vm07.stdout:2/638: dwrite d0/d42/d1f/d20/fa9 [0,4194304] 0 2026-03-10T12:38:16.596 INFO:tasks.workunit.client.0.vm00.stdout:3/949: mkdir dd/d139/d13a 0 2026-03-10T12:38:16.613 INFO:tasks.workunit.client.0.vm00.stdout:0/811: truncate d3/d22/fde 1860137 0 2026-03-10T12:38:16.614 INFO:tasks.workunit.client.0.vm00.stdout:7/698: truncate da/d47/d87/fb3 128534 0 2026-03-10T12:38:16.614 INFO:tasks.workunit.client.1.vm07.stdout:6/699: getdents d1/d4/d9b 0 2026-03-10T12:38:16.623 INFO:tasks.workunit.client.1.vm07.stdout:9/795: dwrite d5/d69/d93/fd1 [0,4194304] 0 2026-03-10T12:38:16.628 INFO:tasks.workunit.client.1.vm07.stdout:0/833: truncate d0/d14/d7c/fba 199833 0 2026-03-10T12:38:16.633 INFO:tasks.workunit.client.1.vm07.stdout:8/701: mknod d1/d3/d40/d92/dba/ce9 0 2026-03-10T12:38:16.676 INFO:tasks.workunit.client.0.vm00.stdout:9/942: write d0/d5/f3b [2771903,130655] 0 2026-03-10T12:38:16.684 INFO:tasks.workunit.client.0.vm00.stdout:9/943: rename d0/d3d/d59/d4e/dba/d1e/d27/ce9 to d0/d3d/df2/c14e 0 2026-03-10T12:38:16.699 INFO:tasks.workunit.client.1.vm07.stdout:5/750: dread d0/d22/d18/fb4 [0,4194304] 0 2026-03-10T12:38:16.704 INFO:tasks.workunit.client.0.vm00.stdout:9/944: symlink d0/l14f 0 2026-03-10T12:38:16.715 INFO:tasks.workunit.client.0.vm00.stdout:9/945: creat d0/d3d/d43/f150 x:0 0 0 2026-03-10T12:38:16.716 INFO:tasks.workunit.client.0.vm00.stdout:9/946: write d0/d3d/d125/f149 [187941,123380] 0 2026-03-10T12:38:16.726 INFO:tasks.workunit.client.0.vm00.stdout:9/947: dwrite d0/d3d/d43/d53/fd1 [0,4194304] 0 2026-03-10T12:38:16.729 INFO:tasks.workunit.client.0.vm00.stdout:9/948: mkdir d0/d5/d151 0 2026-03-10T12:38:16.775 INFO:tasks.workunit.client.1.vm07.stdout:3/741: dwrite dc/d18/de2/df6/ffc [0,4194304] 0 2026-03-10T12:38:16.778 INFO:tasks.workunit.client.0.vm00.stdout:6/640: write d2/da/dc/f13 [1111956,27941] 0 2026-03-10T12:38:16.779 INFO:tasks.workunit.client.1.vm07.stdout:3/742: read dc/dd/d43/d76/d95/da0/fa2 [618759,13431] 0 2026-03-10T12:38:16.780 INFO:tasks.workunit.client.1.vm07.stdout:3/743: truncate dc/d18/de2/ff4 351363 0 2026-03-10T12:38:16.806 INFO:tasks.workunit.client.1.vm07.stdout:7/678: dwrite d0/f56 [0,4194304] 0 2026-03-10T12:38:16.822 INFO:tasks.workunit.client.0.vm00.stdout:7/699: dwrite da/d25/f5a [0,4194304] 0 2026-03-10T12:38:16.827 INFO:tasks.workunit.client.0.vm00.stdout:4/947: write df/d93/dbc/fc3 [129493,92990] 0 2026-03-10T12:38:16.832 INFO:tasks.workunit.client.0.vm00.stdout:1/930: dwrite da/d12/d91/fb5 [0,4194304] 0 2026-03-10T12:38:16.845 INFO:tasks.workunit.client.0.vm00.stdout:6/641: rmdir d2/d16/d29/d31/d88/d92/daa/dc1 39 2026-03-10T12:38:16.847 INFO:tasks.workunit.client.0.vm00.stdout:1/931: creat da/d21/d39/f134 x:0 0 0 2026-03-10T12:38:16.851 INFO:tasks.workunit.client.1.vm07.stdout:6/700: mkdir d1/d4/d6/d16/d1a/d6e/de2 0 2026-03-10T12:38:16.851 INFO:tasks.workunit.client.0.vm00.stdout:5/988: dwrite f12 [0,4194304] 0 2026-03-10T12:38:16.861 INFO:tasks.workunit.client.0.vm00.stdout:2/945: dwrite d4/dd/fe6 [0,4194304] 0 2026-03-10T12:38:16.866 INFO:tasks.workunit.client.0.vm00.stdout:4/948: getdents df/d63/d77 0 2026-03-10T12:38:16.869 INFO:tasks.workunit.client.0.vm00.stdout:5/989: creat d1f/d26/d2b/d35/d78/f162 x:0 0 0 2026-03-10T12:38:16.881 INFO:tasks.workunit.client.0.vm00.stdout:8/876: dwrite d0/d93/d36/d5b/f65 [0,4194304] 0 2026-03-10T12:38:16.898 INFO:tasks.workunit.client.0.vm00.stdout:5/990: mkdir d1f/d26/d2b/d35/d78/d99/d163 0 2026-03-10T12:38:16.901 INFO:tasks.workunit.client.0.vm00.stdout:1/932: link da/d21/db3/d59/d120/d72/d7e/l12a da/d21/db3/d59/da6/da4/dda/dc0/dfe/l135 0 2026-03-10T12:38:16.901 INFO:tasks.workunit.client.0.vm00.stdout:8/877: mknod d0/dd/d38/d81/c10f 0 2026-03-10T12:38:16.905 INFO:tasks.workunit.client.0.vm00.stdout:5/991: chown d1f/d26/d2b/d35/d78/d99/daf/ff5 7946 1 2026-03-10T12:38:16.906 INFO:tasks.workunit.client.0.vm00.stdout:3/950: dwrite dd/d64/d93/ff7 [0,4194304] 0 2026-03-10T12:38:16.909 INFO:tasks.workunit.client.0.vm00.stdout:6/642: creat d2/d9f/feb x:0 0 0 2026-03-10T12:38:16.912 INFO:tasks.workunit.client.0.vm00.stdout:8/878: creat d0/dd/dfe/f110 x:0 0 0 2026-03-10T12:38:16.915 INFO:tasks.workunit.client.0.vm00.stdout:8/879: creat d0/dd/d38/f111 x:0 0 0 2026-03-10T12:38:16.915 INFO:tasks.workunit.client.0.vm00.stdout:8/880: stat d0/d93/d36/d7d/le5 0 2026-03-10T12:38:16.915 INFO:tasks.workunit.client.0.vm00.stdout:8/881: stat d0/c66 0 2026-03-10T12:38:16.915 INFO:tasks.workunit.client.0.vm00.stdout:3/951: link dd/d18/d13/l70 dd/d2a/da2/de1/d101/l13b 0 2026-03-10T12:38:16.919 INFO:tasks.workunit.client.0.vm00.stdout:8/882: mknod d0/dd/d38/d81/df3/c112 0 2026-03-10T12:38:16.921 INFO:tasks.workunit.client.0.vm00.stdout:2/946: dread d4/d6/d121/f96 [0,4194304] 0 2026-03-10T12:38:16.921 INFO:tasks.workunit.client.0.vm00.stdout:5/992: dwrite d1f/d26/d2e/d58/d10c/f159 [0,4194304] 0 2026-03-10T12:38:16.922 INFO:tasks.workunit.client.1.vm07.stdout:8/702: truncate d1/f88 484605 0 2026-03-10T12:38:16.923 INFO:tasks.workunit.client.0.vm00.stdout:6/643: creat d2/d42/d80/d89/fec x:0 0 0 2026-03-10T12:38:16.923 INFO:tasks.workunit.client.1.vm07.stdout:3/744: dread - dc/dd/db5/fd3 zero size 2026-03-10T12:38:16.924 INFO:tasks.workunit.client.0.vm00.stdout:2/947: write d4/dd/da7/ffc [906322,56519] 0 2026-03-10T12:38:16.924 INFO:tasks.workunit.client.0.vm00.stdout:3/952: creat dd/d3d/d8a/de0/d55/d11a/f13c x:0 0 0 2026-03-10T12:38:16.925 INFO:tasks.workunit.client.0.vm00.stdout:8/883: mkdir d0/d93/d17/db1/d113 0 2026-03-10T12:38:16.926 INFO:tasks.workunit.client.0.vm00.stdout:5/993: mknod d1f/d26/d2b/d37/dcc/c164 0 2026-03-10T12:38:16.929 INFO:tasks.workunit.client.1.vm07.stdout:8/703: write d1/d3/d40/fd1 [169052,69552] 0 2026-03-10T12:38:16.933 INFO:tasks.workunit.client.0.vm00.stdout:2/948: chown d4/d78 4780207 1 2026-03-10T12:38:16.943 INFO:tasks.workunit.client.0.vm00.stdout:2/949: stat d4/d6/cb7 0 2026-03-10T12:38:16.943 INFO:tasks.workunit.client.0.vm00.stdout:6/644: truncate d2/da/dc/d2f/fb4 2135946 0 2026-03-10T12:38:16.943 INFO:tasks.workunit.client.0.vm00.stdout:5/994: rmdir d1f/d26/d6f 39 2026-03-10T12:38:16.943 INFO:tasks.workunit.client.0.vm00.stdout:8/884: creat d0/d93/d36/f114 x:0 0 0 2026-03-10T12:38:16.943 INFO:tasks.workunit.client.0.vm00.stdout:3/953: getdents dd/d18/d13 0 2026-03-10T12:38:16.958 INFO:tasks.workunit.client.0.vm00.stdout:2/950: dread d4/d6/f4e [0,4194304] 0 2026-03-10T12:38:16.961 INFO:tasks.workunit.client.0.vm00.stdout:2/951: getdents d4/d53/d76/dba/de8 0 2026-03-10T12:38:16.962 INFO:tasks.workunit.client.0.vm00.stdout:2/952: mknod d4/d53/d76/dba/deb/c12e 0 2026-03-10T12:38:16.964 INFO:tasks.workunit.client.0.vm00.stdout:2/953: creat d4/d53/d76/dba/de8/f12f x:0 0 0 2026-03-10T12:38:16.969 INFO:tasks.workunit.client.0.vm00.stdout:3/954: dread dd/d18/f83 [0,4194304] 0 2026-03-10T12:38:16.975 INFO:tasks.workunit.client.0.vm00.stdout:1/933: sync 2026-03-10T12:38:16.975 INFO:tasks.workunit.client.0.vm00.stdout:6/645: sync 2026-03-10T12:38:16.978 INFO:tasks.workunit.client.0.vm00.stdout:1/934: truncate da/d12/d26/f69 2814799 0 2026-03-10T12:38:16.979 INFO:tasks.workunit.client.0.vm00.stdout:1/935: creat da/d12/db4/f136 x:0 0 0 2026-03-10T12:38:16.981 INFO:tasks.workunit.client.0.vm00.stdout:1/936: fdatasync da/d21/db3/fad 0 2026-03-10T12:38:16.981 INFO:tasks.workunit.client.0.vm00.stdout:9/949: write d0/d3d/d59/d4e/dba/d1e/d85/f11d [780701,87374] 0 2026-03-10T12:38:16.985 INFO:tasks.workunit.client.0.vm00.stdout:1/937: fsync da/fd5 0 2026-03-10T12:38:16.985 INFO:tasks.workunit.client.0.vm00.stdout:1/938: chown da/d12/d91/f108 3357087 1 2026-03-10T12:38:16.986 INFO:tasks.workunit.client.0.vm00.stdout:9/950: creat d0/d3d/d43/d53/d126/f152 x:0 0 0 2026-03-10T12:38:16.988 INFO:tasks.workunit.client.0.vm00.stdout:1/939: creat da/d24/d73/f137 x:0 0 0 2026-03-10T12:38:16.991 INFO:tasks.workunit.client.0.vm00.stdout:9/951: rename d0/d3d/d59/d4e/dba/f49 to d0/d3d/d59/d4e/dba/d1e/d2b/f153 0 2026-03-10T12:38:16.995 INFO:tasks.workunit.client.0.vm00.stdout:1/940: getdents da/d21/db3/d59/d120/d80 0 2026-03-10T12:38:16.996 INFO:tasks.workunit.client.0.vm00.stdout:6/646: read d2/da/dc/d2f/f56 [323938,13591] 0 2026-03-10T12:38:16.998 INFO:tasks.workunit.client.0.vm00.stdout:1/941: rename da/d12/db4/cee to da/d24/d5a/dd9/c138 0 2026-03-10T12:38:17.000 INFO:tasks.workunit.client.0.vm00.stdout:6/647: write d2/f5e [3781913,93347] 0 2026-03-10T12:38:17.002 INFO:tasks.workunit.client.0.vm00.stdout:1/942: rename da/d21/db3/d59/da6/da4/lfc to da/d24/d28/d67/db0/l139 0 2026-03-10T12:38:17.004 INFO:tasks.workunit.client.0.vm00.stdout:1/943: mkdir da/d21/db3/d59/da6/d8b/d98/d13a 0 2026-03-10T12:38:17.005 INFO:tasks.workunit.client.0.vm00.stdout:1/944: dread - da/d21/d27/d6a/d94/fcf zero size 2026-03-10T12:38:17.006 INFO:tasks.workunit.client.0.vm00.stdout:9/952: read d0/d3d/d59/d4e/dba/d1e/d27/f28 [89680,49980] 0 2026-03-10T12:38:17.008 INFO:tasks.workunit.client.0.vm00.stdout:9/953: creat d0/d5/f154 x:0 0 0 2026-03-10T12:38:17.009 INFO:tasks.workunit.client.0.vm00.stdout:1/945: mkdir da/d21/db3/d59/da6/d8b/df3/d13b 0 2026-03-10T12:38:17.011 INFO:tasks.workunit.client.0.vm00.stdout:9/954: stat d0/d7f/db8/f11b 0 2026-03-10T12:38:17.018 INFO:tasks.workunit.client.0.vm00.stdout:1/946: rename da/d24/d73/f137 to da/d21/db3/d59/da6/f13c 0 2026-03-10T12:38:17.021 INFO:tasks.workunit.client.0.vm00.stdout:1/947: truncate da/f13 1781798 0 2026-03-10T12:38:17.023 INFO:tasks.workunit.client.0.vm00.stdout:1/948: readlink da/d12/l4b 0 2026-03-10T12:38:17.026 INFO:tasks.workunit.client.0.vm00.stdout:1/949: truncate f3 70590 0 2026-03-10T12:38:17.029 INFO:tasks.workunit.client.0.vm00.stdout:6/648: dread d2/d14/d7a/db9/f85 [0,4194304] 0 2026-03-10T12:38:17.030 INFO:tasks.workunit.client.0.vm00.stdout:1/950: dwrite da/d21/d27/d6a/f9e [0,4194304] 0 2026-03-10T12:38:17.034 INFO:tasks.workunit.client.0.vm00.stdout:1/951: mkdir da/d21/db3/d59/da6/da4/dda/d13d 0 2026-03-10T12:38:17.035 INFO:tasks.workunit.client.0.vm00.stdout:1/952: mkdir da/d21/db3/d59/da6/d8b/df3/d13e 0 2026-03-10T12:38:17.036 INFO:tasks.workunit.client.0.vm00.stdout:1/953: fdatasync da/d21/d39/f134 0 2026-03-10T12:38:17.039 INFO:tasks.workunit.client.0.vm00.stdout:1/954: dwrite da/d24/d5a/d71/d10c/f127 [0,4194304] 0 2026-03-10T12:38:17.042 INFO:tasks.workunit.client.0.vm00.stdout:6/649: mkdir d2/da/dbf/ded 0 2026-03-10T12:38:17.044 INFO:tasks.workunit.client.0.vm00.stdout:1/955: creat da/d12/d26/dd2/f13f x:0 0 0 2026-03-10T12:38:17.045 INFO:tasks.workunit.client.0.vm00.stdout:1/956: mknod da/d21/d27/d6a/d94/db9/c140 0 2026-03-10T12:38:17.048 INFO:tasks.workunit.client.0.vm00.stdout:1/957: mknod da/d21/db3/d59/da6/d8b/df3/c141 0 2026-03-10T12:38:17.056 INFO:tasks.workunit.client.0.vm00.stdout:1/958: getdents da/d24/d28/d67 0 2026-03-10T12:38:17.056 INFO:tasks.workunit.client.0.vm00.stdout:4/949: write df/d32/d76/fc2 [121236,122000] 0 2026-03-10T12:38:17.057 INFO:tasks.workunit.client.0.vm00.stdout:4/950: chown df/f19 177 1 2026-03-10T12:38:17.058 INFO:tasks.workunit.client.0.vm00.stdout:4/951: chown df/d1f/d36/d3a/l123 0 1 2026-03-10T12:38:17.058 INFO:tasks.workunit.client.0.vm00.stdout:4/952: write df/d1f/d22/dcb/f131 [4803404,38466] 0 2026-03-10T12:38:17.064 INFO:tasks.workunit.client.0.vm00.stdout:1/959: symlink da/d21/db3/d59/da6/d8b/l142 0 2026-03-10T12:38:17.064 INFO:tasks.workunit.client.0.vm00.stdout:9/955: dread d0/f4 [4194304,4194304] 0 2026-03-10T12:38:17.074 INFO:tasks.workunit.client.1.vm07.stdout:7/679: mknod d0/d61/db4/d8a/d9d/ce4 0 2026-03-10T12:38:17.075 INFO:tasks.workunit.client.0.vm00.stdout:5/995: dwrite d1f/d26/d2e/d58/d6b/d86/fee [0,4194304] 0 2026-03-10T12:38:17.076 INFO:tasks.workunit.client.0.vm00.stdout:6/650: link d2/d16/d29/d31/d88/d92/daa/cc4 d2/d9f/dce/cee 0 2026-03-10T12:38:17.081 INFO:tasks.workunit.client.1.vm07.stdout:3/745: link dc/d18/de2/df6/ffc dc/dd/d43/d5c/f101 0 2026-03-10T12:38:17.081 INFO:tasks.workunit.client.1.vm07.stdout:6/701: rmdir d1/d4/d6/d16/d1a/d6e/de2 0 2026-03-10T12:38:17.085 INFO:tasks.workunit.client.0.vm00.stdout:8/885: write d0/dd/f9e [55524,123190] 0 2026-03-10T12:38:17.092 INFO:tasks.workunit.client.0.vm00.stdout:2/954: dwrite d4/d53/d76/d9b/dad/f5e [0,4194304] 0 2026-03-10T12:38:17.099 INFO:tasks.workunit.client.1.vm07.stdout:3/746: symlink dc/d18/d2d/l102 0 2026-03-10T12:38:17.099 INFO:tasks.workunit.client.1.vm07.stdout:6/702: mkdir d1/d4/d6/d46/d4d/dc7/dd9/de3 0 2026-03-10T12:38:17.099 INFO:tasks.workunit.client.1.vm07.stdout:8/704: getdents d1/d3/db2/dcd/db8 0 2026-03-10T12:38:17.102 INFO:tasks.workunit.client.0.vm00.stdout:3/955: dwrite dd/d18/d13/d99/da5/fcc [0,4194304] 0 2026-03-10T12:38:17.103 INFO:tasks.workunit.client.0.vm00.stdout:3/956: chown dd/d3d/d65/f90 36822 1 2026-03-10T12:38:17.103 INFO:tasks.workunit.client.0.vm00.stdout:3/957: fdatasync f7 0 2026-03-10T12:38:17.112 INFO:tasks.workunit.client.1.vm07.stdout:8/705: truncate d1/d3/db2/dcd/f7c 14029 0 2026-03-10T12:38:17.115 INFO:tasks.workunit.client.0.vm00.stdout:8/886: truncate d0/d93/d2d/f52 1931395 0 2026-03-10T12:38:17.115 INFO:tasks.workunit.client.0.vm00.stdout:8/887: stat d0/d58/fbf 0 2026-03-10T12:38:17.115 INFO:tasks.workunit.client.0.vm00.stdout:8/888: write d0/d93/d60/f98 [1242487,45113] 0 2026-03-10T12:38:17.115 INFO:tasks.workunit.client.0.vm00.stdout:8/889: chown d0/d93/d17 1404 1 2026-03-10T12:38:17.116 INFO:tasks.workunit.client.1.vm07.stdout:3/747: mknod dc/dd/d43/d76/c103 0 2026-03-10T12:38:17.121 INFO:tasks.workunit.client.0.vm00.stdout:9/956: dread d0/d5/f10d [0,4194304] 0 2026-03-10T12:38:17.121 INFO:tasks.workunit.client.0.vm00.stdout:9/957: chown d0/d3d/d59/d4e/dba/d1e/d27/d115/f87 2140948 1 2026-03-10T12:38:17.123 INFO:tasks.workunit.client.0.vm00.stdout:2/955: creat d4/d53/d76/d9b/d107/d128/f130 x:0 0 0 2026-03-10T12:38:17.124 INFO:tasks.workunit.client.0.vm00.stdout:3/958: creat dd/dea/f13d x:0 0 0 2026-03-10T12:38:17.124 INFO:tasks.workunit.client.0.vm00.stdout:2/956: chown d4/d53/d9e/d101/f118 172407235 1 2026-03-10T12:38:17.127 INFO:tasks.workunit.client.0.vm00.stdout:8/890: mknod d0/d93/d2d/d49/c115 0 2026-03-10T12:38:17.130 INFO:tasks.workunit.client.0.vm00.stdout:8/891: write d0/d93/d17/ff9 [1758354,130172] 0 2026-03-10T12:38:17.131 INFO:tasks.workunit.client.0.vm00.stdout:8/892: dwrite d0/d93/d2d/f55 [4194304,4194304] 0 2026-03-10T12:38:17.136 INFO:tasks.workunit.client.1.vm07.stdout:8/706: dread d1/d3/d6/f4f [0,4194304] 0 2026-03-10T12:38:17.141 INFO:tasks.workunit.client.0.vm00.stdout:2/957: symlink d4/d53/d76/dba/de8/l131 0 2026-03-10T12:38:17.141 INFO:tasks.workunit.client.0.vm00.stdout:9/958: mkdir d0/d3d/d59/d4e/d104/d12d/d155 0 2026-03-10T12:38:17.142 INFO:tasks.workunit.client.0.vm00.stdout:9/959: dread d0/d3d/d59/d4e/dba/d19/d50/fbd [0,4194304] 0 2026-03-10T12:38:17.143 INFO:tasks.workunit.client.0.vm00.stdout:9/960: chown d0/fdc 64383 1 2026-03-10T12:38:17.158 INFO:tasks.workunit.client.0.vm00.stdout:8/893: mknod d0/d93/d36/d5b/c116 0 2026-03-10T12:38:17.166 INFO:tasks.workunit.client.0.vm00.stdout:2/958: mkdir d4/d6/d2d/d3a/d43/dd5/d132 0 2026-03-10T12:38:17.172 INFO:tasks.workunit.client.0.vm00.stdout:8/894: rmdir d0/dd/dfe 39 2026-03-10T12:38:17.182 INFO:tasks.workunit.client.0.vm00.stdout:3/959: creat dd/d27/f13e x:0 0 0 2026-03-10T12:38:17.185 INFO:tasks.workunit.client.0.vm00.stdout:8/895: dwrite d0/d93/d36/d7d/ff5 [0,4194304] 0 2026-03-10T12:38:17.188 INFO:tasks.workunit.client.0.vm00.stdout:9/961: mknod d0/d7f/db8/c156 0 2026-03-10T12:38:17.203 INFO:tasks.workunit.client.0.vm00.stdout:4/953: write df/d1f/d22/d26/dab/f75 [1517188,106742] 0 2026-03-10T12:38:17.211 INFO:tasks.workunit.client.0.vm00.stdout:2/959: mknod d4/d10f/c133 0 2026-03-10T12:38:17.212 INFO:tasks.workunit.client.0.vm00.stdout:3/960: rmdir dd/d3d/d8a/de0/de4/dac 39 2026-03-10T12:38:17.212 INFO:tasks.workunit.client.0.vm00.stdout:9/962: creat d0/d7f/d88/f157 x:0 0 0 2026-03-10T12:38:17.212 INFO:tasks.workunit.client.0.vm00.stdout:5/996: dwrite d1f/d26/d2e/d58/d10c/d123/d72/ffa [0,4194304] 0 2026-03-10T12:38:17.212 INFO:tasks.workunit.client.0.vm00.stdout:1/960: dwrite da/d12/d91/f108 [0,4194304] 0 2026-03-10T12:38:17.216 INFO:tasks.workunit.client.0.vm00.stdout:3/961: chown dd/d2a/da2/de1/d100 1 1 2026-03-10T12:38:17.217 INFO:tasks.workunit.client.0.vm00.stdout:3/962: chown dd/d2a/c116 9022334 1 2026-03-10T12:38:17.222 INFO:tasks.workunit.client.0.vm00.stdout:8/896: sync 2026-03-10T12:38:17.223 INFO:tasks.workunit.client.0.vm00.stdout:8/897: fdatasync d0/dd/d38/f3d 0 2026-03-10T12:38:17.223 INFO:tasks.workunit.client.0.vm00.stdout:9/963: fdatasync d0/d7f/db8/dc4/fca 0 2026-03-10T12:38:17.225 INFO:tasks.workunit.client.0.vm00.stdout:2/960: unlink d4/d6/c114 0 2026-03-10T12:38:17.236 INFO:tasks.workunit.client.0.vm00.stdout:1/961: symlink da/d21/d27/d118/l143 0 2026-03-10T12:38:17.236 INFO:tasks.workunit.client.0.vm00.stdout:5/997: unlink d1f/d26/d6f/c10e 0 2026-03-10T12:38:17.259 INFO:tasks.workunit.client.0.vm00.stdout:5/998: truncate d1f/d6a/d94/dc9/fff 489568 0 2026-03-10T12:38:17.263 INFO:tasks.workunit.client.0.vm00.stdout:4/954: rename df/d6c/d90/cc1 to df/d1f/d22/dcb/def/c139 0 2026-03-10T12:38:17.269 INFO:tasks.workunit.client.0.vm00.stdout:3/963: truncate dd/d2a/da2/de1/d38/f63 1558825 0 2026-03-10T12:38:17.272 INFO:tasks.workunit.client.0.vm00.stdout:1/962: link da/d21/db3/d59/d120/dab/f12b da/d21/f144 0 2026-03-10T12:38:17.273 INFO:tasks.workunit.client.0.vm00.stdout:1/963: write da/d12/f64 [760622,53003] 0 2026-03-10T12:38:17.288 INFO:tasks.workunit.client.0.vm00.stdout:8/898: rename d0/dd/d38/d81/df3/l71 to d0/d58/d68/l117 0 2026-03-10T12:38:17.290 INFO:tasks.workunit.client.0.vm00.stdout:3/964: creat dd/d4e/d5d/f13f x:0 0 0 2026-03-10T12:38:17.290 INFO:tasks.workunit.client.0.vm00.stdout:5/999: creat d1f/d26/d2b/d35/d78/d99/d163/f165 x:0 0 0 2026-03-10T12:38:17.293 INFO:tasks.workunit.client.0.vm00.stdout:0/812: write d3/d22/d3a/fd9 [741633,12014] 0 2026-03-10T12:38:17.293 INFO:tasks.workunit.client.0.vm00.stdout:2/961: dread d4/dd/f17 [0,4194304] 0 2026-03-10T12:38:17.295 INFO:tasks.workunit.client.0.vm00.stdout:0/813: mknod d3/db/da4/de7/c108 0 2026-03-10T12:38:17.299 INFO:tasks.workunit.client.0.vm00.stdout:0/814: unlink d3/d7/d3c/f99 0 2026-03-10T12:38:17.300 INFO:tasks.workunit.client.0.vm00.stdout:0/815: write d3/d7/d4c/d5b/d38/db3/fca [726262,105229] 0 2026-03-10T12:38:17.319 INFO:tasks.workunit.client.0.vm00.stdout:9/964: link d0/d5/dc/f2a d0/d3d/d59/d4e/dba/d1e/dcb/f158 0 2026-03-10T12:38:17.321 INFO:tasks.workunit.client.0.vm00.stdout:1/964: rename da/d24/d5a/d71/d10c to da/d21/d39/d145 0 2026-03-10T12:38:17.328 INFO:tasks.workunit.client.0.vm00.stdout:9/965: symlink d0/d7f/d88/l159 0 2026-03-10T12:38:17.332 INFO:tasks.workunit.client.0.vm00.stdout:9/966: readlink d0/d3d/d59/d4e/dba/d1e/d27/d115/lac 0 2026-03-10T12:38:17.335 INFO:tasks.workunit.client.0.vm00.stdout:9/967: stat d0/d3d/d59/d4e/f7c 0 2026-03-10T12:38:17.335 INFO:tasks.workunit.client.0.vm00.stdout:8/899: fdatasync d0/d93/d17/f1d 0 2026-03-10T12:38:17.335 INFO:tasks.workunit.client.0.vm00.stdout:9/968: chown d0/f21 1593464 1 2026-03-10T12:38:17.341 INFO:tasks.workunit.client.0.vm00.stdout:1/965: mknod da/d21/d27/d6a/d94/c146 0 2026-03-10T12:38:17.348 INFO:tasks.workunit.client.0.vm00.stdout:2/962: rename d4/dd/la0 to d4/d78/l134 0 2026-03-10T12:38:17.362 INFO:tasks.workunit.client.0.vm00.stdout:9/969: unlink d0/d3d/d59/d4e/dba/d19/c8e 0 2026-03-10T12:38:17.371 INFO:tasks.workunit.client.0.vm00.stdout:4/955: truncate df/d1f/d36/d3a/d41/fc7 1849040 0 2026-03-10T12:38:17.377 INFO:tasks.workunit.client.0.vm00.stdout:3/965: rename dd/d27/f56 to dd/d3d/d8a/de0/de4/dac/f140 0 2026-03-10T12:38:17.377 INFO:tasks.workunit.client.0.vm00.stdout:9/970: dread d0/d3d/d59/d4e/dba/d1e/d2b/f5f [0,4194304] 0 2026-03-10T12:38:17.399 INFO:tasks.workunit.client.0.vm00.stdout:3/966: chown dd/d4e/c74 472 1 2026-03-10T12:38:17.399 INFO:tasks.workunit.client.0.vm00.stdout:9/971: symlink d0/d7f/d88/l15a 0 2026-03-10T12:38:17.400 INFO:tasks.workunit.client.0.vm00.stdout:3/967: write dd/f25 [5164305,25804] 0 2026-03-10T12:38:17.406 INFO:tasks.workunit.client.0.vm00.stdout:2/963: link d4/f10d d4/d53/d68/f135 0 2026-03-10T12:38:17.406 INFO:tasks.workunit.client.0.vm00.stdout:8/900: link d0/d93/d36/l62 d0/d58/d68/l118 0 2026-03-10T12:38:17.408 INFO:tasks.workunit.client.0.vm00.stdout:8/901: fdatasync d0/d93/d17/fb2 0 2026-03-10T12:38:17.412 INFO:tasks.workunit.client.0.vm00.stdout:4/956: symlink df/d1f/l13a 0 2026-03-10T12:38:17.412 INFO:tasks.workunit.client.0.vm00.stdout:8/902: chown d0/d93/d17/da2/ca4 311772 1 2026-03-10T12:38:17.420 INFO:tasks.workunit.client.0.vm00.stdout:9/972: rmdir d0/d7f/db8/dc4 39 2026-03-10T12:38:17.439 INFO:tasks.workunit.client.0.vm00.stdout:9/973: sync 2026-03-10T12:38:17.444 INFO:tasks.workunit.client.0.vm00.stdout:1/966: rename da/d21/db3/d59/da6/da4/dda/dc0/dfe/d10e/l110 to da/d24/l147 0 2026-03-10T12:38:17.454 INFO:tasks.workunit.client.0.vm00.stdout:8/903: mknod d0/dd/d38/d81/df3/c119 0 2026-03-10T12:38:17.460 INFO:tasks.workunit.client.0.vm00.stdout:3/968: write dd/d3d/fe3 [2160379,9536] 0 2026-03-10T12:38:17.474 INFO:tasks.workunit.client.0.vm00.stdout:4/957: dwrite df/d1f/d36/f92 [0,4194304] 0 2026-03-10T12:38:17.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:17 vm00.local ceph-mon[50686]: pgmap v5: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:17.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:17 vm00.local ceph-mon[50686]: mgrmap e24: vm07.kfawlb(active, since 4s) 2026-03-10T12:38:17.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:17 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:16] ENGINE Bus STARTING 2026-03-10T12:38:17.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:17 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:16] ENGINE Serving on http://192.168.123.107:8765 2026-03-10T12:38:17.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:17 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:16] ENGINE Serving on https://192.168.123.107:7150 2026-03-10T12:38:17.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:17 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:16] ENGINE Bus STARTED 2026-03-10T12:38:17.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:17 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:16] ENGINE Client ('192.168.123.107', 55800) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T12:38:17.515 INFO:tasks.workunit.client.0.vm00.stdout:4/958: dread df/d1f/d22/f4c [0,4194304] 0 2026-03-10T12:38:17.518 INFO:tasks.workunit.client.0.vm00.stdout:9/974: dwrite d0/d3d/d59/d4e/dba/d19/f109 [0,4194304] 0 2026-03-10T12:38:17.522 INFO:tasks.workunit.client.0.vm00.stdout:1/967: write da/d21/d39/f89 [2054999,30504] 0 2026-03-10T12:38:17.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:17 vm07.local ceph-mon[58582]: pgmap v5: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:17.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:17 vm07.local ceph-mon[58582]: mgrmap e24: vm07.kfawlb(active, since 4s) 2026-03-10T12:38:17.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:17 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:16] ENGINE Bus STARTING 2026-03-10T12:38:17.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:17 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:16] ENGINE Serving on http://192.168.123.107:8765 2026-03-10T12:38:17.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:17 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:16] ENGINE Serving on https://192.168.123.107:7150 2026-03-10T12:38:17.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:17 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:16] ENGINE Bus STARTED 2026-03-10T12:38:17.536 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:17 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:16] ENGINE Client ('192.168.123.107', 55800) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T12:38:17.545 INFO:tasks.workunit.client.0.vm00.stdout:3/969: unlink dd/d18/d13/d1d/c41 0 2026-03-10T12:38:17.549 INFO:tasks.workunit.client.0.vm00.stdout:2/964: link d4/d6/ff2 d4/f136 0 2026-03-10T12:38:17.549 INFO:tasks.workunit.client.0.vm00.stdout:9/975: symlink d0/d3d/d59/d4e/dba/d19/l15b 0 2026-03-10T12:38:17.551 INFO:tasks.workunit.client.0.vm00.stdout:3/970: dread dd/d64/f98 [0,4194304] 0 2026-03-10T12:38:17.559 INFO:tasks.workunit.client.0.vm00.stdout:2/965: unlink d4/d6/d121/l54 0 2026-03-10T12:38:17.563 INFO:tasks.workunit.client.0.vm00.stdout:8/904: getdents d0/dd/d38 0 2026-03-10T12:38:17.564 INFO:tasks.workunit.client.0.vm00.stdout:8/905: write d0/d93/d60/ff8 [429757,54543] 0 2026-03-10T12:38:17.568 INFO:tasks.workunit.client.0.vm00.stdout:9/976: dread d0/d7f/db8/dc4/f6c [0,4194304] 0 2026-03-10T12:38:17.575 INFO:tasks.workunit.client.0.vm00.stdout:2/966: symlink d4/d6/d2d/d3a/d43/l137 0 2026-03-10T12:38:17.578 INFO:tasks.workunit.client.0.vm00.stdout:2/967: dread d4/d53/d9e/d101/f118 [0,4194304] 0 2026-03-10T12:38:17.579 INFO:tasks.workunit.client.0.vm00.stdout:3/971: link dd/d3d/d8a/de0/d55/c85 dd/d3d/d8a/de0/d55/dfd/d125/c141 0 2026-03-10T12:38:17.583 INFO:tasks.workunit.client.1.vm07.stdout:2/639: dwrite d0/d42/d1f/d90/fb2 [0,4194304] 0 2026-03-10T12:38:17.589 INFO:tasks.workunit.client.0.vm00.stdout:9/977: mkdir d0/d3d/d43/d15c 0 2026-03-10T12:38:17.590 INFO:tasks.workunit.client.1.vm07.stdout:2/640: creat d0/d42/d4e/d77/d70/fd9 x:0 0 0 2026-03-10T12:38:17.593 INFO:tasks.workunit.client.0.vm00.stdout:4/959: write df/d1f/d22/d26/dab/d73/f7a [1335153,26684] 0 2026-03-10T12:38:17.595 INFO:tasks.workunit.client.0.vm00.stdout:1/968: write da/d21/db3/f7a [246732,124462] 0 2026-03-10T12:38:17.600 INFO:tasks.workunit.client.0.vm00.stdout:9/978: truncate d0/d3d/d59/d4e/dba/d1e/d85/d98/fab 1548873 0 2026-03-10T12:38:17.600 INFO:tasks.workunit.client.0.vm00.stdout:2/968: symlink d4/l138 0 2026-03-10T12:38:17.601 INFO:tasks.workunit.client.0.vm00.stdout:3/972: mkdir dd/d2a/da2/de1/d142 0 2026-03-10T12:38:17.601 INFO:tasks.workunit.client.0.vm00.stdout:2/969: chown d4/d6/d2d/d31 423604 1 2026-03-10T12:38:17.602 INFO:tasks.workunit.client.0.vm00.stdout:8/906: link d0/d93/d36/f39 d0/dd/d38/f11a 0 2026-03-10T12:38:17.603 INFO:tasks.workunit.client.0.vm00.stdout:2/970: chown d4/d6/d2d/d3a/c11f 15614 1 2026-03-10T12:38:17.607 INFO:tasks.workunit.client.0.vm00.stdout:3/973: dwrite dd/d2a/f78 [0,4194304] 0 2026-03-10T12:38:17.612 INFO:tasks.workunit.client.0.vm00.stdout:3/974: dwrite dd/d2a/da2/f12a [0,4194304] 0 2026-03-10T12:38:17.615 INFO:tasks.workunit.client.1.vm07.stdout:2/641: dread d0/d42/d26/f52 [0,4194304] 0 2026-03-10T12:38:17.616 INFO:tasks.workunit.client.1.vm07.stdout:2/642: write d0/d42/d1f/d20/f3f [8279905,91177] 0 2026-03-10T12:38:17.617 INFO:tasks.workunit.client.0.vm00.stdout:4/960: creat df/d63/d94/f13b x:0 0 0 2026-03-10T12:38:17.618 INFO:tasks.workunit.client.0.vm00.stdout:1/969: dread da/d12/d91/fb8 [0,4194304] 0 2026-03-10T12:38:17.623 INFO:tasks.workunit.client.1.vm07.stdout:2/643: stat d0/c37 0 2026-03-10T12:38:17.625 INFO:tasks.workunit.client.0.vm00.stdout:4/961: creat df/d1f/d36/d3a/d41/d111/f13c x:0 0 0 2026-03-10T12:38:17.625 INFO:tasks.workunit.client.0.vm00.stdout:8/907: creat d0/d58/d68/d10c/f11b x:0 0 0 2026-03-10T12:38:17.626 INFO:tasks.workunit.client.0.vm00.stdout:3/975: write dd/d27/d2c/f7d [231682,58389] 0 2026-03-10T12:38:17.632 INFO:tasks.workunit.client.0.vm00.stdout:4/962: dwrite df/d1f/d36/dc6/f11e [0,4194304] 0 2026-03-10T12:38:17.633 INFO:tasks.workunit.client.0.vm00.stdout:4/963: stat df/d32/d64 0 2026-03-10T12:38:17.636 INFO:tasks.workunit.client.1.vm07.stdout:2/644: link d0/l79 d0/d29/d64/d74/d75/db7/lda 0 2026-03-10T12:38:17.645 INFO:tasks.workunit.client.0.vm00.stdout:8/908: dread - d0/d46/fc6 zero size 2026-03-10T12:38:17.645 INFO:tasks.workunit.client.0.vm00.stdout:8/909: read - d0/d46/d89/ff4 zero size 2026-03-10T12:38:17.648 INFO:tasks.workunit.client.0.vm00.stdout:8/910: dwrite d0/dd/f9a [4194304,4194304] 0 2026-03-10T12:38:17.652 INFO:tasks.workunit.client.0.vm00.stdout:1/970: creat da/d21/db3/d59/d120/d80/dd8/f148 x:0 0 0 2026-03-10T12:38:17.659 INFO:tasks.workunit.client.0.vm00.stdout:4/964: truncate df/d1f/d22/d26/d65/d91/d101/f7c 342515 0 2026-03-10T12:38:17.664 INFO:tasks.workunit.client.0.vm00.stdout:8/911: chown d0/d93/d36/d7d/cda 1 1 2026-03-10T12:38:17.665 INFO:tasks.workunit.client.0.vm00.stdout:8/912: write d0/d93/d36/d7d/f106 [520312,24717] 0 2026-03-10T12:38:17.665 INFO:tasks.workunit.client.0.vm00.stdout:8/913: chown d0 174902883 1 2026-03-10T12:38:17.666 INFO:tasks.workunit.client.0.vm00.stdout:8/914: chown d0/d93/d17/db1/dde/lea 181044729 1 2026-03-10T12:38:17.669 INFO:tasks.workunit.client.0.vm00.stdout:2/971: creat d4/d6/d121/d6d/f139 x:0 0 0 2026-03-10T12:38:17.669 INFO:tasks.workunit.client.1.vm07.stdout:2/645: dread d0/d42/d26/f3e [0,4194304] 0 2026-03-10T12:38:17.670 INFO:tasks.workunit.client.0.vm00.stdout:4/965: mkdir df/d1f/d22/dcb/d13d 0 2026-03-10T12:38:17.671 INFO:tasks.workunit.client.0.vm00.stdout:4/966: write df/d1f/d22/f7d [8112671,115657] 0 2026-03-10T12:38:17.673 INFO:tasks.workunit.client.0.vm00.stdout:9/979: link d0/d3d/d59/d4e/dba/d1e/d27/d115/l91 d0/d7f/db8/l15d 0 2026-03-10T12:38:17.674 INFO:tasks.workunit.client.0.vm00.stdout:8/915: symlink d0/d93/d2d/d49/l11c 0 2026-03-10T12:38:17.676 INFO:tasks.workunit.client.0.vm00.stdout:2/972: mknod d4/d6/d93/c13a 0 2026-03-10T12:38:17.680 INFO:tasks.workunit.client.0.vm00.stdout:1/971: symlink da/d21/db3/d59/d120/d72/l149 0 2026-03-10T12:38:17.681 INFO:tasks.workunit.client.0.vm00.stdout:2/973: dwrite d4/d53/d9e/d10a/f126 [0,4194304] 0 2026-03-10T12:38:17.688 INFO:tasks.workunit.client.0.vm00.stdout:8/916: readlink d0/d46/ldf 0 2026-03-10T12:38:17.688 INFO:tasks.workunit.client.0.vm00.stdout:4/967: creat df/d1f/d36/d3a/d41/df7/d112/f13e x:0 0 0 2026-03-10T12:38:17.694 INFO:tasks.workunit.client.0.vm00.stdout:9/980: chown d0/d3d/d59/d4e/dba/d1e/d27/d115/l91 5 1 2026-03-10T12:38:17.696 INFO:tasks.workunit.client.0.vm00.stdout:4/968: symlink df/d1f/d36/d3a/d41/df7/l13f 0 2026-03-10T12:38:17.700 INFO:tasks.workunit.client.0.vm00.stdout:1/972: creat da/d21/d39/d129/f14a x:0 0 0 2026-03-10T12:38:17.702 INFO:tasks.workunit.client.0.vm00.stdout:2/974: truncate d4/d6/f89 741169 0 2026-03-10T12:38:17.705 INFO:tasks.workunit.client.0.vm00.stdout:9/981: mkdir d0/d7f/db8/dc4/db0/dcc/d15e 0 2026-03-10T12:38:17.712 INFO:tasks.workunit.client.0.vm00.stdout:7/700: dwrite da/d26/f27 [4194304,4194304] 0 2026-03-10T12:38:17.721 INFO:tasks.workunit.client.0.vm00.stdout:7/701: rename da/d26/c34 to da/d26/d37/d56/ddf/cf9 0 2026-03-10T12:38:17.725 INFO:tasks.workunit.client.1.vm07.stdout:5/751: write d0/d22/d18/d19/d2e/d67/f94 [1138636,68418] 0 2026-03-10T12:38:17.729 INFO:tasks.workunit.client.0.vm00.stdout:3/976: write dd/d18/d13/d1d/dc6/d106/f104 [238167,52157] 0 2026-03-10T12:38:17.729 INFO:tasks.workunit.client.0.vm00.stdout:2/975: symlink d4/dd/d102/l13b 0 2026-03-10T12:38:17.730 INFO:tasks.workunit.client.1.vm07.stdout:9/796: creat d5/d69/d93/f10b x:0 0 0 2026-03-10T12:38:17.731 INFO:tasks.workunit.client.1.vm07.stdout:0/834: dwrite d0/f1c [4194304,4194304] 0 2026-03-10T12:38:17.731 INFO:tasks.workunit.client.1.vm07.stdout:9/797: chown d5/d13/d2c/de6/d64/fb7 103 1 2026-03-10T12:38:17.736 INFO:tasks.workunit.client.0.vm00.stdout:2/976: dwrite d4/d6/f22 [0,4194304] 0 2026-03-10T12:38:17.743 INFO:tasks.workunit.client.0.vm00.stdout:3/977: dwrite dd/d3d/d8a/de0/de4/dac/f122 [0,4194304] 0 2026-03-10T12:38:17.743 INFO:tasks.workunit.client.0.vm00.stdout:8/917: getdents d0/d58 0 2026-03-10T12:38:17.743 INFO:tasks.workunit.client.1.vm07.stdout:5/752: creat d0/d22/d18/d19/d21/d54/dcb/db8/dec/f108 x:0 0 0 2026-03-10T12:38:17.743 INFO:tasks.workunit.client.1.vm07.stdout:0/835: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d9d/d114 0 2026-03-10T12:38:17.750 INFO:tasks.workunit.client.0.vm00.stdout:7/702: sync 2026-03-10T12:38:17.750 INFO:tasks.workunit.client.1.vm07.stdout:9/798: creat d5/d16/f10c x:0 0 0 2026-03-10T12:38:17.752 INFO:tasks.workunit.client.1.vm07.stdout:9/799: fsync d5/d16/d23/fb2 0 2026-03-10T12:38:17.756 INFO:tasks.workunit.client.0.vm00.stdout:2/977: creat d4/dd/da7/f13c x:0 0 0 2026-03-10T12:38:17.756 INFO:tasks.workunit.client.1.vm07.stdout:9/800: chown d5/d13/d2c/de6/d64/lbb 6 1 2026-03-10T12:38:17.757 INFO:tasks.workunit.client.1.vm07.stdout:9/801: readlink d5/d13/d22/l3f 0 2026-03-10T12:38:17.762 INFO:tasks.workunit.client.1.vm07.stdout:9/802: symlink d5/d16/d18/l10d 0 2026-03-10T12:38:17.765 INFO:tasks.workunit.client.1.vm07.stdout:9/803: creat d5/d13/d2c/de6/dce/f10e x:0 0 0 2026-03-10T12:38:17.765 INFO:tasks.workunit.client.1.vm07.stdout:9/804: chown d5/d13/d57/d4f/d6a 380 1 2026-03-10T12:38:17.774 INFO:tasks.workunit.client.0.vm00.stdout:8/918: link d0/dd/d38/d81/df3/d9b/c107 d0/d93/d36/d7d/c11d 0 2026-03-10T12:38:17.788 INFO:tasks.workunit.client.0.vm00.stdout:2/978: dread d4/d53/f7d [0,4194304] 0 2026-03-10T12:38:17.791 INFO:tasks.workunit.client.0.vm00.stdout:4/969: fsync df/d1f/d36/d3a/fdf 0 2026-03-10T12:38:17.791 INFO:tasks.workunit.client.0.vm00.stdout:9/982: write d0/d3d/d59/d4e/dba/f8d [369181,129781] 0 2026-03-10T12:38:17.806 INFO:tasks.workunit.client.0.vm00.stdout:4/970: rename df/d1f/d36 to df/d63/d94/d140 0 2026-03-10T12:38:17.809 INFO:tasks.workunit.client.0.vm00.stdout:9/983: rmdir d0/d3d/d59/d4e/dba/d1e/d85/d98 39 2026-03-10T12:38:17.822 INFO:tasks.workunit.client.0.vm00.stdout:9/984: sync 2026-03-10T12:38:17.825 INFO:tasks.workunit.client.0.vm00.stdout:3/978: dwrite dd/d3d/f50 [0,4194304] 0 2026-03-10T12:38:17.827 INFO:tasks.workunit.client.0.vm00.stdout:1/973: dwrite da/d24/f53 [0,4194304] 0 2026-03-10T12:38:17.837 INFO:tasks.workunit.client.0.vm00.stdout:2/979: link d4/d6/de7/fea d4/d6/de7/f13d 0 2026-03-10T12:38:17.837 INFO:tasks.workunit.client.0.vm00.stdout:9/985: dwrite d0/d3d/d59/d4e/dba/d19/d50/fbd [0,4194304] 0 2026-03-10T12:38:17.839 INFO:tasks.workunit.client.0.vm00.stdout:8/919: dwrite d0/d46/f92 [0,4194304] 0 2026-03-10T12:38:17.845 INFO:tasks.workunit.client.1.vm07.stdout:7/680: write d0/fc [5621341,8443] 0 2026-03-10T12:38:17.848 INFO:tasks.workunit.client.1.vm07.stdout:6/703: write d1/d4/d6/d4e/d64/f6f [584696,60277] 0 2026-03-10T12:38:17.849 INFO:tasks.workunit.client.0.vm00.stdout:6/651: write d2/d16/d29/f64 [137325,45507] 0 2026-03-10T12:38:17.852 INFO:tasks.workunit.client.1.vm07.stdout:7/681: dwrite d0/d61/fdb [0,4194304] 0 2026-03-10T12:38:17.857 INFO:tasks.workunit.client.0.vm00.stdout:3/979: write dd/d2a/da2/db4/f107 [811161,47747] 0 2026-03-10T12:38:17.860 INFO:tasks.workunit.client.1.vm07.stdout:3/748: dwrite dc/dd/d28/d7a/f7f [0,4194304] 0 2026-03-10T12:38:17.864 INFO:tasks.workunit.client.1.vm07.stdout:6/704: creat d1/d4/d6/d16/d49/fe4 x:0 0 0 2026-03-10T12:38:17.866 INFO:tasks.workunit.client.1.vm07.stdout:4/819: mknod d0/d4/d10/d9a/c11e 0 2026-03-10T12:38:17.866 INFO:tasks.workunit.client.1.vm07.stdout:4/820: chown d0/d4/df2/df6/d46/d76/fa2 14406 1 2026-03-10T12:38:17.872 INFO:tasks.workunit.client.0.vm00.stdout:1/974: creat da/d21/db3/d59/d120/f14b x:0 0 0 2026-03-10T12:38:17.876 INFO:tasks.workunit.client.1.vm07.stdout:1/729: rmdir d9/df/dc2 39 2026-03-10T12:38:17.879 INFO:tasks.workunit.client.0.vm00.stdout:0/816: write d3/d7/d4c/d5b/d38/fa2 [1393360,123569] 0 2026-03-10T12:38:17.883 INFO:tasks.workunit.client.0.vm00.stdout:0/817: fdatasync d3/d7/d4c/d5b/d38/db3/de2/fc6 0 2026-03-10T12:38:17.897 INFO:tasks.workunit.client.0.vm00.stdout:8/920: dread d0/d93/d17/d48/f4c [0,4194304] 0 2026-03-10T12:38:17.899 INFO:tasks.workunit.client.0.vm00.stdout:8/921: truncate d0/dd/d38/f111 244525 0 2026-03-10T12:38:17.910 INFO:tasks.workunit.client.0.vm00.stdout:4/971: write df/d1f/d22/d26/d65/d91/fad [923082,44487] 0 2026-03-10T12:38:17.913 INFO:tasks.workunit.client.0.vm00.stdout:0/818: symlink d3/d7/d4c/dcc/ded/l109 0 2026-03-10T12:38:17.920 INFO:tasks.workunit.client.0.vm00.stdout:0/819: fdatasync d3/d7/d4c/d5b/d38/d44/d5a/f7e 0 2026-03-10T12:38:17.927 INFO:tasks.workunit.client.0.vm00.stdout:9/986: creat d0/d3d/d59/d4e/dba/d19/d50/f15f x:0 0 0 2026-03-10T12:38:17.931 INFO:tasks.workunit.client.1.vm07.stdout:3/749: dwrite dc/dd/d43/d5c/fa9 [0,4194304] 0 2026-03-10T12:38:17.940 INFO:tasks.workunit.client.1.vm07.stdout:6/705: dread - d1/d4/d6/d16/d1a/d6e/fbe zero size 2026-03-10T12:38:17.940 INFO:tasks.workunit.client.0.vm00.stdout:3/980: mkdir dd/d64/d93/d143 0 2026-03-10T12:38:17.941 INFO:tasks.workunit.client.0.vm00.stdout:1/975: mknod da/d21/db3/d59/d120/d80/dd8/c14c 0 2026-03-10T12:38:17.959 INFO:tasks.workunit.client.1.vm07.stdout:1/730: creat d9/df/d29/d2b/d92/d9d/fee x:0 0 0 2026-03-10T12:38:17.963 INFO:tasks.workunit.client.0.vm00.stdout:9/987: truncate d0/d3d/d43/f54 5291121 0 2026-03-10T12:38:17.969 INFO:tasks.workunit.client.0.vm00.stdout:8/922: fsync d0/dd/f2b 0 2026-03-10T12:38:17.971 INFO:tasks.workunit.client.0.vm00.stdout:4/972: mknod df/d1f/c141 0 2026-03-10T12:38:17.974 INFO:tasks.workunit.client.0.vm00.stdout:0/820: unlink d3/d7/d4c/d5b/f56 0 2026-03-10T12:38:17.974 INFO:tasks.workunit.client.0.vm00.stdout:2/980: dwrite d4/d53/d68/fb1 [0,4194304] 0 2026-03-10T12:38:17.981 INFO:tasks.workunit.client.1.vm07.stdout:6/706: mknod d1/d4/d6/d16/d1a/d6e/ce5 0 2026-03-10T12:38:17.983 INFO:tasks.workunit.client.0.vm00.stdout:9/988: mkdir d0/d5/dc/d160 0 2026-03-10T12:38:17.983 INFO:tasks.workunit.client.1.vm07.stdout:1/731: chown d9/d2d/d4f/d75/d77/da7/lad 54604893 1 2026-03-10T12:38:17.984 INFO:tasks.workunit.client.1.vm07.stdout:6/707: dread - d1/d4/fc5 zero size 2026-03-10T12:38:17.987 INFO:tasks.workunit.client.0.vm00.stdout:3/981: fsync dd/d2a/da2/db4/fe8 0 2026-03-10T12:38:17.997 INFO:tasks.workunit.client.0.vm00.stdout:9/989: sync 2026-03-10T12:38:17.998 INFO:tasks.workunit.client.0.vm00.stdout:3/982: mkdir dd/d27/d2c/def/d118/d144 0 2026-03-10T12:38:17.998 INFO:tasks.workunit.client.0.vm00.stdout:3/983: readlink dd/d27/d2c/def/lf5 0 2026-03-10T12:38:18.003 INFO:tasks.workunit.client.0.vm00.stdout:0/821: creat d3/d40/f10a x:0 0 0 2026-03-10T12:38:18.012 INFO:tasks.workunit.client.0.vm00.stdout:0/822: readlink d3/db/d24/ldd 0 2026-03-10T12:38:18.017 INFO:tasks.workunit.client.0.vm00.stdout:9/990: unlink d0/d3d/d59/d4e/dba/d1e/d27/d115/f100 0 2026-03-10T12:38:18.024 INFO:tasks.workunit.client.1.vm07.stdout:1/732: dread d9/d2d/d4f/d75/f83 [4194304,4194304] 0 2026-03-10T12:38:18.028 INFO:tasks.workunit.client.0.vm00.stdout:4/973: rename df/d63/d94/d140/d3a/d41/l27 to df/d1f/d22/d26/d65/d91/d101/l142 0 2026-03-10T12:38:18.029 INFO:tasks.workunit.client.1.vm07.stdout:3/750: dread dc/dd/d28/f46 [0,4194304] 0 2026-03-10T12:38:18.033 INFO:tasks.workunit.client.1.vm07.stdout:3/751: write dc/d18/de2/ff4 [860904,108749] 0 2026-03-10T12:38:18.035 INFO:tasks.workunit.client.0.vm00.stdout:3/984: dread dd/d18/d13/f6b [0,4194304] 0 2026-03-10T12:38:18.039 INFO:tasks.workunit.client.1.vm07.stdout:8/707: rename d1/d3/c55 to d1/d3/cea 0 2026-03-10T12:38:18.043 INFO:tasks.workunit.client.0.vm00.stdout:0/823: link d3/le d3/d7/d4c/d5b/d38/l10b 0 2026-03-10T12:38:18.043 INFO:tasks.workunit.client.0.vm00.stdout:9/991: symlink d0/d3d/d43/d15c/l161 0 2026-03-10T12:38:18.048 INFO:tasks.workunit.client.0.vm00.stdout:4/974: mkdir df/d63/d94/d140/d143 0 2026-03-10T12:38:18.052 INFO:tasks.workunit.client.0.vm00.stdout:4/975: chown df/d1f/d22/d26/f11a 3 1 2026-03-10T12:38:18.058 INFO:tasks.workunit.client.1.vm07.stdout:3/752: dread dc/dd/d1f/d45/f5e [0,4194304] 0 2026-03-10T12:38:18.058 INFO:tasks.workunit.client.1.vm07.stdout:3/753: chown dc/dd/d43/d76/d95/db8 6906 1 2026-03-10T12:38:18.067 INFO:tasks.workunit.client.1.vm07.stdout:3/754: getdents dc/d18/d24/d72 0 2026-03-10T12:38:18.069 INFO:tasks.workunit.client.1.vm07.stdout:2/646: write d0/d42/f2c [182714,40599] 0 2026-03-10T12:38:18.069 INFO:tasks.workunit.client.0.vm00.stdout:1/976: dwrite da/d21/db3/d59/d120/fdc [0,4194304] 0 2026-03-10T12:38:18.078 INFO:tasks.workunit.client.0.vm00.stdout:1/977: truncate da/d21/f144 478946 0 2026-03-10T12:38:18.079 INFO:tasks.workunit.client.1.vm07.stdout:5/753: dwrite d0/d22/d18/d19/d21/fd4 [0,4194304] 0 2026-03-10T12:38:18.080 INFO:tasks.workunit.client.1.vm07.stdout:5/754: stat d0/d22/dbc 0 2026-03-10T12:38:18.084 INFO:tasks.workunit.client.1.vm07.stdout:5/755: mkdir d0/d22/d109 0 2026-03-10T12:38:18.093 INFO:tasks.workunit.client.1.vm07.stdout:0/836: write d0/d14/d5f/fb3 [2074613,18103] 0 2026-03-10T12:38:18.093 INFO:tasks.workunit.client.1.vm07.stdout:5/756: dread - d0/d22/d18/d19/d72/ff1 zero size 2026-03-10T12:38:18.096 INFO:tasks.workunit.client.0.vm00.stdout:1/978: truncate da/d24/d28/fb1 4023763 0 2026-03-10T12:38:18.098 INFO:tasks.workunit.client.1.vm07.stdout:9/805: write d5/d16/d23/d26/f86 [316330,38752] 0 2026-03-10T12:38:18.099 INFO:tasks.workunit.client.1.vm07.stdout:0/837: chown d0/d14/d5f/d76/d2f/d31/d4f/d9d/fda 79687998 1 2026-03-10T12:38:18.100 INFO:tasks.workunit.client.0.vm00.stdout:1/979: rename da/d21/db3/d59/da6/d8b/d98/d13a to da/d21/db3/d59/d120/d11f/d14d 0 2026-03-10T12:38:18.101 INFO:tasks.workunit.client.1.vm07.stdout:9/806: creat d5/d13/d2c/de6/d64/f10f x:0 0 0 2026-03-10T12:38:18.102 INFO:tasks.workunit.client.0.vm00.stdout:1/980: rmdir da/d21/db3/d59/d120/d72/d7e 39 2026-03-10T12:38:18.105 INFO:tasks.workunit.client.0.vm00.stdout:7/703: write da/d25/d2e/d4c/f6e [1773505,16147] 0 2026-03-10T12:38:18.107 INFO:tasks.workunit.client.1.vm07.stdout:0/838: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115 0 2026-03-10T12:38:18.108 INFO:tasks.workunit.client.1.vm07.stdout:0/839: readlink d0/d14/d5f/d76/da1/l7f 0 2026-03-10T12:38:18.111 INFO:tasks.workunit.client.1.vm07.stdout:9/807: link d5/d13/d57/fc9 d5/d13/d2c/de6/d64/f110 0 2026-03-10T12:38:18.114 INFO:tasks.workunit.client.0.vm00.stdout:8/923: dwrite d0/d93/d36/d5b/f95 [0,4194304] 0 2026-03-10T12:38:18.114 INFO:tasks.workunit.client.0.vm00.stdout:7/704: symlink da/d26/d37/d56/lfa 0 2026-03-10T12:38:18.115 INFO:tasks.workunit.client.0.vm00.stdout:8/924: fdatasync d0/d93/d36/d7d/f106 0 2026-03-10T12:38:18.119 INFO:tasks.workunit.client.0.vm00.stdout:7/705: creat da/d26/d37/d56/ddf/ffb x:0 0 0 2026-03-10T12:38:18.123 INFO:tasks.workunit.client.0.vm00.stdout:3/985: write dd/d18/f83 [1408416,57768] 0 2026-03-10T12:38:18.123 INFO:tasks.workunit.client.0.vm00.stdout:3/986: write dd/d18/d13/d99/da5/fcc [3353579,40902] 0 2026-03-10T12:38:18.123 INFO:tasks.workunit.client.0.vm00.stdout:8/925: read d0/d93/d17/fb2 [1412147,59250] 0 2026-03-10T12:38:18.129 INFO:tasks.workunit.client.1.vm07.stdout:0/840: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/fef [0,4194304] 0 2026-03-10T12:38:18.131 INFO:tasks.workunit.client.0.vm00.stdout:9/992: dwrite d0/d3d/d59/d4e/dba/d1e/d27/fae [0,4194304] 0 2026-03-10T12:38:18.134 INFO:tasks.workunit.client.0.vm00.stdout:8/926: creat d0/d58/d68/f11e x:0 0 0 2026-03-10T12:38:18.135 INFO:tasks.workunit.client.0.vm00.stdout:3/987: rename dd/d2a/da2/de1/d45/f47 to dd/d2a/da2/de1/d101/f145 0 2026-03-10T12:38:18.142 INFO:tasks.workunit.client.0.vm00.stdout:8/927: creat d0/f11f x:0 0 0 2026-03-10T12:38:18.143 INFO:tasks.workunit.client.0.vm00.stdout:3/988: dwrite dd/d2a/da2/f12a [0,4194304] 0 2026-03-10T12:38:18.150 INFO:tasks.workunit.client.0.vm00.stdout:9/993: dread d0/d7f/db8/dc4/db0/fbf [0,4194304] 0 2026-03-10T12:38:18.158 INFO:tasks.workunit.client.0.vm00.stdout:3/989: mknod dd/d139/d13a/c146 0 2026-03-10T12:38:18.161 INFO:tasks.workunit.client.0.vm00.stdout:4/976: write df/d32/d64/fd6 [689871,77733] 0 2026-03-10T12:38:18.163 INFO:tasks.workunit.client.0.vm00.stdout:4/977: fdatasync df/f3d 0 2026-03-10T12:38:18.166 INFO:tasks.workunit.client.0.vm00.stdout:3/990: stat dd/d2a/da2/de1/d101/l105 0 2026-03-10T12:38:18.166 INFO:tasks.workunit.client.0.vm00.stdout:9/994: link d0/d7f/db8/dc4/f111 d0/d3d/d59/d4e/d104/d12d/d155/f162 0 2026-03-10T12:38:18.167 INFO:tasks.workunit.client.0.vm00.stdout:6/652: write d2/d16/f78 [974265,44058] 0 2026-03-10T12:38:18.173 INFO:tasks.workunit.client.0.vm00.stdout:3/991: dread - dd/d18/d13/d1d/ff6 zero size 2026-03-10T12:38:18.173 INFO:tasks.workunit.client.0.vm00.stdout:4/978: fsync df/fac 0 2026-03-10T12:38:18.173 INFO:tasks.workunit.client.0.vm00.stdout:9/995: mknod d0/d3d/d43/d15c/c163 0 2026-03-10T12:38:18.174 INFO:tasks.workunit.client.0.vm00.stdout:6/653: symlink d2/d16/d29/d31/d88/d92/lef 0 2026-03-10T12:38:18.176 INFO:tasks.workunit.client.0.vm00.stdout:3/992: dread - dd/d18/d13/d99/da5/fdf zero size 2026-03-10T12:38:18.177 INFO:tasks.workunit.client.0.vm00.stdout:7/706: dread da/d25/d2e/d4c/fe7 [0,4194304] 0 2026-03-10T12:38:18.183 INFO:tasks.workunit.client.0.vm00.stdout:6/654: mknod d2/d16/d29/d31/d88/cf0 0 2026-03-10T12:38:18.184 INFO:tasks.workunit.client.0.vm00.stdout:4/979: read - df/d32/d76/fe7 zero size 2026-03-10T12:38:18.186 INFO:tasks.workunit.client.0.vm00.stdout:6/655: creat d2/d16/d29/d31/d88/ff1 x:0 0 0 2026-03-10T12:38:18.186 INFO:tasks.workunit.client.1.vm07.stdout:2/647: dread d0/f4a [0,4194304] 0 2026-03-10T12:38:18.187 INFO:tasks.workunit.client.1.vm07.stdout:7/682: truncate d0/d57/dd6/d80/fbc 3090089 0 2026-03-10T12:38:18.191 INFO:tasks.workunit.client.0.vm00.stdout:3/993: mkdir dd/d3d/d8a/de0/d55/dfd/d125/d2b/d11c/d12d/d147 0 2026-03-10T12:38:18.192 INFO:tasks.workunit.client.1.vm07.stdout:4/821: write d0/d4/df2/df6/d46/d76/fae [920619,100292] 0 2026-03-10T12:38:18.197 INFO:tasks.workunit.client.1.vm07.stdout:7/683: dwrite d0/d61/db4/fc4 [0,4194304] 0 2026-03-10T12:38:18.199 INFO:tasks.workunit.client.0.vm00.stdout:0/824: write d3/db/d77/faa [3991293,125951] 0 2026-03-10T12:38:18.201 INFO:tasks.workunit.client.1.vm07.stdout:6/708: dwrite d1/d4/f11 [0,4194304] 0 2026-03-10T12:38:18.203 INFO:tasks.workunit.client.0.vm00.stdout:6/656: read d2/d16/d29/d31/d88/d92/fba [295494,21795] 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.0.vm00.stdout:0/825: chown d3/d7/f10 1 1 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.0.vm00.stdout:0/826: mknod d3/d22/d3a/deb/c10c 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.0.vm00.stdout:0/827: write d3/d7/d4c/d5b/d38/db3/fca [930644,59288] 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.1.vm07.stdout:3/755: write dc/dd/d28/d3b/f70 [3996839,111745] 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.1.vm07.stdout:2/648: truncate d0/d80/d93/fb6 298449 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.1.vm07.stdout:5/757: write d0/d22/d18/d19/d2e/d67/dd9/fef [2981474,106809] 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.1.vm07.stdout:4/822: creat d0/d4/df2/f11f x:0 0 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.1.vm07.stdout:0/841: write d0/d14/d5f/d76/d2f/d31/d79/fdc [1206913,22952] 0 2026-03-10T12:38:18.213 INFO:tasks.workunit.client.1.vm07.stdout:8/708: dwrite d1/d3/f2d [4194304,4194304] 0 2026-03-10T12:38:18.214 INFO:tasks.workunit.client.1.vm07.stdout:5/758: write d0/d22/d18/d19/d2e/d67/fe2 [290080,53739] 0 2026-03-10T12:38:18.214 INFO:tasks.workunit.client.0.vm00.stdout:3/994: sync 2026-03-10T12:38:18.216 INFO:tasks.workunit.client.0.vm00.stdout:6/657: dwrite d2/d16/f23 [0,4194304] 0 2026-03-10T12:38:18.223 INFO:tasks.workunit.client.0.vm00.stdout:9/996: dread d0/d3d/d59/d4e/dba/d1e/dcb/f158 [0,4194304] 0 2026-03-10T12:38:18.225 INFO:tasks.workunit.client.0.vm00.stdout:0/828: chown d3/d7/d4c/d5b/dc5 10 1 2026-03-10T12:38:18.233 INFO:tasks.workunit.client.0.vm00.stdout:9/997: mknod d0/d3d/d59/d4e/dba/d1e/d27/c164 0 2026-03-10T12:38:18.238 INFO:tasks.workunit.client.1.vm07.stdout:6/709: rename d1/d4/d6/d16/d1a/f6a to d1/d4/d6/d16/d1a/d9d/fe6 0 2026-03-10T12:38:18.239 INFO:tasks.workunit.client.1.vm07.stdout:0/842: mknod d0/c116 0 2026-03-10T12:38:18.240 INFO:tasks.workunit.client.0.vm00.stdout:3/995: getdents dd/d3d/d8a/de0/d55/dfd/d125/d2b/d11c 0 2026-03-10T12:38:18.241 INFO:tasks.workunit.client.0.vm00.stdout:9/998: chown d0/d3d/d125/c140 251 1 2026-03-10T12:38:18.247 INFO:tasks.workunit.client.1.vm07.stdout:5/759: creat d0/d22/d18/d19/d21/dc2/ded/f10a x:0 0 0 2026-03-10T12:38:18.247 INFO:tasks.workunit.client.1.vm07.stdout:5/760: read d0/f9 [1139228,46421] 0 2026-03-10T12:38:18.250 INFO:tasks.workunit.client.1.vm07.stdout:0/843: rename d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9/d10c/f107 to d0/d14/d5f/d76/d2f/d31/df0/f117 0 2026-03-10T12:38:18.253 INFO:tasks.workunit.client.1.vm07.stdout:5/761: rename d0/d22/d18/dc7 to d0/d22/d18/d3e/d5d/d10b 0 2026-03-10T12:38:18.253 INFO:tasks.workunit.client.1.vm07.stdout:0/844: stat d0/d14/d7c 0 2026-03-10T12:38:18.253 INFO:tasks.workunit.client.1.vm07.stdout:6/710: unlink d1/d4/d44/cc4 0 2026-03-10T12:38:18.253 INFO:tasks.workunit.client.1.vm07.stdout:5/762: stat d0/d22/d18/d19/d72/dcc 0 2026-03-10T12:38:18.264 INFO:tasks.workunit.client.0.vm00.stdout:9/999: rename d0/d3d/d59/d4e/d104/l138 to d0/d3d/d59/d4e/dba/d1e/dcb/l165 0 2026-03-10T12:38:18.264 INFO:tasks.workunit.client.0.vm00.stdout:3/996: getdents dd/d64 0 2026-03-10T12:38:18.264 INFO:tasks.workunit.client.1.vm07.stdout:6/711: mknod d1/d4/d6/d43/ce7 0 2026-03-10T12:38:18.265 INFO:tasks.workunit.client.1.vm07.stdout:5/763: rename d0/d22/la4 to d0/d22/d18/d3e/d53/d9e/l10c 0 2026-03-10T12:38:18.267 INFO:tasks.workunit.client.1.vm07.stdout:5/764: fdatasync d0/d22/d18/d3e/d5d/db6/fe4 0 2026-03-10T12:38:18.270 INFO:tasks.workunit.client.1.vm07.stdout:5/765: read d0/d22/d18/d19/d21/fd4 [1179676,121803] 0 2026-03-10T12:38:18.271 INFO:tasks.workunit.client.1.vm07.stdout:5/766: chown d0/d22/f16 0 1 2026-03-10T12:38:18.272 INFO:tasks.workunit.client.0.vm00.stdout:3/997: creat dd/d2a/da2/de1/d142/f148 x:0 0 0 2026-03-10T12:38:18.278 INFO:tasks.workunit.client.1.vm07.stdout:6/712: mknod d1/d4/d6/d46/d4d/ce8 0 2026-03-10T12:38:18.282 INFO:tasks.workunit.client.1.vm07.stdout:5/767: write d0/f9 [662028,123198] 0 2026-03-10T12:38:18.288 INFO:tasks.workunit.client.0.vm00.stdout:2/981: dread d4/d53/d68/f69 [0,4194304] 0 2026-03-10T12:38:18.303 INFO:tasks.workunit.client.1.vm07.stdout:1/733: dread d9/df/d29/d2b/d31/d91/fa9 [0,4194304] 0 2026-03-10T12:38:18.308 INFO:tasks.workunit.client.1.vm07.stdout:1/734: getdents d9/df/d55/d9f 0 2026-03-10T12:38:18.311 INFO:tasks.workunit.client.1.vm07.stdout:1/735: dread - d9/d2d/d4f/d75/fda zero size 2026-03-10T12:38:18.313 INFO:tasks.workunit.client.0.vm00.stdout:2/982: link d4/c7f d4/d6/d2d/c13e 0 2026-03-10T12:38:18.318 INFO:tasks.workunit.client.1.vm07.stdout:1/736: rename d9/df/fe7 to d9/d2d/d4f/dde/fef 0 2026-03-10T12:38:18.319 INFO:tasks.workunit.client.1.vm07.stdout:1/737: fsync d9/df/d55/fce 0 2026-03-10T12:38:18.320 INFO:tasks.workunit.client.1.vm07.stdout:9/808: dread d5/f91 [0,4194304] 0 2026-03-10T12:38:18.329 INFO:tasks.workunit.client.1.vm07.stdout:1/738: fdatasync d9/f1a 0 2026-03-10T12:38:18.353 INFO:tasks.workunit.client.0.vm00.stdout:7/707: dread da/d26/d50/d73/fce [0,4194304] 0 2026-03-10T12:38:18.353 INFO:tasks.workunit.client.0.vm00.stdout:0/829: write d3/d33/f4d [2487216,67547] 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.0.vm00.stdout:6/658: write d2/d16/f17 [3422212,48916] 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.0.vm00.stdout:0/830: stat d3/d40/f4e 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.0.vm00.stdout:6/659: chown d2/d42/d80/d9d/fe9 0 1 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:6/713: dread d1/d4/d6/d16/d1a/f8e [0,4194304] 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:6/714: readlink d1/d4/d6/d46/d4d/l75 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:6/715: chown d1/d4/d6/d16/d49/fd3 19 1 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:3/756: dwrite dc/dd/d1f/d45/fea [4194304,4194304] 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:4/823: dwrite d0/d4/d5/da/f44 [0,4194304] 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:2/649: truncate d0/d42/d26/d38/d4f/d62/fba 7185891 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:9/809: fdatasync d5/d13/d57/d4f/d6a/fba 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:4/824: write d0/d4/d5/da/f11b [332418,26890] 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:8/709: dwrite d1/d3/d11/fbd [0,4194304] 0 2026-03-10T12:38:18.354 INFO:tasks.workunit.client.1.vm07.stdout:0/845: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/fe4 [227296,91912] 0 2026-03-10T12:38:18.373 INFO:tasks.workunit.client.1.vm07.stdout:3/757: mknod dc/dd/db5/c104 0 2026-03-10T12:38:18.386 INFO:tasks.workunit.client.0.vm00.stdout:6/660: rmdir d2/da 39 2026-03-10T12:38:18.386 INFO:tasks.workunit.client.1.vm07.stdout:3/758: chown dc/dd/d43/d76/d95/dd8 454 1 2026-03-10T12:38:18.386 INFO:tasks.workunit.client.1.vm07.stdout:4/825: dread - d0/d4/df2/f108 zero size 2026-03-10T12:38:18.386 INFO:tasks.workunit.client.1.vm07.stdout:2/650: creat d0/d42/d1f/d20/fdb x:0 0 0 2026-03-10T12:38:18.386 INFO:tasks.workunit.client.1.vm07.stdout:3/759: dread - dc/d18/fd4 zero size 2026-03-10T12:38:18.386 INFO:tasks.workunit.client.1.vm07.stdout:4/826: symlink d0/d4/d5/d34/l120 0 2026-03-10T12:38:18.388 INFO:tasks.workunit.client.1.vm07.stdout:4/827: dwrite d0/d4/d5/fea [0,4194304] 0 2026-03-10T12:38:18.390 INFO:tasks.workunit.client.1.vm07.stdout:4/828: chown d0/d4/d10/d9a/l6a 71007823 1 2026-03-10T12:38:18.394 INFO:tasks.workunit.client.1.vm07.stdout:0/846: rename d0/f1d to d0/d14/d5f/d76/f118 0 2026-03-10T12:38:18.394 INFO:tasks.workunit.client.0.vm00.stdout:1/981: dread da/d21/d27/fe8 [0,4194304] 0 2026-03-10T12:38:18.400 INFO:tasks.workunit.client.1.vm07.stdout:0/847: rmdir d0 39 2026-03-10T12:38:18.400 INFO:tasks.workunit.client.1.vm07.stdout:4/829: stat d0/d5c/d7c/fc0 0 2026-03-10T12:38:18.401 INFO:tasks.workunit.client.0.vm00.stdout:1/982: mknod da/d21/db3/d59/d120/dab/c14e 0 2026-03-10T12:38:18.401 INFO:tasks.workunit.client.0.vm00.stdout:6/661: creat d2/d9f/dce/ff2 x:0 0 0 2026-03-10T12:38:18.402 INFO:tasks.workunit.client.0.vm00.stdout:8/928: dread d0/f11 [0,4194304] 0 2026-03-10T12:38:18.412 INFO:tasks.workunit.client.0.vm00.stdout:8/929: getdents d0/d93/d36/d7d 0 2026-03-10T12:38:18.413 INFO:tasks.workunit.client.1.vm07.stdout:0/848: mkdir d0/d14/d5f/d41/d6a/d74/d119 0 2026-03-10T12:38:18.421 INFO:tasks.workunit.client.1.vm07.stdout:4/830: getdents d0/d4/d10/d3c/d2b/d54 0 2026-03-10T12:38:18.423 INFO:tasks.workunit.client.0.vm00.stdout:2/983: dread d4/d6/d2d/d3a/dd3/fbe [4194304,4194304] 0 2026-03-10T12:38:18.424 INFO:tasks.workunit.client.1.vm07.stdout:0/849: symlink d0/d14/d5f/d41/d6a/d74/d119/l11a 0 2026-03-10T12:38:18.424 INFO:tasks.workunit.client.1.vm07.stdout:4/831: fsync d0/d4/d5/d78/dc5/df7/db2/dd5/f115 0 2026-03-10T12:38:18.426 INFO:tasks.workunit.client.1.vm07.stdout:4/832: readlink d0/d5c/d7c/ld7 0 2026-03-10T12:38:18.428 INFO:tasks.workunit.client.1.vm07.stdout:4/833: creat d0/d4/d5/da/d95/f121 x:0 0 0 2026-03-10T12:38:18.428 INFO:tasks.workunit.client.1.vm07.stdout:4/834: chown d0/d4/d5/da/d95/le0 129889 1 2026-03-10T12:38:18.429 INFO:tasks.workunit.client.1.vm07.stdout:5/768: sync 2026-03-10T12:38:18.451 INFO:tasks.workunit.client.1.vm07.stdout:9/810: sync 2026-03-10T12:38:18.451 INFO:tasks.workunit.client.1.vm07.stdout:5/769: sync 2026-03-10T12:38:18.459 INFO:tasks.workunit.client.0.vm00.stdout:1/983: dread da/d24/d73/fb6 [0,4194304] 0 2026-03-10T12:38:18.459 INFO:tasks.workunit.client.1.vm07.stdout:4/835: rename d0/d4/d5/f75 to d0/d4/d5/d78/dc5/df7/db2/dd5/f122 0 2026-03-10T12:38:18.465 INFO:tasks.workunit.client.1.vm07.stdout:7/684: dread d0/d57/d62/f84 [0,4194304] 0 2026-03-10T12:38:18.481 INFO:tasks.workunit.client.1.vm07.stdout:5/770: fdatasync d0/d22/d18/d19/d2e/d67/fc8 0 2026-03-10T12:38:18.494 INFO:tasks.workunit.client.1.vm07.stdout:4/836: truncate d0/d4/d5/da/d66/fa8 631693 0 2026-03-10T12:38:18.495 INFO:tasks.workunit.client.1.vm07.stdout:4/837: write d0/d4/d5/d34/f94 [1618876,92861] 0 2026-03-10T12:38:18.506 INFO:tasks.workunit.client.1.vm07.stdout:7/685: mkdir d0/d57/dd6/de5 0 2026-03-10T12:38:18.529 INFO:tasks.workunit.client.1.vm07.stdout:6/716: write d1/d4/d6/d43/f90 [315911,15772] 0 2026-03-10T12:38:18.530 INFO:tasks.workunit.client.0.vm00.stdout:4/980: truncate df/d6c/f124 1187055 0 2026-03-10T12:38:18.531 INFO:tasks.workunit.client.1.vm07.stdout:3/760: write dc/dd/d28/d3b/fc1 [908596,45775] 0 2026-03-10T12:38:18.534 INFO:tasks.workunit.client.1.vm07.stdout:1/739: dwrite d9/fe [0,4194304] 0 2026-03-10T12:38:18.537 INFO:tasks.workunit.client.1.vm07.stdout:8/710: dwrite d1/d3/d11/d87/fc1 [0,4194304] 0 2026-03-10T12:38:18.537 INFO:tasks.workunit.client.0.vm00.stdout:3/998: write dd/d18/d13/d99/da5/fd4 [5216481,114905] 0 2026-03-10T12:38:18.542 INFO:tasks.workunit.client.1.vm07.stdout:9/811: truncate d5/d13/d2c/de6/d64/f110 45306 0 2026-03-10T12:38:18.546 INFO:tasks.workunit.client.0.vm00.stdout:4/981: write df/f3d [364104,79928] 0 2026-03-10T12:38:18.546 INFO:tasks.workunit.client.0.vm00.stdout:7/708: dwrite da/d41/f72 [0,4194304] 0 2026-03-10T12:38:18.549 INFO:tasks.workunit.client.1.vm07.stdout:3/761: fdatasync dc/dd/d43/d5c/f101 0 2026-03-10T12:38:18.550 INFO:tasks.workunit.client.1.vm07.stdout:6/717: creat d1/d4/d6/d16/d1a/d99/fe9 x:0 0 0 2026-03-10T12:38:18.551 INFO:tasks.workunit.client.1.vm07.stdout:6/718: chown d1/d4/d6/d43/ccd 563238 1 2026-03-10T12:38:18.552 INFO:tasks.workunit.client.0.vm00.stdout:0/831: dwrite d3/d7/d4c/d9d/fd1 [0,4194304] 0 2026-03-10T12:38:18.564 INFO:tasks.workunit.client.0.vm00.stdout:8/930: write d0/d46/d89/fb6 [3036269,13092] 0 2026-03-10T12:38:18.565 INFO:tasks.workunit.client.0.vm00.stdout:8/931: chown d0/d93/d17/d48/l4f 27 1 2026-03-10T12:38:18.572 INFO:tasks.workunit.client.1.vm07.stdout:4/838: dread d0/d4/d10/d5f/fb6 [0,4194304] 0 2026-03-10T12:38:18.573 INFO:tasks.workunit.client.0.vm00.stdout:4/982: creat df/d1f/d22/d26/d65/d91/f144 x:0 0 0 2026-03-10T12:38:18.575 INFO:tasks.workunit.client.0.vm00.stdout:2/984: dwrite d4/d6/f89 [0,4194304] 0 2026-03-10T12:38:18.582 INFO:tasks.workunit.client.0.vm00.stdout:0/832: unlink d3/d22/da5/lb7 0 2026-03-10T12:38:18.583 INFO:tasks.workunit.client.0.vm00.stdout:4/983: creat df/d32/d64/f145 x:0 0 0 2026-03-10T12:38:18.584 INFO:tasks.workunit.client.0.vm00.stdout:0/833: stat d3/db/f97 0 2026-03-10T12:38:18.587 INFO:tasks.workunit.client.1.vm07.stdout:1/740: symlink d9/df/d55/lf0 0 2026-03-10T12:38:18.592 INFO:tasks.workunit.client.0.vm00.stdout:0/834: readlink d3/d7/ld0 0 2026-03-10T12:38:18.592 INFO:tasks.workunit.client.1.vm07.stdout:4/839: rename d0/d4/d10/d9a/l6a to d0/d4/d10/d114/l123 0 2026-03-10T12:38:18.592 INFO:tasks.workunit.client.1.vm07.stdout:2/651: truncate d0/d29/d64/d74/d88/f51 2395154 0 2026-03-10T12:38:18.592 INFO:tasks.workunit.client.1.vm07.stdout:2/652: stat d0/d80 0 2026-03-10T12:38:18.592 INFO:tasks.workunit.client.1.vm07.stdout:7/686: creat d0/fe6 x:0 0 0 2026-03-10T12:38:18.592 INFO:tasks.workunit.client.1.vm07.stdout:6/719: creat d1/d4/d6/d96/fea x:0 0 0 2026-03-10T12:38:18.593 INFO:tasks.workunit.client.1.vm07.stdout:1/741: rename d9/d2d/d80/d8e/lbd to d9/d2d/d80/d8e/dc7/lf1 0 2026-03-10T12:38:18.595 INFO:tasks.workunit.client.1.vm07.stdout:2/653: mknod d0/d42/d4e/d77/cdc 0 2026-03-10T12:38:18.595 INFO:tasks.workunit.client.1.vm07.stdout:7/687: fdatasync d0/f4f 0 2026-03-10T12:38:18.598 INFO:tasks.workunit.client.0.vm00.stdout:4/984: creat df/d63/f146 x:0 0 0 2026-03-10T12:38:18.600 INFO:tasks.workunit.client.0.vm00.stdout:4/985: unlink df/d1f/l11d 0 2026-03-10T12:38:18.607 INFO:tasks.workunit.client.1.vm07.stdout:1/742: unlink d9/f1a 0 2026-03-10T12:38:18.615 INFO:tasks.workunit.client.1.vm07.stdout:7/688: creat d0/d57/d62/d90/da1/fe7 x:0 0 0 2026-03-10T12:38:18.625 INFO:tasks.workunit.client.0.vm00.stdout:0/835: link d3/d7/d4c/d5b/d38/l51 d3/d7/d4c/d5b/d38/d44/df9/l10d 0 2026-03-10T12:38:18.632 INFO:tasks.workunit.client.0.vm00.stdout:0/836: dread - d3/d7/db0/ff2 zero size 2026-03-10T12:38:18.638 INFO:tasks.workunit.client.0.vm00.stdout:4/986: dread df/f42 [4194304,4194304] 0 2026-03-10T12:38:18.640 INFO:tasks.workunit.client.1.vm07.stdout:1/743: creat d9/df/dc2/ff2 x:0 0 0 2026-03-10T12:38:18.640 INFO:tasks.workunit.client.1.vm07.stdout:3/762: dread dc/dd/d28/d3b/fa5 [8388608,4194304] 0 2026-03-10T12:38:18.646 INFO:tasks.workunit.client.1.vm07.stdout:4/840: rename d0/d5c/d7c to d0/d4/d10/d9a/d124 0 2026-03-10T12:38:18.646 INFO:tasks.workunit.client.1.vm07.stdout:2/654: dread d0/f15 [0,4194304] 0 2026-03-10T12:38:18.646 INFO:tasks.workunit.client.1.vm07.stdout:4/841: chown d0/fa1 14587617 1 2026-03-10T12:38:18.651 INFO:tasks.workunit.client.1.vm07.stdout:0/850: write d0/d14/d5f/d76/d2f/d31/d79/d85/fb5 [276630,41054] 0 2026-03-10T12:38:18.651 INFO:tasks.workunit.client.1.vm07.stdout:2/655: dwrite d0/d42/d1f/d20/f3f [8388608,4194304] 0 2026-03-10T12:38:18.667 INFO:tasks.workunit.client.0.vm00.stdout:4/987: sync 2026-03-10T12:38:18.668 INFO:tasks.workunit.client.0.vm00.stdout:4/988: chown df/d8a/f12d 1453968 1 2026-03-10T12:38:18.673 INFO:tasks.workunit.client.1.vm07.stdout:5/771: dwrite d0/d22/d18/f95 [0,4194304] 0 2026-03-10T12:38:18.678 INFO:tasks.workunit.client.0.vm00.stdout:4/989: truncate df/d63/d94/d140/d3a/f119 485214 0 2026-03-10T12:38:18.691 INFO:tasks.workunit.client.1.vm07.stdout:3/763: mkdir dc/dd/d43/d76/d95/dde/d105 0 2026-03-10T12:38:18.692 INFO:tasks.workunit.client.1.vm07.stdout:7/689: getdents d0/d61/d79/db5 0 2026-03-10T12:38:18.692 INFO:tasks.workunit.client.1.vm07.stdout:7/690: chown d0/c36 0 1 2026-03-10T12:38:18.695 INFO:tasks.workunit.client.0.vm00.stdout:4/990: symlink df/d1f/d22/d26/l147 0 2026-03-10T12:38:18.699 INFO:tasks.workunit.client.0.vm00.stdout:1/984: dwrite da/d21/d27/d6a/d94/db9/f103 [0,4194304] 0 2026-03-10T12:38:18.701 INFO:tasks.workunit.client.1.vm07.stdout:4/842: creat d0/d4/d5/f125 x:0 0 0 2026-03-10T12:38:18.705 INFO:tasks.workunit.client.0.vm00.stdout:4/991: fsync df/d63/d77/f9d 0 2026-03-10T12:38:18.705 INFO:tasks.workunit.client.0.vm00.stdout:4/992: truncate df/f132 863603 0 2026-03-10T12:38:18.706 INFO:tasks.workunit.client.0.vm00.stdout:4/993: stat df/d1f/d22/f5a 0 2026-03-10T12:38:18.707 INFO:tasks.workunit.client.0.vm00.stdout:4/994: truncate df/d1f/d22/d26/d65/da7/d10e/f113 9207637 0 2026-03-10T12:38:18.712 INFO:tasks.workunit.client.0.vm00.stdout:4/995: rmdir df/d63/d77 39 2026-03-10T12:38:18.715 INFO:tasks.workunit.client.1.vm07.stdout:4/843: symlink d0/d4/d5/d78/dc5/df7/db2/l126 0 2026-03-10T12:38:18.718 INFO:tasks.workunit.client.1.vm07.stdout:7/691: unlink d0/c7d 0 2026-03-10T12:38:18.721 INFO:tasks.workunit.client.1.vm07.stdout:7/692: truncate d0/d57/f9f 2737220 0 2026-03-10T12:38:18.722 INFO:tasks.workunit.client.1.vm07.stdout:3/764: sync 2026-03-10T12:38:18.722 INFO:tasks.workunit.client.1.vm07.stdout:3/765: write dc/dd/d43/feb [332350,3673] 0 2026-03-10T12:38:18.723 INFO:tasks.workunit.client.1.vm07.stdout:3/766: chown dc/dd/d1f/dac/cb9 0 1 2026-03-10T12:38:18.727 INFO:tasks.workunit.client.1.vm07.stdout:4/844: getdents d0/d4/d5/d78 0 2026-03-10T12:38:18.742 INFO:tasks.workunit.client.1.vm07.stdout:4/845: unlink d0/d4/d10/d5f/l7e 0 2026-03-10T12:38:18.742 INFO:tasks.workunit.client.1.vm07.stdout:8/711: write d1/fc [1270629,125261] 0 2026-03-10T12:38:18.745 INFO:tasks.workunit.client.1.vm07.stdout:9/812: dwrite d5/d16/d18/f20 [4194304,4194304] 0 2026-03-10T12:38:18.759 INFO:tasks.workunit.client.0.vm00.stdout:7/709: dwrite da/d25/d2c/f30 [0,4194304] 0 2026-03-10T12:38:18.768 INFO:tasks.workunit.client.0.vm00.stdout:0/837: dread d3/d33/f4d [0,4194304] 0 2026-03-10T12:38:18.772 INFO:tasks.workunit.client.0.vm00.stdout:0/838: read d3/d7/f70 [5187992,53444] 0 2026-03-10T12:38:18.772 INFO:tasks.workunit.client.1.vm07.stdout:6/720: dwrite d1/d4/d6/d43/d88/d97/fa2 [0,4194304] 0 2026-03-10T12:38:18.793 INFO:tasks.workunit.client.1.vm07.stdout:8/712: truncate d1/d3/d40/d92/db6/fad 13736 0 2026-03-10T12:38:18.798 INFO:tasks.workunit.client.1.vm07.stdout:9/813: mknod d5/d13/d2c/de6/d64/c111 0 2026-03-10T12:38:18.809 INFO:tasks.workunit.client.0.vm00.stdout:8/932: truncate d0/d46/d89/fb6 1846055 0 2026-03-10T12:38:18.810 INFO:tasks.workunit.client.1.vm07.stdout:1/744: write d9/d2d/d4f/d75/d77/da7/fcd [202233,564] 0 2026-03-10T12:38:18.812 INFO:tasks.workunit.client.0.vm00.stdout:0/839: read d3/d7/d4c/d5b/f9b [2506119,48858] 0 2026-03-10T12:38:18.812 INFO:tasks.workunit.client.0.vm00.stdout:7/710: creat da/d26/d37/ffc x:0 0 0 2026-03-10T12:38:18.813 INFO:tasks.workunit.client.0.vm00.stdout:7/711: fsync da/d26/f97 0 2026-03-10T12:38:18.814 INFO:tasks.workunit.client.0.vm00.stdout:7/712: stat da/d25/l5b 0 2026-03-10T12:38:18.815 INFO:tasks.workunit.client.0.vm00.stdout:7/713: chown da/d25/d2c/f4f 3003 1 2026-03-10T12:38:18.817 INFO:tasks.workunit.client.1.vm07.stdout:6/721: mkdir d1/dd7/d66/dd6/deb 0 2026-03-10T12:38:18.818 INFO:tasks.workunit.client.1.vm07.stdout:0/851: dwrite d0/d14/d5f/d3b/dbc/fbe [0,4194304] 0 2026-03-10T12:38:18.820 INFO:tasks.workunit.client.0.vm00.stdout:2/985: dwrite d4/d6/d2d/d31/f127 [0,4194304] 0 2026-03-10T12:38:18.825 INFO:tasks.workunit.client.1.vm07.stdout:6/722: dread d1/d4/d6/d16/fbc [0,4194304] 0 2026-03-10T12:38:18.825 INFO:tasks.workunit.client.1.vm07.stdout:5/772: dwrite d0/d22/d18/d3e/d53/faa [0,4194304] 0 2026-03-10T12:38:18.835 INFO:tasks.workunit.client.0.vm00.stdout:0/840: unlink d3/d7/f15 0 2026-03-10T12:38:18.841 INFO:tasks.workunit.client.0.vm00.stdout:0/841: truncate d3/d40/f10a 798173 0 2026-03-10T12:38:18.845 INFO:tasks.workunit.client.1.vm07.stdout:8/713: creat d1/d3/d40/d92/dba/feb x:0 0 0 2026-03-10T12:38:18.857 INFO:tasks.workunit.client.0.vm00.stdout:2/986: creat d4/d6/de7/d11d/f13f x:0 0 0 2026-03-10T12:38:18.860 INFO:tasks.workunit.client.0.vm00.stdout:8/933: truncate d0/d58/d68/f74 3767371 0 2026-03-10T12:38:18.868 INFO:tasks.workunit.client.0.vm00.stdout:2/987: mkdir d4/d53/d76/dba/deb/d140 0 2026-03-10T12:38:18.869 INFO:tasks.workunit.client.0.vm00.stdout:4/996: write df/d63/d77/f9d [101147,66590] 0 2026-03-10T12:38:18.871 INFO:tasks.workunit.client.1.vm07.stdout:3/767: truncate dc/f17 914630 0 2026-03-10T12:38:18.871 INFO:tasks.workunit.client.1.vm07.stdout:7/693: write d0/d47/f59 [862026,627] 0 2026-03-10T12:38:18.880 INFO:tasks.workunit.client.0.vm00.stdout:8/934: sync 2026-03-10T12:38:18.881 INFO:tasks.workunit.client.0.vm00.stdout:0/842: dread d3/d22/f55 [0,4194304] 0 2026-03-10T12:38:18.883 INFO:tasks.workunit.client.1.vm07.stdout:0/852: unlink d0/d14/d5f/d41/d6a/l95 0 2026-03-10T12:38:18.886 INFO:tasks.workunit.client.1.vm07.stdout:4/846: write d0/d4/d10/f36 [8561121,5181] 0 2026-03-10T12:38:18.886 INFO:tasks.workunit.client.0.vm00.stdout:8/935: link d0/d46/fc6 d0/dd/d38/d81/f120 0 2026-03-10T12:38:18.890 INFO:tasks.workunit.client.0.vm00.stdout:8/936: read - d0/d58/d68/d10c/f11b zero size 2026-03-10T12:38:18.891 INFO:tasks.workunit.client.1.vm07.stdout:1/745: dread d9/df/d29/d2b/d31/d91/d59/f84 [0,4194304] 0 2026-03-10T12:38:18.891 INFO:tasks.workunit.client.0.vm00.stdout:4/997: dread df/f85 [0,4194304] 0 2026-03-10T12:38:18.900 INFO:tasks.workunit.client.1.vm07.stdout:8/714: rmdir d1/d3/db2/dcd 39 2026-03-10T12:38:18.918 INFO:tasks.workunit.client.0.vm00.stdout:2/988: dread d4/f67 [0,4194304] 0 2026-03-10T12:38:18.928 INFO:tasks.workunit.client.0.vm00.stdout:6/662: rename d2/d51/d70/ccb to d2/d42/dae/cf3 0 2026-03-10T12:38:18.931 INFO:tasks.workunit.client.0.vm00.stdout:8/937: dwrite d0/d93/d36/d7d/fb0 [0,4194304] 0 2026-03-10T12:38:18.933 INFO:tasks.workunit.client.0.vm00.stdout:3/999: rename dd/d2a/l57 to dd/d2a/da2/de1/d142/l149 0 2026-03-10T12:38:18.935 INFO:tasks.workunit.client.1.vm07.stdout:9/814: write d5/d13/d6c/fb6 [183166,3249] 0 2026-03-10T12:38:18.935 INFO:tasks.workunit.client.0.vm00.stdout:8/938: dread - d0/d93/d2d/d49/ff0 zero size 2026-03-10T12:38:18.940 INFO:tasks.workunit.client.0.vm00.stdout:1/985: rename da/d12/d26/f69 to da/d21/d39/d129/f14f 0 2026-03-10T12:38:18.940 INFO:tasks.workunit.client.0.vm00.stdout:8/939: stat d0/dd/c37 0 2026-03-10T12:38:18.945 INFO:tasks.workunit.client.1.vm07.stdout:2/656: truncate d0/d29/d64/d74/d88/f51 367931 0 2026-03-10T12:38:18.953 INFO:tasks.workunit.client.0.vm00.stdout:0/843: write d3/d7/d4c/f96 [2541812,81926] 0 2026-03-10T12:38:18.956 INFO:tasks.workunit.client.0.vm00.stdout:1/986: mknod da/d21/db3/d59/da6/da4/dda/dc0/c150 0 2026-03-10T12:38:18.961 INFO:tasks.workunit.client.0.vm00.stdout:1/987: dread - da/d21/db3/d59/d120/d72/d7e/f124 zero size 2026-03-10T12:38:18.962 INFO:tasks.workunit.client.0.vm00.stdout:2/989: dwrite d4/d78/df9/f108 [0,4194304] 0 2026-03-10T12:38:18.971 INFO:tasks.workunit.client.0.vm00.stdout:1/988: rmdir da/d21/db3/d59/d120/d11f 39 2026-03-10T12:38:18.974 INFO:tasks.workunit.client.0.vm00.stdout:1/989: chown da/d21/db3/d59/d120/d72/d7e/c8d 261048 1 2026-03-10T12:38:18.979 INFO:tasks.workunit.client.0.vm00.stdout:4/998: dread df/d63/d77/fe8 [0,4194304] 0 2026-03-10T12:38:18.982 INFO:tasks.workunit.client.0.vm00.stdout:7/714: rename da/d26/d50/d73/d89 to da/d47/dfd 0 2026-03-10T12:38:18.982 INFO:tasks.workunit.client.0.vm00.stdout:4/999: creat df/d63/d94/d140/dc6/f148 x:0 0 0 2026-03-10T12:38:18.984 INFO:tasks.workunit.client.0.vm00.stdout:8/940: rename d0/d58/d68/l118 to d0/d93/d17/da2/l121 0 2026-03-10T12:38:18.994 INFO:tasks.workunit.client.0.vm00.stdout:6/663: dread d2/d42/fd4 [0,4194304] 0 2026-03-10T12:38:19.005 INFO:tasks.workunit.client.0.vm00.stdout:0/844: rename d3/db/d77/d82 to d3/d7/db0/dc4/dd5/d10e 0 2026-03-10T12:38:19.016 INFO:tasks.workunit.client.0.vm00.stdout:1/990: sync 2026-03-10T12:38:19.018 INFO:tasks.workunit.client.0.vm00.stdout:8/941: rename d0/d58/d68/l117 to d0/dd/l122 0 2026-03-10T12:38:19.019 INFO:tasks.workunit.client.0.vm00.stdout:6/664: fdatasync d2/d14/d7a/db9/f85 0 2026-03-10T12:38:19.019 INFO:tasks.workunit.client.0.vm00.stdout:8/942: stat d0/d93/d17/da2/fc1 0 2026-03-10T12:38:19.020 INFO:tasks.workunit.client.0.vm00.stdout:1/991: mknod da/d12/d126/c151 0 2026-03-10T12:38:19.023 INFO:tasks.workunit.client.0.vm00.stdout:1/992: chown da/d24/c7b 316453 1 2026-03-10T12:38:19.026 INFO:tasks.workunit.client.0.vm00.stdout:1/993: readlink da/d21/db3/d59/da6/da4/dda/dc0/dfe/l135 0 2026-03-10T12:38:19.028 INFO:tasks.workunit.client.0.vm00.stdout:1/994: dread - da/d21/d39/f92 zero size 2026-03-10T12:38:19.029 INFO:tasks.workunit.client.0.vm00.stdout:8/943: dread d0/d93/d36/d5b/f95 [0,4194304] 0 2026-03-10T12:38:19.032 INFO:tasks.workunit.client.0.vm00.stdout:1/995: symlink da/d21/d27/d6a/d94/l152 0 2026-03-10T12:38:19.039 INFO:tasks.workunit.client.0.vm00.stdout:2/990: dwrite d4/dd/f10 [4194304,4194304] 0 2026-03-10T12:38:19.045 INFO:tasks.workunit.client.0.vm00.stdout:2/991: chown d4/dd/da7 23921 1 2026-03-10T12:38:19.047 INFO:tasks.workunit.client.0.vm00.stdout:7/715: dwrite da/d25/d2e/f9c [0,4194304] 0 2026-03-10T12:38:19.051 INFO:tasks.workunit.client.0.vm00.stdout:8/944: fdatasync d0/dd/d38/ff2 0 2026-03-10T12:38:19.053 INFO:tasks.workunit.client.0.vm00.stdout:0/845: truncate d3/d7/d4c/d5b/f57 1001744 0 2026-03-10T12:38:19.070 INFO:tasks.workunit.client.0.vm00.stdout:0/846: fdatasync d3/d22/f83 0 2026-03-10T12:38:19.070 INFO:tasks.workunit.client.0.vm00.stdout:2/992: rename d4/d53/d9e/c11b to d4/d6/d2d/d31/c141 0 2026-03-10T12:38:19.070 INFO:tasks.workunit.client.0.vm00.stdout:7/716: rename da/d41 to da/d41/d7b/d9d/dc8/dfe 22 2026-03-10T12:38:19.074 INFO:tasks.workunit.client.0.vm00.stdout:8/945: truncate d0/d93/d17/fb2 2692259 0 2026-03-10T12:38:19.074 INFO:tasks.workunit.client.0.vm00.stdout:2/993: rmdir d4/d6/d2d/d3a/d43/dd5 39 2026-03-10T12:38:19.075 INFO:tasks.workunit.client.0.vm00.stdout:2/994: chown d4/d53/d76/d9b/dad/l52 119759 1 2026-03-10T12:38:19.076 INFO:tasks.workunit.client.0.vm00.stdout:2/995: readlink d4/d78/la5 0 2026-03-10T12:38:19.089 INFO:tasks.workunit.client.1.vm07.stdout:0/853: mkdir d0/d14/d5f/d76/d2f/d31/d4f/da8/d11b 0 2026-03-10T12:38:19.090 INFO:tasks.workunit.client.0.vm00.stdout:7/717: symlink da/d26/d50/d73/lff 0 2026-03-10T12:38:19.094 INFO:tasks.workunit.client.0.vm00.stdout:8/946: mkdir d0/d5c/d123 0 2026-03-10T12:38:19.094 INFO:tasks.workunit.client.0.vm00.stdout:6/665: getdents d2/d51/d70 0 2026-03-10T12:38:19.099 INFO:tasks.workunit.client.0.vm00.stdout:8/947: rename d0/d93/d2d/d49/ff0 to d0/d5c/f124 0 2026-03-10T12:38:19.099 INFO:tasks.workunit.client.0.vm00.stdout:8/948: stat d0/d93/d36/d5b 0 2026-03-10T12:38:19.102 INFO:tasks.workunit.client.1.vm07.stdout:7/694: read d0/d61/f69 [118440,79815] 0 2026-03-10T12:38:19.103 INFO:tasks.workunit.client.0.vm00.stdout:8/949: mknod d0/d46/d89/c125 0 2026-03-10T12:38:19.109 INFO:tasks.workunit.client.1.vm07.stdout:4/847: mkdir d0/d4/d5/d34/d127 0 2026-03-10T12:38:19.110 INFO:tasks.workunit.client.0.vm00.stdout:8/950: mkdir d0/d93/d17/db1/d113/d126 0 2026-03-10T12:38:19.110 INFO:tasks.workunit.client.0.vm00.stdout:7/718: creat da/d3f/d71/f100 x:0 0 0 2026-03-10T12:38:19.110 INFO:tasks.workunit.client.0.vm00.stdout:6/666: chown d2/da/dbf/ded 5 1 2026-03-10T12:38:19.120 INFO:tasks.workunit.client.0.vm00.stdout:8/951: fdatasync d0/dd/d38/f11a 0 2026-03-10T12:38:19.125 INFO:tasks.workunit.client.0.vm00.stdout:8/952: rmdir d0/dd/d38/d81 39 2026-03-10T12:38:19.128 INFO:tasks.workunit.client.1.vm07.stdout:9/815: mknod d5/d13/d2c/de6/d64/d108/c112 0 2026-03-10T12:38:19.129 INFO:tasks.workunit.client.0.vm00.stdout:7/719: mkdir da/d25/d2c/d82/d101 0 2026-03-10T12:38:19.129 INFO:tasks.workunit.client.0.vm00.stdout:8/953: mknod d0/d93/d36/db8/c127 0 2026-03-10T12:38:19.138 INFO:tasks.workunit.client.1.vm07.stdout:5/773: write d0/d22/d18/d19/d21/d54/f9b [2683136,113763] 0 2026-03-10T12:38:19.138 INFO:tasks.workunit.client.0.vm00.stdout:6/667: creat d2/da/dc/d2f/ff4 x:0 0 0 2026-03-10T12:38:19.139 INFO:tasks.workunit.client.0.vm00.stdout:0/847: write d3/d7/db0/ff2 [914033,5058] 0 2026-03-10T12:38:19.140 INFO:tasks.workunit.client.1.vm07.stdout:3/768: getdents dc/dd/d43/d76/d95/dde/d105 0 2026-03-10T12:38:19.144 INFO:tasks.workunit.client.1.vm07.stdout:0/854: mknod d0/d14/d5f/d41/d6a/d9a/df9/c11c 0 2026-03-10T12:38:19.149 INFO:tasks.workunit.client.1.vm07.stdout:1/746: mknod d9/cf3 0 2026-03-10T12:38:19.149 INFO:tasks.workunit.client.1.vm07.stdout:8/715: symlink d1/d3/d11/lec 0 2026-03-10T12:38:19.149 INFO:tasks.workunit.client.0.vm00.stdout:7/720: symlink da/d41/d48/l102 0 2026-03-10T12:38:19.149 INFO:tasks.workunit.client.0.vm00.stdout:0/848: truncate d3/d7/d3c/d74/f78 1178486 0 2026-03-10T12:38:19.150 INFO:tasks.workunit.client.0.vm00.stdout:1/996: dwrite da/d24/d28/d67/f52 [0,4194304] 0 2026-03-10T12:38:19.154 INFO:tasks.workunit.client.0.vm00.stdout:7/721: write da/d25/d2c/d82/d68/fcd [3754833,98317] 0 2026-03-10T12:38:19.154 INFO:tasks.workunit.client.1.vm07.stdout:4/848: creat d0/d4/d10/d3c/d2b/d54/f128 x:0 0 0 2026-03-10T12:38:19.157 INFO:tasks.workunit.client.0.vm00.stdout:7/722: fsync da/d26/d37/f79 0 2026-03-10T12:38:19.158 INFO:tasks.workunit.client.0.vm00.stdout:2/996: dwrite d4/d6/de7/f124 [0,4194304] 0 2026-03-10T12:38:19.163 INFO:tasks.workunit.client.0.vm00.stdout:8/954: rename d0/d93/c35 to d0/d46/c128 0 2026-03-10T12:38:19.164 INFO:tasks.workunit.client.0.vm00.stdout:2/997: write d4/d6/d121/d6d/f139 [279025,17606] 0 2026-03-10T12:38:19.165 INFO:tasks.workunit.client.0.vm00.stdout:8/955: read d0/d93/d17/d48/f4c [1298721,128336] 0 2026-03-10T12:38:19.165 INFO:tasks.workunit.client.1.vm07.stdout:3/769: truncate dc/dd/d28/d7a/fba 301440 0 2026-03-10T12:38:19.168 INFO:tasks.workunit.client.0.vm00.stdout:1/997: rename da/fc to da/d21/db3/d59/da6/da4/dda/dc0/dfe/f153 0 2026-03-10T12:38:19.168 INFO:tasks.workunit.client.0.vm00.stdout:2/998: fdatasync d4/d6/d2d/d31/f127 0 2026-03-10T12:38:19.172 INFO:tasks.workunit.client.1.vm07.stdout:0/855: fsync d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/fb8 0 2026-03-10T12:38:19.172 INFO:tasks.workunit.client.0.vm00.stdout:8/956: unlink d0/d93/d2d/d49/fae 0 2026-03-10T12:38:19.172 INFO:tasks.workunit.client.0.vm00.stdout:2/999: creat d4/d6/de7/f142 x:0 0 0 2026-03-10T12:38:19.173 INFO:tasks.workunit.client.1.vm07.stdout:1/747: read d9/df/f4a [673089,73967] 0 2026-03-10T12:38:19.173 INFO:tasks.workunit.client.0.vm00.stdout:7/723: fsync da/d1b/d40/fca 0 2026-03-10T12:38:19.176 INFO:tasks.workunit.client.1.vm07.stdout:6/723: getdents d1/d4/d4a 0 2026-03-10T12:38:19.180 INFO:tasks.workunit.client.1.vm07.stdout:7/695: rename d0/c15 to d0/d47/dde/ce8 0 2026-03-10T12:38:19.187 INFO:tasks.workunit.client.0.vm00.stdout:7/724: stat da/d25/d2e/d4c/f92 0 2026-03-10T12:38:19.195 INFO:tasks.workunit.client.1.vm07.stdout:8/716: truncate d1/d3/d11/f43 1653307 0 2026-03-10T12:38:19.201 INFO:tasks.workunit.client.0.vm00.stdout:7/725: rename da/d26/d50/la7 to da/d47/d87/l103 0 2026-03-10T12:38:19.208 INFO:tasks.workunit.client.1.vm07.stdout:0/856: readlink d0/d14/d5f/d76/d2f/d31/d4f/da8/de2/lf8 0 2026-03-10T12:38:19.209 INFO:tasks.workunit.client.1.vm07.stdout:0/857: chown d0/d14/ldd 80919 1 2026-03-10T12:38:19.209 INFO:tasks.workunit.client.1.vm07.stdout:0/858: fsync d0/d14/d5f/fb3 0 2026-03-10T12:38:19.214 INFO:tasks.workunit.client.0.vm00.stdout:7/726: symlink da/d3f/dd1/l104 0 2026-03-10T12:38:19.220 INFO:tasks.workunit.client.1.vm07.stdout:6/724: mknod d1/d4/d6/d46/d4d/dc7/dd9/cec 0 2026-03-10T12:38:19.222 INFO:tasks.workunit.client.1.vm07.stdout:3/770: rename dc/d18/d24/f3f to dc/dd/d1f/d45/dbf/f106 0 2026-03-10T12:38:19.242 INFO:tasks.workunit.client.0.vm00.stdout:6/668: write d2/d42/dae/fc8 [219577,107955] 0 2026-03-10T12:38:19.244 INFO:tasks.workunit.client.0.vm00.stdout:6/669: fdatasync d2/d42/d80/d9d/fca 0 2026-03-10T12:38:19.244 INFO:tasks.workunit.client.0.vm00.stdout:6/670: dread - d2/da/f77 zero size 2026-03-10T12:38:19.244 INFO:tasks.workunit.client.0.vm00.stdout:6/671: stat d2/da/dc/d2f/f56 0 2026-03-10T12:38:19.246 INFO:tasks.workunit.client.0.vm00.stdout:7/727: link da/d41/d48/d81/la2 da/d41/d7b/d9d/l105 0 2026-03-10T12:38:19.255 INFO:tasks.workunit.client.1.vm07.stdout:2/657: truncate d0/d42/f2c 1292785 0 2026-03-10T12:38:19.264 INFO:tasks.workunit.client.0.vm00.stdout:0/849: dwrite d3/d40/fec [4194304,4194304] 0 2026-03-10T12:38:19.265 INFO:tasks.workunit.client.1.vm07.stdout:9/816: dwrite d5/d13/d57/d4f/f88 [0,4194304] 0 2026-03-10T12:38:19.272 INFO:tasks.workunit.client.0.vm00.stdout:6/672: symlink d2/d42/d80/d89/lf5 0 2026-03-10T12:38:19.280 INFO:tasks.workunit.client.1.vm07.stdout:5/774: write d0/d22/d18/d19/de5/f105 [2369949,92851] 0 2026-03-10T12:38:19.281 INFO:tasks.workunit.client.1.vm07.stdout:1/748: write d9/df/d29/d2b/d30/fd0 [552872,61540] 0 2026-03-10T12:38:19.286 INFO:tasks.workunit.client.0.vm00.stdout:7/728: dread - da/d41/d7b/d9d/fa8 zero size 2026-03-10T12:38:19.295 INFO:tasks.workunit.client.1.vm07.stdout:0/859: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115/d11d 0 2026-03-10T12:38:19.301 INFO:tasks.workunit.client.0.vm00.stdout:0/850: mknod d3/db/da4/c10f 0 2026-03-10T12:38:19.309 INFO:tasks.workunit.client.0.vm00.stdout:0/851: symlink d3/d7/db0/dc4/dd5/l110 0 2026-03-10T12:38:19.313 INFO:tasks.workunit.client.1.vm07.stdout:2/658: stat d0/d42/c30 0 2026-03-10T12:38:19.319 INFO:tasks.workunit.client.1.vm07.stdout:3/771: mknod dc/dd/c107 0 2026-03-10T12:38:19.321 INFO:tasks.workunit.client.0.vm00.stdout:7/729: sync 2026-03-10T12:38:19.324 INFO:tasks.workunit.client.1.vm07.stdout:7/696: link d0/f27 d0/d57/d62/d90/da1/fe9 0 2026-03-10T12:38:19.333 INFO:tasks.workunit.client.0.vm00.stdout:6/673: rmdir d2/da/dbf/db2 0 2026-03-10T12:38:19.336 INFO:tasks.workunit.client.0.vm00.stdout:8/957: write d0/d93/d17/db1/fee [243601,126447] 0 2026-03-10T12:38:19.336 INFO:tasks.workunit.client.0.vm00.stdout:1/998: truncate da/d21/db3/d59/d120/fdc 2442726 0 2026-03-10T12:38:19.338 INFO:tasks.workunit.client.0.vm00.stdout:0/852: dread d3/db/da4/fa7 [0,4194304] 0 2026-03-10T12:38:19.338 INFO:tasks.workunit.client.0.vm00.stdout:0/853: stat d3/d7/db0/dc4/ce8 0 2026-03-10T12:38:19.341 INFO:tasks.workunit.client.0.vm00.stdout:8/958: mkdir d0/d93/d2d/d49/d129 0 2026-03-10T12:38:19.345 INFO:tasks.workunit.client.0.vm00.stdout:1/999: creat da/d21/db3/d59/d120/d72/d121/d12c/f154 x:0 0 0 2026-03-10T12:38:19.348 INFO:tasks.workunit.client.1.vm07.stdout:4/849: getdents d0/d4/df2/df6/d46 0 2026-03-10T12:38:19.350 INFO:tasks.workunit.client.1.vm07.stdout:8/717: rename d1/d3/d11/f46 to d1/d3/d40/d92/fed 0 2026-03-10T12:38:19.353 INFO:tasks.workunit.client.0.vm00.stdout:0/854: symlink d3/d7/db0/dc4/dd5/d10e/l111 0 2026-03-10T12:38:19.363 INFO:tasks.workunit.client.1.vm07.stdout:9/817: creat d5/d13/d6c/d89/f113 x:0 0 0 2026-03-10T12:38:19.369 INFO:tasks.workunit.client.1.vm07.stdout:0/860: dread d0/d14/d5f/d41/f55 [0,4194304] 0 2026-03-10T12:38:19.377 INFO:tasks.workunit.client.1.vm07.stdout:3/772: fdatasync dc/dd/d28/f46 0 2026-03-10T12:38:19.378 INFO:tasks.workunit.client.0.vm00.stdout:0/855: dread d3/d7/d4c/d5b/d38/f8b [0,4194304] 0 2026-03-10T12:38:19.383 INFO:tasks.workunit.client.0.vm00.stdout:0/856: dwrite d3/d7/d4c/d5b/d38/fa2 [0,4194304] 0 2026-03-10T12:38:19.383 INFO:tasks.workunit.client.1.vm07.stdout:8/718: dread d1/d3/d5d/f5f [0,4194304] 0 2026-03-10T12:38:19.384 INFO:tasks.workunit.client.0.vm00.stdout:0/857: chown d3/db/d77/f9e 16 1 2026-03-10T12:38:19.386 INFO:tasks.workunit.client.1.vm07.stdout:5/775: creat d0/d22/d18/d19/de5/f10d x:0 0 0 2026-03-10T12:38:19.389 INFO:tasks.workunit.client.1.vm07.stdout:1/749: mkdir d9/df/dc9/df4 0 2026-03-10T12:38:19.397 INFO:tasks.workunit.client.0.vm00.stdout:0/858: chown d3/d7/d4c/d5b/d38/l10b 5 1 2026-03-10T12:38:19.398 INFO:tasks.workunit.client.1.vm07.stdout:2/659: mkdir d0/d42/d26/d38/d4f/dad/ddd 0 2026-03-10T12:38:19.398 INFO:tasks.workunit.client.1.vm07.stdout:6/725: getdents d1/d4/d6/d16 0 2026-03-10T12:38:19.399 INFO:tasks.workunit.client.1.vm07.stdout:4/850: creat d0/d4/d5/d78/dc5/df7/db2/f129 x:0 0 0 2026-03-10T12:38:19.399 INFO:tasks.workunit.client.1.vm07.stdout:6/726: chown d1/d4/d9b 2764471 1 2026-03-10T12:38:19.401 INFO:tasks.workunit.client.1.vm07.stdout:0/861: mknod d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/c11e 0 2026-03-10T12:38:19.409 INFO:tasks.workunit.client.0.vm00.stdout:6/674: rename d2/d42/dae to d2/d9f/df6 0 2026-03-10T12:38:19.409 INFO:tasks.workunit.client.1.vm07.stdout:5/776: creat d0/d22/d18/d19/d21/d54/f10e x:0 0 0 2026-03-10T12:38:19.414 INFO:tasks.workunit.client.1.vm07.stdout:2/660: creat d0/d42/d4e/daf/fde x:0 0 0 2026-03-10T12:38:19.415 INFO:tasks.workunit.client.0.vm00.stdout:8/959: rename d0/d93/d36 to d0/dd/dfe/d12a 0 2026-03-10T12:38:19.425 INFO:tasks.workunit.client.1.vm07.stdout:6/727: symlink d1/d4/d6/d16/d1a/d33/led 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.1.vm07.stdout:0/862: read - d0/d14/d5f/d76/d2f/d31/d4f/fe1 zero size 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.1.vm07.stdout:9/818: link d5/d13/l5b d5/d13/d9d/l114 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.1.vm07.stdout:4/851: mknod d0/d4/d10/d3c/d2b/d54/de1/c12a 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:7/730: dwrite da/d41/f4b [0,4194304] 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:6/675: symlink d2/d16/lf7 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:6/676: chown d2/d16/d29/d31/fd8 5 1 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:6/677: dread - d2/d14/d7a/db9/f6c zero size 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:7/731: rename da/d26/d37/f79 to da/d47/dfd/f106 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:7/732: fsync da/d47/fe4 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:6/678: creat d2/d42/d80/d9d/ff8 x:0 0 0 2026-03-10T12:38:19.438 INFO:tasks.workunit.client.0.vm00.stdout:7/733: dread - da/d26/d37/d56/ddf/ffb zero size 2026-03-10T12:38:19.440 INFO:tasks.workunit.client.1.vm07.stdout:8/719: creat d1/d3/d40/fee x:0 0 0 2026-03-10T12:38:19.440 INFO:tasks.workunit.client.1.vm07.stdout:1/750: sync 2026-03-10T12:38:19.441 INFO:tasks.workunit.client.1.vm07.stdout:1/751: chown d9/f6d 10817460 1 2026-03-10T12:38:19.443 INFO:tasks.workunit.client.0.vm00.stdout:7/734: creat da/d3f/d60/f107 x:0 0 0 2026-03-10T12:38:19.450 INFO:tasks.workunit.client.1.vm07.stdout:4/852: mkdir d0/d4/d5/d78/dc5/df7/db2/dd5/d12b 0 2026-03-10T12:38:19.456 INFO:tasks.workunit.client.1.vm07.stdout:6/728: symlink d1/d4/d6/d16/d1a/d2c/de0/lee 0 2026-03-10T12:38:19.456 INFO:tasks.workunit.client.1.vm07.stdout:5/777: rename d0/d22/d18/d3e/d53/fee to d0/d22/d18/d19/d21/f10f 0 2026-03-10T12:38:19.458 INFO:tasks.workunit.client.1.vm07.stdout:1/752: read d9/df/d29/f82 [2199346,28112] 0 2026-03-10T12:38:19.461 INFO:tasks.workunit.client.1.vm07.stdout:2/661: dread d0/f4 [0,4194304] 0 2026-03-10T12:38:19.466 INFO:tasks.workunit.client.1.vm07.stdout:9/819: symlink d5/d13/d6c/d89/dac/l115 0 2026-03-10T12:38:19.473 INFO:tasks.workunit.client.1.vm07.stdout:4/853: mkdir d0/d4/d5/d78/dc5/d12c 0 2026-03-10T12:38:19.474 INFO:tasks.workunit.client.0.vm00.stdout:8/960: write d0/d93/d17/fa6 [221534,100706] 0 2026-03-10T12:38:19.477 INFO:tasks.workunit.client.1.vm07.stdout:2/662: truncate d0/d42/d1f/d20/fa0 617839 0 2026-03-10T12:38:19.488 INFO:tasks.workunit.client.1.vm07.stdout:4/854: creat d0/d4/d10/d3c/d2b/f12d x:0 0 0 2026-03-10T12:38:19.490 INFO:tasks.workunit.client.1.vm07.stdout:8/720: getdents d1 0 2026-03-10T12:38:19.493 INFO:tasks.workunit.client.1.vm07.stdout:2/663: write d0/d29/d64/d74/f9e [1437681,80919] 0 2026-03-10T12:38:19.502 INFO:tasks.workunit.client.1.vm07.stdout:1/753: dread d9/df/d29/d2b/d31/f72 [0,4194304] 0 2026-03-10T12:38:19.510 INFO:tasks.workunit.client.0.vm00.stdout:8/961: chown d0/d5c/f124 94 1 2026-03-10T12:38:19.513 INFO:tasks.workunit.client.1.vm07.stdout:6/729: rmdir d1/d4/d6/d16/d49/db7 0 2026-03-10T12:38:19.514 INFO:tasks.workunit.client.1.vm07.stdout:7/697: truncate d0/d52/f97 232373 0 2026-03-10T12:38:19.514 INFO:tasks.workunit.client.1.vm07.stdout:3/773: write dc/d18/f34 [907170,87903] 0 2026-03-10T12:38:19.514 INFO:tasks.workunit.client.0.vm00.stdout:0/859: write d3/d7/d4c/d9d/ff3 [305285,124126] 0 2026-03-10T12:38:19.516 INFO:tasks.workunit.client.0.vm00.stdout:0/860: fsync d3/d7/d4c/d5b/d38/db3/fca 0 2026-03-10T12:38:19.518 INFO:tasks.workunit.client.1.vm07.stdout:0/863: truncate d0/d14/d5f/d76/f78 2822674 0 2026-03-10T12:38:19.518 INFO:tasks.workunit.client.0.vm00.stdout:6/679: dwrite d2/d42/d80/fbd [0,4194304] 0 2026-03-10T12:38:19.526 INFO:tasks.workunit.client.1.vm07.stdout:0/864: dwrite d0/d14/d5f/d41/d6a/f102 [0,4194304] 0 2026-03-10T12:38:19.527 INFO:tasks.workunit.client.0.vm00.stdout:8/962: unlink d0/dd/dfe/d12a/d7d/fb0 0 2026-03-10T12:38:19.528 INFO:tasks.workunit.client.0.vm00.stdout:8/963: chown d0/d58/d68/d10c/f11b 20972541 1 2026-03-10T12:38:19.530 INFO:tasks.workunit.client.1.vm07.stdout:7/698: sync 2026-03-10T12:38:19.537 INFO:tasks.workunit.client.0.vm00.stdout:8/964: dwrite d0/d93/d17/ff9 [0,4194304] 0 2026-03-10T12:38:19.537 INFO:tasks.workunit.client.1.vm07.stdout:4/855: mkdir d0/d4/df2/df6/d46/d12e 0 2026-03-10T12:38:19.539 INFO:tasks.workunit.client.1.vm07.stdout:4/856: truncate d0/d4/d10/d114/f117 766233 0 2026-03-10T12:38:19.545 INFO:tasks.workunit.client.0.vm00.stdout:8/965: chown d0/d46/d7e/f8a 899167 1 2026-03-10T12:38:19.545 INFO:tasks.workunit.client.0.vm00.stdout:7/735: dwrite da/d1b/d40/fca [0,4194304] 0 2026-03-10T12:38:19.566 INFO:tasks.workunit.client.0.vm00.stdout:8/966: dwrite d0/d93/d17/d48/fc7 [0,4194304] 0 2026-03-10T12:38:19.569 INFO:tasks.workunit.client.1.vm07.stdout:9/820: dwrite d5/d13/d2c/de6/f43 [0,4194304] 0 2026-03-10T12:38:19.576 INFO:tasks.workunit.client.0.vm00.stdout:8/967: stat d0/dd/dfe/d12a/db8/c127 0 2026-03-10T12:38:19.581 INFO:tasks.workunit.client.1.vm07.stdout:9/821: dwrite d5/d13/d6c/da4/fd0 [0,4194304] 0 2026-03-10T12:38:19.586 INFO:tasks.workunit.client.0.vm00.stdout:6/680: rmdir d2/da/dc/d94 39 2026-03-10T12:38:19.590 INFO:tasks.workunit.client.0.vm00.stdout:6/681: truncate d2/d16/d29/d31/d88/dd5/fe8 275290 0 2026-03-10T12:38:19.593 INFO:tasks.workunit.client.0.vm00.stdout:7/736: mkdir da/d26/d37/d56/ddf/d108 0 2026-03-10T12:38:19.594 INFO:tasks.workunit.client.0.vm00.stdout:6/682: write d2/d14/d7a/fe1 [28650,6603] 0 2026-03-10T12:38:19.596 INFO:tasks.workunit.client.0.vm00.stdout:8/968: creat d0/dd/f12b x:0 0 0 2026-03-10T12:38:19.596 INFO:tasks.workunit.client.0.vm00.stdout:8/969: stat d0/d46/d7e/f8a 0 2026-03-10T12:38:19.598 INFO:tasks.workunit.client.0.vm00.stdout:7/737: symlink da/d25/d2c/d82/d68/l109 0 2026-03-10T12:38:19.603 INFO:tasks.workunit.client.1.vm07.stdout:8/721: dread d1/d3/d11/f86 [0,4194304] 0 2026-03-10T12:38:19.611 INFO:tasks.workunit.client.1.vm07.stdout:0/865: mknod d0/d14/d5f/d76/d2f/d31/d4f/da8/c11f 0 2026-03-10T12:38:19.611 INFO:tasks.workunit.client.1.vm07.stdout:5/778: link d0/l7 d0/d22/d18/l110 0 2026-03-10T12:38:19.611 INFO:tasks.workunit.client.0.vm00.stdout:8/970: creat d0/d46/f12c x:0 0 0 2026-03-10T12:38:19.611 INFO:tasks.workunit.client.1.vm07.stdout:0/866: stat d0/d14/d7c/c82 0 2026-03-10T12:38:19.612 INFO:tasks.workunit.client.1.vm07.stdout:4/857: rmdir d0/d4/d10/d5f/d6d 39 2026-03-10T12:38:19.612 INFO:tasks.workunit.client.1.vm07.stdout:2/664: symlink d0/d29/d64/db5/dbb/dca/ldf 0 2026-03-10T12:38:19.621 INFO:tasks.workunit.client.1.vm07.stdout:2/665: dwrite d0/d80/d93/fce [0,4194304] 0 2026-03-10T12:38:19.640 INFO:tasks.workunit.client.0.vm00.stdout:8/971: read d0/dd/d38/f111 [65297,73130] 0 2026-03-10T12:38:19.648 INFO:tasks.workunit.client.1.vm07.stdout:1/754: write d9/df/d55/d9f/fb3 [17809,128157] 0 2026-03-10T12:38:19.648 INFO:tasks.workunit.client.0.vm00.stdout:7/738: dread - da/d3f/d71/f95 zero size 2026-03-10T12:38:19.648 INFO:tasks.workunit.client.0.vm00.stdout:0/861: write d3/db/d24/fb1 [174319,4263] 0 2026-03-10T12:38:19.650 INFO:tasks.workunit.client.0.vm00.stdout:7/739: write da/d26/f27 [8814597,42681] 0 2026-03-10T12:38:19.658 INFO:tasks.workunit.client.1.vm07.stdout:9/822: rename d5/d16/d23/l7c to d5/d13/d57/d3e/l116 0 2026-03-10T12:38:19.663 INFO:tasks.workunit.client.0.vm00.stdout:7/740: rmdir da/d41/d7b/d9d/dc8 39 2026-03-10T12:38:19.663 INFO:tasks.workunit.client.0.vm00.stdout:7/741: chown da/d26/d37/l9e 3 1 2026-03-10T12:38:19.679 INFO:tasks.workunit.client.0.vm00.stdout:7/742: creat da/d26/d37/d56/ddf/d108/f10a x:0 0 0 2026-03-10T12:38:19.682 INFO:tasks.workunit.client.1.vm07.stdout:0/867: truncate d0/d14/f37 2812714 0 2026-03-10T12:38:19.683 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:19 vm00.local ceph-mon[50686]: pgmap v6: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:19.683 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:19 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:19.683 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:19 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:19.683 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:19 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:19.683 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:19 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:19.686 INFO:tasks.workunit.client.1.vm07.stdout:2/666: symlink d0/d5b/le0 0 2026-03-10T12:38:19.695 INFO:tasks.workunit.client.0.vm00.stdout:7/743: truncate da/d26/d37/f4a 1686129 0 2026-03-10T12:38:19.695 INFO:tasks.workunit.client.1.vm07.stdout:9/823: truncate d5/d16/d23/d26/d68/fa0 1922751 0 2026-03-10T12:38:19.695 INFO:tasks.workunit.client.1.vm07.stdout:1/755: fsync d9/df/d29/f49 0 2026-03-10T12:38:19.695 INFO:tasks.workunit.client.1.vm07.stdout:3/774: link dc/dd/d1f/dac/fd7 dc/d18/d99/da3/def/f108 0 2026-03-10T12:38:19.695 INFO:tasks.workunit.client.1.vm07.stdout:7/699: link d0/l2d d0/d57/dd6/d80/lea 0 2026-03-10T12:38:19.698 INFO:tasks.workunit.client.0.vm00.stdout:7/744: readlink da/d26/d37/l9e 0 2026-03-10T12:38:19.701 INFO:tasks.workunit.client.1.vm07.stdout:7/700: dwrite d0/d52/fa4 [0,4194304] 0 2026-03-10T12:38:19.708 INFO:tasks.workunit.client.1.vm07.stdout:3/775: dread dc/dd/d43/d5c/f9d [0,4194304] 0 2026-03-10T12:38:19.711 INFO:tasks.workunit.client.0.vm00.stdout:7/745: truncate da/d3f/d60/fb1 924643 0 2026-03-10T12:38:19.711 INFO:tasks.workunit.client.0.vm00.stdout:7/746: chown f0 3 1 2026-03-10T12:38:19.722 INFO:tasks.workunit.client.0.vm00.stdout:7/747: creat da/d25/d2c/d82/f10b x:0 0 0 2026-03-10T12:38:19.732 INFO:tasks.workunit.client.1.vm07.stdout:6/730: dwrite d1/d4/d4a/f56 [0,4194304] 0 2026-03-10T12:38:19.744 INFO:tasks.workunit.client.0.vm00.stdout:6/683: write d2/da/fd9 [2891699,120213] 0 2026-03-10T12:38:19.745 INFO:tasks.workunit.client.1.vm07.stdout:7/701: creat d0/d61/db4/d8a/feb x:0 0 0 2026-03-10T12:38:19.748 INFO:tasks.workunit.client.0.vm00.stdout:6/684: unlink d2/d16/d29/d31/fd8 0 2026-03-10T12:38:19.751 INFO:tasks.workunit.client.0.vm00.stdout:7/748: link da/d26/d37/d56/lfa da/d25/d2e/d4c/l10c 0 2026-03-10T12:38:19.752 INFO:tasks.workunit.client.1.vm07.stdout:0/868: mkdir d0/d14/d5f/d76/d2f/d31/d120 0 2026-03-10T12:38:19.755 INFO:tasks.workunit.client.0.vm00.stdout:6/685: unlink d2/da/c8c 0 2026-03-10T12:38:19.756 INFO:tasks.workunit.client.1.vm07.stdout:4/858: link d0/d4/d10/d3c/d2b/l111 d0/d4/d10/d9a/db9/l12f 0 2026-03-10T12:38:19.759 INFO:tasks.workunit.client.0.vm00.stdout:7/749: fsync da/d3f/d71/f8c 0 2026-03-10T12:38:19.761 INFO:tasks.workunit.client.1.vm07.stdout:2/667: mkdir d0/de1 0 2026-03-10T12:38:19.764 INFO:tasks.workunit.client.0.vm00.stdout:8/972: write d0/f9d [4387486,12848] 0 2026-03-10T12:38:19.765 INFO:tasks.workunit.client.0.vm00.stdout:8/973: stat d0/d46/d7e/lf6 0 2026-03-10T12:38:19.771 INFO:tasks.workunit.client.0.vm00.stdout:7/750: symlink da/d25/d2c/d82/d68/l10d 0 2026-03-10T12:38:19.772 INFO:tasks.workunit.client.0.vm00.stdout:8/974: mknod d0/d46/d7e/c12d 0 2026-03-10T12:38:19.773 INFO:tasks.workunit.client.0.vm00.stdout:8/975: symlink d0/d93/d17/db1/l12e 0 2026-03-10T12:38:19.779 INFO:tasks.workunit.client.1.vm07.stdout:3/776: mknod dc/dd/d43/d76/c109 0 2026-03-10T12:38:19.781 INFO:tasks.workunit.client.1.vm07.stdout:4/859: creat d0/d4/df2/df6/d46/d76/f130 x:0 0 0 2026-03-10T12:38:19.786 INFO:tasks.workunit.client.0.vm00.stdout:7/751: dread da/d41/d48/fae [0,4194304] 0 2026-03-10T12:38:19.787 INFO:tasks.workunit.client.0.vm00.stdout:7/752: write da/d25/f5a [3370051,69972] 0 2026-03-10T12:38:19.788 INFO:tasks.workunit.client.1.vm07.stdout:8/722: link d1/d3/d6/lc0 d1/d3/db2/lef 0 2026-03-10T12:38:19.795 INFO:tasks.workunit.client.1.vm07.stdout:9/824: rename d5/d13/d22/l3f to d5/l117 0 2026-03-10T12:38:19.796 INFO:tasks.workunit.client.0.vm00.stdout:8/976: dread d0/d58/d68/f74 [0,4194304] 0 2026-03-10T12:38:19.798 INFO:tasks.workunit.client.0.vm00.stdout:8/977: rename d0/d93/fcc to d0/d46/d89/f12f 0 2026-03-10T12:38:19.799 INFO:tasks.workunit.client.0.vm00.stdout:8/978: symlink d0/dd/dfe/d12a/d5b/l130 0 2026-03-10T12:38:19.800 INFO:tasks.workunit.client.0.vm00.stdout:8/979: chown d0/d46/d7e/lf6 70 1 2026-03-10T12:38:19.802 INFO:tasks.workunit.client.0.vm00.stdout:8/980: symlink d0/d93/d17/db1/d113/d126/l131 0 2026-03-10T12:38:19.802 INFO:tasks.workunit.client.1.vm07.stdout:6/731: creat d1/d4/d9b/de1/fef x:0 0 0 2026-03-10T12:38:19.803 INFO:tasks.workunit.client.0.vm00.stdout:8/981: mknod d0/d93/d17/db1/d113/c132 0 2026-03-10T12:38:19.807 INFO:tasks.workunit.client.1.vm07.stdout:5/779: dwrite d0/d22/d18/d3e/d53/d9e/f8c [0,4194304] 0 2026-03-10T12:38:19.808 INFO:tasks.workunit.client.0.vm00.stdout:0/862: dwrite d3/d7/d4c/d5b/f2a [0,4194304] 0 2026-03-10T12:38:19.811 INFO:tasks.workunit.client.1.vm07.stdout:5/780: stat d0/d22/d18/d19/d2e/da9/fb5 0 2026-03-10T12:38:19.822 INFO:tasks.workunit.client.0.vm00.stdout:7/753: creat da/d25/d2c/d82/d68/f10e x:0 0 0 2026-03-10T12:38:19.829 INFO:tasks.workunit.client.1.vm07.stdout:3/777: creat dc/dd/d28/d7a/d8e/f10a x:0 0 0 2026-03-10T12:38:19.829 INFO:tasks.workunit.client.1.vm07.stdout:3/778: chown dc/dd/d28/c89 2548800 1 2026-03-10T12:38:19.837 INFO:tasks.workunit.client.0.vm00.stdout:0/863: fsync d3/d22/f55 0 2026-03-10T12:38:19.838 INFO:tasks.workunit.client.1.vm07.stdout:4/860: creat d0/d4/d10/d3c/d2b/d2d/d9c/f131 x:0 0 0 2026-03-10T12:38:19.842 INFO:tasks.workunit.client.1.vm07.stdout:2/668: fsync d0/d42/d26/f48 0 2026-03-10T12:38:19.853 INFO:tasks.workunit.client.1.vm07.stdout:1/756: rename d9/f6d to d9/d2d/d4f/d75/de3/ff5 0 2026-03-10T12:38:19.858 INFO:tasks.workunit.client.0.vm00.stdout:7/754: mknod da/d25/d2c/d82/d68/df8/c10f 0 2026-03-10T12:38:19.858 INFO:tasks.workunit.client.0.vm00.stdout:7/755: chown da/d41/d7b/d9d/dba/fe3 5496484 1 2026-03-10T12:38:19.858 INFO:tasks.workunit.client.1.vm07.stdout:9/825: creat d5/d13/d2c/f118 x:0 0 0 2026-03-10T12:38:19.866 INFO:tasks.workunit.client.1.vm07.stdout:6/732: fdatasync d1/d4/d6/d16/d1a/f9f 0 2026-03-10T12:38:19.872 INFO:tasks.workunit.client.0.vm00.stdout:8/982: dwrite d0/dd/dfe/d12a/f41 [4194304,4194304] 0 2026-03-10T12:38:19.874 INFO:tasks.workunit.client.1.vm07.stdout:3/779: sync 2026-03-10T12:38:19.885 INFO:tasks.workunit.client.1.vm07.stdout:5/781: creat d0/d22/d18/d3e/d5d/db6/f111 x:0 0 0 2026-03-10T12:38:19.893 INFO:tasks.workunit.client.0.vm00.stdout:0/864: rmdir d3/d7/d4c/dcc/dea/d102/d106 0 2026-03-10T12:38:19.910 INFO:tasks.workunit.client.1.vm07.stdout:4/861: rmdir d0/d4/d10/d114 39 2026-03-10T12:38:19.912 INFO:tasks.workunit.client.1.vm07.stdout:7/702: write d0/d61/db4/f54 [586373,92724] 0 2026-03-10T12:38:19.914 INFO:tasks.workunit.client.1.vm07.stdout:2/669: dread d0/f73 [0,4194304] 0 2026-03-10T12:38:19.915 INFO:tasks.workunit.client.1.vm07.stdout:2/670: chown d0/d80/d93 153445 1 2026-03-10T12:38:19.916 INFO:tasks.workunit.client.0.vm00.stdout:0/865: rmdir d3/db/da4/de7 39 2026-03-10T12:38:19.919 INFO:tasks.workunit.client.1.vm07.stdout:8/723: dwrite d1/d3/d40/d92/dba/fc3 [0,4194304] 0 2026-03-10T12:38:19.921 INFO:tasks.workunit.client.1.vm07.stdout:0/869: rename d0/d14/d5f/d3b/lab to d0/d14/d5f/d76/d2f/d31/df0/l121 0 2026-03-10T12:38:19.924 INFO:tasks.workunit.client.0.vm00.stdout:6/686: dwrite d2/da/dc/f45 [0,4194304] 0 2026-03-10T12:38:19.926 INFO:tasks.workunit.client.0.vm00.stdout:8/983: dwrite d0/d46/d7e/fd6 [0,4194304] 0 2026-03-10T12:38:19.928 INFO:tasks.workunit.client.1.vm07.stdout:8/724: dwrite d1/d3/d40/fee [0,4194304] 0 2026-03-10T12:38:19.935 INFO:tasks.workunit.client.0.vm00.stdout:0/866: mkdir d3/db/d24/d25/d112 0 2026-03-10T12:38:19.935 INFO:tasks.workunit.client.0.vm00.stdout:8/984: getdents d0/d5c/d123 0 2026-03-10T12:38:19.941 INFO:tasks.workunit.client.0.vm00.stdout:6/687: fdatasync d2/d16/d29/f54 0 2026-03-10T12:38:19.943 INFO:tasks.workunit.client.0.vm00.stdout:8/985: creat d0/d93/d17/f133 x:0 0 0 2026-03-10T12:38:19.943 INFO:tasks.workunit.client.0.vm00.stdout:8/986: stat d0/lc4 0 2026-03-10T12:38:19.943 INFO:tasks.workunit.client.0.vm00.stdout:0/867: rmdir d3/d7/d4c/d9d 39 2026-03-10T12:38:19.964 INFO:tasks.workunit.client.0.vm00.stdout:6/688: getdents d2/d14/dbb/dd6 0 2026-03-10T12:38:19.964 INFO:tasks.workunit.client.0.vm00.stdout:6/689: stat d2/da/dbf 0 2026-03-10T12:38:19.965 INFO:tasks.workunit.client.0.vm00.stdout:6/690: dread - d2/d42/d80/d9d/fca zero size 2026-03-10T12:38:19.970 INFO:tasks.workunit.client.0.vm00.stdout:0/868: dread d3/f9c [0,4194304] 0 2026-03-10T12:38:19.976 INFO:tasks.workunit.client.0.vm00.stdout:6/691: creat d2/da/dbf/ded/ff9 x:0 0 0 2026-03-10T12:38:19.979 INFO:tasks.workunit.client.0.vm00.stdout:0/869: symlink d3/d7/l113 0 2026-03-10T12:38:19.982 INFO:tasks.workunit.client.0.vm00.stdout:8/987: dread d0/dd/fbc [0,4194304] 0 2026-03-10T12:38:19.992 INFO:tasks.workunit.client.0.vm00.stdout:8/988: unlink d0/d93/l2c 0 2026-03-10T12:38:19.995 INFO:tasks.workunit.client.0.vm00.stdout:0/870: dread d3/d7/db0/ff2 [0,4194304] 0 2026-03-10T12:38:19.995 INFO:tasks.workunit.client.1.vm07.stdout:6/733: fdatasync d1/d4/d6/d43/d65/f7f 0 2026-03-10T12:38:19.998 INFO:tasks.workunit.client.0.vm00.stdout:8/989: creat d0/d93/d17/da2/f134 x:0 0 0 2026-03-10T12:38:20.002 INFO:tasks.workunit.client.1.vm07.stdout:5/782: truncate d0/d22/f89 4686931 0 2026-03-10T12:38:20.002 INFO:tasks.workunit.client.1.vm07.stdout:3/780: read - dc/dd/fbc zero size 2026-03-10T12:38:20.002 INFO:tasks.workunit.client.0.vm00.stdout:8/990: creat d0/d93/d17/da2/f135 x:0 0 0 2026-03-10T12:38:20.004 INFO:tasks.workunit.client.1.vm07.stdout:4/862: truncate d0/d4/d10/d9a/db9/fef 2084019 0 2026-03-10T12:38:20.006 INFO:tasks.workunit.client.1.vm07.stdout:1/757: mknod d9/d2d/d4f/d75/cf6 0 2026-03-10T12:38:20.007 INFO:tasks.workunit.client.0.vm00.stdout:7/756: truncate da/f10 6798140 0 2026-03-10T12:38:20.009 INFO:tasks.workunit.client.0.vm00.stdout:7/757: write da/f17 [939091,16263] 0 2026-03-10T12:38:20.011 INFO:tasks.workunit.client.0.vm00.stdout:7/758: chown da/d26/d37/d56/ddf/d108 6969 1 2026-03-10T12:38:20.015 INFO:tasks.workunit.client.0.vm00.stdout:8/991: write d0/d58/fbf [647495,2313] 0 2026-03-10T12:38:20.016 INFO:tasks.workunit.client.0.vm00.stdout:0/871: rename d3/d7/d4c/d5b/f2a to d3/d7/d4c/d5b/d38/d44/df9/f114 0 2026-03-10T12:38:20.017 INFO:tasks.workunit.client.0.vm00.stdout:0/872: write d3/d40/f10a [606967,110913] 0 2026-03-10T12:38:20.018 INFO:tasks.workunit.client.0.vm00.stdout:8/992: creat d0/d93/d2d/f136 x:0 0 0 2026-03-10T12:38:20.018 INFO:tasks.workunit.client.0.vm00.stdout:0/873: stat d3/d7/d4c/dcc 0 2026-03-10T12:38:20.020 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:19 vm07.local ceph-mon[58582]: pgmap v6: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail 2026-03-10T12:38:20.020 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:19 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:20.020 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:19 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:20.020 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:19 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:20.020 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:19 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:20.021 INFO:tasks.workunit.client.0.vm00.stdout:8/993: creat d0/d5c/f137 x:0 0 0 2026-03-10T12:38:20.030 INFO:tasks.workunit.client.0.vm00.stdout:6/692: write d2/da/dc/f45 [4432773,18844] 0 2026-03-10T12:38:20.034 INFO:tasks.workunit.client.0.vm00.stdout:8/994: write d0/d93/d2d/dc8/ffb [827954,104591] 0 2026-03-10T12:38:20.046 INFO:tasks.workunit.client.1.vm07.stdout:0/870: truncate d0/d14/d5f/d76/d2f/d31/d4f/d9d/fda 91876 0 2026-03-10T12:38:20.050 INFO:tasks.workunit.client.0.vm00.stdout:0/874: dread d3/d7/d4c/d5b/d38/db3/fca [0,4194304] 0 2026-03-10T12:38:20.053 INFO:tasks.workunit.client.0.vm00.stdout:8/995: dwrite d0/d46/d89/f12f [0,4194304] 0 2026-03-10T12:38:20.062 INFO:tasks.workunit.client.0.vm00.stdout:8/996: fsync d0/dd/d38/f3d 0 2026-03-10T12:38:20.065 INFO:tasks.workunit.client.0.vm00.stdout:8/997: unlink d0/d93/d17/db1/l12e 0 2026-03-10T12:38:20.065 INFO:tasks.workunit.client.0.vm00.stdout:7/759: rename da/d3f/d71/f100 to da/d3f/d60/f110 0 2026-03-10T12:38:20.065 INFO:tasks.workunit.client.0.vm00.stdout:8/998: stat d0/dd/dfe/d12a/d7d/ff5 0 2026-03-10T12:38:20.068 INFO:tasks.workunit.client.0.vm00.stdout:6/693: truncate d2/d16/d74/f62 4465474 0 2026-03-10T12:38:20.072 INFO:tasks.workunit.client.1.vm07.stdout:6/734: creat d1/d4/d6/d16/ff0 x:0 0 0 2026-03-10T12:38:20.072 INFO:tasks.workunit.client.1.vm07.stdout:7/703: write d0/d57/dd6/d80/fd2 [793859,113708] 0 2026-03-10T12:38:20.073 INFO:tasks.workunit.client.1.vm07.stdout:3/781: creat dc/d18/d2d/f10b x:0 0 0 2026-03-10T12:38:20.074 INFO:tasks.workunit.client.0.vm00.stdout:8/999: symlink d0/d93/d17/db1/d113/d126/l138 0 2026-03-10T12:38:20.074 INFO:tasks.workunit.client.1.vm07.stdout:2/671: link d0/d29/d64/f67 d0/d42/d1f/d20/fe2 0 2026-03-10T12:38:20.075 INFO:tasks.workunit.client.0.vm00.stdout:0/875: symlink d3/d7/d4c/d5b/l115 0 2026-03-10T12:38:20.078 INFO:tasks.workunit.client.0.vm00.stdout:7/760: creat da/d41/f111 x:0 0 0 2026-03-10T12:38:20.078 INFO:tasks.workunit.client.1.vm07.stdout:9/826: creat d5/f119 x:0 0 0 2026-03-10T12:38:20.079 INFO:tasks.workunit.client.0.vm00.stdout:0/876: symlink d3/d7/d4c/dcc/l116 0 2026-03-10T12:38:20.079 INFO:tasks.workunit.client.1.vm07.stdout:0/871: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/f122 x:0 0 0 2026-03-10T12:38:20.079 INFO:tasks.workunit.client.0.vm00.stdout:7/761: chown da/d1b/d40/cc1 787 1 2026-03-10T12:38:20.089 INFO:tasks.workunit.client.1.vm07.stdout:7/704: dwrite d0/d61/db4/fad [0,4194304] 0 2026-03-10T12:38:20.094 INFO:tasks.workunit.client.1.vm07.stdout:5/783: symlink d0/d22/d18/d19/d21/dc2/df0/l112 0 2026-03-10T12:38:20.095 INFO:tasks.workunit.client.1.vm07.stdout:5/784: stat d0/d22/d18/d19/d72/dcc 0 2026-03-10T12:38:20.098 INFO:tasks.workunit.client.0.vm00.stdout:6/694: sync 2026-03-10T12:38:20.109 INFO:tasks.workunit.client.0.vm00.stdout:0/877: fdatasync d3/d7/d4c/d9d/ffb 0 2026-03-10T12:38:20.113 INFO:tasks.workunit.client.0.vm00.stdout:0/878: write d3/d7/db0/dc4/dd5/d10e/fce [947518,12994] 0 2026-03-10T12:38:20.116 INFO:tasks.workunit.client.0.vm00.stdout:0/879: sync 2026-03-10T12:38:20.123 INFO:tasks.workunit.client.1.vm07.stdout:9/827: mknod d5/d1f/d75/c11a 0 2026-03-10T12:38:20.123 INFO:tasks.workunit.client.1.vm07.stdout:0/872: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9/f123 x:0 0 0 2026-03-10T12:38:20.137 INFO:tasks.workunit.client.1.vm07.stdout:1/758: dwrite d9/df/dc2/f57 [4194304,4194304] 0 2026-03-10T12:38:20.138 INFO:tasks.workunit.client.1.vm07.stdout:1/759: stat d9/df/dc9 0 2026-03-10T12:38:20.150 INFO:tasks.workunit.client.1.vm07.stdout:4/863: rmdir d0/d4/d5/d78/dc5/df7/d119 0 2026-03-10T12:38:20.151 INFO:tasks.workunit.client.0.vm00.stdout:0/880: unlink d3/db/d24/d25/l67 0 2026-03-10T12:38:20.152 INFO:tasks.workunit.client.0.vm00.stdout:7/762: creat da/d41/d48/f112 x:0 0 0 2026-03-10T12:38:20.153 INFO:tasks.workunit.client.1.vm07.stdout:8/725: link d1/d3/le1 d1/d3/db2/dcd/db8/lf0 0 2026-03-10T12:38:20.153 INFO:tasks.workunit.client.1.vm07.stdout:2/672: dwrite d0/d42/f1b [0,4194304] 0 2026-03-10T12:38:20.156 INFO:tasks.workunit.client.1.vm07.stdout:6/735: dwrite d1/d4/d6/d43/d65/f86 [0,4194304] 0 2026-03-10T12:38:20.164 INFO:tasks.workunit.client.1.vm07.stdout:9/828: readlink d5/d13/d57/d3e/l116 0 2026-03-10T12:38:20.164 INFO:tasks.workunit.client.1.vm07.stdout:9/829: chown d5/d13/d2c/de6/f56 0 1 2026-03-10T12:38:20.169 INFO:tasks.workunit.client.1.vm07.stdout:0/873: rename d0/d14/d5f/d41/d6a/f102 to d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/f124 0 2026-03-10T12:38:20.171 INFO:tasks.workunit.client.1.vm07.stdout:3/782: link dc/d18/c2e dc/d18/d99/da3/c10c 0 2026-03-10T12:38:20.172 INFO:tasks.workunit.client.0.vm00.stdout:0/881: truncate d3/db/d24/d25/fb8 6535763 0 2026-03-10T12:38:20.175 INFO:tasks.workunit.client.0.vm00.stdout:6/695: getdents d2/d9f/dce 0 2026-03-10T12:38:20.176 INFO:tasks.workunit.client.0.vm00.stdout:7/763: dread da/d3f/d60/fb1 [0,4194304] 0 2026-03-10T12:38:20.182 INFO:tasks.workunit.client.1.vm07.stdout:4/864: truncate d0/d4/df2/df6/f27 605768 0 2026-03-10T12:38:20.182 INFO:tasks.workunit.client.1.vm07.stdout:0/874: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9/f123 [0,4194304] 0 2026-03-10T12:38:20.190 INFO:tasks.workunit.client.0.vm00.stdout:0/882: dread d3/d7/d3c/f72 [0,4194304] 0 2026-03-10T12:38:20.190 INFO:tasks.workunit.client.1.vm07.stdout:8/726: mkdir d1/d3/d40/d92/dba/df1 0 2026-03-10T12:38:20.191 INFO:tasks.workunit.client.1.vm07.stdout:2/673: symlink d0/d29/d64/d74/d75/le3 0 2026-03-10T12:38:20.227 INFO:tasks.workunit.client.0.vm00.stdout:0/883: dread d3/d7/d4c/d5b/d38/f93 [0,4194304] 0 2026-03-10T12:38:20.239 INFO:tasks.workunit.client.0.vm00.stdout:0/884: symlink d3/d22/d3a/deb/l117 0 2026-03-10T12:38:20.244 INFO:tasks.workunit.client.1.vm07.stdout:5/785: creat d0/d22/d18/d19/d21/f113 x:0 0 0 2026-03-10T12:38:20.265 INFO:tasks.workunit.client.0.vm00.stdout:0/885: mkdir d3/d22/da5/d118 0 2026-03-10T12:38:20.268 INFO:tasks.workunit.client.0.vm00.stdout:6/696: dwrite d2/da/dc/f27 [0,4194304] 0 2026-03-10T12:38:20.276 INFO:tasks.workunit.client.0.vm00.stdout:6/697: chown d2/d16/d29/d31/d88/d92/fb6 2459 1 2026-03-10T12:38:20.284 INFO:tasks.workunit.client.0.vm00.stdout:0/886: creat d3/d22/d3a/deb/f119 x:0 0 0 2026-03-10T12:38:20.287 INFO:tasks.workunit.client.1.vm07.stdout:8/727: rename d1/d3/db2/dcd/db8/lf0 to d1/d3/d6/d50/d70/dd4/lf2 0 2026-03-10T12:38:20.288 INFO:tasks.workunit.client.0.vm00.stdout:7/764: write da/d26/d37/d56/f6c [970685,26651] 0 2026-03-10T12:38:20.294 INFO:tasks.workunit.client.1.vm07.stdout:8/728: chown d1/d3/d40/d92/dba 70 1 2026-03-10T12:38:20.305 INFO:tasks.workunit.client.1.vm07.stdout:3/783: truncate dc/dd/d1f/dac/fd7 1287991 0 2026-03-10T12:38:20.305 INFO:tasks.workunit.client.1.vm07.stdout:6/736: creat d1/d4/d6/d46/d4d/dc7/dd9/ddc/ff1 x:0 0 0 2026-03-10T12:38:20.306 INFO:tasks.workunit.client.0.vm00.stdout:0/887: symlink d3/d7/d3c/l11a 0 2026-03-10T12:38:20.306 INFO:tasks.workunit.client.0.vm00.stdout:7/765: dread da/d3f/f93 [0,4194304] 0 2026-03-10T12:38:20.310 INFO:tasks.workunit.client.1.vm07.stdout:7/705: getdents d0/d61/db4 0 2026-03-10T12:38:20.311 INFO:tasks.workunit.client.1.vm07.stdout:5/786: rmdir d0/d22/d18/d19 39 2026-03-10T12:38:20.319 INFO:tasks.workunit.client.0.vm00.stdout:0/888: dwrite d3/d7/db0/ff2 [0,4194304] 0 2026-03-10T12:38:20.321 INFO:tasks.workunit.client.1.vm07.stdout:9/830: dwrite d5/d13/d6c/d7a/fe5 [0,4194304] 0 2026-03-10T12:38:20.323 INFO:tasks.workunit.client.1.vm07.stdout:1/760: dwrite d9/d2d/d4f/d75/de3/ff5 [0,4194304] 0 2026-03-10T12:38:20.324 INFO:tasks.workunit.client.1.vm07.stdout:4/865: dwrite d0/d4/df2/df6/f87 [0,4194304] 0 2026-03-10T12:38:20.328 INFO:tasks.workunit.client.0.vm00.stdout:6/698: write d2/da/fda [74575,70544] 0 2026-03-10T12:38:20.333 INFO:tasks.workunit.client.0.vm00.stdout:7/766: rmdir da/d26/d50 39 2026-03-10T12:38:20.337 INFO:tasks.workunit.client.1.vm07.stdout:8/729: fsync d1/d3/d6/d50/fc8 0 2026-03-10T12:38:20.342 INFO:tasks.workunit.client.1.vm07.stdout:3/784: symlink dc/dd/d28/d7a/d8e/l10d 0 2026-03-10T12:38:20.344 INFO:tasks.workunit.client.0.vm00.stdout:6/699: read d2/d9f/df6/fc8 [180757,24526] 0 2026-03-10T12:38:20.346 INFO:tasks.workunit.client.1.vm07.stdout:1/761: mknod d9/d2d/d4f/d75/de3/cf7 0 2026-03-10T12:38:20.347 INFO:tasks.workunit.client.1.vm07.stdout:1/762: readlink d9/d2d/d4f/d75/l8a 0 2026-03-10T12:38:20.348 INFO:tasks.workunit.client.1.vm07.stdout:1/763: chown d9/d2d/d4f/d75/fab 316 1 2026-03-10T12:38:20.350 INFO:tasks.workunit.client.1.vm07.stdout:9/831: rename d5/d13/d2c/de6/d64/f70 to d5/d13/d2c/de6/d76/f11b 0 2026-03-10T12:38:20.361 INFO:tasks.workunit.client.1.vm07.stdout:0/875: creat d0/d14/d5f/d76/d2f/d31/d79/f125 x:0 0 0 2026-03-10T12:38:20.363 INFO:tasks.workunit.client.1.vm07.stdout:7/706: dread d0/d57/dd6/d80/fac [0,4194304] 0 2026-03-10T12:38:20.374 INFO:tasks.workunit.client.0.vm00.stdout:7/767: fsync da/d25/d2e/d4c/fe7 0 2026-03-10T12:38:20.391 INFO:tasks.workunit.client.0.vm00.stdout:7/768: mknod da/d47/c113 0 2026-03-10T12:38:20.391 INFO:tasks.workunit.client.0.vm00.stdout:7/769: chown da/d25/d2c/d82/d68/df8/c10f 1 1 2026-03-10T12:38:20.391 INFO:tasks.workunit.client.1.vm07.stdout:8/730: fsync d1/d3/d6/faf 0 2026-03-10T12:38:20.392 INFO:tasks.workunit.client.1.vm07.stdout:4/866: chown d0/d4/d10/f4b 7936 1 2026-03-10T12:38:20.393 INFO:tasks.workunit.client.1.vm07.stdout:4/867: write d0/d4/df2/df6/d46/d76/fae [779990,14076] 0 2026-03-10T12:38:20.393 INFO:tasks.workunit.client.1.vm07.stdout:1/764: symlink d9/d2d/d80/lf8 0 2026-03-10T12:38:20.394 INFO:tasks.workunit.client.1.vm07.stdout:3/785: rename dc/dd/db5/f5a to dc/d18/d2d/de5/f10e 0 2026-03-10T12:38:20.400 INFO:tasks.workunit.client.0.vm00.stdout:7/770: mkdir da/d41/d48/d114 0 2026-03-10T12:38:20.410 INFO:tasks.workunit.client.1.vm07.stdout:1/765: creat d9/d2d/d4f/d75/de3/ff9 x:0 0 0 2026-03-10T12:38:20.418 INFO:tasks.workunit.client.1.vm07.stdout:7/707: symlink d0/d57/lec 0 2026-03-10T12:38:20.419 INFO:tasks.workunit.client.1.vm07.stdout:3/786: mknod dc/dd/d43/d76/d95/da0/c10f 0 2026-03-10T12:38:20.419 INFO:tasks.workunit.client.1.vm07.stdout:2/674: link d0/d29/d64/d74/f8e d0/fe4 0 2026-03-10T12:38:20.419 INFO:tasks.workunit.client.1.vm07.stdout:2/675: chown d0/d42/d1f/d90 1836407 1 2026-03-10T12:38:20.424 INFO:tasks.workunit.client.1.vm07.stdout:0/876: sync 2026-03-10T12:38:20.428 INFO:tasks.workunit.client.1.vm07.stdout:7/708: truncate d0/d61/f69 1611382 0 2026-03-10T12:38:20.432 INFO:tasks.workunit.client.1.vm07.stdout:2/676: symlink d0/d42/d1f/le5 0 2026-03-10T12:38:20.433 INFO:tasks.workunit.client.1.vm07.stdout:2/677: read d0/d42/d26/f2e [3691817,63296] 0 2026-03-10T12:38:20.442 INFO:tasks.workunit.client.1.vm07.stdout:9/832: getdents d5/d69/d93 0 2026-03-10T12:38:20.445 INFO:tasks.workunit.client.1.vm07.stdout:6/737: write d1/d4/d6/f91 [278081,66581] 0 2026-03-10T12:38:20.445 INFO:tasks.workunit.client.0.vm00.stdout:0/889: write d3/d7/d4c/d5b/f57 [1587549,48078] 0 2026-03-10T12:38:20.448 INFO:tasks.workunit.client.0.vm00.stdout:6/700: dwrite d2/d42/d80/d89/fa5 [0,4194304] 0 2026-03-10T12:38:20.451 INFO:tasks.workunit.client.1.vm07.stdout:5/787: dwrite d0/d22/d18/d19/d36/fc1 [0,4194304] 0 2026-03-10T12:38:20.471 INFO:tasks.workunit.client.1.vm07.stdout:8/731: dwrite d1/f6b [0,4194304] 0 2026-03-10T12:38:20.473 INFO:tasks.workunit.client.1.vm07.stdout:8/732: write d1/d3/d40/fd1 [541892,100243] 0 2026-03-10T12:38:20.476 INFO:tasks.workunit.client.0.vm00.stdout:7/771: write da/d47/dfd/fa9 [381535,55476] 0 2026-03-10T12:38:20.478 INFO:tasks.workunit.client.1.vm07.stdout:4/868: dwrite d0/d4/df2/df6/d46/f85 [0,4194304] 0 2026-03-10T12:38:20.484 INFO:tasks.workunit.client.1.vm07.stdout:0/877: symlink d0/d14/d5f/d76/d2f/d31/d79/d9e/l126 0 2026-03-10T12:38:20.493 INFO:tasks.workunit.client.1.vm07.stdout:3/787: symlink dc/dd/d43/d76/l110 0 2026-03-10T12:38:20.493 INFO:tasks.workunit.client.1.vm07.stdout:3/788: read - dc/d18/fdd zero size 2026-03-10T12:38:20.502 INFO:tasks.workunit.client.1.vm07.stdout:0/878: dread d0/d14/d5f/d76/d2f/d31/d79/d85/fb5 [0,4194304] 0 2026-03-10T12:38:20.506 INFO:tasks.workunit.client.1.vm07.stdout:2/678: mknod d0/d42/d26/d38/d4f/ce6 0 2026-03-10T12:38:20.506 INFO:tasks.workunit.client.1.vm07.stdout:2/679: chown d0/d29/d64/d6c 57 1 2026-03-10T12:38:20.506 INFO:tasks.workunit.client.1.vm07.stdout:2/680: stat d0/d29/fb3 0 2026-03-10T12:38:20.506 INFO:tasks.workunit.client.1.vm07.stdout:1/766: link d9/d2d/d4f/d75/d77/lb4 d9/d2d/d4f/lfa 0 2026-03-10T12:38:20.516 INFO:tasks.workunit.client.1.vm07.stdout:5/788: dread - d0/d22/d18/d19/d72/ffc zero size 2026-03-10T12:38:20.528 INFO:tasks.workunit.client.1.vm07.stdout:8/733: read d1/d3/f1d [489861,40194] 0 2026-03-10T12:38:20.528 INFO:tasks.workunit.client.1.vm07.stdout:8/734: chown d1/d3/d40/d92 786038 1 2026-03-10T12:38:20.529 INFO:tasks.workunit.client.1.vm07.stdout:8/735: stat d1/d3/d6/d54/l91 0 2026-03-10T12:38:20.529 INFO:tasks.workunit.client.1.vm07.stdout:8/736: readlink d1/d3/d6c/lb1 0 2026-03-10T12:38:20.529 INFO:tasks.workunit.client.1.vm07.stdout:8/737: stat d1/d3/f2d 0 2026-03-10T12:38:20.531 INFO:tasks.workunit.client.1.vm07.stdout:7/709: dread d0/d61/db4/f9e [0,4194304] 0 2026-03-10T12:38:20.531 INFO:tasks.workunit.client.1.vm07.stdout:7/710: dread - d0/d61/db4/fdc zero size 2026-03-10T12:38:20.532 INFO:tasks.workunit.client.1.vm07.stdout:7/711: readlink d0/d47/l49 0 2026-03-10T12:38:20.534 INFO:tasks.workunit.client.1.vm07.stdout:3/789: rmdir dc/dd/d28/dd0 39 2026-03-10T12:38:20.546 INFO:tasks.workunit.client.1.vm07.stdout:0/879: creat d0/d14/d5f/d41/d6a/d9a/f127 x:0 0 0 2026-03-10T12:38:20.553 INFO:tasks.workunit.client.1.vm07.stdout:9/833: write d5/d13/d57/d4f/d6a/f8e [4625174,4644] 0 2026-03-10T12:38:20.555 INFO:tasks.workunit.client.0.vm00.stdout:0/890: dwrite d3/d22/f46 [4194304,4194304] 0 2026-03-10T12:38:20.565 INFO:tasks.workunit.client.0.vm00.stdout:6/701: truncate d2/d16/d74/f4d 277589 0 2026-03-10T12:38:20.570 INFO:tasks.workunit.client.1.vm07.stdout:6/738: truncate d1/d4/d6/f8d 132813 0 2026-03-10T12:38:20.571 INFO:tasks.workunit.client.0.vm00.stdout:7/772: dwrite da/d26/d37/f96 [0,4194304] 0 2026-03-10T12:38:20.571 INFO:tasks.workunit.client.0.vm00.stdout:0/891: dread d3/d40/f7a [0,4194304] 0 2026-03-10T12:38:20.574 INFO:tasks.workunit.client.0.vm00.stdout:7/773: chown da/d26/d37/d56 179 1 2026-03-10T12:38:20.578 INFO:tasks.workunit.client.0.vm00.stdout:6/702: mknod d2/d16/d29/d31/d88/cfa 0 2026-03-10T12:38:20.581 INFO:tasks.workunit.client.1.vm07.stdout:4/869: symlink d0/d4/d5/d78/dc5/df7/db2/dd5/d12b/l132 0 2026-03-10T12:38:20.587 INFO:tasks.workunit.client.1.vm07.stdout:1/767: dwrite d9/df/d29/d2b/d31/d91/d59/fa4 [0,4194304] 0 2026-03-10T12:38:20.588 INFO:tasks.workunit.client.1.vm07.stdout:5/789: dwrite d0/d22/d18/fb4 [4194304,4194304] 0 2026-03-10T12:38:20.591 INFO:tasks.workunit.client.0.vm00.stdout:0/892: dwrite d3/d7/f9f [4194304,4194304] 0 2026-03-10T12:38:20.601 INFO:tasks.workunit.client.0.vm00.stdout:7/774: write da/d47/dfd/f106 [1743775,55948] 0 2026-03-10T12:38:20.601 INFO:tasks.workunit.client.0.vm00.stdout:7/775: chown da/d41/d48/d81/lad 845226151 1 2026-03-10T12:38:20.603 INFO:tasks.workunit.client.0.vm00.stdout:7/776: readlink da/d25/d2c/d82/d68/ld9 0 2026-03-10T12:38:20.610 INFO:tasks.workunit.client.1.vm07.stdout:8/738: mkdir d1/d3/d6/d54/dd2/df3 0 2026-03-10T12:38:20.629 INFO:tasks.workunit.client.0.vm00.stdout:6/703: symlink d2/d51/lfb 0 2026-03-10T12:38:20.633 INFO:tasks.workunit.client.0.vm00.stdout:6/704: dread d2/da/fda [0,4194304] 0 2026-03-10T12:38:20.639 INFO:tasks.workunit.client.0.vm00.stdout:7/777: truncate f1 5388401 0 2026-03-10T12:38:20.646 INFO:tasks.workunit.client.0.vm00.stdout:0/893: creat d3/db/f11b x:0 0 0 2026-03-10T12:38:20.648 INFO:tasks.workunit.client.1.vm07.stdout:2/681: mknod d0/d42/ce7 0 2026-03-10T12:38:20.650 INFO:tasks.workunit.client.0.vm00.stdout:6/705: creat d2/d9f/dce/ffc x:0 0 0 2026-03-10T12:38:20.650 INFO:tasks.workunit.client.0.vm00.stdout:7/778: mknod da/d25/d2c/d82/d68/df8/c115 0 2026-03-10T12:38:20.651 INFO:tasks.workunit.client.0.vm00.stdout:6/706: read - d2/d14/d7a/db9/f6c zero size 2026-03-10T12:38:20.654 INFO:tasks.workunit.client.0.vm00.stdout:7/779: creat da/d1b/d40/f116 x:0 0 0 2026-03-10T12:38:20.654 INFO:tasks.workunit.client.1.vm07.stdout:6/739: unlink d1/d4/fc5 0 2026-03-10T12:38:20.654 INFO:tasks.workunit.client.1.vm07.stdout:4/870: creat d0/d4/df2/df6/d46/d76/f133 x:0 0 0 2026-03-10T12:38:20.655 INFO:tasks.workunit.client.0.vm00.stdout:6/707: truncate d2/d16/f23 4348100 0 2026-03-10T12:38:20.660 INFO:tasks.workunit.client.0.vm00.stdout:6/708: rmdir d2/d16/d29/d31/d88/d92/daa 39 2026-03-10T12:38:20.664 INFO:tasks.workunit.client.1.vm07.stdout:5/790: creat d0/d22/d18/d19/d21/d54/f114 x:0 0 0 2026-03-10T12:38:20.664 INFO:tasks.workunit.client.1.vm07.stdout:0/880: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d128 0 2026-03-10T12:38:20.664 INFO:tasks.workunit.client.0.vm00.stdout:7/780: dread - da/d41/d48/fd4 zero size 2026-03-10T12:38:20.664 INFO:tasks.workunit.client.0.vm00.stdout:6/709: mkdir d2/d42/d80/dfd 0 2026-03-10T12:38:20.665 INFO:tasks.workunit.client.1.vm07.stdout:9/834: mknod d5/d13/c11c 0 2026-03-10T12:38:20.667 INFO:tasks.workunit.client.0.vm00.stdout:7/781: mknod da/d41/d7b/d9d/c117 0 2026-03-10T12:38:20.667 INFO:tasks.workunit.client.1.vm07.stdout:2/682: creat d0/d42/d26/d7d/fe8 x:0 0 0 2026-03-10T12:38:20.669 INFO:tasks.workunit.client.0.vm00.stdout:0/894: sync 2026-03-10T12:38:20.671 INFO:tasks.workunit.client.1.vm07.stdout:4/871: dwrite d0/d4/d10/d114/f117 [0,4194304] 0 2026-03-10T12:38:20.676 INFO:tasks.workunit.client.0.vm00.stdout:6/710: creat d2/da/dc/d94/ffe x:0 0 0 2026-03-10T12:38:20.676 INFO:tasks.workunit.client.1.vm07.stdout:5/791: rmdir d0/d22/d18/d3e/d5d/db6 39 2026-03-10T12:38:20.677 INFO:tasks.workunit.client.1.vm07.stdout:5/792: stat d0/d22/d18/d19/d21/dc2/df0 0 2026-03-10T12:38:20.678 INFO:tasks.workunit.client.1.vm07.stdout:5/793: read d0/d22/d18/d19/d2e/d67/fa0 [844659,11208] 0 2026-03-10T12:38:20.685 INFO:tasks.workunit.client.0.vm00.stdout:0/895: fdatasync d3/db/d77/f8a 0 2026-03-10T12:38:20.688 INFO:tasks.workunit.client.0.vm00.stdout:7/782: truncate da/d41/d48/fbc 278439 0 2026-03-10T12:38:20.693 INFO:tasks.workunit.client.1.vm07.stdout:8/739: creat d1/d3/d6c/dde/de7/ff4 x:0 0 0 2026-03-10T12:38:20.699 INFO:tasks.workunit.client.1.vm07.stdout:3/790: rename dc/d18/d99/da3/c10c to dc/dd/d28/c111 0 2026-03-10T12:38:20.715 INFO:tasks.workunit.client.0.vm00.stdout:6/711: rmdir d2/d9f/dce 39 2026-03-10T12:38:20.723 INFO:tasks.workunit.client.1.vm07.stdout:9/835: dread d5/d13/d57/d4f/d6a/f8e [0,4194304] 0 2026-03-10T12:38:20.724 INFO:tasks.workunit.client.0.vm00.stdout:0/896: dread d3/db/fbc [0,4194304] 0 2026-03-10T12:38:20.726 INFO:tasks.workunit.client.1.vm07.stdout:7/712: write d0/d61/db4/f4b [2942078,95679] 0 2026-03-10T12:38:20.726 INFO:tasks.workunit.client.1.vm07.stdout:7/713: stat d0/d57/d62/d90/da1/le0 0 2026-03-10T12:38:20.727 INFO:tasks.workunit.client.0.vm00.stdout:0/897: readlink d3/d7/d4c/d5b/dc5/lfa 0 2026-03-10T12:38:20.727 INFO:tasks.workunit.client.1.vm07.stdout:2/683: creat d0/d5b/d98/fe9 x:0 0 0 2026-03-10T12:38:20.736 INFO:tasks.workunit.client.0.vm00.stdout:0/898: write d3/d7/d4c/f96 [1093093,21537] 0 2026-03-10T12:38:20.739 INFO:tasks.workunit.client.0.vm00.stdout:0/899: write d3/db/d77/faa [1996889,54978] 0 2026-03-10T12:38:20.743 INFO:tasks.workunit.client.0.vm00.stdout:0/900: stat d3/d7/d4c/d5b/d38/db3/de2/fad 0 2026-03-10T12:38:20.748 INFO:tasks.workunit.client.1.vm07.stdout:6/740: creat d1/d4/d6/d43/d88/dc3/ff2 x:0 0 0 2026-03-10T12:38:20.757 INFO:tasks.workunit.client.0.vm00.stdout:7/783: rename da/d1b/d40/f116 to da/d25/d2c/d82/d101/f118 0 2026-03-10T12:38:20.763 INFO:tasks.workunit.client.0.vm00.stdout:6/712: symlink d2/d16/d29/d31/d88/d92/lff 0 2026-03-10T12:38:20.763 INFO:tasks.workunit.client.0.vm00.stdout:6/713: chown d2/d16/f1e 137900 1 2026-03-10T12:38:20.764 INFO:tasks.workunit.client.0.vm00.stdout:6/714: fdatasync d2/da/dc/d94/ffe 0 2026-03-10T12:38:20.765 INFO:tasks.workunit.client.1.vm07.stdout:3/791: creat dc/dd/d1f/f112 x:0 0 0 2026-03-10T12:38:20.781 INFO:tasks.workunit.client.1.vm07.stdout:1/768: dread d9/f61 [0,4194304] 0 2026-03-10T12:38:20.793 INFO:tasks.workunit.client.1.vm07.stdout:9/836: mknod d5/d13/d2c/c11d 0 2026-03-10T12:38:20.801 INFO:tasks.workunit.client.1.vm07.stdout:9/837: dread d5/f1a [0,4194304] 0 2026-03-10T12:38:20.809 INFO:tasks.workunit.client.1.vm07.stdout:0/881: mknod d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/c129 0 2026-03-10T12:38:20.821 INFO:tasks.workunit.client.0.vm00.stdout:7/784: creat da/d3f/d60/f119 x:0 0 0 2026-03-10T12:38:20.822 INFO:tasks.workunit.client.1.vm07.stdout:7/714: truncate d0/f1e 1365388 0 2026-03-10T12:38:20.822 INFO:tasks.workunit.client.1.vm07.stdout:7/715: readlink d0/d57/dd6/l86 0 2026-03-10T12:38:20.831 INFO:tasks.workunit.client.0.vm00.stdout:6/715: symlink d2/d42/d80/d9d/l100 0 2026-03-10T12:38:20.840 INFO:tasks.workunit.client.1.vm07.stdout:4/872: symlink d0/d4/d10/l134 0 2026-03-10T12:38:20.844 INFO:tasks.workunit.client.0.vm00.stdout:6/716: sync 2026-03-10T12:38:20.854 INFO:tasks.workunit.client.1.vm07.stdout:6/741: mknod d1/d4/d71/cf3 0 2026-03-10T12:38:20.854 INFO:tasks.workunit.client.1.vm07.stdout:6/742: fsync d1/d4/f82 0 2026-03-10T12:38:20.857 INFO:tasks.workunit.client.1.vm07.stdout:8/740: dwrite d1/f3f [0,4194304] 0 2026-03-10T12:38:20.862 INFO:tasks.workunit.client.1.vm07.stdout:8/741: dwrite d1/f6b [4194304,4194304] 0 2026-03-10T12:38:20.873 INFO:tasks.workunit.client.1.vm07.stdout:5/794: mknod d0/d22/d18/d30/c115 0 2026-03-10T12:38:20.884 INFO:tasks.workunit.client.0.vm00.stdout:7/785: link da/d25/d2e/cf0 da/d47/d87/c11a 0 2026-03-10T12:38:20.886 INFO:tasks.workunit.client.1.vm07.stdout:3/792: mknod dc/dd/d28/d7a/c113 0 2026-03-10T12:38:20.894 INFO:tasks.workunit.client.0.vm00.stdout:6/717: creat d2/d16/d74/f101 x:0 0 0 2026-03-10T12:38:20.902 INFO:tasks.workunit.client.1.vm07.stdout:1/769: symlink d9/d2d/d4f/dde/lfb 0 2026-03-10T12:38:20.909 INFO:tasks.workunit.client.0.vm00.stdout:0/901: write d3/d7/d4c/d5b/d38/f8b [819724,6049] 0 2026-03-10T12:38:20.913 INFO:tasks.workunit.client.1.vm07.stdout:2/684: dwrite d0/d42/d1f/d20/f2b [0,4194304] 0 2026-03-10T12:38:20.914 INFO:tasks.workunit.client.1.vm07.stdout:2/685: chown d0/d42/d4e/d77/d70 116 1 2026-03-10T12:38:20.920 INFO:tasks.workunit.client.0.vm00.stdout:6/718: dread - d2/d9f/dce/ff2 zero size 2026-03-10T12:38:20.924 INFO:tasks.workunit.client.0.vm00.stdout:0/902: rename d3/d7/d4c/d5b/d38/d44/df9/f114 to d3/d7/d4c/dcc/dea/f11c 0 2026-03-10T12:38:20.929 INFO:tasks.workunit.client.1.vm07.stdout:9/838: symlink d5/d1f/d75/l11e 0 2026-03-10T12:38:20.931 INFO:tasks.workunit.client.0.vm00.stdout:7/786: symlink da/d25/d2e/d4c/l11b 0 2026-03-10T12:38:20.955 INFO:tasks.workunit.client.0.vm00.stdout:0/903: dread d3/d40/d65/f92 [0,4194304] 0 2026-03-10T12:38:20.965 INFO:tasks.workunit.client.1.vm07.stdout:4/873: creat d0/d4/d10/d3c/d2b/d54/f135 x:0 0 0 2026-03-10T12:38:20.966 INFO:tasks.workunit.client.0.vm00.stdout:0/904: dread d3/db/d24/fb1 [0,4194304] 0 2026-03-10T12:38:20.970 INFO:tasks.workunit.client.0.vm00.stdout:0/905: sync 2026-03-10T12:38:20.975 INFO:tasks.workunit.client.1.vm07.stdout:6/743: creat d1/d4/d6/d43/d88/d97/ff4 x:0 0 0 2026-03-10T12:38:20.987 INFO:tasks.workunit.client.1.vm07.stdout:0/882: dwrite d0/d14/d5f/d3b/dbc/fb6 [0,4194304] 0 2026-03-10T12:38:20.991 INFO:tasks.workunit.client.0.vm00.stdout:7/787: write da/f16 [480671,107894] 0 2026-03-10T12:38:20.996 INFO:tasks.workunit.client.0.vm00.stdout:6/719: dwrite d2/da/f77 [0,4194304] 0 2026-03-10T12:38:21.011 INFO:tasks.workunit.client.0.vm00.stdout:7/788: creat da/d26/d37/d61/f11c x:0 0 0 2026-03-10T12:38:21.071 INFO:tasks.workunit.client.0.vm00.stdout:7/789: chown da/d25/d2e/cec 46838 1 2026-03-10T12:38:21.118 INFO:tasks.workunit.client.1.vm07.stdout:5/795: truncate d0/d22/f93 1375184 0 2026-03-10T12:38:21.121 INFO:tasks.workunit.client.1.vm07.stdout:1/770: mknod d9/df/dc9/cfc 0 2026-03-10T12:38:21.122 INFO:tasks.workunit.client.1.vm07.stdout:3/793: dread dc/dd/d43/d5c/f65 [0,4194304] 0 2026-03-10T12:38:21.127 INFO:tasks.workunit.client.1.vm07.stdout:2/686: chown d0/f8d 9024 1 2026-03-10T12:38:21.135 INFO:tasks.workunit.client.0.vm00.stdout:6/720: write d2/d14/d7a/db9/f9b [560474,17907] 0 2026-03-10T12:38:21.136 INFO:tasks.workunit.client.1.vm07.stdout:4/874: creat d0/d4/d5/d78/dc5/df7/db2/dd5/d12b/f136 x:0 0 0 2026-03-10T12:38:21.136 INFO:tasks.workunit.client.1.vm07.stdout:0/883: mknod d0/d14/d5f/d41/d86/c12a 0 2026-03-10T12:38:21.136 INFO:tasks.workunit.client.1.vm07.stdout:5/796: mknod d0/d22/d18/d19/d21/d54/dcb/de8/c116 0 2026-03-10T12:38:21.137 INFO:tasks.workunit.client.1.vm07.stdout:5/797: stat d0/d22/d18/d19/d2e/f59 0 2026-03-10T12:38:21.138 INFO:tasks.workunit.client.1.vm07.stdout:1/771: creat d9/df/d29/d2b/d30/ffd x:0 0 0 2026-03-10T12:38:21.139 INFO:tasks.workunit.client.1.vm07.stdout:3/794: creat dc/dd/d1f/dc7/f114 x:0 0 0 2026-03-10T12:38:21.139 INFO:tasks.workunit.client.1.vm07.stdout:3/795: chown dc/dd/d1f/c32 6 1 2026-03-10T12:38:21.140 INFO:tasks.workunit.client.0.vm00.stdout:0/906: dwrite d3/d7/d4c/d5b/f88 [0,4194304] 0 2026-03-10T12:38:21.141 INFO:tasks.workunit.client.0.vm00.stdout:7/790: dwrite da/d1b/f39 [0,4194304] 0 2026-03-10T12:38:21.143 INFO:tasks.workunit.client.0.vm00.stdout:6/721: truncate d2/d14/d7a/db9/f6c 206887 0 2026-03-10T12:38:21.150 INFO:tasks.workunit.client.1.vm07.stdout:9/839: unlink d5/d13/d2c/de6/l96 0 2026-03-10T12:38:21.151 INFO:tasks.workunit.client.1.vm07.stdout:7/716: creat d0/d57/d62/d90/fed x:0 0 0 2026-03-10T12:38:21.153 INFO:tasks.workunit.client.0.vm00.stdout:0/907: mkdir d3/d7/d4c/d5b/d38/db3/d11d 0 2026-03-10T12:38:21.153 INFO:tasks.workunit.client.0.vm00.stdout:0/908: dread - d3/d7/d3c/d4b/f79 zero size 2026-03-10T12:38:21.154 INFO:tasks.workunit.client.1.vm07.stdout:7/717: dwrite d0/d61/db4/f54 [0,4194304] 0 2026-03-10T12:38:21.157 INFO:tasks.workunit.client.1.vm07.stdout:4/875: symlink d0/d4/d10/d9a/db9/l137 0 2026-03-10T12:38:21.164 INFO:tasks.workunit.client.0.vm00.stdout:0/909: fsync d3/d7/db0/dc4/dd5/d10e/fe9 0 2026-03-10T12:38:21.166 INFO:tasks.workunit.client.1.vm07.stdout:0/884: truncate d0/d14/d5f/d3b/f5b 3589067 0 2026-03-10T12:38:21.168 INFO:tasks.workunit.client.0.vm00.stdout:7/791: truncate da/d41/d48/fbc 24326 0 2026-03-10T12:38:21.169 INFO:tasks.workunit.client.1.vm07.stdout:1/772: mkdir d9/d2d/d4f/d75/d77/da7/dfe 0 2026-03-10T12:38:21.172 INFO:tasks.workunit.client.1.vm07.stdout:2/687: dread d0/d42/d26/d7d/fc8 [0,4194304] 0 2026-03-10T12:38:21.180 INFO:tasks.workunit.client.0.vm00.stdout:7/792: creat da/d25/d2c/d82/d101/f11d x:0 0 0 2026-03-10T12:38:21.180 INFO:tasks.workunit.client.0.vm00.stdout:7/793: dread - da/d3f/d60/f119 zero size 2026-03-10T12:38:21.180 INFO:tasks.workunit.client.0.vm00.stdout:7/794: chown da/d1b/d40/fca 10055686 1 2026-03-10T12:38:21.180 INFO:tasks.workunit.client.1.vm07.stdout:3/796: creat dc/dd/db5/f115 x:0 0 0 2026-03-10T12:38:21.180 INFO:tasks.workunit.client.1.vm07.stdout:3/797: readlink dc/dd/d1f/d45/l58 0 2026-03-10T12:38:21.180 INFO:tasks.workunit.client.1.vm07.stdout:9/840: dread - d5/d16/d23/d26/d68/f105 zero size 2026-03-10T12:38:21.180 INFO:tasks.workunit.client.1.vm07.stdout:9/841: read d5/d13/d6c/da4/fd0 [395984,80612] 0 2026-03-10T12:38:21.185 INFO:tasks.workunit.client.0.vm00.stdout:7/795: creat da/d26/d37/d56/ddf/d108/f11e x:0 0 0 2026-03-10T12:38:21.186 INFO:tasks.workunit.client.0.vm00.stdout:6/722: dread d2/da/dc/f25 [0,4194304] 0 2026-03-10T12:38:21.187 INFO:tasks.workunit.client.1.vm07.stdout:6/744: mkdir d1/d4/d6/d16/d1a/d99/df5 0 2026-03-10T12:38:21.190 INFO:tasks.workunit.client.1.vm07.stdout:4/876: mknod d0/d4/d5/d78/dc5/df7/db2/dd5/c138 0 2026-03-10T12:38:21.191 INFO:tasks.workunit.client.0.vm00.stdout:6/723: readlink d2/da/dc/d94/ldf 0 2026-03-10T12:38:21.195 INFO:tasks.workunit.client.1.vm07.stdout:8/742: link d1/d3/d6/d54/fa8 d1/d3/d6/d50/ff5 0 2026-03-10T12:38:21.195 INFO:tasks.workunit.client.1.vm07.stdout:2/688: creat d0/d42/d26/d7d/fea x:0 0 0 2026-03-10T12:38:21.195 INFO:tasks.workunit.client.0.vm00.stdout:6/724: chown d2/d14/cb5 347 1 2026-03-10T12:38:21.195 INFO:tasks.workunit.client.0.vm00.stdout:6/725: write d2/da/f77 [4456327,72716] 0 2026-03-10T12:38:21.195 INFO:tasks.workunit.client.0.vm00.stdout:6/726: read - d2/d42/d9c/fe2 zero size 2026-03-10T12:38:21.195 INFO:tasks.workunit.client.1.vm07.stdout:0/885: dread d0/d14/d5f/d41/d6a/d74/fb9 [0,4194304] 0 2026-03-10T12:38:21.197 INFO:tasks.workunit.client.0.vm00.stdout:6/727: mkdir d2/d16/d74/d102 0 2026-03-10T12:38:21.207 INFO:tasks.workunit.client.1.vm07.stdout:7/718: rename d0/d47/dde/cdf to d0/d67/cee 0 2026-03-10T12:38:21.210 INFO:tasks.workunit.client.1.vm07.stdout:5/798: creat d0/d22/d18/f117 x:0 0 0 2026-03-10T12:38:21.216 INFO:tasks.workunit.client.1.vm07.stdout:1/773: mkdir d9/dff 0 2026-03-10T12:38:21.217 INFO:tasks.workunit.client.1.vm07.stdout:0/886: fsync d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/fd5 0 2026-03-10T12:38:21.218 INFO:tasks.workunit.client.1.vm07.stdout:9/842: dread d5/d13/d57/fa7 [0,4194304] 0 2026-03-10T12:38:21.223 INFO:tasks.workunit.client.1.vm07.stdout:2/689: rename d0/d42/d4e/d77/cdc to d0/d29/d64/db5/ceb 0 2026-03-10T12:38:21.224 INFO:tasks.workunit.client.1.vm07.stdout:4/877: link d0/d4/d10/d3c/d2b/d2d/d9c/f131 d0/d4/d10/d3c/d2b/d54/f139 0 2026-03-10T12:38:21.228 INFO:tasks.workunit.client.1.vm07.stdout:7/719: creat d0/d47/da0/fef x:0 0 0 2026-03-10T12:38:21.238 INFO:tasks.workunit.client.1.vm07.stdout:9/843: creat d5/d16/dd7/f11f x:0 0 0 2026-03-10T12:38:21.238 INFO:tasks.workunit.client.1.vm07.stdout:9/844: chown d5/d13/l15 2179 1 2026-03-10T12:38:21.250 INFO:tasks.workunit.client.1.vm07.stdout:2/690: creat d0/d5b/fec x:0 0 0 2026-03-10T12:38:21.252 INFO:tasks.workunit.client.0.vm00.stdout:7/796: rmdir da/d25/d2c/d82/d101 39 2026-03-10T12:38:21.253 INFO:tasks.workunit.client.0.vm00.stdout:0/910: write d3/d7/d3c/d74/f78 [2203955,74864] 0 2026-03-10T12:38:21.255 INFO:tasks.workunit.client.0.vm00.stdout:6/728: truncate d2/d14/d7a/db9/f4a 4039106 0 2026-03-10T12:38:21.256 INFO:tasks.workunit.client.1.vm07.stdout:3/798: write dc/d18/d24/f37 [2763142,129319] 0 2026-03-10T12:38:21.259 INFO:tasks.workunit.client.1.vm07.stdout:9/845: mkdir d5/d13/d2c/de6/dce/d120 0 2026-03-10T12:38:21.260 INFO:tasks.workunit.client.1.vm07.stdout:9/846: dread d5/d13/d22/f32 [0,4194304] 0 2026-03-10T12:38:21.263 INFO:tasks.workunit.client.0.vm00.stdout:0/911: write d3/d7/d4c/dcc/dea/f11c [5225544,102612] 0 2026-03-10T12:38:21.264 INFO:tasks.workunit.client.0.vm00.stdout:6/729: readlink d2/l7 0 2026-03-10T12:38:21.266 INFO:tasks.workunit.client.1.vm07.stdout:6/745: getdents d1/d4/d6/d43/d65 0 2026-03-10T12:38:21.270 INFO:tasks.workunit.client.1.vm07.stdout:8/743: write d1/f19 [5451315,71522] 0 2026-03-10T12:38:21.273 INFO:tasks.workunit.client.1.vm07.stdout:5/799: write d0/d22/d18/d19/d21/d3a/ff7 [828494,64439] 0 2026-03-10T12:38:21.277 INFO:tasks.workunit.client.1.vm07.stdout:0/887: dwrite d0/d14/d5f/d76/d2f/d31/d4f/f70 [0,4194304] 0 2026-03-10T12:38:21.282 INFO:tasks.workunit.client.0.vm00.stdout:6/730: truncate d2/da/dc/d2f/fdc 942347 0 2026-03-10T12:38:21.283 INFO:tasks.workunit.client.1.vm07.stdout:1/774: creat d9/d2d/d4f/d75/d77/f100 x:0 0 0 2026-03-10T12:38:21.283 INFO:tasks.workunit.client.1.vm07.stdout:1/775: readlink d9/df/ld3 0 2026-03-10T12:38:21.287 INFO:tasks.workunit.client.1.vm07.stdout:9/847: dread - d5/d1f/d7d/ffb zero size 2026-03-10T12:38:21.287 INFO:tasks.workunit.client.0.vm00.stdout:0/912: creat d3/d40/f11e x:0 0 0 2026-03-10T12:38:21.287 INFO:tasks.workunit.client.1.vm07.stdout:9/848: write d5/d13/d6c/fb6 [790520,58643] 0 2026-03-10T12:38:21.290 INFO:tasks.workunit.client.0.vm00.stdout:7/797: getdents da/d25/d2c/d82/d68 0 2026-03-10T12:38:21.291 INFO:tasks.workunit.client.1.vm07.stdout:6/746: creat d1/d4/d6/d16/d1a/d2c/de0/ff6 x:0 0 0 2026-03-10T12:38:21.291 INFO:tasks.workunit.client.0.vm00.stdout:6/731: mkdir d2/d42/d103 0 2026-03-10T12:38:21.293 INFO:tasks.workunit.client.0.vm00.stdout:0/913: rename d3/d7/d3c to d3/d7/d3c/d11f 22 2026-03-10T12:38:21.299 INFO:tasks.workunit.client.1.vm07.stdout:4/878: dwrite d0/fa1 [0,4194304] 0 2026-03-10T12:38:21.305 INFO:tasks.workunit.client.0.vm00.stdout:7/798: unlink da/d47/fe1 0 2026-03-10T12:38:21.310 INFO:tasks.workunit.client.1.vm07.stdout:0/888: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/f12b x:0 0 0 2026-03-10T12:38:21.313 INFO:tasks.workunit.client.1.vm07.stdout:7/720: rmdir d0/d57/dd6/de5 0 2026-03-10T12:38:21.330 INFO:tasks.workunit.client.1.vm07.stdout:2/691: write d0/d42/d4e/d77/d70/f96 [1836188,119736] 0 2026-03-10T12:38:21.333 INFO:tasks.workunit.client.1.vm07.stdout:3/799: write dc/d18/d24/f55 [4375569,20872] 0 2026-03-10T12:38:21.338 INFO:tasks.workunit.client.1.vm07.stdout:5/800: dwrite d0/d22/d18/d3e/fa5 [0,4194304] 0 2026-03-10T12:38:21.340 INFO:tasks.workunit.client.0.vm00.stdout:0/914: write d3/d22/f42 [1018922,93064] 0 2026-03-10T12:38:21.355 INFO:tasks.workunit.client.1.vm07.stdout:4/879: rmdir d0/d4/d5 39 2026-03-10T12:38:21.358 INFO:tasks.workunit.client.0.vm00.stdout:6/732: truncate d2/d42/fd4 3409822 0 2026-03-10T12:38:21.358 INFO:tasks.workunit.client.1.vm07.stdout:7/721: creat d0/d47/dde/ff0 x:0 0 0 2026-03-10T12:38:21.365 INFO:tasks.workunit.client.0.vm00.stdout:0/915: mkdir d3/d7/d4c/d5b/d120 0 2026-03-10T12:38:21.368 INFO:tasks.workunit.client.1.vm07.stdout:2/692: creat d0/d42/d26/d38/d4f/dad/fed x:0 0 0 2026-03-10T12:38:21.369 INFO:tasks.workunit.client.0.vm00.stdout:0/916: dwrite d3/d7/d3c/d74/f101 [0,4194304] 0 2026-03-10T12:38:21.370 INFO:tasks.workunit.client.0.vm00.stdout:6/733: dread d2/d16/f41 [0,4194304] 0 2026-03-10T12:38:21.375 INFO:tasks.workunit.client.1.vm07.stdout:3/800: truncate dc/f94 479582 0 2026-03-10T12:38:21.387 INFO:tasks.workunit.client.0.vm00.stdout:7/799: dwrite da/d1b/d40/f7d [0,4194304] 0 2026-03-10T12:38:21.388 INFO:tasks.workunit.client.1.vm07.stdout:1/776: write d9/df/d29/d2b/d30/fa8 [72582,80567] 0 2026-03-10T12:38:21.391 INFO:tasks.workunit.client.0.vm00.stdout:0/917: creat d3/d7/db0/dc4/f121 x:0 0 0 2026-03-10T12:38:21.394 INFO:tasks.workunit.client.1.vm07.stdout:4/880: truncate d0/d4/d10/d3c/f6c 2324756 0 2026-03-10T12:38:21.396 INFO:tasks.workunit.client.0.vm00.stdout:7/800: dread da/d3f/d60/f88 [0,4194304] 0 2026-03-10T12:38:21.398 INFO:tasks.workunit.client.0.vm00.stdout:0/918: read d3/d7/d3c/d74/f78 [1695429,122064] 0 2026-03-10T12:38:21.417 INFO:tasks.workunit.client.0.vm00.stdout:6/734: write d2/d16/d29/f54 [720371,30512] 0 2026-03-10T12:38:21.417 INFO:tasks.workunit.client.1.vm07.stdout:9/849: rename d5/d13/d2c/ff5 to d5/f121 0 2026-03-10T12:38:21.421 INFO:tasks.workunit.client.1.vm07.stdout:6/747: link d1/dd7/l72 d1/d4/d6/d16/d1a/lf7 0 2026-03-10T12:38:21.427 INFO:tasks.workunit.client.0.vm00.stdout:6/735: dread d2/d42/d80/fbd [0,4194304] 0 2026-03-10T12:38:21.427 INFO:tasks.workunit.client.1.vm07.stdout:8/744: getdents d1/d3/d6c 0 2026-03-10T12:38:21.427 INFO:tasks.workunit.client.1.vm07.stdout:2/693: fsync d0/d29/d64/d74/d88/f58 0 2026-03-10T12:38:21.428 INFO:tasks.workunit.client.1.vm07.stdout:2/694: chown d0/f15 194 1 2026-03-10T12:38:21.428 INFO:tasks.workunit.client.1.vm07.stdout:3/801: mkdir dc/dd/d1f/dc7/dc9/d116 0 2026-03-10T12:38:21.428 INFO:tasks.workunit.client.1.vm07.stdout:1/777: mkdir d9/df/d29/d2b/d30/d101 0 2026-03-10T12:38:21.433 INFO:tasks.workunit.client.1.vm07.stdout:1/778: dwrite d9/df/d29/d2b/d31/fd8 [4194304,4194304] 0 2026-03-10T12:38:21.448 INFO:tasks.workunit.client.0.vm00.stdout:7/801: unlink da/d26/d37/d61/cd5 0 2026-03-10T12:38:21.453 INFO:tasks.workunit.client.1.vm07.stdout:4/881: mkdir d0/d4/df2/df6/d46/d13a 0 2026-03-10T12:38:21.454 INFO:tasks.workunit.client.1.vm07.stdout:4/882: stat d0/d4/d10/d3c/d2b 0 2026-03-10T12:38:21.456 INFO:tasks.workunit.client.1.vm07.stdout:7/722: creat d0/d47/dab/dae/ff1 x:0 0 0 2026-03-10T12:38:21.467 INFO:tasks.workunit.client.0.vm00.stdout:6/736: dread - d2/da/dc/d83/fb0 zero size 2026-03-10T12:38:21.469 INFO:tasks.workunit.client.1.vm07.stdout:0/889: rename d0/d14/d5f/d76/d2f/db2 to d0/d14/d5f/d76/d2f/df4/d12c 0 2026-03-10T12:38:21.477 INFO:tasks.workunit.client.0.vm00.stdout:0/919: creat d3/d7/d4c/d5b/d38/f122 x:0 0 0 2026-03-10T12:38:21.484 INFO:tasks.workunit.client.1.vm07.stdout:6/748: creat d1/d4/d6/d46/d4d/dc7/ff8 x:0 0 0 2026-03-10T12:38:21.486 INFO:tasks.workunit.client.0.vm00.stdout:7/802: dread da/d41/d7b/d9d/fc2 [0,4194304] 0 2026-03-10T12:38:21.487 INFO:tasks.workunit.client.0.vm00.stdout:7/803: truncate da/d47/fe4 625266 0 2026-03-10T12:38:21.488 INFO:tasks.workunit.client.0.vm00.stdout:7/804: truncate da/d3f/d60/f110 1039969 0 2026-03-10T12:38:21.491 INFO:tasks.workunit.client.0.vm00.stdout:6/737: rename d2/da/dc/d94/fc7 to d2/d16/d29/d31/d34/f104 0 2026-03-10T12:38:21.493 INFO:tasks.workunit.client.1.vm07.stdout:9/850: write d5/d16/d23/d26/f46 [5241355,124812] 0 2026-03-10T12:38:21.494 INFO:tasks.workunit.client.1.vm07.stdout:2/695: creat d0/d5b/d98/fee x:0 0 0 2026-03-10T12:38:21.494 INFO:tasks.workunit.client.1.vm07.stdout:8/745: write d1/d3/d40/fb0 [388649,91104] 0 2026-03-10T12:38:21.498 INFO:tasks.workunit.client.1.vm07.stdout:8/746: chown d1/d3/d6/d50/f80 4 1 2026-03-10T12:38:21.526 INFO:tasks.workunit.client.1.vm07.stdout:1/779: write d9/df/d29/fd4 [931567,24222] 0 2026-03-10T12:38:21.535 INFO:tasks.workunit.client.0.vm00.stdout:0/920: creat d3/d40/d65/f123 x:0 0 0 2026-03-10T12:38:21.546 INFO:tasks.workunit.client.0.vm00.stdout:6/738: symlink d2/d42/d9c/l105 0 2026-03-10T12:38:21.548 INFO:tasks.workunit.client.1.vm07.stdout:7/723: dread d0/f70 [0,4194304] 0 2026-03-10T12:38:21.550 INFO:tasks.workunit.client.0.vm00.stdout:6/739: dwrite d2/da/dc/d2f/ff4 [0,4194304] 0 2026-03-10T12:38:21.559 INFO:tasks.workunit.client.1.vm07.stdout:6/749: rename d1/d4/c1c to d1/d4/d6/d16/d1a/d2c/de0/cf9 0 2026-03-10T12:38:21.560 INFO:tasks.workunit.client.0.vm00.stdout:7/805: write da/d25/f2b [2562257,109102] 0 2026-03-10T12:38:21.573 INFO:tasks.workunit.client.1.vm07.stdout:9/851: creat d5/d13/d2c/de6/d74/f122 x:0 0 0 2026-03-10T12:38:21.576 INFO:tasks.workunit.client.1.vm07.stdout:5/801: link d0/cb d0/d22/d18/d19/d21/dc2/df0/c118 0 2026-03-10T12:38:21.579 INFO:tasks.workunit.client.1.vm07.stdout:9/852: read d5/d16/d23/d26/f86 [230905,117056] 0 2026-03-10T12:38:21.580 INFO:tasks.workunit.client.0.vm00.stdout:6/740: creat d2/d42/d80/d9d/f106 x:0 0 0 2026-03-10T12:38:21.581 INFO:tasks.workunit.client.0.vm00.stdout:6/741: stat d2/d42/d80/d9d/fe9 0 2026-03-10T12:38:21.582 INFO:tasks.workunit.client.1.vm07.stdout:4/883: mknod d0/d4/d5/d8f/c13b 0 2026-03-10T12:38:21.584 INFO:tasks.workunit.client.1.vm07.stdout:4/884: write d0/d4/d10/d114/f117 [2381050,90868] 0 2026-03-10T12:38:21.589 INFO:tasks.workunit.client.0.vm00.stdout:7/806: fdatasync da/d25/d2e/d4c/fe7 0 2026-03-10T12:38:21.602 INFO:tasks.workunit.client.1.vm07.stdout:3/802: truncate dc/d18/f34 1355581 0 2026-03-10T12:38:21.603 INFO:tasks.workunit.client.1.vm07.stdout:6/750: write d1/d4/d6/d16/d1a/f9f [942647,77427] 0 2026-03-10T12:38:21.605 INFO:tasks.workunit.client.1.vm07.stdout:6/751: chown d1/d4/d6/d43/f90 3453 1 2026-03-10T12:38:21.605 INFO:tasks.workunit.client.1.vm07.stdout:2/696: dwrite d0/d29/d64/d74/d75/fa5 [0,4194304] 0 2026-03-10T12:38:21.609 INFO:tasks.workunit.client.1.vm07.stdout:0/890: dwrite d0/d14/d5f/f54 [0,4194304] 0 2026-03-10T12:38:21.616 INFO:tasks.workunit.client.0.vm00.stdout:0/921: link d3/d7/d4c/d5b/dc5/lfa d3/d7/d4c/d5b/dc5/l124 0 2026-03-10T12:38:21.618 INFO:tasks.workunit.client.1.vm07.stdout:9/853: symlink d5/d16/d23/d26/l123 0 2026-03-10T12:38:21.618 INFO:tasks.workunit.client.1.vm07.stdout:1/780: symlink d9/df/dc2/de1/l102 0 2026-03-10T12:38:21.622 INFO:tasks.workunit.client.1.vm07.stdout:7/724: mknod d0/cf2 0 2026-03-10T12:38:21.630 INFO:tasks.workunit.client.0.vm00.stdout:0/922: mknod d3/d7/d4c/d5b/d38/d44/df9/c125 0 2026-03-10T12:38:21.634 INFO:tasks.workunit.client.0.vm00.stdout:7/807: creat da/d41/d48/d114/f11f x:0 0 0 2026-03-10T12:38:21.638 INFO:tasks.workunit.client.1.vm07.stdout:4/885: creat d0/d4/d5/d78/dc5/df7/db2/dd5/f13c x:0 0 0 2026-03-10T12:38:21.639 INFO:tasks.workunit.client.0.vm00.stdout:6/742: link d2/da/f82 d2/d16/d29/d31/d34/f107 0 2026-03-10T12:38:21.642 INFO:tasks.workunit.client.1.vm07.stdout:8/747: link d1/d3/d18/l33 d1/d3/d6/d54/dd2/lf6 0 2026-03-10T12:38:21.642 INFO:tasks.workunit.client.0.vm00.stdout:6/743: truncate d2/d16/d29/d31/d34/f104 993149 0 2026-03-10T12:38:21.643 INFO:tasks.workunit.client.0.vm00.stdout:0/923: unlink d3/d7/d4c/d5b/d38/d44/df9/l10d 0 2026-03-10T12:38:21.643 INFO:tasks.workunit.client.0.vm00.stdout:0/924: chown d3/db/d77/cef 103231 1 2026-03-10T12:38:21.644 INFO:tasks.workunit.client.1.vm07.stdout:3/803: truncate dc/dd/d1f/dac/fee 1047919 0 2026-03-10T12:38:21.646 INFO:tasks.workunit.client.0.vm00.stdout:6/744: mknod d2/da/dbf/ded/c108 0 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: pgmap v7: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 43 MiB/s rd, 106 MiB/s wr, 265 op/s 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:21.649 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:21 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:21.724 INFO:tasks.workunit.client.1.vm07.stdout:6/752: dread - d1/d4/d6/d16/fdd zero size 2026-03-10T12:38:21.743 INFO:tasks.workunit.client.1.vm07.stdout:2/697: creat d0/d29/d64/d6c/fef x:0 0 0 2026-03-10T12:38:21.744 INFO:tasks.workunit.client.1.vm07.stdout:2/698: chown d0/d5b/d98/fe9 389 1 2026-03-10T12:38:21.745 INFO:tasks.workunit.client.1.vm07.stdout:2/699: write d0/d42/f1b [4010686,15590] 0 2026-03-10T12:38:21.761 INFO:tasks.workunit.client.1.vm07.stdout:0/891: fsync d0/d14/d5f/d41/d6a/d74/fb9 0 2026-03-10T12:38:21.776 INFO:tasks.workunit.client.1.vm07.stdout:9/854: fdatasync d5/d16/d23/fee 0 2026-03-10T12:38:21.799 INFO:tasks.workunit.client.0.vm00.stdout:7/808: dwrite da/d26/d37/d56/fed [0,4194304] 0 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: pgmap v7: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 43 MiB/s rd, 106 MiB/s wr, 265 op/s 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:21.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:21 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:21.835 INFO:tasks.workunit.client.0.vm00.stdout:7/809: symlink da/d26/d37/d56/ddf/l120 0 2026-03-10T12:38:21.853 INFO:tasks.workunit.client.0.vm00.stdout:7/810: dwrite da/d25/d2c/d82/ff5 [0,4194304] 0 2026-03-10T12:38:21.854 INFO:tasks.workunit.client.0.vm00.stdout:7/811: read - da/d26/d37/d56/ddf/d108/f10a zero size 2026-03-10T12:38:21.893 INFO:tasks.workunit.client.1.vm07.stdout:2/700: unlink d0/d5b/cbe 0 2026-03-10T12:38:21.894 INFO:tasks.workunit.client.0.vm00.stdout:7/812: getdents da/d1b 0 2026-03-10T12:38:21.895 INFO:tasks.workunit.client.0.vm00.stdout:7/813: stat da/d26/d37/d56 0 2026-03-10T12:38:21.898 INFO:tasks.workunit.client.1.vm07.stdout:0/892: fsync d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/fd5 0 2026-03-10T12:38:21.915 INFO:tasks.workunit.client.0.vm00.stdout:7/814: dread da/d1b/d40/f44 [0,4194304] 0 2026-03-10T12:38:21.926 INFO:tasks.workunit.client.1.vm07.stdout:4/886: fsync d0/d4/df2/df6/d46/d76/fa2 0 2026-03-10T12:38:21.929 INFO:tasks.workunit.client.1.vm07.stdout:8/748: mknod d1/d3/d40/d92/cf7 0 2026-03-10T12:38:21.932 INFO:tasks.workunit.client.1.vm07.stdout:3/804: truncate dc/dd/d28/d3b/f4d 3740717 0 2026-03-10T12:38:21.998 INFO:tasks.workunit.client.1.vm07.stdout:5/802: dwrite d0/d22/f93 [0,4194304] 0 2026-03-10T12:38:22.013 INFO:tasks.workunit.client.1.vm07.stdout:1/781: dwrite d9/d2d/d4f/d5a/f6e [0,4194304] 0 2026-03-10T12:38:22.013 INFO:tasks.workunit.client.1.vm07.stdout:7/725: write d0/d61/db4/d8a/fbe [256940,61162] 0 2026-03-10T12:38:22.015 INFO:tasks.workunit.client.1.vm07.stdout:2/701: unlink d0/d42/d4e/dab/cd8 0 2026-03-10T12:38:22.029 INFO:tasks.workunit.client.1.vm07.stdout:4/887: creat d0/d8e/f13d x:0 0 0 2026-03-10T12:38:22.033 INFO:tasks.workunit.client.1.vm07.stdout:6/753: write d1/d4/d6/d4e/fa1 [994769,93458] 0 2026-03-10T12:38:22.034 INFO:tasks.workunit.client.0.vm00.stdout:0/925: rename d3/d7/d4c to d3/d7/db0/dc4/de5/d126 0 2026-03-10T12:38:22.046 INFO:tasks.workunit.client.1.vm07.stdout:5/803: dread d0/d22/d18/d19/de5/f105 [0,4194304] 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.0.vm00.stdout:6/745: mknod d2/d16/d29/c109 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.0.vm00.stdout:6/746: chown d2/d16/f20 59613384 1 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.0.vm00.stdout:6/747: mkdir d2/da/dc/d2f/d10a 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.0.vm00.stdout:0/926: link d3/db/d24/d25/f3f d3/d7/db0/f127 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.0.vm00.stdout:6/748: creat d2/d16/d29/d31/d88/d92/f10b x:0 0 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.1.vm07.stdout:1/782: dwrite d9/d2d/de2/fbf [0,4194304] 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.1.vm07.stdout:2/702: truncate d0/d29/d64/d6c/f71 1836171 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.1.vm07.stdout:0/893: mknod d0/d14/d5f/d76/d2f/d31/df0/d105/c12d 0 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.1.vm07.stdout:6/754: dread - d1/d4/d6/d46/d4d/fdf zero size 2026-03-10T12:38:22.087 INFO:tasks.workunit.client.1.vm07.stdout:5/804: mknod d0/d22/d18/d3e/d5d/d10b/c119 0 2026-03-10T12:38:22.089 INFO:tasks.workunit.client.0.vm00.stdout:6/749: truncate d2/da/dc/d2f/f56 1952063 0 2026-03-10T12:38:22.090 INFO:tasks.workunit.client.1.vm07.stdout:2/703: truncate d0/f4 3331351 0 2026-03-10T12:38:22.103 INFO:tasks.workunit.client.1.vm07.stdout:2/704: dwrite d0/d42/d26/d7d/fe8 [0,4194304] 0 2026-03-10T12:38:22.108 INFO:tasks.workunit.client.1.vm07.stdout:1/783: dread d9/d2d/d4f/dde/fef [0,4194304] 0 2026-03-10T12:38:22.122 INFO:tasks.workunit.client.1.vm07.stdout:2/705: dwrite d0/d5b/d98/fe9 [0,4194304] 0 2026-03-10T12:38:22.122 INFO:tasks.workunit.client.1.vm07.stdout:4/888: creat d0/d4/d5/d34/d127/f13e x:0 0 0 2026-03-10T12:38:22.136 INFO:tasks.workunit.client.1.vm07.stdout:6/755: rename d1/d4/d6/cb6 to d1/dd7/d66/cfa 0 2026-03-10T12:38:22.138 INFO:tasks.workunit.client.1.vm07.stdout:8/749: link d1/d3/d6/l17 d1/d3/d6/lf8 0 2026-03-10T12:38:22.140 INFO:tasks.workunit.client.1.vm07.stdout:4/889: symlink d0/d4/d10/d9a/d124/l13f 0 2026-03-10T12:38:22.141 INFO:tasks.workunit.client.1.vm07.stdout:7/726: link d0/d57/f9f d0/d57/ff3 0 2026-03-10T12:38:22.146 INFO:tasks.workunit.client.1.vm07.stdout:2/706: link d0/d45/lb1 d0/d29/d64/d74/d75/lf0 0 2026-03-10T12:38:22.149 INFO:tasks.workunit.client.1.vm07.stdout:4/890: mknod d0/d4/d5/da/c140 0 2026-03-10T12:38:22.151 INFO:tasks.workunit.client.1.vm07.stdout:7/727: mkdir d0/d61/db4/df4 0 2026-03-10T12:38:22.160 INFO:tasks.workunit.client.1.vm07.stdout:7/728: dread - d0/d61/db4/d8a/fd8 zero size 2026-03-10T12:38:22.160 INFO:tasks.workunit.client.1.vm07.stdout:4/891: mknod d0/d4/d10/d3c/d2b/d54/de1/c141 0 2026-03-10T12:38:22.160 INFO:tasks.workunit.client.1.vm07.stdout:7/729: unlink d0/f10 0 2026-03-10T12:38:22.165 INFO:tasks.workunit.client.1.vm07.stdout:4/892: read d0/d4/d10/d3c/d2b/f60 [2358892,104486] 0 2026-03-10T12:38:22.166 INFO:tasks.workunit.client.1.vm07.stdout:4/893: rename d0/d4/d5/d78 to d0/d4/d5/d78/dc5/df7/d142 22 2026-03-10T12:38:22.171 INFO:tasks.workunit.client.1.vm07.stdout:4/894: truncate d0/d4/d10/d9a/d124/fb4 1073233 0 2026-03-10T12:38:22.173 INFO:tasks.workunit.client.1.vm07.stdout:4/895: chown d0/d4/fb8 4872041 1 2026-03-10T12:38:22.198 INFO:tasks.workunit.client.1.vm07.stdout:3/805: sync 2026-03-10T12:38:22.198 INFO:tasks.workunit.client.1.vm07.stdout:6/756: sync 2026-03-10T12:38:22.207 INFO:tasks.workunit.client.0.vm00.stdout:7/815: dwrite da/d41/fa0 [0,4194304] 0 2026-03-10T12:38:22.209 INFO:tasks.workunit.client.0.vm00.stdout:7/816: write da/d25/d2e/f9c [3741779,1588] 0 2026-03-10T12:38:22.230 INFO:tasks.workunit.client.1.vm07.stdout:6/757: fsync d1/d4/d6/d16/fbc 0 2026-03-10T12:38:22.236 INFO:tasks.workunit.client.1.vm07.stdout:3/806: link dc/dd/f19 dc/dd/d28/d7a/f117 0 2026-03-10T12:38:22.244 INFO:tasks.workunit.client.0.vm00.stdout:7/817: rmdir da/d25/d2e 39 2026-03-10T12:38:22.246 INFO:tasks.workunit.client.1.vm07.stdout:3/807: sync 2026-03-10T12:38:22.247 INFO:tasks.workunit.client.0.vm00.stdout:7/818: readlink da/d26/d50/d73/lff 0 2026-03-10T12:38:22.292 INFO:tasks.workunit.client.0.vm00.stdout:7/819: dread da/d1b/d40/f74 [0,4194304] 0 2026-03-10T12:38:22.293 INFO:tasks.workunit.client.0.vm00.stdout:7/820: chown da/d1b/f39 5 1 2026-03-10T12:38:22.297 INFO:tasks.workunit.client.0.vm00.stdout:7/821: creat da/d41/d7b/f121 x:0 0 0 2026-03-10T12:38:22.299 INFO:tasks.workunit.client.0.vm00.stdout:7/822: mkdir da/d25/d2c/d122 0 2026-03-10T12:38:22.309 INFO:tasks.workunit.client.0.vm00.stdout:7/823: rename da/d26/d50/d73/fd7 to da/d25/d2e/f123 0 2026-03-10T12:38:22.317 INFO:tasks.workunit.client.0.vm00.stdout:7/824: getdents da/d1b/d40 0 2026-03-10T12:38:22.320 INFO:tasks.workunit.client.0.vm00.stdout:7/825: write da/d25/d2c/d82/d68/fcd [3356264,7142] 0 2026-03-10T12:38:22.325 INFO:tasks.workunit.client.0.vm00.stdout:7/826: mkdir da/d25/d2c/d82/d68/d124 0 2026-03-10T12:38:22.325 INFO:tasks.workunit.client.0.vm00.stdout:7/827: chown da/d26/d37/f6f 100305 1 2026-03-10T12:38:22.326 INFO:tasks.workunit.client.0.vm00.stdout:7/828: chown da/d3f/d71/le6 409928 1 2026-03-10T12:38:22.326 INFO:tasks.workunit.client.0.vm00.stdout:7/829: chown da/d25/f2b 493540681 1 2026-03-10T12:38:22.329 INFO:tasks.workunit.client.1.vm07.stdout:9/855: dwrite d5/f121 [0,4194304] 0 2026-03-10T12:38:22.341 INFO:tasks.workunit.client.1.vm07.stdout:9/856: creat d5/d1f/d5e/d6b/de0/f124 x:0 0 0 2026-03-10T12:38:22.354 INFO:tasks.workunit.client.1.vm07.stdout:9/857: creat d5/d13/d2c/de6/dce/d120/f125 x:0 0 0 2026-03-10T12:38:22.355 INFO:tasks.workunit.client.0.vm00.stdout:0/927: dwrite d3/db/d24/fb1 [0,4194304] 0 2026-03-10T12:38:22.359 INFO:tasks.workunit.client.1.vm07.stdout:9/858: write d5/d1f/d5e/d6b/de0/f124 [187925,55562] 0 2026-03-10T12:38:22.363 INFO:tasks.workunit.client.1.vm07.stdout:0/894: dwrite d0/d14/d7c/f10f [0,4194304] 0 2026-03-10T12:38:22.372 INFO:tasks.workunit.client.0.vm00.stdout:0/928: rmdir d3/d7/db0/dc4/de5/d126/d5b 39 2026-03-10T12:38:22.375 INFO:tasks.workunit.client.0.vm00.stdout:6/750: fsync d2/da/dc/d2f/f56 0 2026-03-10T12:38:22.375 INFO:tasks.workunit.client.0.vm00.stdout:6/751: write d2/da/f77 [475518,28375] 0 2026-03-10T12:38:22.376 INFO:tasks.workunit.client.1.vm07.stdout:9/859: creat d5/d13/d6c/da4/d102/f126 x:0 0 0 2026-03-10T12:38:22.382 INFO:tasks.workunit.client.1.vm07.stdout:5/805: write d0/d22/d18/d19/d21/fbd [605307,82533] 0 2026-03-10T12:38:22.395 INFO:tasks.workunit.client.0.vm00.stdout:0/929: rmdir d3/d40/d65 39 2026-03-10T12:38:22.395 INFO:tasks.workunit.client.1.vm07.stdout:1/784: dwrite d9/df/d29/d2b/d30/f38 [0,4194304] 0 2026-03-10T12:38:22.395 INFO:tasks.workunit.client.1.vm07.stdout:8/750: dwrite d1/d3/d6c/f9b [0,4194304] 0 2026-03-10T12:38:22.397 INFO:tasks.workunit.client.0.vm00.stdout:6/752: mkdir d2/d42/d80/d9d/d10c 0 2026-03-10T12:38:22.401 INFO:tasks.workunit.client.1.vm07.stdout:2/707: dwrite d0/d42/d1f/fbf [0,4194304] 0 2026-03-10T12:38:22.404 INFO:tasks.workunit.client.1.vm07.stdout:7/730: write d0/d57/d62/f75 [1155,116715] 0 2026-03-10T12:38:22.408 INFO:tasks.workunit.client.1.vm07.stdout:4/896: dwrite d0/d4/d10/d9a/db9/f10a [0,4194304] 0 2026-03-10T12:38:22.409 INFO:tasks.workunit.client.1.vm07.stdout:4/897: stat d0/d4/d10/d9a/db9/l137 0 2026-03-10T12:38:22.410 INFO:tasks.workunit.client.1.vm07.stdout:4/898: dread - d0/d4/d5/d78/dc5/df7/db2/dd5/d12b/f136 zero size 2026-03-10T12:38:22.412 INFO:tasks.workunit.client.0.vm00.stdout:6/753: symlink d2/d16/d29/d31/d34/l10d 0 2026-03-10T12:38:22.416 INFO:tasks.workunit.client.0.vm00.stdout:7/830: dread da/d26/d37/fc4 [4194304,4194304] 0 2026-03-10T12:38:22.417 INFO:tasks.workunit.client.1.vm07.stdout:8/751: dread - d1/d3/d11/f9f zero size 2026-03-10T12:38:22.425 INFO:tasks.workunit.client.1.vm07.stdout:6/758: dwrite d1/d4/d6/d16/d1a/d6e/fbe [0,4194304] 0 2026-03-10T12:38:22.425 INFO:tasks.workunit.client.1.vm07.stdout:6/759: chown d1/fc9 124774414 1 2026-03-10T12:38:22.430 INFO:tasks.workunit.client.0.vm00.stdout:6/754: creat d2/da/dc/d83/f10e x:0 0 0 2026-03-10T12:38:22.437 INFO:tasks.workunit.client.1.vm07.stdout:3/808: write dc/dd/d1f/dac/fd7 [2278795,119790] 0 2026-03-10T12:38:22.438 INFO:tasks.workunit.client.1.vm07.stdout:2/708: write d0/d42/f22 [3333121,77940] 0 2026-03-10T12:38:22.446 INFO:tasks.workunit.client.0.vm00.stdout:7/831: symlink da/d1b/l125 0 2026-03-10T12:38:22.448 INFO:tasks.workunit.client.1.vm07.stdout:7/731: dread d0/d61/d79/f8d [0,4194304] 0 2026-03-10T12:38:22.455 INFO:tasks.workunit.client.1.vm07.stdout:5/806: dread d0/d22/d18/d3e/d53/d9e/f76 [0,4194304] 0 2026-03-10T12:38:22.467 INFO:tasks.workunit.client.1.vm07.stdout:6/760: creat d1/d4/d6/d46/d4d/dc7/dd9/ffb x:0 0 0 2026-03-10T12:38:22.467 INFO:tasks.workunit.client.1.vm07.stdout:6/761: chown d1/d4/d6/d46/d4d/l75 125 1 2026-03-10T12:38:22.467 INFO:tasks.workunit.client.1.vm07.stdout:7/732: write d0/d52/fc0 [990861,80819] 0 2026-03-10T12:38:22.467 INFO:tasks.workunit.client.0.vm00.stdout:7/832: dread da/d1b/d40/f44 [0,4194304] 0 2026-03-10T12:38:22.467 INFO:tasks.workunit.client.0.vm00.stdout:7/833: write da/d47/dfd/fa9 [269230,106887] 0 2026-03-10T12:38:22.467 INFO:tasks.workunit.client.0.vm00.stdout:7/834: unlink da/d25/d2c/c3a 0 2026-03-10T12:38:22.467 INFO:tasks.workunit.client.0.vm00.stdout:7/835: readlink da/d41/d48/l59 0 2026-03-10T12:38:22.468 INFO:tasks.workunit.client.1.vm07.stdout:5/807: fsync d0/d22/d18/d19/d2e/d67/fa0 0 2026-03-10T12:38:22.470 INFO:tasks.workunit.client.0.vm00.stdout:7/836: creat da/d25/d2e/d4c/f126 x:0 0 0 2026-03-10T12:38:22.471 INFO:tasks.workunit.client.1.vm07.stdout:2/709: creat d0/de1/ff1 x:0 0 0 2026-03-10T12:38:22.471 INFO:tasks.workunit.client.1.vm07.stdout:7/733: mkdir d0/d47/dde/df5 0 2026-03-10T12:38:22.472 INFO:tasks.workunit.client.1.vm07.stdout:1/785: getdents d9/df/d29/d2b/d31/d91 0 2026-03-10T12:38:22.475 INFO:tasks.workunit.client.0.vm00.stdout:7/837: link da/d1b/d40/fca da/d1b/f127 0 2026-03-10T12:38:22.476 INFO:tasks.workunit.client.1.vm07.stdout:2/710: mkdir d0/de1/df2 0 2026-03-10T12:38:22.477 INFO:tasks.workunit.client.1.vm07.stdout:1/786: stat d9/df/l90 0 2026-03-10T12:38:22.481 INFO:tasks.workunit.client.1.vm07.stdout:4/899: getdents d0/d4/df2 0 2026-03-10T12:38:22.484 INFO:tasks.workunit.client.1.vm07.stdout:4/900: chown d0/d4/df2/df6/d46/d76/fae 66180919 1 2026-03-10T12:38:22.484 INFO:tasks.workunit.client.1.vm07.stdout:6/762: creat d1/d4/d6/ffc x:0 0 0 2026-03-10T12:38:22.488 INFO:tasks.workunit.client.1.vm07.stdout:7/734: unlink d0/d61/db4/d8a/d9d/fb7 0 2026-03-10T12:38:22.495 INFO:tasks.workunit.client.1.vm07.stdout:8/752: sync 2026-03-10T12:38:22.495 INFO:tasks.workunit.client.0.vm00.stdout:6/755: sync 2026-03-10T12:38:22.501 INFO:tasks.workunit.client.1.vm07.stdout:6/763: creat d1/d4/d6/d16/d1a/d2c/ffd x:0 0 0 2026-03-10T12:38:22.502 INFO:tasks.workunit.client.0.vm00.stdout:7/838: mkdir da/d25/d2e/d4c/d128 0 2026-03-10T12:38:22.503 INFO:tasks.workunit.client.1.vm07.stdout:7/735: creat d0/d47/dde/ff6 x:0 0 0 2026-03-10T12:38:22.505 INFO:tasks.workunit.client.1.vm07.stdout:1/787: mkdir d9/dff/d103 0 2026-03-10T12:38:22.508 INFO:tasks.workunit.client.1.vm07.stdout:1/788: rmdir d9/df/d55 39 2026-03-10T12:38:22.509 INFO:tasks.workunit.client.1.vm07.stdout:1/789: fsync d9/df/d29/d2b/d30/fd0 0 2026-03-10T12:38:22.510 INFO:tasks.workunit.client.0.vm00.stdout:6/756: creat d2/d16/d29/d31/d88/d92/daa/f10f x:0 0 0 2026-03-10T12:38:22.510 INFO:tasks.workunit.client.1.vm07.stdout:4/901: link d0/d4/d10/d3c/d2b/d54/c10c d0/d4/d5/d78/dc5/c143 0 2026-03-10T12:38:22.514 INFO:tasks.workunit.client.1.vm07.stdout:1/790: creat d9/d2d/d4f/d5a/f104 x:0 0 0 2026-03-10T12:38:22.516 INFO:tasks.workunit.client.0.vm00.stdout:6/757: creat d2/d14/d7a/f110 x:0 0 0 2026-03-10T12:38:22.518 INFO:tasks.workunit.client.1.vm07.stdout:9/860: rename d5/d13/d2c/de6/dce/d120 to d5/d13/d2c/de6/d64/d108/d127 0 2026-03-10T12:38:22.521 INFO:tasks.workunit.client.1.vm07.stdout:4/902: mkdir d0/d144 0 2026-03-10T12:38:22.524 INFO:tasks.workunit.client.1.vm07.stdout:4/903: mknod d0/d4/d10/d114/c145 0 2026-03-10T12:38:22.524 INFO:tasks.workunit.client.1.vm07.stdout:9/861: mkdir d5/d1f/d5e/d10a/d128 0 2026-03-10T12:38:22.526 INFO:tasks.workunit.client.1.vm07.stdout:4/904: symlink d0/d5c/l146 0 2026-03-10T12:38:22.527 INFO:tasks.workunit.client.1.vm07.stdout:9/862: truncate d5/d69/ffe 3795766 0 2026-03-10T12:38:22.531 INFO:tasks.workunit.client.1.vm07.stdout:4/905: rename d0/d4/d10/d9a/d124/cfd to d0/d4/d10/d3c/d2b/d54/de1/c147 0 2026-03-10T12:38:22.531 INFO:tasks.workunit.client.1.vm07.stdout:4/906: write d0/d4/d10/d114/f117 [1002114,62984] 0 2026-03-10T12:38:22.532 INFO:tasks.workunit.client.1.vm07.stdout:4/907: chown d0/d4/df2/df6/d46/f85 1869608 1 2026-03-10T12:38:22.534 INFO:tasks.workunit.client.1.vm07.stdout:4/908: dread - d0/d4/d5/ffc zero size 2026-03-10T12:38:22.535 INFO:tasks.workunit.client.1.vm07.stdout:8/753: sync 2026-03-10T12:38:22.535 INFO:tasks.workunit.client.1.vm07.stdout:7/736: sync 2026-03-10T12:38:22.535 INFO:tasks.workunit.client.1.vm07.stdout:6/764: sync 2026-03-10T12:38:22.543 INFO:tasks.workunit.client.1.vm07.stdout:0/895: dwrite d0/d14/f19 [4194304,4194304] 0 2026-03-10T12:38:22.550 INFO:tasks.workunit.client.1.vm07.stdout:6/765: read d1/d4/d6/f8d [102692,40320] 0 2026-03-10T12:38:22.552 INFO:tasks.workunit.client.1.vm07.stdout:8/754: fdatasync d1/d3/d6c/fda 0 2026-03-10T12:38:22.571 INFO:tasks.workunit.client.1.vm07.stdout:8/755: readlink d1/l37 0 2026-03-10T12:38:22.583 INFO:tasks.workunit.client.0.vm00.stdout:0/930: dwrite d3/d7/db0/dc4/de5/d126/d9d/ffb [0,4194304] 0 2026-03-10T12:38:22.583 INFO:tasks.workunit.client.0.vm00.stdout:0/931: fsync d3/d40/f10a 0 2026-03-10T12:38:22.593 INFO:tasks.workunit.client.1.vm07.stdout:3/809: dwrite dc/dd/fc5 [0,4194304] 0 2026-03-10T12:38:22.594 INFO:tasks.workunit.client.1.vm07.stdout:3/810: write dc/dd/d28/d3b/f70 [172933,114633] 0 2026-03-10T12:38:22.609 INFO:tasks.workunit.client.1.vm07.stdout:5/808: write d0/d22/d18/d19/d2e/f62 [205663,105370] 0 2026-03-10T12:38:22.617 INFO:tasks.workunit.client.1.vm07.stdout:2/711: dwrite d0/f15 [4194304,4194304] 0 2026-03-10T12:38:22.630 INFO:tasks.workunit.client.0.vm00.stdout:7/839: dwrite da/d41/d7b/d9d/dba/fe3 [0,4194304] 0 2026-03-10T12:38:22.635 INFO:tasks.workunit.client.0.vm00.stdout:7/840: read - da/d3f/d60/f119 zero size 2026-03-10T12:38:22.640 INFO:tasks.workunit.client.0.vm00.stdout:7/841: dwrite da/d41/d7b/d9d/dba/fe3 [0,4194304] 0 2026-03-10T12:38:22.642 INFO:tasks.workunit.client.0.vm00.stdout:0/932: symlink d3/d7/db0/dc4/de5/d126/d5b/d38/db3/de2/l128 0 2026-03-10T12:38:22.642 INFO:tasks.workunit.client.1.vm07.stdout:1/791: write d9/df/f15 [2100722,33178] 0 2026-03-10T12:38:22.645 INFO:tasks.workunit.client.1.vm07.stdout:3/811: dread dc/f17 [0,4194304] 0 2026-03-10T12:38:22.647 INFO:tasks.workunit.client.1.vm07.stdout:3/812: read - dc/dd/d28/d7a/d8e/f10a zero size 2026-03-10T12:38:22.648 INFO:tasks.workunit.client.0.vm00.stdout:6/758: chown d2/d51/lfb 60 1 2026-03-10T12:38:22.650 INFO:tasks.workunit.client.1.vm07.stdout:9/863: write d5/d1f/d75/fbc [2022440,32925] 0 2026-03-10T12:38:22.651 INFO:tasks.workunit.client.1.vm07.stdout:9/864: chown d5/d13/d6c/d89/dac 8 1 2026-03-10T12:38:22.653 INFO:tasks.workunit.client.1.vm07.stdout:4/909: write d0/d4/d10/d3c/d2b/d54/de1/f91 [596505,50325] 0 2026-03-10T12:38:22.656 INFO:tasks.workunit.client.1.vm07.stdout:8/756: unlink d1/d3/d6/lf8 0 2026-03-10T12:38:22.657 INFO:tasks.workunit.client.1.vm07.stdout:0/896: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf [2442218,73597] 0 2026-03-10T12:38:22.657 INFO:tasks.workunit.client.1.vm07.stdout:4/910: dread - d0/d4/d5/da/d95/f121 zero size 2026-03-10T12:38:22.658 INFO:tasks.workunit.client.1.vm07.stdout:6/766: write d1/d4/d6/d16/fdd [633787,96907] 0 2026-03-10T12:38:22.660 INFO:tasks.workunit.client.0.vm00.stdout:7/842: mknod da/c129 0 2026-03-10T12:38:22.661 INFO:tasks.workunit.client.1.vm07.stdout:1/792: creat d9/df/d29/d2b/d92/d9d/f105 x:0 0 0 2026-03-10T12:38:22.662 INFO:tasks.workunit.client.0.vm00.stdout:0/933: dwrite d3/d33/f4d [0,4194304] 0 2026-03-10T12:38:22.662 INFO:tasks.workunit.client.1.vm07.stdout:5/809: mknod d0/d22/d18/d19/d21/d54/dcb/c11a 0 2026-03-10T12:38:22.663 INFO:tasks.workunit.client.1.vm07.stdout:7/737: write d0/d61/f69 [2407991,15460] 0 2026-03-10T12:38:22.664 INFO:tasks.workunit.client.1.vm07.stdout:9/865: dwrite d5/d13/d2c/de6/d64/d108/d127/f125 [0,4194304] 0 2026-03-10T12:38:22.670 INFO:tasks.workunit.client.0.vm00.stdout:6/759: creat d2/d16/d29/f111 x:0 0 0 2026-03-10T12:38:22.670 INFO:tasks.workunit.client.1.vm07.stdout:6/767: write d1/d4/d6/ffc [698995,70563] 0 2026-03-10T12:38:22.670 INFO:tasks.workunit.client.0.vm00.stdout:6/760: readlink d2/d16/ldd 0 2026-03-10T12:38:22.671 INFO:tasks.workunit.client.1.vm07.stdout:9/866: chown d5/d16/d23/lb8 3 1 2026-03-10T12:38:22.677 INFO:tasks.workunit.client.1.vm07.stdout:0/897: unlink d0/d14/ldd 0 2026-03-10T12:38:22.678 INFO:tasks.workunit.client.0.vm00.stdout:0/934: dread d3/d7/f70 [4194304,4194304] 0 2026-03-10T12:38:22.682 INFO:tasks.workunit.client.1.vm07.stdout:6/768: dread d1/d4/d6/d16/d1a/d33/fd2 [0,4194304] 0 2026-03-10T12:38:22.688 INFO:tasks.workunit.client.1.vm07.stdout:4/911: fdatasync d0/d4/df2/f108 0 2026-03-10T12:38:22.689 INFO:tasks.workunit.client.1.vm07.stdout:0/898: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/f12b [0,4194304] 0 2026-03-10T12:38:22.693 INFO:tasks.workunit.client.1.vm07.stdout:0/899: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf [1532338,50923] 0 2026-03-10T12:38:22.704 INFO:tasks.workunit.client.0.vm00.stdout:0/935: mknod d3/d7/db0/dc4/de5/d126/c129 0 2026-03-10T12:38:22.713 INFO:tasks.workunit.client.1.vm07.stdout:7/738: dread d0/d47/f58 [0,4194304] 0 2026-03-10T12:38:22.713 INFO:tasks.workunit.client.0.vm00.stdout:0/936: chown d3/d7/db0/dc4/de5/d126/d5b/d38/d44 895 1 2026-03-10T12:38:22.713 INFO:tasks.workunit.client.0.vm00.stdout:0/937: write d3/d7/db0/dc4/f121 [15001,79735] 0 2026-03-10T12:38:22.714 INFO:tasks.workunit.client.0.vm00.stdout:6/761: mknod d2/d51/c112 0 2026-03-10T12:38:22.722 INFO:tasks.workunit.client.0.vm00.stdout:0/938: creat d3/d7/db0/dc4/de5/d126/d5b/d38/d44/df9/f12a x:0 0 0 2026-03-10T12:38:22.724 INFO:tasks.workunit.client.1.vm07.stdout:5/810: fsync d0/d22/d18/d19/d2e/d67/fa0 0 2026-03-10T12:38:22.728 INFO:tasks.workunit.client.0.vm00.stdout:0/939: truncate d3/d22/d3a/fd9 1364416 0 2026-03-10T12:38:22.730 INFO:tasks.workunit.client.1.vm07.stdout:8/757: mknod d1/d3/db2/cf9 0 2026-03-10T12:38:22.735 INFO:tasks.workunit.client.0.vm00.stdout:6/762: unlink d2/d16/f23 0 2026-03-10T12:38:22.737 INFO:tasks.workunit.client.0.vm00.stdout:0/940: dread d3/db/d77/ff7 [0,4194304] 0 2026-03-10T12:38:22.746 INFO:tasks.workunit.client.0.vm00.stdout:6/763: fsync d2/d16/d29/d31/d88/d92/daa/dc1/fc9 0 2026-03-10T12:38:22.749 INFO:tasks.workunit.client.0.vm00.stdout:0/941: readlink d3/d7/db0/dc4/de5/d126/d5b/l32 0 2026-03-10T12:38:22.751 INFO:tasks.workunit.client.1.vm07.stdout:4/912: mkdir d0/d4/d5/d78/dc5/df7/db2/dd5/d12b/d148 0 2026-03-10T12:38:22.756 INFO:tasks.workunit.client.0.vm00.stdout:6/764: mknod d2/d42/d80/d89/c113 0 2026-03-10T12:38:22.757 INFO:tasks.workunit.client.1.vm07.stdout:7/739: creat d0/d67/ff7 x:0 0 0 2026-03-10T12:38:22.759 INFO:tasks.workunit.client.1.vm07.stdout:8/758: write d1/d3/d18/fd9 [3721166,11127] 0 2026-03-10T12:38:22.759 INFO:tasks.workunit.client.0.vm00.stdout:0/942: creat d3/d7/db0/dc4/de5/d126/dcc/f12b x:0 0 0 2026-03-10T12:38:22.762 INFO:tasks.workunit.client.1.vm07.stdout:4/913: creat d0/d4/d5/d34/d127/f149 x:0 0 0 2026-03-10T12:38:22.763 INFO:tasks.workunit.client.1.vm07.stdout:4/914: stat d0/d4/d5/da/fcb 0 2026-03-10T12:38:22.769 INFO:tasks.workunit.client.1.vm07.stdout:5/811: mknod d0/d22/d18/d3e/c11b 0 2026-03-10T12:38:22.775 INFO:tasks.workunit.client.1.vm07.stdout:8/759: rename d1/d3/d11/f86 to d1/d3/d6/d54/ffa 0 2026-03-10T12:38:22.776 INFO:tasks.workunit.client.1.vm07.stdout:8/760: chown d1/d3/d6/d50/f56 8277 1 2026-03-10T12:38:22.777 INFO:tasks.workunit.client.1.vm07.stdout:4/915: creat d0/d4/d5/d34/f14a x:0 0 0 2026-03-10T12:38:22.782 INFO:tasks.workunit.client.0.vm00.stdout:0/943: rename d3/d7/d3c to d3/d7/db0/dc4/de5/d126/d5b/d38/d12c 0 2026-03-10T12:38:22.784 INFO:tasks.workunit.client.0.vm00.stdout:0/944: write d3/db/d77/faa [4860357,121005] 0 2026-03-10T12:38:22.792 INFO:tasks.workunit.client.1.vm07.stdout:6/769: getdents d1/d4/d9b/de1 0 2026-03-10T12:38:22.798 INFO:tasks.workunit.client.1.vm07.stdout:8/761: truncate d1/d3/d6/d50/fc8 323851 0 2026-03-10T12:38:22.803 INFO:tasks.workunit.client.0.vm00.stdout:7/843: write f9 [4364124,98312] 0 2026-03-10T12:38:22.804 INFO:tasks.workunit.client.1.vm07.stdout:9/867: write d5/d13/d2c/de6/d64/d108/d127/f125 [5017345,42929] 0 2026-03-10T12:38:22.804 INFO:tasks.workunit.client.0.vm00.stdout:7/844: chown da/d26/cdb 589921 1 2026-03-10T12:38:22.804 INFO:tasks.workunit.client.0.vm00.stdout:7/845: stat da/d47 0 2026-03-10T12:38:22.805 INFO:tasks.workunit.client.0.vm00.stdout:7/846: chown da/d3f/dd1/cda 0 1 2026-03-10T12:38:22.807 INFO:tasks.workunit.client.1.vm07.stdout:2/712: dwrite d0/d29/d64/d74/d88/f51 [0,4194304] 0 2026-03-10T12:38:22.809 INFO:tasks.workunit.client.1.vm07.stdout:2/713: dwrite d0/d29/d64/d74/d75/fa5 [4194304,4194304] 0 2026-03-10T12:38:22.814 INFO:tasks.workunit.client.1.vm07.stdout:3/813: write dc/dd/d43/f61 [491877,22305] 0 2026-03-10T12:38:22.814 INFO:tasks.workunit.client.1.vm07.stdout:1/793: write d9/d2d/d4f/d5a/fdd [55311,125002] 0 2026-03-10T12:38:22.815 INFO:tasks.workunit.client.1.vm07.stdout:3/814: dread - dc/d18/fdd zero size 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: Standby manager daemon vm00.nescmq started 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/crt"}]: dispatch 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/key"}]: dispatch 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:22.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:22 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:22.816 INFO:tasks.workunit.client.1.vm07.stdout:1/794: write d9/d2d/d4f/d75/d77/da7/fcd [290732,50278] 0 2026-03-10T12:38:22.823 INFO:tasks.workunit.client.1.vm07.stdout:0/900: dwrite d0/d14/d5f/d76/d2f/d31/d79/d85/fcf [0,4194304] 0 2026-03-10T12:38:22.825 INFO:tasks.workunit.client.1.vm07.stdout:4/916: mkdir d0/d4/d10/d114/d14b 0 2026-03-10T12:38:22.837 INFO:tasks.workunit.client.0.vm00.stdout:7/847: rmdir da/d47/dfd 39 2026-03-10T12:38:22.837 INFO:tasks.workunit.client.0.vm00.stdout:7/848: chown da/d26/d37/d61 51111 1 2026-03-10T12:38:22.842 INFO:tasks.workunit.client.1.vm07.stdout:8/762: mkdir d1/d3/d6/d50/d70/dfb 0 2026-03-10T12:38:22.846 INFO:tasks.workunit.client.0.vm00.stdout:7/849: mkdir da/d26/d37/d56/ddf/d108/d12a 0 2026-03-10T12:38:22.847 INFO:tasks.workunit.client.0.vm00.stdout:7/850: chown da/d25/d2c/d82/f10b 230 1 2026-03-10T12:38:22.848 INFO:tasks.workunit.client.0.vm00.stdout:7/851: chown da/d25/d2c/d82/d68/cd3 932915 1 2026-03-10T12:38:22.856 INFO:tasks.workunit.client.0.vm00.stdout:7/852: fdatasync da/d3f/d60/f110 0 2026-03-10T12:38:22.865 INFO:tasks.workunit.client.1.vm07.stdout:7/740: write d0/d52/fb9 [735667,101543] 0 2026-03-10T12:38:22.870 INFO:tasks.workunit.client.0.vm00.stdout:7/853: rename da/d3f/d60/f119 to da/d47/dfd/f12b 0 2026-03-10T12:38:22.873 INFO:tasks.workunit.client.1.vm07.stdout:0/901: rename d0/d14/d5f/d76/d2f/l4a to d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/l12e 0 2026-03-10T12:38:22.878 INFO:tasks.workunit.client.1.vm07.stdout:1/795: dread d9/df/d29/d6b/fa1 [0,4194304] 0 2026-03-10T12:38:22.884 INFO:tasks.workunit.client.0.vm00.stdout:7/854: getdents da/d26/d37/d56/ddf/d108/d12a 0 2026-03-10T12:38:22.884 INFO:tasks.workunit.client.1.vm07.stdout:4/917: dread d0/d4/d5/d8f/fdd [0,4194304] 0 2026-03-10T12:38:22.884 INFO:tasks.workunit.client.1.vm07.stdout:6/770: symlink d1/dd7/da3/dd8/lfe 0 2026-03-10T12:38:22.884 INFO:tasks.workunit.client.1.vm07.stdout:9/868: link d5/d13/d2c/de6/d64/f10f d5/d13/d6c/d89/dac/f129 0 2026-03-10T12:38:22.885 INFO:tasks.workunit.client.1.vm07.stdout:9/869: fsync d5/d13/d2c/de6/d64/d108/d127/f125 0 2026-03-10T12:38:22.887 INFO:tasks.workunit.client.1.vm07.stdout:9/870: dread d5/d1f/d5e/d6b/de0/f124 [0,4194304] 0 2026-03-10T12:38:22.889 INFO:tasks.workunit.client.0.vm00.stdout:7/855: mknod da/d1b/d40/c12c 0 2026-03-10T12:38:22.890 INFO:tasks.workunit.client.1.vm07.stdout:7/741: mknod d0/d47/dab/cf8 0 2026-03-10T12:38:22.891 INFO:tasks.workunit.client.0.vm00.stdout:7/856: mknod da/d41/d48/d81/c12d 0 2026-03-10T12:38:22.892 INFO:tasks.workunit.client.0.vm00.stdout:7/857: truncate da/d3f/d71/f95 493070 0 2026-03-10T12:38:22.893 INFO:tasks.workunit.client.0.vm00.stdout:7/858: dread da/d41/d7b/d9d/fc2 [0,4194304] 0 2026-03-10T12:38:22.894 INFO:tasks.workunit.client.1.vm07.stdout:5/812: write d0/d22/d18/f4c [249383,22455] 0 2026-03-10T12:38:22.899 INFO:tasks.workunit.client.0.vm00.stdout:0/945: dwrite d3/d22/f54 [0,4194304] 0 2026-03-10T12:38:22.900 INFO:tasks.workunit.client.0.vm00.stdout:7/859: mkdir da/d41/d7b/d9d/dc8/d12e 0 2026-03-10T12:38:22.900 INFO:tasks.workunit.client.0.vm00.stdout:7/860: dread - da/d26/d37/ffc zero size 2026-03-10T12:38:22.913 INFO:tasks.workunit.client.0.vm00.stdout:7/861: dread da/f35 [0,4194304] 0 2026-03-10T12:38:22.914 INFO:tasks.workunit.client.1.vm07.stdout:0/902: mknod d0/d14/d5f/d76/d2f/df4/d12c/c12f 0 2026-03-10T12:38:22.916 INFO:tasks.workunit.client.1.vm07.stdout:3/815: write dc/d18/d99/da3/fd2 [5179860,21789] 0 2026-03-10T12:38:22.934 INFO:tasks.workunit.client.1.vm07.stdout:9/871: fdatasync d5/d13/d57/d4f/d6a/fba 0 2026-03-10T12:38:22.941 INFO:tasks.workunit.client.1.vm07.stdout:8/763: write d1/d3/d11/f35 [607225,590] 0 2026-03-10T12:38:22.943 INFO:tasks.workunit.client.1.vm07.stdout:2/714: dwrite d0/d80/d93/fd6 [0,4194304] 0 2026-03-10T12:38:22.945 INFO:tasks.workunit.client.1.vm07.stdout:8/764: dread d1/f88 [0,4194304] 0 2026-03-10T12:38:22.946 INFO:tasks.workunit.client.0.vm00.stdout:7/862: creat da/d25/d2c/d82/d101/f12f x:0 0 0 2026-03-10T12:38:22.947 INFO:tasks.workunit.client.0.vm00.stdout:7/863: write da/d41/fa0 [3328645,38214] 0 2026-03-10T12:38:22.947 INFO:tasks.workunit.client.0.vm00.stdout:7/864: fdatasync da/f16 0 2026-03-10T12:38:22.960 INFO:tasks.workunit.client.1.vm07.stdout:5/813: readlink d0/d22/d18/d3e/d53/d9e/l60 0 2026-03-10T12:38:22.964 INFO:tasks.workunit.client.1.vm07.stdout:5/814: dwrite d0/d22/d18/d19/d21/f113 [0,4194304] 0 2026-03-10T12:38:22.972 INFO:tasks.workunit.client.1.vm07.stdout:0/903: creat d0/d14/d5f/d41/d6a/d9a/f130 x:0 0 0 2026-03-10T12:38:22.974 INFO:tasks.workunit.client.1.vm07.stdout:1/796: write d9/df/d29/fd5 [119509,13006] 0 2026-03-10T12:38:22.978 INFO:tasks.workunit.client.0.vm00.stdout:0/946: write d3/d22/fde [1101773,34819] 0 2026-03-10T12:38:22.979 INFO:tasks.workunit.client.1.vm07.stdout:6/771: truncate d1/f3d 2170602 0 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: Standby manager daemon vm00.nescmq started 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/crt"}]: dispatch 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/key"}]: dispatch 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.100:0/2' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:22.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:22 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:22.987 INFO:tasks.workunit.client.0.vm00.stdout:0/947: creat d3/d7/f12d x:0 0 0 2026-03-10T12:38:22.988 INFO:tasks.workunit.client.0.vm00.stdout:0/948: chown d3/d7/db0/dc4/de5/d126/d5b/d38/db3/fbe 121898716 1 2026-03-10T12:38:22.990 INFO:tasks.workunit.client.1.vm07.stdout:8/765: read - d1/d3/fb5 zero size 2026-03-10T12:38:22.991 INFO:tasks.workunit.client.1.vm07.stdout:8/766: write d1/d3/d6c/dde/de7/ff4 [762635,84547] 0 2026-03-10T12:38:22.995 INFO:tasks.workunit.client.0.vm00.stdout:7/865: truncate da/d26/f97 4954803 0 2026-03-10T12:38:22.995 INFO:tasks.workunit.client.0.vm00.stdout:7/866: chown da/d3f/dd1 353 1 2026-03-10T12:38:22.996 INFO:tasks.workunit.client.0.vm00.stdout:0/949: chown d3/d22/d3a/deb/ffc 9 1 2026-03-10T12:38:22.997 INFO:tasks.workunit.client.0.vm00.stdout:7/867: truncate da/d25/d2c/d82/d101/f118 774540 0 2026-03-10T12:38:22.999 INFO:tasks.workunit.client.1.vm07.stdout:7/742: dwrite d0/f4f [0,4194304] 0 2026-03-10T12:38:23.003 INFO:tasks.workunit.client.1.vm07.stdout:5/815: symlink d0/d22/d18/d19/d21/d54/dcb/de8/l11c 0 2026-03-10T12:38:23.003 INFO:tasks.workunit.client.1.vm07.stdout:5/816: fsync d0/d22/f50 0 2026-03-10T12:38:23.007 INFO:tasks.workunit.client.0.vm00.stdout:0/950: fsync d3/f9c 0 2026-03-10T12:38:23.008 INFO:tasks.workunit.client.0.vm00.stdout:7/868: dread da/f13 [0,4194304] 0 2026-03-10T12:38:23.015 INFO:tasks.workunit.client.1.vm07.stdout:3/816: write dc/dd/d28/f46 [759879,40805] 0 2026-03-10T12:38:23.017 INFO:tasks.workunit.client.1.vm07.stdout:1/797: dread d9/f52 [0,4194304] 0 2026-03-10T12:38:23.019 INFO:tasks.workunit.client.1.vm07.stdout:4/918: write d0/d4/d10/d9a/db9/fef [358122,1897] 0 2026-03-10T12:38:23.023 INFO:tasks.workunit.client.0.vm00.stdout:0/951: creat d3/d7/db0/dc4/de5/d126/d9d/f12e x:0 0 0 2026-03-10T12:38:23.027 INFO:tasks.workunit.client.1.vm07.stdout:9/872: dwrite d5/d13/d9d/f100 [0,4194304] 0 2026-03-10T12:38:23.035 INFO:tasks.workunit.client.1.vm07.stdout:6/772: creat d1/d4/d6/d16/d1a/d9d/fff x:0 0 0 2026-03-10T12:38:23.041 INFO:tasks.workunit.client.1.vm07.stdout:2/715: truncate d0/d29/d64/d74/f8e 729789 0 2026-03-10T12:38:23.044 INFO:tasks.workunit.client.1.vm07.stdout:5/817: rmdir d0/d22/d18/d30 39 2026-03-10T12:38:23.050 INFO:tasks.workunit.client.0.vm00.stdout:7/869: dread da/d26/d37/f6f [0,4194304] 0 2026-03-10T12:38:23.051 INFO:tasks.workunit.client.0.vm00.stdout:6/765: rename d2/d16/d29/c109 to d2/d51/d70/c114 0 2026-03-10T12:38:23.051 INFO:tasks.workunit.client.0.vm00.stdout:7/870: write da/d25/d2e/d4c/f6e [1950430,83601] 0 2026-03-10T12:38:23.052 INFO:tasks.workunit.client.0.vm00.stdout:6/766: write d2/d9f/dce/ffc [894493,105760] 0 2026-03-10T12:38:23.055 INFO:tasks.workunit.client.0.vm00.stdout:6/767: dread d2/da/dc/f25 [0,4194304] 0 2026-03-10T12:38:23.058 INFO:tasks.workunit.client.1.vm07.stdout:7/743: dread d0/f7b [0,4194304] 0 2026-03-10T12:38:23.063 INFO:tasks.workunit.client.1.vm07.stdout:9/873: dread d5/d13/d6c/fb6 [0,4194304] 0 2026-03-10T12:38:23.072 INFO:tasks.workunit.client.0.vm00.stdout:7/871: unlink da/d26/d37/d56/ddf/d108/f10a 0 2026-03-10T12:38:23.073 INFO:tasks.workunit.client.1.vm07.stdout:3/817: dread dc/d18/d2d/de5/f10e [0,4194304] 0 2026-03-10T12:38:23.075 INFO:tasks.workunit.client.1.vm07.stdout:0/904: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/f112 [513742,27246] 0 2026-03-10T12:38:23.077 INFO:tasks.workunit.client.0.vm00.stdout:6/768: mkdir d2/d14/d115 0 2026-03-10T12:38:23.078 INFO:tasks.workunit.client.0.vm00.stdout:6/769: chown d2/d42/f71 350095879 1 2026-03-10T12:38:23.083 INFO:tasks.workunit.client.0.vm00.stdout:6/770: dread d2/d16/d29/f64 [0,4194304] 0 2026-03-10T12:38:23.084 INFO:tasks.workunit.client.0.vm00.stdout:7/872: read da/d25/d2c/d82/d68/f38 [515387,125218] 0 2026-03-10T12:38:23.084 INFO:tasks.workunit.client.1.vm07.stdout:3/818: dread dc/d18/d99/da3/fb1 [4194304,4194304] 0 2026-03-10T12:38:23.085 INFO:tasks.workunit.client.1.vm07.stdout:3/819: stat dc/dd/f22 0 2026-03-10T12:38:23.091 INFO:tasks.workunit.client.0.vm00.stdout:0/952: write d3/d22/f71 [1365403,42500] 0 2026-03-10T12:38:23.098 INFO:tasks.workunit.client.0.vm00.stdout:7/873: symlink da/d26/d37/d56/l130 0 2026-03-10T12:38:23.098 INFO:tasks.workunit.client.0.vm00.stdout:7/874: stat da/d1b/d40/lb4 0 2026-03-10T12:38:23.103 INFO:tasks.workunit.client.0.vm00.stdout:7/875: dwrite da/d25/d2c/f30 [0,4194304] 0 2026-03-10T12:38:23.104 INFO:tasks.workunit.client.0.vm00.stdout:0/953: rmdir d3/d7/db0/dc4/dd5/d10e 39 2026-03-10T12:38:23.105 INFO:tasks.workunit.client.1.vm07.stdout:4/919: dwrite d0/d4/df2/df6/d46/f56 [4194304,4194304] 0 2026-03-10T12:38:23.113 INFO:tasks.workunit.client.0.vm00.stdout:7/876: dwrite da/d47/dfd/f106 [0,4194304] 0 2026-03-10T12:38:23.115 INFO:tasks.workunit.client.1.vm07.stdout:5/818: creat d0/d22/d18/d19/d21/dc2/f11d x:0 0 0 2026-03-10T12:38:23.116 INFO:tasks.workunit.client.1.vm07.stdout:5/819: stat d0/d22/d18/d19/d72/ff1 0 2026-03-10T12:38:23.121 INFO:tasks.workunit.client.0.vm00.stdout:0/954: dread d3/db/f16 [0,4194304] 0 2026-03-10T12:38:23.133 INFO:tasks.workunit.client.0.vm00.stdout:0/955: write d3/d7/db0/dc4/de5/d126/d5b/d38/f8b [4510224,86035] 0 2026-03-10T12:38:23.133 INFO:tasks.workunit.client.0.vm00.stdout:0/956: readlink d3/d7/db0/dc4/de5/d126/d5b/d38/d44/l69 0 2026-03-10T12:38:23.133 INFO:tasks.workunit.client.1.vm07.stdout:1/798: symlink d9/d2d/dd7/l106 0 2026-03-10T12:38:23.133 INFO:tasks.workunit.client.1.vm07.stdout:7/744: symlink d0/d47/da0/lf9 0 2026-03-10T12:38:23.133 INFO:tasks.workunit.client.1.vm07.stdout:7/745: read d0/f7b [3268184,85318] 0 2026-03-10T12:38:23.134 INFO:tasks.workunit.client.0.vm00.stdout:6/771: write d2/d16/f41 [145365,34104] 0 2026-03-10T12:38:23.135 INFO:tasks.workunit.client.1.vm07.stdout:9/874: mkdir d5/d13/d2c/de6/d64/d108/d12a 0 2026-03-10T12:38:23.137 INFO:tasks.workunit.client.0.vm00.stdout:7/877: creat da/d41/d7b/f131 x:0 0 0 2026-03-10T12:38:23.137 INFO:tasks.workunit.client.0.vm00.stdout:7/878: stat da/d26/d37/d56/f6c 0 2026-03-10T12:38:23.138 INFO:tasks.workunit.client.1.vm07.stdout:0/905: symlink d0/d14/d5f/d3b/l131 0 2026-03-10T12:38:23.138 INFO:tasks.workunit.client.1.vm07.stdout:0/906: fdatasync d0/d14/f19 0 2026-03-10T12:38:23.143 INFO:tasks.workunit.client.1.vm07.stdout:6/773: truncate d1/d4/d6/d16/d49/fd3 836380 0 2026-03-10T12:38:23.151 INFO:tasks.workunit.client.0.vm00.stdout:0/957: creat d3/d40/d65/f12f x:0 0 0 2026-03-10T12:38:23.151 INFO:tasks.workunit.client.0.vm00.stdout:0/958: stat d3/d7/db0/dc4/de5/d126/d5b/dc5/l124 0 2026-03-10T12:38:23.151 INFO:tasks.workunit.client.1.vm07.stdout:8/767: link d1/d3/d6c/dde/fe0 d1/d3/d18/d8e/ffc 0 2026-03-10T12:38:23.151 INFO:tasks.workunit.client.1.vm07.stdout:4/920: truncate d0/d4/d10/d5f/f63 323775 0 2026-03-10T12:38:23.159 INFO:tasks.workunit.client.1.vm07.stdout:3/820: dread dc/dd/f19 [0,4194304] 0 2026-03-10T12:38:23.160 INFO:tasks.workunit.client.0.vm00.stdout:7/879: rename da/d47/ff1 to da/d26/f132 0 2026-03-10T12:38:23.172 INFO:tasks.workunit.client.0.vm00.stdout:0/959: link d3/db/d24/d25/c66 d3/d7/db0/dc4/de5/d126/d5b/dc5/c130 0 2026-03-10T12:38:23.178 INFO:tasks.workunit.client.0.vm00.stdout:6/772: write d2/d16/f20 [864406,28303] 0 2026-03-10T12:38:23.182 INFO:tasks.workunit.client.1.vm07.stdout:5/820: dwrite d0/d22/d18/d3e/df6/ff8 [0,4194304] 0 2026-03-10T12:38:23.190 INFO:tasks.workunit.client.1.vm07.stdout:7/746: dwrite d0/fc [4194304,4194304] 0 2026-03-10T12:38:23.198 INFO:tasks.workunit.client.0.vm00.stdout:7/880: truncate da/d26/d37/fd6 4944412 0 2026-03-10T12:38:23.204 INFO:tasks.workunit.client.1.vm07.stdout:9/875: truncate d5/d16/f10c 981715 0 2026-03-10T12:38:23.204 INFO:tasks.workunit.client.1.vm07.stdout:0/907: chown d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/l109 159 1 2026-03-10T12:38:23.204 INFO:tasks.workunit.client.0.vm00.stdout:0/960: unlink d3/db/d24/d25/c98 0 2026-03-10T12:38:23.204 INFO:tasks.workunit.client.0.vm00.stdout:7/881: chown da/d1b/f39 2 1 2026-03-10T12:38:23.204 INFO:tasks.workunit.client.0.vm00.stdout:7/882: readlink da/d1b/l125 0 2026-03-10T12:38:23.210 INFO:tasks.workunit.client.1.vm07.stdout:2/716: creat d0/ff3 x:0 0 0 2026-03-10T12:38:23.212 INFO:tasks.workunit.client.1.vm07.stdout:2/717: dwrite d0/d80/d93/fce [0,4194304] 0 2026-03-10T12:38:23.213 INFO:tasks.workunit.client.1.vm07.stdout:2/718: readlink d0/la2 0 2026-03-10T12:38:23.229 INFO:tasks.workunit.client.1.vm07.stdout:4/921: rmdir d0/d4/d5 39 2026-03-10T12:38:23.231 INFO:tasks.workunit.client.0.vm00.stdout:0/961: mknod d3/d7/db0/dc4/de5/d126/d5b/c131 0 2026-03-10T12:38:23.232 INFO:tasks.workunit.client.1.vm07.stdout:3/821: truncate dc/dd/d28/d7a/d8e/fb0 4327143 0 2026-03-10T12:38:23.235 INFO:tasks.workunit.client.0.vm00.stdout:6/773: getdents d2/d42/d80/d89 0 2026-03-10T12:38:23.236 INFO:tasks.workunit.client.0.vm00.stdout:6/774: chown d2/d42/d80/dfd 31 1 2026-03-10T12:38:23.238 INFO:tasks.workunit.client.0.vm00.stdout:7/883: link da/d26/d50/c69 da/d26/d37/d56/ddf/c133 0 2026-03-10T12:38:23.244 INFO:tasks.workunit.client.1.vm07.stdout:9/876: creat d5/d69/d93/f12b x:0 0 0 2026-03-10T12:38:23.246 INFO:tasks.workunit.client.0.vm00.stdout:6/775: write d2/d16/d29/d31/d88/d92/fba [297173,60257] 0 2026-03-10T12:38:23.254 INFO:tasks.workunit.client.1.vm07.stdout:7/747: dread d0/d47/f73 [0,4194304] 0 2026-03-10T12:38:23.254 INFO:tasks.workunit.client.1.vm07.stdout:6/774: dwrite d1/d4/d6/d16/d1a/f29 [4194304,4194304] 0 2026-03-10T12:38:23.270 INFO:tasks.workunit.client.1.vm07.stdout:2/719: mkdir d0/d29/d64/d74/df4 0 2026-03-10T12:38:23.271 INFO:tasks.workunit.client.1.vm07.stdout:2/720: truncate d0/ff3 18895 0 2026-03-10T12:38:23.272 INFO:tasks.workunit.client.0.vm00.stdout:7/884: symlink da/d25/d2e/d4c/l134 0 2026-03-10T12:38:23.272 INFO:tasks.workunit.client.1.vm07.stdout:8/768: creat d1/d3/d6/d54/dd2/df3/ffd x:0 0 0 2026-03-10T12:38:23.272 INFO:tasks.workunit.client.0.vm00.stdout:0/962: dwrite d3/db/d77/f8a [0,4194304] 0 2026-03-10T12:38:23.273 INFO:tasks.workunit.client.1.vm07.stdout:8/769: chown d1/d3/d6c/fce 12581 1 2026-03-10T12:38:23.280 INFO:tasks.workunit.client.1.vm07.stdout:9/877: creat d5/d1f/f12c x:0 0 0 2026-03-10T12:38:23.284 INFO:tasks.workunit.client.1.vm07.stdout:0/908: fdatasync d0/d14/f37 0 2026-03-10T12:38:23.289 INFO:tasks.workunit.client.1.vm07.stdout:6/775: rmdir d1/dd7/da3/dd8 39 2026-03-10T12:38:23.290 INFO:tasks.workunit.client.0.vm00.stdout:7/885: truncate da/d25/d2c/d82/d68/f38 1555865 0 2026-03-10T12:38:23.292 INFO:tasks.workunit.client.0.vm00.stdout:0/963: creat d3/d7/f132 x:0 0 0 2026-03-10T12:38:23.293 INFO:tasks.workunit.client.0.vm00.stdout:0/964: readlink d3/d7/db0/dc4/de5/d126/dcc/ded/l109 0 2026-03-10T12:38:23.296 INFO:tasks.workunit.client.1.vm07.stdout:3/822: mknod dc/dd/d1f/d6f/dcf/c118 0 2026-03-10T12:38:23.301 INFO:tasks.workunit.client.0.vm00.stdout:7/886: truncate da/d26/d37/f4a 986998 0 2026-03-10T12:38:23.302 INFO:tasks.workunit.client.1.vm07.stdout:3/823: readlink dc/l38 0 2026-03-10T12:38:23.302 INFO:tasks.workunit.client.1.vm07.stdout:1/799: creat d9/df/d55/f107 x:0 0 0 2026-03-10T12:38:23.302 INFO:tasks.workunit.client.1.vm07.stdout:0/909: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/f132 x:0 0 0 2026-03-10T12:38:23.307 INFO:tasks.workunit.client.0.vm00.stdout:7/887: chown da/d47/d87/fb3 23 1 2026-03-10T12:38:23.309 INFO:tasks.workunit.client.1.vm07.stdout:1/800: dread d9/df/f26 [4194304,4194304] 0 2026-03-10T12:38:23.309 INFO:tasks.workunit.client.0.vm00.stdout:6/776: write d2/da/dc/d2f/f4f [7650,37853] 0 2026-03-10T12:38:23.314 INFO:tasks.workunit.client.1.vm07.stdout:8/770: dwrite d1/d3/fb5 [0,4194304] 0 2026-03-10T12:38:23.316 INFO:tasks.workunit.client.1.vm07.stdout:8/771: chown d1/d3/d40/fd1 4162613 1 2026-03-10T12:38:23.316 INFO:tasks.workunit.client.1.vm07.stdout:8/772: fdatasync d1/d3/d40/d92/dba/feb 0 2026-03-10T12:38:23.319 INFO:tasks.workunit.client.1.vm07.stdout:7/748: fsync d0/d52/f97 0 2026-03-10T12:38:23.328 INFO:tasks.workunit.client.0.vm00.stdout:7/888: symlink da/d26/d37/d56/ddf/d108/l135 0 2026-03-10T12:38:23.329 INFO:tasks.workunit.client.1.vm07.stdout:2/721: creat d0/dcd/ff5 x:0 0 0 2026-03-10T12:38:23.329 INFO:tasks.workunit.client.0.vm00.stdout:0/965: write d3/d7/db0/dc4/de5/d126/d5b/d38/db3/de2/f68 [4205233,87133] 0 2026-03-10T12:38:23.331 INFO:tasks.workunit.client.0.vm00.stdout:0/966: write d3/d40/f10a [100669,12119] 0 2026-03-10T12:38:23.335 INFO:tasks.workunit.client.0.vm00.stdout:7/889: readlink da/d41/d48/lc5 0 2026-03-10T12:38:23.341 INFO:tasks.workunit.client.1.vm07.stdout:5/821: creat d0/d22/d18/d30/f11e x:0 0 0 2026-03-10T12:38:23.341 INFO:tasks.workunit.client.0.vm00.stdout:7/890: write da/d41/f4b [724204,111562] 0 2026-03-10T12:38:23.341 INFO:tasks.workunit.client.0.vm00.stdout:0/967: fsync d3/f9c 0 2026-03-10T12:38:23.341 INFO:tasks.workunit.client.1.vm07.stdout:3/824: dread dc/dd/d28/d7a/d8e/f9b [0,4194304] 0 2026-03-10T12:38:23.346 INFO:tasks.workunit.client.1.vm07.stdout:1/801: mknod d9/df/d79/c108 0 2026-03-10T12:38:23.347 INFO:tasks.workunit.client.1.vm07.stdout:1/802: truncate d9/d2d/d4f/d75/d77/f100 237535 0 2026-03-10T12:38:23.350 INFO:tasks.workunit.client.1.vm07.stdout:7/749: creat d0/d47/dab/ffa x:0 0 0 2026-03-10T12:38:23.352 INFO:tasks.workunit.client.1.vm07.stdout:2/722: fdatasync d0/d42/d26/d38/d4f/f65 0 2026-03-10T12:38:23.354 INFO:tasks.workunit.client.1.vm07.stdout:5/822: readlink d0/d22/dbc/la7 0 2026-03-10T12:38:23.355 INFO:tasks.workunit.client.1.vm07.stdout:9/878: link d5/d13/d2c/de6/d74/le1 d5/d13/d22/l12d 0 2026-03-10T12:38:23.358 INFO:tasks.workunit.client.0.vm00.stdout:0/968: dread d3/f4 [0,4194304] 0 2026-03-10T12:38:23.358 INFO:tasks.workunit.client.1.vm07.stdout:9/879: dwrite d5/d13/d2c/de6/f43 [0,4194304] 0 2026-03-10T12:38:23.358 INFO:tasks.workunit.client.0.vm00.stdout:0/969: fsync d3/d22/f71 0 2026-03-10T12:38:23.360 INFO:tasks.workunit.client.1.vm07.stdout:3/825: mknod dc/dd/d1f/d45/dbf/c119 0 2026-03-10T12:38:23.360 INFO:tasks.workunit.client.1.vm07.stdout:3/826: dread - dc/dd/d1f/f112 zero size 2026-03-10T12:38:23.364 INFO:tasks.workunit.client.0.vm00.stdout:7/891: sync 2026-03-10T12:38:23.364 INFO:tasks.workunit.client.0.vm00.stdout:0/970: chown d3/d7/db0/dc4/de5/d126/d5b/d38/d12c/d74/l105 130 1 2026-03-10T12:38:23.366 INFO:tasks.workunit.client.1.vm07.stdout:7/750: truncate d0/f20 864389 0 2026-03-10T12:38:23.372 INFO:tasks.workunit.client.1.vm07.stdout:8/773: dread d1/d3/d6/d54/fa1 [0,4194304] 0 2026-03-10T12:38:23.376 INFO:tasks.workunit.client.0.vm00.stdout:0/971: dread d3/d7/db0/dc4/de5/d126/d5b/f57 [0,4194304] 0 2026-03-10T12:38:23.376 INFO:tasks.workunit.client.1.vm07.stdout:4/922: getdents d0/d4/d5/d78/dc5/df7/db2/dd5/d12b 0 2026-03-10T12:38:23.376 INFO:tasks.workunit.client.1.vm07.stdout:4/923: write d0/d8e/f13d [419182,18374] 0 2026-03-10T12:38:23.387 INFO:tasks.workunit.client.1.vm07.stdout:4/924: dread d0/d4/df2/df6/fcd [0,4194304] 0 2026-03-10T12:38:23.388 INFO:tasks.workunit.client.1.vm07.stdout:4/925: fsync d0/d4/d5/d78/dc5/df7/db2/dd5/f13c 0 2026-03-10T12:38:23.392 INFO:tasks.workunit.client.1.vm07.stdout:6/776: write d1/d4/d6/d4e/d64/f6f [1553827,6408] 0 2026-03-10T12:38:23.392 INFO:tasks.workunit.client.1.vm07.stdout:6/777: chown d1/fc9 28 1 2026-03-10T12:38:23.393 INFO:tasks.workunit.client.1.vm07.stdout:6/778: write d1/d4/d6/d43/f90 [1037384,1636] 0 2026-03-10T12:38:23.397 INFO:tasks.workunit.client.0.vm00.stdout:6/777: dwrite d2/d16/d29/d31/d88/d92/daa/dc1/fc9 [0,4194304] 0 2026-03-10T12:38:23.400 INFO:tasks.workunit.client.1.vm07.stdout:9/880: mknod d5/d13/d6c/da4/d102/c12e 0 2026-03-10T12:38:23.404 INFO:tasks.workunit.client.0.vm00.stdout:0/972: dread d3/d7/db0/dc4/de5/d126/d5b/d38/db3/fe3 [0,4194304] 0 2026-03-10T12:38:23.411 INFO:tasks.workunit.client.0.vm00.stdout:0/973: dwrite d3/db/d77/f8a [0,4194304] 0 2026-03-10T12:38:23.413 INFO:tasks.workunit.client.0.vm00.stdout:6/778: fsync d2/d14/d7a/db9/f6c 0 2026-03-10T12:38:23.419 INFO:tasks.workunit.client.1.vm07.stdout:5/823: mkdir d0/d22/d18/d3e/d11f 0 2026-03-10T12:38:23.428 INFO:tasks.workunit.client.1.vm07.stdout:8/774: creat d1/d3/d6/d50/d70/dfb/ffe x:0 0 0 2026-03-10T12:38:23.428 INFO:tasks.workunit.client.1.vm07.stdout:8/775: readlink d1/d3/db2/lbc 0 2026-03-10T12:38:23.431 INFO:tasks.workunit.client.0.vm00.stdout:6/779: link d2/da/dc/f45 d2/da/dc/d2f/d10a/f116 0 2026-03-10T12:38:23.440 INFO:tasks.workunit.client.1.vm07.stdout:2/723: link d0/d42/d26/c6b d0/d42/d4e/cf6 0 2026-03-10T12:38:23.440 INFO:tasks.workunit.client.1.vm07.stdout:4/926: unlink d0/d4/d10/d5f/d6d/f71 0 2026-03-10T12:38:23.440 INFO:tasks.workunit.client.0.vm00.stdout:0/974: link d3/d7/db0/dc4/de5/d126/dcc/f12b d3/d7/db0/dc4/de5/d126/dcc/dea/d102/f133 0 2026-03-10T12:38:23.440 INFO:tasks.workunit.client.0.vm00.stdout:6/780: creat d2/d16/d29/d31/d88/d92/daa/dc1/f117 x:0 0 0 2026-03-10T12:38:23.444 INFO:tasks.workunit.client.0.vm00.stdout:0/975: unlink d3/d7/db0/dc4/de5/d126/d5b/d38/db3/fe3 0 2026-03-10T12:38:23.445 INFO:tasks.workunit.client.1.vm07.stdout:5/824: mknod d0/d22/d18/d19/d2e/d67/c120 0 2026-03-10T12:38:23.446 INFO:tasks.workunit.client.1.vm07.stdout:0/910: write d0/d14/d5f/d76/d2f/d31/d79/dcc/fe3 [701320,80476] 0 2026-03-10T12:38:23.451 INFO:tasks.workunit.client.0.vm00.stdout:6/781: mkdir d2/da/dbf/ded/d118 0 2026-03-10T12:38:23.453 INFO:tasks.workunit.client.0.vm00.stdout:7/892: dwrite da/d47/dfd/fac [0,4194304] 0 2026-03-10T12:38:23.466 INFO:tasks.workunit.client.0.vm00.stdout:0/976: mkdir d3/d7/db0/dc4/de5/d126/d5b/d38/db3/d134 0 2026-03-10T12:38:23.468 INFO:tasks.workunit.client.1.vm07.stdout:5/825: dread d0/f9 [0,4194304] 0 2026-03-10T12:38:23.470 INFO:tasks.workunit.client.1.vm07.stdout:1/803: getdents d9/d2d/d80/d8e/dc7 0 2026-03-10T12:38:23.474 INFO:tasks.workunit.client.0.vm00.stdout:6/782: mkdir d2/da/dc/d83/d119 0 2026-03-10T12:38:23.474 INFO:tasks.workunit.client.0.vm00.stdout:7/893: dread da/f35 [0,4194304] 0 2026-03-10T12:38:23.478 INFO:tasks.workunit.client.1.vm07.stdout:2/724: mkdir d0/d42/d1f/d20/df7 0 2026-03-10T12:38:23.478 INFO:tasks.workunit.client.1.vm07.stdout:6/779: write d1/d4/d6/d43/d65/f9c [668865,100957] 0 2026-03-10T12:38:23.482 INFO:tasks.workunit.client.1.vm07.stdout:9/881: write d5/fb [101489,111684] 0 2026-03-10T12:38:23.486 INFO:tasks.workunit.client.0.vm00.stdout:6/783: mkdir d2/d9f/df6/d11a 0 2026-03-10T12:38:23.489 INFO:tasks.workunit.client.1.vm07.stdout:3/827: truncate dc/d18/d99/da3/fd2 2691623 0 2026-03-10T12:38:23.490 INFO:tasks.workunit.client.1.vm07.stdout:7/751: truncate d0/d47/f59 5617034 0 2026-03-10T12:38:23.490 INFO:tasks.workunit.client.0.vm00.stdout:6/784: dread d2/da/dc/d2f/ff4 [0,4194304] 0 2026-03-10T12:38:23.496 INFO:tasks.workunit.client.0.vm00.stdout:0/977: write d3/d7/db0/dc4/de5/d126/d5b/d38/d12c/d4b/f79 [516085,107876] 0 2026-03-10T12:38:23.498 INFO:tasks.workunit.client.1.vm07.stdout:0/911: rmdir d0/d14/d5f/d76/d2f/d31/d79/dcc 39 2026-03-10T12:38:23.498 INFO:tasks.workunit.client.0.vm00.stdout:0/978: stat d3/d7/db0/dc4/de5/d126/d5b/d38/d12c/l20 0 2026-03-10T12:38:23.504 INFO:tasks.workunit.client.1.vm07.stdout:4/927: dwrite d0/d4/d5/d78/dc5/df7/fb0 [0,4194304] 0 2026-03-10T12:38:23.509 INFO:tasks.workunit.client.1.vm07.stdout:1/804: mknod d9/d2d/d80/d8e/c109 0 2026-03-10T12:38:23.510 INFO:tasks.workunit.client.1.vm07.stdout:8/776: symlink d1/d3/d40/d92/dba/df1/lff 0 2026-03-10T12:38:23.514 INFO:tasks.workunit.client.1.vm07.stdout:2/725: fsync d0/f44 0 2026-03-10T12:38:23.518 INFO:tasks.workunit.client.1.vm07.stdout:9/882: creat d5/d1f/d75/f12f x:0 0 0 2026-03-10T12:38:23.523 INFO:tasks.workunit.client.0.vm00.stdout:6/785: read d2/d16/f1e [835135,72702] 0 2026-03-10T12:38:23.524 INFO:tasks.workunit.client.1.vm07.stdout:5/826: dwrite d0/d22/d18/d19/d36/d75/fdb [0,4194304] 0 2026-03-10T12:38:23.527 INFO:tasks.workunit.client.0.vm00.stdout:0/979: chown d3/d7/db0/dc4/de5/d126/d5b/la6 14531 1 2026-03-10T12:38:23.539 INFO:tasks.workunit.client.1.vm07.stdout:7/752: readlink d0/d52/l55 0 2026-03-10T12:38:23.545 INFO:tasks.workunit.client.1.vm07.stdout:7/753: read d0/d61/d79/f8d [47471,72340] 0 2026-03-10T12:38:23.545 INFO:tasks.workunit.client.1.vm07.stdout:0/912: truncate d0/d14/d5f/d76/d2f/d31/d79/d85/fff 423126 0 2026-03-10T12:38:23.545 INFO:tasks.workunit.client.0.vm00.stdout:6/786: mknod d2/da/dbf/c11b 0 2026-03-10T12:38:23.545 INFO:tasks.workunit.client.0.vm00.stdout:0/980: creat d3/db/d24/d25/f135 x:0 0 0 2026-03-10T12:38:23.545 INFO:tasks.workunit.client.0.vm00.stdout:7/894: getdents da/d3f/d71 0 2026-03-10T12:38:23.545 INFO:tasks.workunit.client.0.vm00.stdout:0/981: write d3/d7/db0/dc4/de5/d126/d5b/d38/db3/de2/f68 [1475938,78293] 0 2026-03-10T12:38:23.549 INFO:tasks.workunit.client.1.vm07.stdout:9/883: dread - d5/d13/d9b/fec zero size 2026-03-10T12:38:23.563 INFO:tasks.workunit.client.1.vm07.stdout:5/827: symlink d0/d22/d18/d19/d2e/l121 0 2026-03-10T12:38:23.563 INFO:tasks.workunit.client.1.vm07.stdout:3/828: mkdir dc/dd/d1f/dc7/dc9/d116/d11a 0 2026-03-10T12:38:23.563 INFO:tasks.workunit.client.1.vm07.stdout:7/754: fsync d0/d61/db4/f53 0 2026-03-10T12:38:23.563 INFO:tasks.workunit.client.1.vm07.stdout:5/828: read d0/d22/d18/d19/d2e/d67/fa0 [3175550,66034] 0 2026-03-10T12:38:23.569 INFO:tasks.workunit.client.1.vm07.stdout:1/805: mknod d9/df/dc9/df4/c10a 0 2026-03-10T12:38:23.569 INFO:tasks.workunit.client.0.vm00.stdout:7/895: rename da/d41/d7b/d9d/dba/lee to da/d41/d7b/d9d/dc8/dd0/l136 0 2026-03-10T12:38:23.571 INFO:tasks.workunit.client.1.vm07.stdout:6/780: link d1/d4/d9b/lb0 d1/d4/d6/d43/l100 0 2026-03-10T12:38:23.572 INFO:tasks.workunit.client.0.vm00.stdout:7/896: rmdir da/d1b/d40 39 2026-03-10T12:38:23.573 INFO:tasks.workunit.client.1.vm07.stdout:9/884: stat d5/d16/f8f 0 2026-03-10T12:38:23.575 INFO:tasks.workunit.client.0.vm00.stdout:7/897: mknod da/d3f/d71/c137 0 2026-03-10T12:38:23.576 INFO:tasks.workunit.client.1.vm07.stdout:3/829: rename dc/dd/d43/d76/c109 to dc/dd/d43/c11b 0 2026-03-10T12:38:23.577 INFO:tasks.workunit.client.0.vm00.stdout:7/898: stat da/l1f 0 2026-03-10T12:38:23.579 INFO:tasks.workunit.client.1.vm07.stdout:8/777: link d1/d3/d6/d54/dd2/fdb d1/d3/d11/d87/f100 0 2026-03-10T12:38:23.581 INFO:tasks.workunit.client.1.vm07.stdout:8/778: write d1/f3f [4798233,124635] 0 2026-03-10T12:38:23.581 INFO:tasks.workunit.client.1.vm07.stdout:8/779: chown d1/d3/d5d/fd5 33307125 1 2026-03-10T12:38:23.581 INFO:tasks.workunit.client.1.vm07.stdout:8/780: chown d1/d3/db2/dcd 10982 1 2026-03-10T12:38:23.582 INFO:tasks.workunit.client.0.vm00.stdout:7/899: creat da/d41/d48/d81/f138 x:0 0 0 2026-03-10T12:38:23.583 INFO:tasks.workunit.client.0.vm00.stdout:7/900: stat da/d3f/dd1 0 2026-03-10T12:38:23.583 INFO:tasks.workunit.client.1.vm07.stdout:1/806: mknod d9/df/d55/d9f/c10b 0 2026-03-10T12:38:23.583 INFO:tasks.workunit.client.0.vm00.stdout:7/901: dread - da/d41/d7b/fb0 zero size 2026-03-10T12:38:23.584 INFO:tasks.workunit.client.1.vm07.stdout:1/807: write d9/d2d/d4f/d75/de3/ff5 [1284352,45086] 0 2026-03-10T12:38:23.585 INFO:tasks.workunit.client.1.vm07.stdout:1/808: chown d9/df/d29/d2b/d92/db6/cc3 1402442 1 2026-03-10T12:38:23.592 INFO:tasks.workunit.client.1.vm07.stdout:8/781: symlink d1/d3/db2/dcd/db8/l101 0 2026-03-10T12:38:23.618 INFO:tasks.workunit.client.1.vm07.stdout:1/809: rmdir d9/df/d29/d2b/db1 39 2026-03-10T12:38:23.618 INFO:tasks.workunit.client.1.vm07.stdout:7/755: link d0/d47/f8e d0/d57/dd6/d80/ffb 0 2026-03-10T12:38:23.618 INFO:tasks.workunit.client.1.vm07.stdout:8/782: rmdir d1/d3/d6/d54/dd2/df3 39 2026-03-10T12:38:23.618 INFO:tasks.workunit.client.1.vm07.stdout:1/810: mknod d9/df/d55/d9f/c10c 0 2026-03-10T12:38:23.618 INFO:tasks.workunit.client.1.vm07.stdout:1/811: stat d9/df/d29/d2b/d92/d9d/f105 0 2026-03-10T12:38:23.618 INFO:tasks.workunit.client.1.vm07.stdout:1/812: creat d9/df/d29/d2b/d92/f10d x:0 0 0 2026-03-10T12:38:23.618 INFO:tasks.workunit.client.1.vm07.stdout:1/813: creat d9/d2d/de2/dc8/f10e x:0 0 0 2026-03-10T12:38:23.631 INFO:tasks.workunit.client.0.vm00.stdout:6/787: sync 2026-03-10T12:38:23.632 INFO:tasks.workunit.client.0.vm00.stdout:6/788: stat d2/d16/d29/d31/d88/ff1 0 2026-03-10T12:38:23.633 INFO:tasks.workunit.client.1.vm07.stdout:4/928: sync 2026-03-10T12:38:23.635 INFO:tasks.workunit.client.1.vm07.stdout:6/781: dread d1/d4/d6/f41 [0,4194304] 0 2026-03-10T12:38:23.639 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:23.639 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: Updating vm00:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:23.639 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: pgmap v8: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 33 MiB/s rd, 82 MiB/s wr, 204 op/s 2026-03-10T12:38:23.639 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:23.639 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:23.639 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: mgrmap e25: vm07.kfawlb(active, since 10s), standbys: vm00.nescmq 2026-03-10T12:38:23.639 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.640 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:38:23.640 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.640 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.640 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.642 INFO:tasks.workunit.client.0.vm00.stdout:6/789: dwrite d2/d16/d29/d31/d88/d92/daa/dc1/f117 [0,4194304] 0 2026-03-10T12:38:23.643 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.643 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.643 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:23 vm07.local ceph-mon[58582]: Reconfiguring prometheus.vm00 (dependencies changed)... 2026-03-10T12:38:23.643 INFO:tasks.workunit.client.0.vm00.stdout:6/790: read - d2/d16/d29/d31/d88/d92/f10b zero size 2026-03-10T12:38:23.646 INFO:tasks.workunit.client.1.vm07.stdout:6/782: fdatasync d1/d4/f3b 0 2026-03-10T12:38:23.646 INFO:tasks.workunit.client.1.vm07.stdout:6/783: fdatasync d1/d4/d6/d16/fdd 0 2026-03-10T12:38:23.647 INFO:tasks.workunit.client.1.vm07.stdout:6/784: write d1/d4/d6/d16/ff0 [773069,124383] 0 2026-03-10T12:38:23.651 INFO:tasks.workunit.client.0.vm00.stdout:6/791: creat d2/d14/d7a/f11c x:0 0 0 2026-03-10T12:38:23.660 INFO:tasks.workunit.client.1.vm07.stdout:0/913: dwrite d0/d14/d5f/d76/d2f/d31/d4f/fa7 [0,4194304] 0 2026-03-10T12:38:23.660 INFO:tasks.workunit.client.1.vm07.stdout:2/726: dwrite d0/d42/d4e/d77/f89 [0,4194304] 0 2026-03-10T12:38:23.661 INFO:tasks.workunit.client.0.vm00.stdout:0/982: dwrite d3/d7/db0/dc4/de5/d126/d5b/d38/d44/d5a/ff8 [0,4194304] 0 2026-03-10T12:38:23.663 INFO:tasks.workunit.client.1.vm07.stdout:2/727: chown d0/d45/la4 1299761307 1 2026-03-10T12:38:23.664 INFO:tasks.workunit.client.1.vm07.stdout:0/914: chown d0/d14/d5f/d76/d2f/d31/d79/d9e/l126 2028647361 1 2026-03-10T12:38:23.685 INFO:tasks.workunit.client.1.vm07.stdout:5/829: write d0/d22/d18/f20 [3897273,125801] 0 2026-03-10T12:38:23.686 INFO:tasks.workunit.client.1.vm07.stdout:2/728: dread d0/d42/d26/d38/d4f/d62/fba [0,4194304] 0 2026-03-10T12:38:23.689 INFO:tasks.workunit.client.1.vm07.stdout:2/729: dread d0/f15 [4194304,4194304] 0 2026-03-10T12:38:23.690 INFO:tasks.workunit.client.1.vm07.stdout:3/830: write dc/d18/f36 [3957923,51045] 0 2026-03-10T12:38:23.702 INFO:tasks.workunit.client.1.vm07.stdout:9/885: truncate d5/d13/d9d/f100 3389314 0 2026-03-10T12:38:23.702 INFO:tasks.workunit.client.1.vm07.stdout:9/886: chown d5/d16/d18/cc7 9 1 2026-03-10T12:38:23.703 INFO:tasks.workunit.client.0.vm00.stdout:7/902: dwrite da/d1b/fe5 [0,4194304] 0 2026-03-10T12:38:23.708 INFO:tasks.workunit.client.0.vm00.stdout:6/792: fsync d2/d42/d80/d9d/fca 0 2026-03-10T12:38:23.712 INFO:tasks.workunit.client.1.vm07.stdout:8/783: dwrite d1/f68 [4194304,4194304] 0 2026-03-10T12:38:23.716 INFO:tasks.workunit.client.1.vm07.stdout:1/814: write d9/d2d/fcb [3498965,29295] 0 2026-03-10T12:38:23.718 INFO:tasks.workunit.client.1.vm07.stdout:1/815: write d9/fe [3538045,44403] 0 2026-03-10T12:38:23.718 INFO:tasks.workunit.client.1.vm07.stdout:7/756: dwrite d0/d57/d62/d90/fcc [0,4194304] 0 2026-03-10T12:38:23.723 INFO:tasks.workunit.client.0.vm00.stdout:0/983: mknod d3/d7/db0/dc4/de5/d126/d5b/dc5/c136 0 2026-03-10T12:38:23.729 INFO:tasks.workunit.client.0.vm00.stdout:7/903: dread f9 [0,4194304] 0 2026-03-10T12:38:23.732 INFO:tasks.workunit.client.0.vm00.stdout:0/984: symlink d3/db/d24/d25/l137 0 2026-03-10T12:38:23.741 INFO:tasks.workunit.client.0.vm00.stdout:7/904: creat da/d26/d37/d61/f139 x:0 0 0 2026-03-10T12:38:23.745 INFO:tasks.workunit.client.0.vm00.stdout:7/905: dwrite da/d41/f4b [0,4194304] 0 2026-03-10T12:38:23.745 INFO:tasks.workunit.client.1.vm07.stdout:8/784: sync 2026-03-10T12:38:23.748 INFO:tasks.workunit.client.1.vm07.stdout:7/757: fdatasync d0/d61/d79/fba 0 2026-03-10T12:38:23.757 INFO:tasks.workunit.client.0.vm00.stdout:0/985: mkdir d3/d7/db0/dc4/dd5/d138 0 2026-03-10T12:38:23.757 INFO:tasks.workunit.client.0.vm00.stdout:6/793: link d2/d16/d29/f4c d2/d16/d74/f11d 0 2026-03-10T12:38:23.757 INFO:tasks.workunit.client.1.vm07.stdout:1/816: rename d9/d2d/d80/d8e/dc7/le9 to d9/d2d/d80/l10f 0 2026-03-10T12:38:23.757 INFO:tasks.workunit.client.1.vm07.stdout:0/915: mkdir d0/d14/d5f/d41/d133 0 2026-03-10T12:38:23.762 INFO:tasks.workunit.client.0.vm00.stdout:6/794: rmdir d2/d16/d29/d31/d88/d92/daa 39 2026-03-10T12:38:23.763 INFO:tasks.workunit.client.1.vm07.stdout:6/785: getdents d1/d4/d6/d16/d1a/d9d/db2 0 2026-03-10T12:38:23.763 INFO:tasks.workunit.client.0.vm00.stdout:7/906: mknod da/d26/d37/d56/c13a 0 2026-03-10T12:38:23.767 INFO:tasks.workunit.client.0.vm00.stdout:0/986: creat d3/d7/db0/dc4/de5/d126/d5b/d38/db3/d134/f139 x:0 0 0 2026-03-10T12:38:23.770 INFO:tasks.workunit.client.1.vm07.stdout:1/817: rename d9/df/d29/d2b/d92/db6/lc5 to d9/df/d29/d2b/l110 0 2026-03-10T12:38:23.774 INFO:tasks.workunit.client.0.vm00.stdout:6/795: read d2/d16/f78 [722844,15617] 0 2026-03-10T12:38:23.774 INFO:tasks.workunit.client.1.vm07.stdout:6/786: mknod d1/d4/d6/d16/c101 0 2026-03-10T12:38:23.776 INFO:tasks.workunit.client.0.vm00.stdout:6/796: read - d2/d14/d7a/f11c zero size 2026-03-10T12:38:23.778 INFO:tasks.workunit.client.1.vm07.stdout:7/758: getdents d0/d61/db4 0 2026-03-10T12:38:23.779 INFO:tasks.workunit.client.1.vm07.stdout:6/787: read d1/d4/d6/d43/d65/f86 [3323778,125676] 0 2026-03-10T12:38:23.780 INFO:tasks.workunit.client.1.vm07.stdout:6/788: write d1/d4/d6/d4e/d64/f6f [4663242,42318] 0 2026-03-10T12:38:23.784 INFO:tasks.workunit.client.0.vm00.stdout:0/987: rmdir d3/d7/db0/dc4/de5/d126/dcc/dea 39 2026-03-10T12:38:23.785 INFO:tasks.workunit.client.1.vm07.stdout:1/818: symlink d9/df/d79/l111 0 2026-03-10T12:38:23.790 INFO:tasks.workunit.client.0.vm00.stdout:7/907: creat da/d1b/f13b x:0 0 0 2026-03-10T12:38:23.797 INFO:tasks.workunit.client.1.vm07.stdout:4/929: dwrite d0/d8e/fb5 [0,4194304] 0 2026-03-10T12:38:23.807 INFO:tasks.workunit.client.0.vm00.stdout:7/908: symlink da/d25/d2c/d82/d68/df8/l13c 0 2026-03-10T12:38:23.807 INFO:tasks.workunit.client.0.vm00.stdout:7/909: chown da/d3f/d60 146 1 2026-03-10T12:38:23.817 INFO:tasks.workunit.client.1.vm07.stdout:5/830: dwrite d0/d22/d18/d19/d2e/f88 [0,4194304] 0 2026-03-10T12:38:23.821 INFO:tasks.workunit.client.0.vm00.stdout:7/910: creat da/d41/d7b/d9d/dba/f13d x:0 0 0 2026-03-10T12:38:23.821 INFO:tasks.workunit.client.1.vm07.stdout:6/789: symlink d1/d4/d6/d4e/l102 0 2026-03-10T12:38:23.821 INFO:tasks.workunit.client.0.vm00.stdout:6/797: link d2/d42/d80/fbd d2/da/dbf/ded/d118/f11e 0 2026-03-10T12:38:23.821 INFO:tasks.workunit.client.0.vm00.stdout:6/798: stat d2/da/ce4 0 2026-03-10T12:38:23.825 INFO:tasks.workunit.client.1.vm07.stdout:1/819: read d9/d2d/d4f/d5a/f65 [2360339,103863] 0 2026-03-10T12:38:23.828 INFO:tasks.workunit.client.1.vm07.stdout:4/930: creat d0/d4/d10/d3c/d2b/d2d/da7/f14c x:0 0 0 2026-03-10T12:38:23.830 INFO:tasks.workunit.client.1.vm07.stdout:5/831: fdatasync d0/d22/f27 0 2026-03-10T12:38:23.831 INFO:tasks.workunit.client.1.vm07.stdout:5/832: dread - d0/d22/d18/d30/f11e zero size 2026-03-10T12:38:23.832 INFO:tasks.workunit.client.1.vm07.stdout:7/759: creat d0/d61/db4/d8a/d9d/ffc x:0 0 0 2026-03-10T12:38:23.833 INFO:tasks.workunit.client.1.vm07.stdout:7/760: dread - d0/d61/db4/fdc zero size 2026-03-10T12:38:23.833 INFO:tasks.workunit.client.1.vm07.stdout:7/761: stat d0/d47/c65 0 2026-03-10T12:38:23.837 INFO:tasks.workunit.client.1.vm07.stdout:6/790: symlink d1/d4/d6/d43/d65/l103 0 2026-03-10T12:38:23.841 INFO:tasks.workunit.client.0.vm00.stdout:0/988: link d3/d7/db0/dc4/de5/d126/d5b/d38/db3/fcd d3/db/f13a 0 2026-03-10T12:38:23.841 INFO:tasks.workunit.client.1.vm07.stdout:1/820: creat d9/d2d/d80/d8e/f112 x:0 0 0 2026-03-10T12:38:23.841 INFO:tasks.workunit.client.1.vm07.stdout:4/931: truncate d0/d4/d5/da/f6e 4298058 0 2026-03-10T12:38:23.841 INFO:tasks.workunit.client.0.vm00.stdout:7/911: creat da/d41/d7b/d9d/f13e x:0 0 0 2026-03-10T12:38:23.847 INFO:tasks.workunit.client.1.vm07.stdout:5/833: dwrite d0/d22/d18/d3e/d5d/db6/fe4 [0,4194304] 0 2026-03-10T12:38:23.853 INFO:tasks.workunit.client.0.vm00.stdout:7/912: dwrite da/d3f/d60/f110 [0,4194304] 0 2026-03-10T12:38:23.853 INFO:tasks.workunit.client.1.vm07.stdout:7/762: mknod d0/d47/dab/dae/cfd 0 2026-03-10T12:38:23.853 INFO:tasks.workunit.client.1.vm07.stdout:7/763: stat d0/d47/da0/fda 0 2026-03-10T12:38:23.853 INFO:tasks.workunit.client.1.vm07.stdout:7/764: stat d0/d61/d79/f83 0 2026-03-10T12:38:23.855 INFO:tasks.workunit.client.0.vm00.stdout:6/799: link d2/f5e d2/d14/d7a/f11f 0 2026-03-10T12:38:23.857 INFO:tasks.workunit.client.0.vm00.stdout:0/989: mkdir d3/d7/db0/dc4/de5/d126/d5b/d13b 0 2026-03-10T12:38:23.857 INFO:tasks.workunit.client.1.vm07.stdout:4/932: mknod d0/d4/d5/d78/dc5/df7/db2/c14d 0 2026-03-10T12:38:23.864 INFO:tasks.workunit.client.0.vm00.stdout:7/913: mkdir da/d25/d2c/d82/d68/d124/d13f 0 2026-03-10T12:38:23.865 INFO:tasks.workunit.client.0.vm00.stdout:7/914: fsync da/d26/d37/d56/ddf/d108/f11e 0 2026-03-10T12:38:23.866 INFO:tasks.workunit.client.0.vm00.stdout:6/800: symlink d2/d42/d103/l120 0 2026-03-10T12:38:23.867 INFO:tasks.workunit.client.0.vm00.stdout:6/801: chown d2/d42/d80/d89/fb8 6612 1 2026-03-10T12:38:23.867 INFO:tasks.workunit.client.1.vm07.stdout:2/730: write d0/d42/d4e/daf/fcf [833257,56148] 0 2026-03-10T12:38:23.871 INFO:tasks.workunit.client.1.vm07.stdout:3/831: dwrite dc/d18/d24/f3e [0,4194304] 0 2026-03-10T12:38:23.873 INFO:tasks.workunit.client.1.vm07.stdout:3/832: write dc/d18/d24/f49 [1493360,30386] 0 2026-03-10T12:38:23.875 INFO:tasks.workunit.client.0.vm00.stdout:0/990: creat d3/db/da4/de7/f13c x:0 0 0 2026-03-10T12:38:23.880 INFO:tasks.workunit.client.0.vm00.stdout:0/991: dwrite d3/d40/f11e [0,4194304] 0 2026-03-10T12:38:23.881 INFO:tasks.workunit.client.1.vm07.stdout:9/887: dwrite d5/d1f/d5e/d6b/fae [0,4194304] 0 2026-03-10T12:38:23.891 INFO:tasks.workunit.client.0.vm00.stdout:0/992: dread - d3/d7/db0/dc4/de5/d126/d5b/d38/db3/de2/fc6 zero size 2026-03-10T12:38:23.893 INFO:tasks.workunit.client.0.vm00.stdout:0/993: chown d3/d7/db0/dc4/de5/d126/d9d/f12e 1037247221 1 2026-03-10T12:38:23.895 INFO:tasks.workunit.client.1.vm07.stdout:8/785: write d1/d3/d11/f43 [742625,7763] 0 2026-03-10T12:38:23.903 INFO:tasks.workunit.client.0.vm00.stdout:7/915: dwrite da/d26/d37/fc4 [0,4194304] 0 2026-03-10T12:38:23.903 INFO:tasks.workunit.client.1.vm07.stdout:7/765: symlink d0/d57/d62/lfe 0 2026-03-10T12:38:23.903 INFO:tasks.workunit.client.1.vm07.stdout:0/916: dwrite d0/d14/d5f/d76/d2f/d31/d79/ffd [0,4194304] 0 2026-03-10T12:38:23.903 INFO:tasks.workunit.client.1.vm07.stdout:0/917: chown d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115 2 1 2026-03-10T12:38:23.905 INFO:tasks.workunit.client.1.vm07.stdout:3/833: sync 2026-03-10T12:38:23.916 INFO:tasks.workunit.client.0.vm00.stdout:7/916: creat da/d41/d48/d81/f140 x:0 0 0 2026-03-10T12:38:23.934 INFO:tasks.workunit.client.0.vm00.stdout:0/994: rename d3/d7/db0/dc4/de5/d126/d9d/ff3 to d3/d7/db0/dc4/de5/d126/dcc/dea/d102/f13d 0 2026-03-10T12:38:23.938 INFO:tasks.workunit.client.1.vm07.stdout:6/791: dread d1/d4/d6/d16/d1a/f9f [0,4194304] 0 2026-03-10T12:38:23.941 INFO:tasks.workunit.client.0.vm00.stdout:0/995: mkdir d3/db/da4/de7/d13e 0 2026-03-10T12:38:23.944 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: Updating vm00:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: pgmap v8: 65 pgs: 65 active+clean; 3.6 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 33 MiB/s rd, 82 MiB/s wr, 204 op/s 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: mgrmap e25: vm07.kfawlb(active, since 10s), standbys: vm00.nescmq 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:23.945 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:23 vm00.local ceph-mon[50686]: Reconfiguring prometheus.vm00 (dependencies changed)... 2026-03-10T12:38:23.946 INFO:tasks.workunit.client.0.vm00.stdout:0/996: chown d3/db/c23 16190 1 2026-03-10T12:38:23.947 INFO:tasks.workunit.client.0.vm00.stdout:0/997: truncate d3/d7/f12d 850310 0 2026-03-10T12:38:23.969 INFO:tasks.workunit.client.1.vm07.stdout:9/888: rmdir d5/d16/d23/d26 39 2026-03-10T12:38:23.972 INFO:tasks.workunit.client.1.vm07.stdout:7/766: mknod d0/d61/d79/cff 0 2026-03-10T12:38:23.974 INFO:tasks.workunit.client.1.vm07.stdout:0/918: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/f134 x:0 0 0 2026-03-10T12:38:23.980 INFO:tasks.workunit.client.1.vm07.stdout:3/834: creat dc/dd/d28/d7a/f11c x:0 0 0 2026-03-10T12:38:23.980 INFO:tasks.workunit.client.1.vm07.stdout:1/821: getdents d9/df/d29/d2b/d31/d91/d59 0 2026-03-10T12:38:23.980 INFO:tasks.workunit.client.1.vm07.stdout:5/834: getdents d0/d22/d18/d3e 0 2026-03-10T12:38:23.981 INFO:tasks.workunit.client.1.vm07.stdout:8/786: symlink d1/d3/d40/l102 0 2026-03-10T12:38:23.982 INFO:tasks.workunit.client.1.vm07.stdout:7/767: unlink d0/d61/l94 0 2026-03-10T12:38:23.985 INFO:tasks.workunit.client.1.vm07.stdout:3/835: creat dc/dd/d1f/d6f/dcf/f11d x:0 0 0 2026-03-10T12:38:23.986 INFO:tasks.workunit.client.1.vm07.stdout:1/822: creat d9/df/d29/d2b/d31/d91/d59/f113 x:0 0 0 2026-03-10T12:38:23.988 INFO:tasks.workunit.client.1.vm07.stdout:7/768: mknod d0/d61/d79/db5/c100 0 2026-03-10T12:38:23.989 INFO:tasks.workunit.client.1.vm07.stdout:7/769: chown d0/d52/c60 682610 1 2026-03-10T12:38:23.990 INFO:tasks.workunit.client.1.vm07.stdout:3/836: fdatasync dc/dd/d28/d3b/f4d 0 2026-03-10T12:38:23.991 INFO:tasks.workunit.client.1.vm07.stdout:1/823: symlink d9/df/d29/d2b/d30/l114 0 2026-03-10T12:38:23.992 INFO:tasks.workunit.client.1.vm07.stdout:6/792: link d1/d4/d6/d46/d4d/dc7/dd9/cec d1/d4/d6/d46/d4d/dc7/c104 0 2026-03-10T12:38:23.994 INFO:tasks.workunit.client.1.vm07.stdout:3/837: dwrite dc/dd/db5/f115 [0,4194304] 0 2026-03-10T12:38:23.997 INFO:tasks.workunit.client.1.vm07.stdout:3/838: dwrite dc/dd/d43/d76/d95/fb6 [4194304,4194304] 0 2026-03-10T12:38:24.009 INFO:tasks.workunit.client.1.vm07.stdout:7/770: creat d0/d67/f101 x:0 0 0 2026-03-10T12:38:24.009 INFO:tasks.workunit.client.1.vm07.stdout:7/771: chown d0/d61/d79/f8d 624 1 2026-03-10T12:38:24.018 INFO:tasks.workunit.client.1.vm07.stdout:6/793: creat d1/d4/d6/d43/f105 x:0 0 0 2026-03-10T12:38:24.026 INFO:tasks.workunit.client.1.vm07.stdout:3/839: fdatasync dc/d18/d2d/f80 0 2026-03-10T12:38:24.026 INFO:tasks.workunit.client.1.vm07.stdout:7/772: chown d0/l1c 0 1 2026-03-10T12:38:24.026 INFO:tasks.workunit.client.1.vm07.stdout:6/794: write d1/d4/f19 [3910486,44205] 0 2026-03-10T12:38:24.026 INFO:tasks.workunit.client.1.vm07.stdout:3/840: dread dc/dd/d28/d7a/f117 [0,4194304] 0 2026-03-10T12:38:24.026 INFO:tasks.workunit.client.1.vm07.stdout:8/787: getdents d1/d3/d40/d92/dba/df1 0 2026-03-10T12:38:24.026 INFO:tasks.workunit.client.1.vm07.stdout:5/835: link d0/d22/d18/d19/d21/d54/c96 d0/c122 0 2026-03-10T12:38:24.026 INFO:tasks.workunit.client.1.vm07.stdout:3/841: symlink dc/d18/de2/df6/l11e 0 2026-03-10T12:38:24.032 INFO:tasks.workunit.client.1.vm07.stdout:8/788: symlink d1/d3/d6/d50/d70/dd4/l103 0 2026-03-10T12:38:24.037 INFO:tasks.workunit.client.1.vm07.stdout:5/836: dwrite d0/d22/d18/d19/d2e/d67/dd9/fef [0,4194304] 0 2026-03-10T12:38:24.040 INFO:tasks.workunit.client.1.vm07.stdout:3/842: creat dc/dd/d1f/dc7/dc9/d116/f11f x:0 0 0 2026-03-10T12:38:24.046 INFO:tasks.workunit.client.1.vm07.stdout:3/843: write dc/dd/d28/d7a/d8e/f10a [249535,61524] 0 2026-03-10T12:38:24.049 INFO:tasks.workunit.client.1.vm07.stdout:3/844: rename dc/dd/d28/d7a/f7f to dc/f120 0 2026-03-10T12:38:24.049 INFO:tasks.workunit.client.1.vm07.stdout:3/845: readlink dc/dd/d43/d76/d95/dde/le9 0 2026-03-10T12:38:24.051 INFO:tasks.workunit.client.1.vm07.stdout:3/846: mkdir dc/dd/d43/d76/d95/d121 0 2026-03-10T12:38:24.100 INFO:tasks.workunit.client.1.vm07.stdout:7/773: getdents d0/d47/dab/dae 0 2026-03-10T12:38:24.101 INFO:tasks.workunit.client.1.vm07.stdout:7/774: chown d0/d57/dd6/d80 87 1 2026-03-10T12:38:24.101 INFO:tasks.workunit.client.1.vm07.stdout:7/775: dread - d0/d61/db4/f7a zero size 2026-03-10T12:38:24.109 INFO:tasks.workunit.client.0.vm00.stdout:6/802: write d2/d16/f17 [4921132,13840] 0 2026-03-10T12:38:24.110 INFO:tasks.workunit.client.1.vm07.stdout:4/933: write d0/d4/d10/d9a/d124/f100 [137605,105750] 0 2026-03-10T12:38:24.110 INFO:tasks.workunit.client.0.vm00.stdout:6/803: write d2/d14/d7a/db9/f9b [629349,97153] 0 2026-03-10T12:38:24.118 INFO:tasks.workunit.client.1.vm07.stdout:4/934: truncate d0/d4/d10/d9a/f113 337732 0 2026-03-10T12:38:24.120 INFO:tasks.workunit.client.0.vm00.stdout:6/804: link d2/d14/d7a/db9/f9b d2/da/dc/d94/f121 0 2026-03-10T12:38:24.121 INFO:tasks.workunit.client.1.vm07.stdout:4/935: write d0/d4/d5/f125 [511118,107060] 0 2026-03-10T12:38:24.121 INFO:tasks.workunit.client.1.vm07.stdout:4/936: fdatasync d0/d4/df2/df6/d46/d76/f130 0 2026-03-10T12:38:24.124 INFO:tasks.workunit.client.0.vm00.stdout:7/917: dwrite da/d47/fb7 [0,4194304] 0 2026-03-10T12:38:24.125 INFO:tasks.workunit.client.1.vm07.stdout:4/937: creat d0/d4/d10/d3c/d2b/d2d/da7/f14e x:0 0 0 2026-03-10T12:38:24.127 INFO:tasks.workunit.client.0.vm00.stdout:0/998: write d3/db/d24/d25/f7d [387551,58001] 0 2026-03-10T12:38:24.136 INFO:tasks.workunit.client.1.vm07.stdout:4/938: mkdir d0/d4/d10/d114/d14b/d14f 0 2026-03-10T12:38:24.137 INFO:tasks.workunit.client.1.vm07.stdout:2/731: write d0/d42/d26/f52 [4969954,124555] 0 2026-03-10T12:38:24.138 INFO:tasks.workunit.client.0.vm00.stdout:6/805: getdents d2/d14/dbb 0 2026-03-10T12:38:24.140 INFO:tasks.workunit.client.1.vm07.stdout:2/732: truncate d0/d29/d64/fd2 108963 0 2026-03-10T12:38:24.140 INFO:tasks.workunit.client.0.vm00.stdout:7/918: dwrite da/d25/d2c/d82/d68/f10e [0,4194304] 0 2026-03-10T12:38:24.145 INFO:tasks.workunit.client.1.vm07.stdout:4/939: symlink d0/d4/d5/d78/dc5/d12c/l150 0 2026-03-10T12:38:24.146 INFO:tasks.workunit.client.1.vm07.stdout:9/889: write d5/f91 [2645985,100194] 0 2026-03-10T12:38:24.146 INFO:tasks.workunit.client.1.vm07.stdout:2/733: creat d0/d42/d26/d38/d4f/d5d/ff8 x:0 0 0 2026-03-10T12:38:24.152 INFO:tasks.workunit.client.0.vm00.stdout:0/999: link d3/d40/d65/fc0 d3/d7/db0/dc4/de5/d126/d5b/d38/db3/f13f 0 2026-03-10T12:38:24.156 INFO:tasks.workunit.client.1.vm07.stdout:0/919: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/f89 [0,4194304] 0 2026-03-10T12:38:24.158 INFO:tasks.workunit.client.1.vm07.stdout:0/920: fdatasync d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/fd5 0 2026-03-10T12:38:24.161 INFO:tasks.workunit.client.1.vm07.stdout:9/890: creat d5/d69/f130 x:0 0 0 2026-03-10T12:38:24.161 INFO:tasks.workunit.client.1.vm07.stdout:2/734: mkdir d0/d29/d64/db5/dbb/df9 0 2026-03-10T12:38:24.161 INFO:tasks.workunit.client.1.vm07.stdout:9/891: chown d5/d13/d57/d4f 619470 1 2026-03-10T12:38:24.164 INFO:tasks.workunit.client.1.vm07.stdout:1/824: dwrite d9/df/f11 [0,4194304] 0 2026-03-10T12:38:24.171 INFO:tasks.workunit.client.0.vm00.stdout:7/919: mknod da/d26/d37/d56/ddf/d108/d12a/c141 0 2026-03-10T12:38:24.178 INFO:tasks.workunit.client.1.vm07.stdout:6/795: write d1/d4/f5a [1022759,96970] 0 2026-03-10T12:38:24.178 INFO:tasks.workunit.client.0.vm00.stdout:6/806: rename d2/d16/d29/d31/d88/d92/daa/dc1/fc9 to d2/d16/d29/d31/d88/d92/daa/dc1/f122 0 2026-03-10T12:38:24.178 INFO:tasks.workunit.client.1.vm07.stdout:6/796: chown d1/d4/d4a 71902 1 2026-03-10T12:38:24.181 INFO:tasks.workunit.client.1.vm07.stdout:0/921: rmdir d0/d14/d5f/d76/d2f/d31/d4f/d60 39 2026-03-10T12:38:24.181 INFO:tasks.workunit.client.1.vm07.stdout:2/735: symlink d0/d42/d1f/lfa 0 2026-03-10T12:38:24.182 INFO:tasks.workunit.client.1.vm07.stdout:2/736: write d0/d42/d26/d7d/fe8 [2192869,22034] 0 2026-03-10T12:38:24.185 INFO:tasks.workunit.client.1.vm07.stdout:8/789: write d1/d3/f1f [5441911,75587] 0 2026-03-10T12:38:24.188 INFO:tasks.workunit.client.1.vm07.stdout:5/837: dwrite d0/d22/d18/d19/d21/f10f [0,4194304] 0 2026-03-10T12:38:24.197 INFO:tasks.workunit.client.1.vm07.stdout:3/847: getdents dc/dd/d1f/dc7/dc9/d116 0 2026-03-10T12:38:24.200 INFO:tasks.workunit.client.1.vm07.stdout:7/776: write d0/d47/f9a [6110,16689] 0 2026-03-10T12:38:24.204 INFO:tasks.workunit.client.1.vm07.stdout:7/777: dread d0/d57/dd6/d80/fac [0,4194304] 0 2026-03-10T12:38:24.209 INFO:tasks.workunit.client.0.vm00.stdout:6/807: mkdir d2/d14/dc0/d123 0 2026-03-10T12:38:24.209 INFO:tasks.workunit.client.0.vm00.stdout:6/808: readlink d2/d42/l79 0 2026-03-10T12:38:24.210 INFO:tasks.workunit.client.0.vm00.stdout:7/920: link da/d3f/d71/le6 da/d41/d7b/d9d/dba/l142 0 2026-03-10T12:38:24.214 INFO:tasks.workunit.client.1.vm07.stdout:4/940: write d0/d4/d10/d3c/d2b/d54/f139 [130119,36241] 0 2026-03-10T12:38:24.214 INFO:tasks.workunit.client.0.vm00.stdout:6/809: mkdir d2/d16/d29/d31/d34/d124 0 2026-03-10T12:38:24.216 INFO:tasks.workunit.client.1.vm07.stdout:2/737: symlink d0/d42/d4e/daf/lfb 0 2026-03-10T12:38:24.220 INFO:tasks.workunit.client.1.vm07.stdout:2/738: dwrite d0/d29/d64/d74/f9e [4194304,4194304] 0 2026-03-10T12:38:24.247 INFO:tasks.workunit.client.0.vm00.stdout:6/810: dread d2/da/f11 [0,4194304] 0 2026-03-10T12:38:24.248 INFO:tasks.workunit.client.0.vm00.stdout:7/921: truncate da/d1b/d40/fca 2639684 0 2026-03-10T12:38:24.249 INFO:tasks.workunit.client.0.vm00.stdout:7/922: dread da/d3f/d60/f88 [0,4194304] 0 2026-03-10T12:38:24.250 INFO:tasks.workunit.client.1.vm07.stdout:1/825: mkdir d9/df/d55/d115 0 2026-03-10T12:38:24.258 INFO:tasks.workunit.client.1.vm07.stdout:9/892: write d5/fda [741991,125989] 0 2026-03-10T12:38:24.258 INFO:tasks.workunit.client.1.vm07.stdout:9/893: chown d5/d1f/d5e/d6b/l6d 54512 1 2026-03-10T12:38:24.267 INFO:tasks.workunit.client.1.vm07.stdout:8/790: dwrite d1/d3/f73 [0,4194304] 0 2026-03-10T12:38:24.269 INFO:tasks.workunit.client.1.vm07.stdout:5/838: write d0/d22/d18/d19/d21/d54/f7d [920346,7067] 0 2026-03-10T12:38:24.273 INFO:tasks.workunit.client.1.vm07.stdout:3/848: dwrite dc/dd/d43/d5c/f9d [0,4194304] 0 2026-03-10T12:38:24.277 INFO:tasks.workunit.client.1.vm07.stdout:6/797: dwrite d1/d4/d6/d4e/f8b [0,4194304] 0 2026-03-10T12:38:24.277 INFO:tasks.workunit.client.1.vm07.stdout:3/849: read dc/dd/d28/f46 [1128659,336] 0 2026-03-10T12:38:24.277 INFO:tasks.workunit.client.1.vm07.stdout:3/850: readlink dc/d18/d24/l63 0 2026-03-10T12:38:24.277 INFO:tasks.workunit.client.1.vm07.stdout:3/851: readlink la 0 2026-03-10T12:38:24.277 INFO:tasks.workunit.client.1.vm07.stdout:3/852: chown dc/d18/d99/da3/def/c100 37256 1 2026-03-10T12:38:24.279 INFO:tasks.workunit.client.1.vm07.stdout:5/839: read d0/d22/f16 [2668154,39844] 0 2026-03-10T12:38:24.283 INFO:tasks.workunit.client.0.vm00.stdout:7/923: rmdir da/d41/d7b/d9d/dc8 39 2026-03-10T12:38:24.284 INFO:tasks.workunit.client.0.vm00.stdout:7/924: fsync da/d3f/d60/f110 0 2026-03-10T12:38:24.285 INFO:tasks.workunit.client.1.vm07.stdout:0/922: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/d135 0 2026-03-10T12:38:24.285 INFO:tasks.workunit.client.1.vm07.stdout:7/778: creat d0/d61/db4/df4/f102 x:0 0 0 2026-03-10T12:38:24.291 INFO:tasks.workunit.client.1.vm07.stdout:1/826: truncate d9/df/d29/d2b/d31/fc6 983096 0 2026-03-10T12:38:24.297 INFO:tasks.workunit.client.1.vm07.stdout:8/791: readlink d1/d3/d6/d50/d70/dd4/lf2 0 2026-03-10T12:38:24.298 INFO:tasks.workunit.client.1.vm07.stdout:3/853: mkdir dc/d18/d99/d122 0 2026-03-10T12:38:24.299 INFO:tasks.workunit.client.1.vm07.stdout:8/792: dwrite d1/d3/d40/d92/dba/feb [0,4194304] 0 2026-03-10T12:38:24.301 INFO:tasks.workunit.client.1.vm07.stdout:5/840: unlink d0/d22/d18/d19/d21/dc2/df0/l112 0 2026-03-10T12:38:24.301 INFO:tasks.workunit.client.1.vm07.stdout:8/793: chown d1/d3/d40/d92/f94 27090 1 2026-03-10T12:38:24.303 INFO:tasks.workunit.client.1.vm07.stdout:7/779: creat d0/d61/db4/f103 x:0 0 0 2026-03-10T12:38:24.304 INFO:tasks.workunit.client.1.vm07.stdout:7/780: dread - d0/d61/db4/fdc zero size 2026-03-10T12:38:24.307 INFO:tasks.workunit.client.1.vm07.stdout:0/923: symlink d0/d14/d5f/d76/d2f/d31/d79/d9e/l136 0 2026-03-10T12:38:24.315 INFO:tasks.workunit.client.1.vm07.stdout:1/827: mknod d9/d2d/d4f/dde/c116 0 2026-03-10T12:38:24.315 INFO:tasks.workunit.client.1.vm07.stdout:5/841: rmdir d0/d22/d18/d19/d72/dcc 39 2026-03-10T12:38:24.315 INFO:tasks.workunit.client.1.vm07.stdout:5/842: chown d0/d22/d18/d19/d21/d54/dcb 117 1 2026-03-10T12:38:24.315 INFO:tasks.workunit.client.1.vm07.stdout:9/894: link d5/d69/d93/d97/fc3 d5/d1f/d5e/d6b/f131 0 2026-03-10T12:38:24.315 INFO:tasks.workunit.client.1.vm07.stdout:7/781: creat d0/d61/d79/f104 x:0 0 0 2026-03-10T12:38:24.315 INFO:tasks.workunit.client.1.vm07.stdout:7/782: chown d0/d61/d79/db5/c100 54 1 2026-03-10T12:38:24.315 INFO:tasks.workunit.client.1.vm07.stdout:6/798: rename d1/d4/d9b/de1 to d1/d106 0 2026-03-10T12:38:24.318 INFO:tasks.workunit.client.1.vm07.stdout:5/843: mknod d0/d22/d18/d3e/d53/c123 0 2026-03-10T12:38:24.319 INFO:tasks.workunit.client.1.vm07.stdout:1/828: rename d9/d2d/d80/d8e/c109 to d9/d2d/c117 0 2026-03-10T12:38:24.321 INFO:tasks.workunit.client.1.vm07.stdout:3/854: rmdir dc/dd/d43/d76/d95/dd8 0 2026-03-10T12:38:24.323 INFO:tasks.workunit.client.1.vm07.stdout:0/924: mkdir d0/d14/d5f/d76/d2f/d31/d79/dcc/d137 0 2026-03-10T12:38:24.327 INFO:tasks.workunit.client.1.vm07.stdout:1/829: rmdir d9/d2d/d80/d8e/dc7 39 2026-03-10T12:38:24.328 INFO:tasks.workunit.client.1.vm07.stdout:1/830: dwrite d9/d2d/d4f/d75/de3/ff9 [0,4194304] 0 2026-03-10T12:38:24.332 INFO:tasks.workunit.client.1.vm07.stdout:1/831: unlink d9/df/d55/d9f/c10b 0 2026-03-10T12:38:24.338 INFO:tasks.workunit.client.1.vm07.stdout:3/855: dread dc/dd/d28/d7a/fba [0,4194304] 0 2026-03-10T12:38:24.339 INFO:tasks.workunit.client.1.vm07.stdout:6/799: getdents d1/d4/d6/d46/d4d/dc7 0 2026-03-10T12:38:24.345 INFO:tasks.workunit.client.1.vm07.stdout:1/832: dread d9/df/d29/d2b/d31/d91/faf [0,4194304] 0 2026-03-10T12:38:24.349 INFO:tasks.workunit.client.1.vm07.stdout:6/800: dread - d1/d4/d6/d16/fd1 zero size 2026-03-10T12:38:24.349 INFO:tasks.workunit.client.1.vm07.stdout:6/801: readlink d1/d4/d6/d43/d65/l95 0 2026-03-10T12:38:24.349 INFO:tasks.workunit.client.1.vm07.stdout:6/802: chown d1/d4/d6/f60 2870799 1 2026-03-10T12:38:24.349 INFO:tasks.workunit.client.1.vm07.stdout:3/856: rename dc/d18/d99/d9c to dc/d18/d99/d123 0 2026-03-10T12:38:24.350 INFO:tasks.workunit.client.1.vm07.stdout:0/925: dread d0/d14/f36 [0,4194304] 0 2026-03-10T12:38:24.351 INFO:tasks.workunit.client.1.vm07.stdout:1/833: read d9/df/d29/d2b/d31/f7d [261595,83830] 0 2026-03-10T12:38:24.351 INFO:tasks.workunit.client.1.vm07.stdout:6/803: mkdir d1/d4/d6/d46/d4d/d107 0 2026-03-10T12:38:24.352 INFO:tasks.workunit.client.1.vm07.stdout:6/804: write d1/d4/f5a [256154,13147] 0 2026-03-10T12:38:24.353 INFO:tasks.workunit.client.1.vm07.stdout:6/805: rename d1/d4/d6/d46/d4d/dc7/dd9 to d1/d4/d6/d46/d4d/dc7/dd9/de3/d108 22 2026-03-10T12:38:24.355 INFO:tasks.workunit.client.1.vm07.stdout:0/926: mknod d0/d14/d5f/d76/da1/c138 0 2026-03-10T12:38:24.364 INFO:tasks.workunit.client.1.vm07.stdout:1/834: creat d9/df/dc2/de1/f118 x:0 0 0 2026-03-10T12:38:24.366 INFO:tasks.workunit.client.1.vm07.stdout:3/857: dread dc/dd/d1f/f30 [0,4194304] 0 2026-03-10T12:38:24.367 INFO:tasks.workunit.client.1.vm07.stdout:6/806: rename d1/d4/d6/d4e/d64/f6f to d1/d4/d6/d46/d4d/dc7/f109 0 2026-03-10T12:38:24.368 INFO:tasks.workunit.client.1.vm07.stdout:1/835: dwrite d9/d2d/d4f/d75/d77/da7/fcd [0,4194304] 0 2026-03-10T12:38:24.375 INFO:tasks.workunit.client.1.vm07.stdout:0/927: unlink d0/d14/d5f/d76/d2f/d31/d79/d85/fc6 0 2026-03-10T12:38:24.378 INFO:tasks.workunit.client.1.vm07.stdout:6/807: rmdir d1/d4/d6/d46/d4d 39 2026-03-10T12:38:24.378 INFO:tasks.workunit.client.1.vm07.stdout:6/808: readlink d1/d4/d4a/l52 0 2026-03-10T12:38:24.384 INFO:tasks.workunit.client.1.vm07.stdout:0/928: rename d0/d14/d5f/d76/l4c to d0/d14/d5f/d41/d6a/d9a/l139 0 2026-03-10T12:38:24.387 INFO:tasks.workunit.client.1.vm07.stdout:6/809: symlink d1/d4/d44/l10a 0 2026-03-10T12:38:24.392 INFO:tasks.workunit.client.1.vm07.stdout:6/810: stat d1/dd7/l72 0 2026-03-10T12:38:24.392 INFO:tasks.workunit.client.1.vm07.stdout:4/941: write d0/d4/d10/f116 [417571,114643] 0 2026-03-10T12:38:24.394 INFO:tasks.workunit.client.1.vm07.stdout:4/942: write d0/d4/d10/d3c/d2b/d54/de1/f91 [932660,70981] 0 2026-03-10T12:38:24.396 INFO:tasks.workunit.client.0.vm00.stdout:7/925: write da/d1b/d40/f74 [2561073,43687] 0 2026-03-10T12:38:24.398 INFO:tasks.workunit.client.1.vm07.stdout:2/739: write d0/d42/d1f/f2f [2644376,37438] 0 2026-03-10T12:38:24.401 INFO:tasks.workunit.client.1.vm07.stdout:4/943: dwrite d0/d4/d5/da/d95/f121 [0,4194304] 0 2026-03-10T12:38:24.410 INFO:tasks.workunit.client.1.vm07.stdout:8/794: dwrite d1/d3/d6/d50/faa [0,4194304] 0 2026-03-10T12:38:24.419 INFO:tasks.workunit.client.1.vm07.stdout:9/895: dwrite d5/d13/d57/d4f/f58 [0,4194304] 0 2026-03-10T12:38:24.420 INFO:tasks.workunit.client.1.vm07.stdout:9/896: chown d5 447 1 2026-03-10T12:38:24.426 INFO:tasks.workunit.client.0.vm00.stdout:7/926: symlink da/d26/d37/d56/l143 0 2026-03-10T12:38:24.443 INFO:tasks.workunit.client.1.vm07.stdout:7/783: dwrite d0/d61/db4/fdc [0,4194304] 0 2026-03-10T12:38:24.444 INFO:tasks.workunit.client.1.vm07.stdout:1/836: rmdir d9/df/d29/d2b/d30/d101 0 2026-03-10T12:38:24.444 INFO:tasks.workunit.client.1.vm07.stdout:7/784: dwrite d0/d61/db4/f4b [8388608,4194304] 0 2026-03-10T12:38:24.444 INFO:tasks.workunit.client.1.vm07.stdout:5/844: dwrite d0/d22/d18/d3e/d5d/db6/fc4 [0,4194304] 0 2026-03-10T12:38:24.444 INFO:tasks.workunit.client.1.vm07.stdout:4/944: fsync d0/d4/df2/f11f 0 2026-03-10T12:38:24.445 INFO:tasks.workunit.client.0.vm00.stdout:7/927: creat da/f144 x:0 0 0 2026-03-10T12:38:24.446 INFO:tasks.workunit.client.0.vm00.stdout:7/928: dread - da/d25/d2e/d4c/f126 zero size 2026-03-10T12:38:24.446 INFO:tasks.workunit.client.0.vm00.stdout:7/929: stat da/d25/d2c/d82/d68/df8/c10f 0 2026-03-10T12:38:24.452 INFO:tasks.workunit.client.1.vm07.stdout:6/811: getdents d1/d4/d4a 0 2026-03-10T12:38:24.454 INFO:tasks.workunit.client.1.vm07.stdout:5/845: dread d0/d22/d18/d19/d21/fbd [0,4194304] 0 2026-03-10T12:38:24.462 INFO:tasks.workunit.client.1.vm07.stdout:1/837: rmdir d9/d2d/d4f/d75/de3 39 2026-03-10T12:38:24.463 INFO:tasks.workunit.client.1.vm07.stdout:1/838: write d9/d2d/d4f/d5a/fdd [833255,34222] 0 2026-03-10T12:38:24.469 INFO:tasks.workunit.client.1.vm07.stdout:8/795: mkdir d1/d3/d40/d104 0 2026-03-10T12:38:24.470 INFO:tasks.workunit.client.1.vm07.stdout:8/796: write d1/d3/d40/d92/dba/fc3 [3897634,99653] 0 2026-03-10T12:38:24.473 INFO:tasks.workunit.client.1.vm07.stdout:5/846: creat d0/d22/d18/d19/d21/d54/dcb/db8/f124 x:0 0 0 2026-03-10T12:38:24.475 INFO:tasks.workunit.client.1.vm07.stdout:2/740: link d0/d42/d26/d38/d4f/d5d/l7c d0/dcd/lfc 0 2026-03-10T12:38:24.480 INFO:tasks.workunit.client.1.vm07.stdout:8/797: mkdir d1/d3/db2/dcd/d105 0 2026-03-10T12:38:24.480 INFO:tasks.workunit.client.1.vm07.stdout:7/785: rmdir d0/d47/da0/dd4/de2 0 2026-03-10T12:38:24.480 INFO:tasks.workunit.client.1.vm07.stdout:7/786: write d0/f4f [137098,51582] 0 2026-03-10T12:38:24.480 INFO:tasks.workunit.client.1.vm07.stdout:2/741: rmdir d0/d29 39 2026-03-10T12:38:24.482 INFO:tasks.workunit.client.1.vm07.stdout:8/798: dread d1/d3/d11/f77 [4194304,4194304] 0 2026-03-10T12:38:24.483 INFO:tasks.workunit.client.1.vm07.stdout:7/787: dwrite d0/d47/dde/ff6 [0,4194304] 0 2026-03-10T12:38:24.485 INFO:tasks.workunit.client.1.vm07.stdout:7/788: stat d0/c36 0 2026-03-10T12:38:24.486 INFO:tasks.workunit.client.1.vm07.stdout:2/742: symlink d0/d42/d4e/d77/lfd 0 2026-03-10T12:38:24.487 INFO:tasks.workunit.client.1.vm07.stdout:4/945: read d0/d4/d10/d3c/f68 [469805,95048] 0 2026-03-10T12:38:24.492 INFO:tasks.workunit.client.1.vm07.stdout:1/839: link d9/d2d/d4f/d75/cf6 d9/df/d55/c119 0 2026-03-10T12:38:24.494 INFO:tasks.workunit.client.1.vm07.stdout:7/789: getdents d0/d57 0 2026-03-10T12:38:24.496 INFO:tasks.workunit.client.1.vm07.stdout:7/790: unlink d0/d61/db4/d8a/fd8 0 2026-03-10T12:38:24.498 INFO:tasks.workunit.client.1.vm07.stdout:7/791: dread d0/d61/db4/fdc [0,4194304] 0 2026-03-10T12:38:24.499 INFO:tasks.workunit.client.1.vm07.stdout:7/792: chown d0/d61/db4/d8a/d9d/ce4 102 1 2026-03-10T12:38:24.500 INFO:tasks.workunit.client.1.vm07.stdout:0/929: sync 2026-03-10T12:38:24.500 INFO:tasks.workunit.client.1.vm07.stdout:6/812: sync 2026-03-10T12:38:24.501 INFO:tasks.workunit.client.1.vm07.stdout:0/930: readlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/l10e 0 2026-03-10T12:38:24.505 INFO:tasks.workunit.client.1.vm07.stdout:6/813: dwrite d1/d4/f5a [0,4194304] 0 2026-03-10T12:38:24.514 INFO:tasks.workunit.client.1.vm07.stdout:0/931: dread - d0/d14/d5f/d41/d6a/fe0 zero size 2026-03-10T12:38:24.514 INFO:tasks.workunit.client.1.vm07.stdout:6/814: mkdir d1/d4/d6/d4e/d64/d10b 0 2026-03-10T12:38:24.517 INFO:tasks.workunit.client.1.vm07.stdout:0/932: truncate d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/fa4 3993971 0 2026-03-10T12:38:24.517 INFO:tasks.workunit.client.1.vm07.stdout:0/933: chown d0/d14/d5f/d3b/dbc/fb6 12310915 1 2026-03-10T12:38:24.518 INFO:tasks.workunit.client.1.vm07.stdout:6/815: rename d1/d4/d6/d46/d4d/fb to d1/d4/d6/d16/d1a/d9d/f10c 0 2026-03-10T12:38:24.521 INFO:tasks.workunit.client.1.vm07.stdout:6/816: chown d1/d4/d6/d46/d4d/ce8 1715 1 2026-03-10T12:38:24.521 INFO:tasks.workunit.client.1.vm07.stdout:0/934: dread - d0/d14/d5f/d76/d2f/d31/d4f/d9d/f104 zero size 2026-03-10T12:38:24.521 INFO:tasks.workunit.client.1.vm07.stdout:6/817: creat d1/d4/d6/d16/f10d x:0 0 0 2026-03-10T12:38:24.522 INFO:tasks.workunit.client.1.vm07.stdout:0/935: creat d0/d14/d5f/d76/f13a x:0 0 0 2026-03-10T12:38:24.527 INFO:tasks.workunit.client.1.vm07.stdout:0/936: dwrite d0/d14/d5f/d76/d2f/d31/d79/ffd [0,4194304] 0 2026-03-10T12:38:24.546 INFO:tasks.workunit.client.1.vm07.stdout:0/937: read d0/d14/d5f/d76/d2f/d31/d4f/d60/f75 [3076571,51838] 0 2026-03-10T12:38:24.555 INFO:tasks.workunit.client.1.vm07.stdout:0/938: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/f13b x:0 0 0 2026-03-10T12:38:24.558 INFO:tasks.workunit.client.1.vm07.stdout:0/939: symlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115/d11d/l13c 0 2026-03-10T12:38:24.562 INFO:tasks.workunit.client.1.vm07.stdout:0/940: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115/d11d/f13d x:0 0 0 2026-03-10T12:38:24.566 INFO:tasks.workunit.client.1.vm07.stdout:0/941: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/f112 [0,4194304] 0 2026-03-10T12:38:24.572 INFO:tasks.workunit.client.1.vm07.stdout:0/942: read d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf [527508,114420] 0 2026-03-10T12:38:24.579 INFO:tasks.workunit.client.1.vm07.stdout:3/858: dwrite dc/d18/d24/fe8 [0,4194304] 0 2026-03-10T12:38:24.582 INFO:tasks.workunit.client.1.vm07.stdout:3/859: write dc/d18/fdd [674985,60459] 0 2026-03-10T12:38:24.583 INFO:tasks.workunit.client.1.vm07.stdout:8/799: dread d1/d3/f16 [0,4194304] 0 2026-03-10T12:38:24.584 INFO:tasks.workunit.client.1.vm07.stdout:0/943: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/fc7 [0,4194304] 0 2026-03-10T12:38:24.589 INFO:tasks.workunit.client.1.vm07.stdout:3/860: fsync dc/d18/d2d/f80 0 2026-03-10T12:38:24.594 INFO:tasks.workunit.client.1.vm07.stdout:3/861: write dc/d18/d2d/f71 [1253217,25211] 0 2026-03-10T12:38:24.635 INFO:tasks.workunit.client.1.vm07.stdout:3/862: sync 2026-03-10T12:38:24.641 INFO:tasks.workunit.client.1.vm07.stdout:3/863: rename dc/dd/d28/l6b to dc/dd/d1f/d6f/l124 0 2026-03-10T12:38:24.647 INFO:tasks.workunit.client.1.vm07.stdout:3/864: rename dc/le0 to dc/d18/l125 0 2026-03-10T12:38:24.650 INFO:tasks.workunit.client.0.vm00.stdout:6/811: rename d2/d51/d70/c114 to d2/d14/dc0/c125 0 2026-03-10T12:38:24.650 INFO:tasks.workunit.client.1.vm07.stdout:3/865: chown dc/d18/d2d/de5/f10e 63672399 1 2026-03-10T12:38:24.651 INFO:tasks.workunit.client.1.vm07.stdout:9/897: write d5/d13/d57/d4f/d6a/fba [1549360,94206] 0 2026-03-10T12:38:24.653 INFO:tasks.workunit.client.0.vm00.stdout:6/812: dwrite d2/d16/d29/d31/d88/dd5/fe8 [0,4194304] 0 2026-03-10T12:38:24.660 INFO:tasks.workunit.client.1.vm07.stdout:3/866: dread dc/dd/d28/d3b/f9f [0,4194304] 0 2026-03-10T12:38:24.660 INFO:tasks.workunit.client.1.vm07.stdout:3/867: readlink dc/d18/d2d/l102 0 2026-03-10T12:38:24.663 INFO:tasks.workunit.client.1.vm07.stdout:5/847: write d0/d22/d18/d3e/d5d/dcf/fd2 [1047609,50295] 0 2026-03-10T12:38:24.670 INFO:tasks.workunit.client.1.vm07.stdout:3/868: mkdir dc/dd/d1f/dc7/dc9/d126 0 2026-03-10T12:38:24.672 INFO:tasks.workunit.client.0.vm00.stdout:7/930: read f0 [1194728,18386] 0 2026-03-10T12:38:24.679 INFO:tasks.workunit.client.0.vm00.stdout:6/813: unlink d2/d14/d7a/db9/c9a 0 2026-03-10T12:38:24.680 INFO:tasks.workunit.client.1.vm07.stdout:2/743: write d0/f9c [113999,8939] 0 2026-03-10T12:38:24.682 INFO:tasks.workunit.client.1.vm07.stdout:4/946: write d0/d4/d10/d3c/d2b/d2d/f65 [398796,50990] 0 2026-03-10T12:38:24.685 INFO:tasks.workunit.client.1.vm07.stdout:1/840: dwrite d9/df/dc2/ff2 [0,4194304] 0 2026-03-10T12:38:24.691 INFO:tasks.workunit.client.1.vm07.stdout:7/793: dwrite d0/f5f [0,4194304] 0 2026-03-10T12:38:24.697 INFO:tasks.workunit.client.1.vm07.stdout:7/794: dwrite d0/fc [0,4194304] 0 2026-03-10T12:38:24.711 INFO:tasks.workunit.client.1.vm07.stdout:3/869: mknod dc/d18/de2/c127 0 2026-03-10T12:38:24.717 INFO:tasks.workunit.client.1.vm07.stdout:2/744: creat d0/d42/d4e/ffe x:0 0 0 2026-03-10T12:38:24.717 INFO:tasks.workunit.client.1.vm07.stdout:6/818: write d1/dd7/d66/dd6/fda [366838,74102] 0 2026-03-10T12:38:24.717 INFO:tasks.workunit.client.1.vm07.stdout:4/947: creat d0/d4/d10/d3c/d2b/d2d/d9c/f151 x:0 0 0 2026-03-10T12:38:24.717 INFO:tasks.workunit.client.1.vm07.stdout:2/745: dwrite d0/d42/d26/d7d/fea [0,4194304] 0 2026-03-10T12:38:24.719 INFO:tasks.workunit.client.0.vm00.stdout:6/814: dread d2/d14/f32 [0,4194304] 0 2026-03-10T12:38:24.720 INFO:tasks.workunit.client.1.vm07.stdout:7/795: readlink d0/d61/db4/d8a/l96 0 2026-03-10T12:38:24.721 INFO:tasks.workunit.client.1.vm07.stdout:7/796: truncate d0/d47/dde/ff0 9675 0 2026-03-10T12:38:24.723 INFO:tasks.workunit.client.1.vm07.stdout:0/944: write d0/d14/d5f/d76/d2f/d31/f5a [628399,7590] 0 2026-03-10T12:38:24.724 INFO:tasks.workunit.client.1.vm07.stdout:0/945: chown d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/f112 1 1 2026-03-10T12:38:24.724 INFO:tasks.workunit.client.0.vm00.stdout:6/815: creat d2/d9f/dce/f126 x:0 0 0 2026-03-10T12:38:24.725 INFO:tasks.workunit.client.1.vm07.stdout:3/870: readlink dc/dd/db5/l90 0 2026-03-10T12:38:24.726 INFO:tasks.workunit.client.0.vm00.stdout:6/816: symlink d2/d14/d7a/db9/l127 0 2026-03-10T12:38:24.726 INFO:tasks.workunit.client.0.vm00.stdout:6/817: dread - d2/d16/d74/f7d zero size 2026-03-10T12:38:24.728 INFO:tasks.workunit.client.1.vm07.stdout:4/948: mknod d0/d4/d10/d3c/d2b/d2d/c152 0 2026-03-10T12:38:24.734 INFO:tasks.workunit.client.0.vm00.stdout:6/818: creat d2/d16/d29/d31/d88/d92/daa/dc1/f128 x:0 0 0 2026-03-10T12:38:24.735 INFO:tasks.workunit.client.1.vm07.stdout:2/746: unlink d0/d80/fbc 0 2026-03-10T12:38:24.737 INFO:tasks.workunit.client.1.vm07.stdout:7/797: mkdir d0/d57/dd6/d80/d105 0 2026-03-10T12:38:24.737 INFO:tasks.workunit.client.0.vm00.stdout:6/819: dwrite d2/d16/f41 [0,4194304] 0 2026-03-10T12:38:24.738 INFO:tasks.workunit.client.1.vm07.stdout:5/848: link d0/d22/d18/d19/d72/c9a d0/d22/d18/d19/d21/dc2/c125 0 2026-03-10T12:38:24.742 INFO:tasks.workunit.client.1.vm07.stdout:1/841: creat d9/df/d29/f11a x:0 0 0 2026-03-10T12:38:24.744 INFO:tasks.workunit.client.1.vm07.stdout:7/798: unlink d0/l46 0 2026-03-10T12:38:24.751 INFO:tasks.workunit.client.1.vm07.stdout:5/849: mkdir d0/d22/d18/d19/d36/d75/d77/d126 0 2026-03-10T12:38:24.755 INFO:tasks.workunit.client.0.vm00.stdout:6/820: creat d2/da/dc/d83/d119/f129 x:0 0 0 2026-03-10T12:38:24.755 INFO:tasks.workunit.client.1.vm07.stdout:6/819: truncate d1/d4/d6/d16/d49/fd3 1385384 0 2026-03-10T12:38:24.755 INFO:tasks.workunit.client.1.vm07.stdout:2/747: rmdir d0/d29/d64/d6c 39 2026-03-10T12:38:24.755 INFO:tasks.workunit.client.1.vm07.stdout:2/748: chown d0/d42/d4e/dab 6871416 1 2026-03-10T12:38:24.755 INFO:tasks.workunit.client.1.vm07.stdout:0/946: creat d0/f13e x:0 0 0 2026-03-10T12:38:24.757 INFO:tasks.workunit.client.0.vm00.stdout:6/821: creat d2/da/dc/d83/d119/f12a x:0 0 0 2026-03-10T12:38:24.757 INFO:tasks.workunit.client.1.vm07.stdout:5/850: truncate d0/d22/d18/d3e/d53/fa3 3571686 0 2026-03-10T12:38:24.761 INFO:tasks.workunit.client.1.vm07.stdout:2/749: fsync d0/f4a 0 2026-03-10T12:38:24.766 INFO:tasks.workunit.client.1.vm07.stdout:2/750: write d0/d42/d26/d7d/fc8 [18866,87372] 0 2026-03-10T12:38:24.769 INFO:tasks.workunit.client.1.vm07.stdout:6/820: dread d1/d4/d6/f80 [0,4194304] 0 2026-03-10T12:38:24.772 INFO:tasks.workunit.client.1.vm07.stdout:8/800: write d1/d3/d6c/dde/fe0 [256259,40592] 0 2026-03-10T12:38:24.774 INFO:tasks.workunit.client.1.vm07.stdout:5/851: dread d0/d22/d18/d19/d21/d54/dcb/f6a [0,4194304] 0 2026-03-10T12:38:24.776 INFO:tasks.workunit.client.0.vm00.stdout:6/822: mkdir d2/d51/d12b 0 2026-03-10T12:38:24.778 INFO:tasks.workunit.client.1.vm07.stdout:0/947: creat d0/d14/d5f/d76/d2f/d31/d4f/d9d/d114/f13f x:0 0 0 2026-03-10T12:38:24.778 INFO:tasks.workunit.client.1.vm07.stdout:7/799: link d0/d61/d79/f95 d0/d47/da0/dd4/f106 0 2026-03-10T12:38:24.782 INFO:tasks.workunit.client.1.vm07.stdout:9/898: dwrite d5/d1f/d7d/ffb [0,4194304] 0 2026-03-10T12:38:24.785 INFO:tasks.workunit.client.1.vm07.stdout:4/949: sync 2026-03-10T12:38:24.785 INFO:tasks.workunit.client.1.vm07.stdout:3/871: sync 2026-03-10T12:38:24.785 INFO:tasks.workunit.client.1.vm07.stdout:2/751: sync 2026-03-10T12:38:24.788 INFO:tasks.workunit.client.1.vm07.stdout:7/800: mkdir d0/d57/dd6/d107 0 2026-03-10T12:38:24.796 INFO:tasks.workunit.client.1.vm07.stdout:9/899: creat d5/d69/d93/f132 x:0 0 0 2026-03-10T12:38:24.796 INFO:tasks.workunit.client.1.vm07.stdout:6/821: symlink d1/dd7/da3/dd5/l10e 0 2026-03-10T12:38:24.796 INFO:tasks.workunit.client.1.vm07.stdout:2/752: truncate d0/d29/d64/d6c/fef 307565 0 2026-03-10T12:38:24.796 INFO:tasks.workunit.client.1.vm07.stdout:3/872: symlink dc/d18/de2/df6/l128 0 2026-03-10T12:38:24.796 INFO:tasks.workunit.client.1.vm07.stdout:0/948: mkdir d0/d14/d5f/d76/d2f/d31/d4f/d9d/d140 0 2026-03-10T12:38:24.796 INFO:tasks.workunit.client.1.vm07.stdout:7/801: mkdir d0/d57/d108 0 2026-03-10T12:38:24.797 INFO:tasks.workunit.client.1.vm07.stdout:9/900: dread d5/d16/d23/d26/d68/fdc [0,4194304] 0 2026-03-10T12:38:24.797 INFO:tasks.workunit.client.1.vm07.stdout:3/873: dwrite dc/dd/fc5 [0,4194304] 0 2026-03-10T12:38:24.804 INFO:tasks.workunit.client.1.vm07.stdout:3/874: dwrite dc/d18/de2/df6/ffc [8388608,4194304] 0 2026-03-10T12:38:24.817 INFO:tasks.workunit.client.0.vm00.stdout:6/823: unlink d2/d9f/df6/cf3 0 2026-03-10T12:38:24.818 INFO:tasks.workunit.client.1.vm07.stdout:6/822: symlink d1/d4/d6/d46/d4d/dc7/dd9/l10f 0 2026-03-10T12:38:24.832 INFO:tasks.workunit.client.1.vm07.stdout:3/875: unlink dc/dd/d28/d7a/d8e/f9b 0 2026-03-10T12:38:24.832 INFO:tasks.workunit.client.1.vm07.stdout:3/876: chown dc/d18/d99/d122 512129 1 2026-03-10T12:38:24.841 INFO:tasks.workunit.client.1.vm07.stdout:4/950: creat d0/d4/f153 x:0 0 0 2026-03-10T12:38:24.842 INFO:tasks.workunit.client.1.vm07.stdout:4/951: chown d0/d4/d10/d5f/l96 13485526 1 2026-03-10T12:38:24.842 INFO:tasks.workunit.client.1.vm07.stdout:4/952: write d0/d4/f153 [287270,48865] 0 2026-03-10T12:38:24.846 INFO:tasks.workunit.client.1.vm07.stdout:3/877: dread dc/d18/d24/f3e [4194304,4194304] 0 2026-03-10T12:38:24.847 INFO:tasks.workunit.client.1.vm07.stdout:2/753: mknod d0/d29/d64/db5/dbb/df9/cff 0 2026-03-10T12:38:24.854 INFO:tasks.workunit.client.1.vm07.stdout:2/754: chown d0/f18 15970 1 2026-03-10T12:38:24.858 INFO:tasks.workunit.client.1.vm07.stdout:9/901: dread d5/d13/d2c/de6/dce/ff9 [0,4194304] 0 2026-03-10T12:38:24.861 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:24 vm00.local ceph-mon[50686]: Reconfiguring daemon prometheus.vm00 on vm00 2026-03-10T12:38:24.866 INFO:tasks.workunit.client.1.vm07.stdout:2/755: unlink d0/d42/d26/d38/d4f/f65 0 2026-03-10T12:38:24.869 INFO:tasks.workunit.client.1.vm07.stdout:2/756: dwrite d0/d42/f22 [0,4194304] 0 2026-03-10T12:38:24.881 INFO:tasks.workunit.client.1.vm07.stdout:0/949: getdents d0/d14/d7c 0 2026-03-10T12:38:24.881 INFO:tasks.workunit.client.1.vm07.stdout:0/950: dread - d0/d14/d5f/d41/fe8 zero size 2026-03-10T12:38:24.882 INFO:tasks.workunit.client.1.vm07.stdout:0/951: fsync d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/f132 0 2026-03-10T12:38:24.885 INFO:tasks.workunit.client.1.vm07.stdout:9/902: rename d5/d13/d22/ced to d5/d13/d22/c133 0 2026-03-10T12:38:24.887 INFO:tasks.workunit.client.1.vm07.stdout:4/953: link d0/d4/df2/df6/d46/f85 d0/d4/d10/d3c/d2b/d54/de1/f154 0 2026-03-10T12:38:24.891 INFO:tasks.workunit.client.0.vm00.stdout:7/931: write da/d41/d7b/f83 [1179426,121855] 0 2026-03-10T12:38:24.901 INFO:tasks.workunit.client.0.vm00.stdout:7/932: fdatasync da/d26/d37/d56/fbb 0 2026-03-10T12:38:24.902 INFO:tasks.workunit.client.0.vm00.stdout:7/933: creat da/d26/d37/dc7/f145 x:0 0 0 2026-03-10T12:38:24.904 INFO:tasks.workunit.client.1.vm07.stdout:1/842: dwrite d9/df/dc2/f7a [0,4194304] 0 2026-03-10T12:38:24.906 INFO:tasks.workunit.client.1.vm07.stdout:1/843: write d9/df/d29/d2b/d31/d91/d59/fa4 [4241771,8436] 0 2026-03-10T12:38:24.909 INFO:tasks.workunit.client.1.vm07.stdout:1/844: dwrite d9/df/d29/d2b/d31/fd8 [0,4194304] 0 2026-03-10T12:38:24.916 INFO:tasks.workunit.client.1.vm07.stdout:5/852: write d0/d22/d18/d19/d21/d54/dcb/f87 [714509,65040] 0 2026-03-10T12:38:24.919 INFO:tasks.workunit.client.1.vm07.stdout:8/801: dwrite d1/f79 [4194304,4194304] 0 2026-03-10T12:38:24.936 INFO:tasks.workunit.client.0.vm00.stdout:6/824: truncate d2/d16/f41 291350 0 2026-03-10T12:38:24.936 INFO:tasks.workunit.client.0.vm00.stdout:7/934: creat da/d47/f146 x:0 0 0 2026-03-10T12:38:24.939 INFO:tasks.workunit.client.0.vm00.stdout:6/825: dread d2/d16/d29/d31/d88/d92/daa/dc1/f122 [0,4194304] 0 2026-03-10T12:38:24.940 INFO:tasks.workunit.client.1.vm07.stdout:6/823: write d1/d4/d6/d96/fea [788223,126468] 0 2026-03-10T12:38:24.942 INFO:tasks.workunit.client.0.vm00.stdout:7/935: creat da/d47/dfd/f147 x:0 0 0 2026-03-10T12:38:24.954 INFO:tasks.workunit.client.1.vm07.stdout:3/878: write dc/dd/f20 [3365212,100317] 0 2026-03-10T12:38:24.954 INFO:tasks.workunit.client.1.vm07.stdout:3/879: fsync dc/d18/d24/fe8 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.1.vm07.stdout:5/853: symlink d0/d22/d18/d3e/d53/d9e/l127 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.1.vm07.stdout:8/802: mknod d1/d3/d6/d50/d70/dcf/c106 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.0.vm00.stdout:6/826: fdatasync d2/d16/d74/f6e 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.0.vm00.stdout:6/827: write d2/da/dc/d94/ffe [409023,42531] 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.0.vm00.stdout:7/936: mkdir da/d25/d2c/d82/d68/d124/d13f/d148 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.0.vm00.stdout:6/828: mkdir d2/d14/dbb/d12c 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.0.vm00.stdout:7/937: creat da/d26/d37/d56/f149 x:0 0 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.0.vm00.stdout:7/938: dread - da/d41/d7b/d9d/fa8 zero size 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.0.vm00.stdout:7/939: dread da/d25/d2e/d4c/f92 [0,4194304] 0 2026-03-10T12:38:24.955 INFO:tasks.workunit.client.1.vm07.stdout:2/757: creat d0/d42/d26/d38/d4f/dad/ddd/f100 x:0 0 0 2026-03-10T12:38:24.958 INFO:tasks.workunit.client.1.vm07.stdout:4/954: symlink d0/d4/d10/d3c/l155 0 2026-03-10T12:38:24.961 INFO:tasks.workunit.client.0.vm00.stdout:6/829: creat d2/d42/d80/dfd/f12d x:0 0 0 2026-03-10T12:38:24.963 INFO:tasks.workunit.client.1.vm07.stdout:3/880: mkdir dc/dd/d43/d76/d95/dde/d129 0 2026-03-10T12:38:24.964 INFO:tasks.workunit.client.0.vm00.stdout:7/940: dwrite da/d47/dfd/f12b [0,4194304] 0 2026-03-10T12:38:24.968 INFO:tasks.workunit.client.1.vm07.stdout:3/881: dread dc/dd/d28/d7a/d8e/f10a [0,4194304] 0 2026-03-10T12:38:24.973 INFO:tasks.workunit.client.1.vm07.stdout:5/854: dread d0/d22/d18/d3e/d53/d9e/f76 [0,4194304] 0 2026-03-10T12:38:24.990 INFO:tasks.workunit.client.0.vm00.stdout:6/830: mkdir d2/da/dc/d2f/d10a/d12e 0 2026-03-10T12:38:24.990 INFO:tasks.workunit.client.1.vm07.stdout:2/758: unlink d0/d42/d26/d38/d4f/d5d/c95 0 2026-03-10T12:38:24.992 INFO:tasks.workunit.client.1.vm07.stdout:4/955: creat d0/d4/d10/d3c/d2b/d54/f156 x:0 0 0 2026-03-10T12:38:24.997 INFO:tasks.workunit.client.0.vm00.stdout:7/941: fdatasync da/f17 0 2026-03-10T12:38:24.997 INFO:tasks.workunit.client.0.vm00.stdout:6/831: stat d2/d14/d7a/db9/l91 0 2026-03-10T12:38:24.998 INFO:tasks.workunit.client.0.vm00.stdout:7/942: readlink da/d25/d2c/d82/d68/l109 0 2026-03-10T12:38:24.998 INFO:tasks.workunit.client.0.vm00.stdout:6/832: read - d2/da/f6a zero size 2026-03-10T12:38:25.000 INFO:tasks.workunit.client.0.vm00.stdout:7/943: fdatasync da/d26/d37/fc4 0 2026-03-10T12:38:25.001 INFO:tasks.workunit.client.0.vm00.stdout:7/944: chown da/d25/d2c/d82/d101/f12f 77 1 2026-03-10T12:38:25.004 INFO:tasks.workunit.client.1.vm07.stdout:8/803: dread d1/f2 [0,4194304] 0 2026-03-10T12:38:25.008 INFO:tasks.workunit.client.0.vm00.stdout:7/945: dwrite da/d26/d37/d56/ddf/d108/f11e [0,4194304] 0 2026-03-10T12:38:25.018 INFO:tasks.workunit.client.0.vm00.stdout:7/946: dread da/d41/d7b/f83 [0,4194304] 0 2026-03-10T12:38:25.018 INFO:tasks.workunit.client.1.vm07.stdout:3/882: creat dc/d18/d99/d123/f12a x:0 0 0 2026-03-10T12:38:25.031 INFO:tasks.workunit.client.1.vm07.stdout:0/952: dwrite d0/d14/d5f/d76/f8a [0,4194304] 0 2026-03-10T12:38:25.038 INFO:tasks.workunit.client.1.vm07.stdout:2/759: rename d0/lc5 to d0/d42/d26/d38/d4f/dad/l101 0 2026-03-10T12:38:25.043 INFO:tasks.workunit.client.1.vm07.stdout:1/845: getdents d9/d2d/d80 0 2026-03-10T12:38:25.058 INFO:tasks.workunit.client.1.vm07.stdout:8/804: creat d1/f107 x:0 0 0 2026-03-10T12:38:25.058 INFO:tasks.workunit.client.1.vm07.stdout:8/805: write d1/d3/f1f [4070801,26016] 0 2026-03-10T12:38:25.058 INFO:tasks.workunit.client.1.vm07.stdout:8/806: dwrite d1/d3/d18/fd9 [0,4194304] 0 2026-03-10T12:38:25.058 INFO:tasks.workunit.client.0.vm00.stdout:7/947: creat da/d41/d7b/d9d/dc8/d12e/f14a x:0 0 0 2026-03-10T12:38:25.058 INFO:tasks.workunit.client.0.vm00.stdout:7/948: readlink da/d25/d2e/l9b 0 2026-03-10T12:38:25.059 INFO:tasks.workunit.client.0.vm00.stdout:7/949: dread - da/d41/d48/d81/f138 zero size 2026-03-10T12:38:25.059 INFO:tasks.workunit.client.0.vm00.stdout:7/950: stat da/d3f/d71/f95 0 2026-03-10T12:38:25.059 INFO:tasks.workunit.client.0.vm00.stdout:6/833: sync 2026-03-10T12:38:25.059 INFO:tasks.workunit.client.1.vm07.stdout:7/802: creat d0/d57/d62/d90/f109 x:0 0 0 2026-03-10T12:38:25.060 INFO:tasks.workunit.client.1.vm07.stdout:0/953: symlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/l141 0 2026-03-10T12:38:25.064 INFO:tasks.workunit.client.1.vm07.stdout:0/954: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/f112 [0,4194304] 0 2026-03-10T12:38:25.071 INFO:tasks.workunit.client.1.vm07.stdout:4/956: mknod d0/d4/d5/d78/dc5/df7/db2/dd5/d12b/d148/c157 0 2026-03-10T12:38:25.071 INFO:tasks.workunit.client.1.vm07.stdout:4/957: dread - d0/d4/df2/f11f zero size 2026-03-10T12:38:25.071 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:24 vm07.local ceph-mon[58582]: Reconfiguring daemon prometheus.vm00 on vm00 2026-03-10T12:38:25.071 INFO:tasks.workunit.client.1.vm07.stdout:2/760: read d0/d29/d64/d6c/fb9 [333719,90933] 0 2026-03-10T12:38:25.071 INFO:tasks.workunit.client.1.vm07.stdout:2/761: write d0/d29/d64/d74/d88/f51 [303581,30671] 0 2026-03-10T12:38:25.077 INFO:tasks.workunit.client.1.vm07.stdout:9/903: dwrite d5/d16/d18/f20 [8388608,4194304] 0 2026-03-10T12:38:25.078 INFO:tasks.workunit.client.1.vm07.stdout:1/846: mknod d9/d2d/d4f/d5a/c11b 0 2026-03-10T12:38:25.078 INFO:tasks.workunit.client.1.vm07.stdout:1/847: chown d9/d2d 0 1 2026-03-10T12:38:25.086 INFO:tasks.workunit.client.0.vm00.stdout:7/951: rename da/d3f/dd1/l104 to da/d25/d2e/d4c/l14b 0 2026-03-10T12:38:25.091 INFO:tasks.workunit.client.1.vm07.stdout:7/803: mkdir d0/d67/d10a 0 2026-03-10T12:38:25.096 INFO:tasks.workunit.client.1.vm07.stdout:6/824: write d1/d4/d6/f30 [2957009,116934] 0 2026-03-10T12:38:25.099 INFO:tasks.workunit.client.0.vm00.stdout:7/952: chown da/l19 16 1 2026-03-10T12:38:25.102 INFO:tasks.workunit.client.1.vm07.stdout:5/855: dwrite d0/d22/d18/d19/d21/fd4 [0,4194304] 0 2026-03-10T12:38:25.103 INFO:tasks.workunit.client.1.vm07.stdout:0/955: rmdir d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9/d10c 39 2026-03-10T12:38:25.104 INFO:tasks.workunit.client.1.vm07.stdout:0/956: read - d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/f13b zero size 2026-03-10T12:38:25.109 INFO:tasks.workunit.client.1.vm07.stdout:0/957: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/f132 [0,4194304] 0 2026-03-10T12:38:25.115 INFO:tasks.workunit.client.1.vm07.stdout:9/904: dread d5/d1f/d5e/d6b/de0/f124 [0,4194304] 0 2026-03-10T12:38:25.115 INFO:tasks.workunit.client.1.vm07.stdout:9/905: fdatasync d5/d69/f130 0 2026-03-10T12:38:25.117 INFO:tasks.workunit.client.1.vm07.stdout:1/848: mkdir d9/d2d/d4f/d5a/d11c 0 2026-03-10T12:38:25.126 INFO:tasks.workunit.client.1.vm07.stdout:6/825: mknod d1/d4/d6/d16/d1a/d9d/db2/c110 0 2026-03-10T12:38:25.128 INFO:tasks.workunit.client.0.vm00.stdout:7/953: getdents da/d26/d50/d73 0 2026-03-10T12:38:25.128 INFO:tasks.workunit.client.1.vm07.stdout:7/804: unlink d0/d61/db4/c99 0 2026-03-10T12:38:25.129 INFO:tasks.workunit.client.1.vm07.stdout:9/906: mkdir d5/d13/d6c/da4/d134 0 2026-03-10T12:38:25.130 INFO:tasks.workunit.client.1.vm07.stdout:1/849: rename d9/df/d29/d2b/d92/d9d/lc4 to d9/dff/d103/l11d 0 2026-03-10T12:38:25.131 INFO:tasks.workunit.client.1.vm07.stdout:3/883: link dc/d18/fa1 dc/dd/d28/d7a/d8e/f12b 0 2026-03-10T12:38:25.132 INFO:tasks.workunit.client.1.vm07.stdout:2/762: link d0/d42/d26/d38/d4f/d62/l8c d0/d29/l102 0 2026-03-10T12:38:25.132 INFO:tasks.workunit.client.0.vm00.stdout:7/954: dread da/d41/d48/fbc [0,4194304] 0 2026-03-10T12:38:25.133 INFO:tasks.workunit.client.1.vm07.stdout:9/907: creat d5/d16/dd7/f135 x:0 0 0 2026-03-10T12:38:25.134 INFO:tasks.workunit.client.0.vm00.stdout:7/955: mkdir da/d25/d2c/d14c 0 2026-03-10T12:38:25.149 INFO:tasks.workunit.client.0.vm00.stdout:7/956: fdatasync da/d25/d2c/d82/d101/f118 0 2026-03-10T12:38:25.149 INFO:tasks.workunit.client.0.vm00.stdout:7/957: dwrite da/d41/d48/d81/f140 [0,4194304] 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.0.vm00.stdout:7/958: creat da/d1b/d40/db6/f14d x:0 0 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:1/850: rename d9/df/dc2/l78 to d9/df/d29/d2b/d30/l11e 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:9/908: rename d5/d13 to d5/d13/d9d/df2/df4/d136 22 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:9/909: fsync d5/d13/d57/d4f/d6a/fba 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:3/884: mknod dc/dd/d28/d7a/d8e/c12c 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:5/856: creat d0/d22/d18/d30/f128 x:0 0 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:5/857: readlink d0/d22/dbc/la6 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:2/763: dread d0/d42/d1f/d20/fa0 [0,4194304] 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:0/958: getdents d0 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:0/959: chown d0/d14/d5f/d76/d2f/d31/d4f/d9d/d140 7349 1 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:2/764: creat d0/d42/d4e/d77/f103 x:0 0 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:3/885: creat dc/dd/d1f/dac/de6/f12d x:0 0 0 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:0/960: read - d0/d14/d5f/d76/d2f/d31/df0/f10b zero size 2026-03-10T12:38:25.150 INFO:tasks.workunit.client.1.vm07.stdout:3/886: write dc/dd/d43/f61 [1227480,7684] 0 2026-03-10T12:38:25.152 INFO:tasks.workunit.client.1.vm07.stdout:7/805: rmdir d0/d57/dd6/d80/d105 0 2026-03-10T12:38:25.174 INFO:tasks.workunit.client.0.vm00.stdout:7/959: dread f0 [0,4194304] 0 2026-03-10T12:38:25.175 INFO:tasks.workunit.client.0.vm00.stdout:7/960: read - da/d41/d7b/d9d/dba/f13d zero size 2026-03-10T12:38:25.176 INFO:tasks.workunit.client.0.vm00.stdout:6/834: write d2/da/dc/fd [5613072,74346] 0 2026-03-10T12:38:25.176 INFO:tasks.workunit.client.0.vm00.stdout:7/961: write da/d25/d2e/d4c/f6e [2290538,25308] 0 2026-03-10T12:38:25.186 INFO:tasks.workunit.client.0.vm00.stdout:6/835: dwrite d2/da/dc/d83/d119/f129 [0,4194304] 0 2026-03-10T12:38:25.187 INFO:tasks.workunit.client.0.vm00.stdout:7/962: dwrite da/d1b/d40/f7d [0,4194304] 0 2026-03-10T12:38:25.190 INFO:tasks.workunit.client.0.vm00.stdout:6/836: write d2/d16/f20 [4890260,76654] 0 2026-03-10T12:38:25.196 INFO:tasks.workunit.client.1.vm07.stdout:7/806: dread d0/f14 [0,4194304] 0 2026-03-10T12:38:25.199 INFO:tasks.workunit.client.1.vm07.stdout:7/807: creat d0/d47/dde/f10b x:0 0 0 2026-03-10T12:38:25.202 INFO:tasks.workunit.client.0.vm00.stdout:6/837: creat d2/d42/d80/dfd/f12f x:0 0 0 2026-03-10T12:38:25.203 INFO:tasks.workunit.client.1.vm07.stdout:7/808: getdents d0/d47/dab/dae 0 2026-03-10T12:38:25.203 INFO:tasks.workunit.client.0.vm00.stdout:7/963: mknod da/d26/c14e 0 2026-03-10T12:38:25.204 INFO:tasks.workunit.client.0.vm00.stdout:7/964: chown da/d25/d2c/d122 458 1 2026-03-10T12:38:25.205 INFO:tasks.workunit.client.0.vm00.stdout:6/838: mknod d2/da/dc/c130 0 2026-03-10T12:38:25.213 INFO:tasks.workunit.client.1.vm07.stdout:7/809: dread d0/d61/db4/f53 [0,4194304] 0 2026-03-10T12:38:25.216 INFO:tasks.workunit.client.1.vm07.stdout:0/961: sync 2026-03-10T12:38:25.219 INFO:tasks.workunit.client.1.vm07.stdout:0/962: link d0/d14/d5f/d41/d6a/d9a/df9/le6 d0/d14/d5f/d76/l142 0 2026-03-10T12:38:25.220 INFO:tasks.workunit.client.1.vm07.stdout:0/963: chown d0/d14/d5f/d76/d2f/ffe 3 1 2026-03-10T12:38:25.225 INFO:tasks.workunit.client.0.vm00.stdout:7/965: sync 2026-03-10T12:38:25.227 INFO:tasks.workunit.client.0.vm00.stdout:7/966: truncate da/d3f/d60/f88 661701 0 2026-03-10T12:38:25.235 INFO:tasks.workunit.client.1.vm07.stdout:4/958: dwrite d0/d4/d10/f4b [0,4194304] 0 2026-03-10T12:38:25.247 INFO:tasks.workunit.client.1.vm07.stdout:8/807: dwrite d1/d3/d6/d50/fc8 [0,4194304] 0 2026-03-10T12:38:25.247 INFO:tasks.workunit.client.1.vm07.stdout:8/808: chown d1/d3/d6/d54/f7d 1465 1 2026-03-10T12:38:25.254 INFO:tasks.workunit.client.1.vm07.stdout:1/851: rename d9/df/dc2 to d9/df/d29/d2b/d31/d11f 0 2026-03-10T12:38:25.254 INFO:tasks.workunit.client.1.vm07.stdout:5/858: write d0/dbf/f104 [2289282,21711] 0 2026-03-10T12:38:25.254 INFO:tasks.workunit.client.1.vm07.stdout:6/826: write d1/d4/f11 [5843917,67477] 0 2026-03-10T12:38:25.254 INFO:tasks.workunit.client.1.vm07.stdout:6/827: chown d1/dd7/d66/dd6 0 1 2026-03-10T12:38:25.255 INFO:tasks.workunit.client.1.vm07.stdout:1/852: chown d9/df/d29/d2b/d31/d11f/de1 1023 1 2026-03-10T12:38:25.262 INFO:tasks.workunit.client.1.vm07.stdout:9/910: dwrite d5/d13/d2c/de6/d64/f110 [0,4194304] 0 2026-03-10T12:38:25.264 INFO:tasks.workunit.client.1.vm07.stdout:9/911: chown d5/d16/d23/d26/f5c 55321 1 2026-03-10T12:38:25.266 INFO:tasks.workunit.client.1.vm07.stdout:8/809: read d1/d3/d40/f7e [7358,3819] 0 2026-03-10T12:38:25.267 INFO:tasks.workunit.client.1.vm07.stdout:8/810: chown d1/d3/d5d/fd5 90 1 2026-03-10T12:38:25.273 INFO:tasks.workunit.client.1.vm07.stdout:5/859: chown d0/d22/d18/d19/d2e/d67/cac 391 1 2026-03-10T12:38:25.280 INFO:tasks.workunit.client.1.vm07.stdout:5/860: chown d0/d22/d18/d19/d72/fd8 23228 1 2026-03-10T12:38:25.280 INFO:tasks.workunit.client.1.vm07.stdout:2/765: write d0/d42/d26/f48 [842729,40194] 0 2026-03-10T12:38:25.282 INFO:tasks.workunit.client.1.vm07.stdout:5/861: stat d0/l7 0 2026-03-10T12:38:25.284 INFO:tasks.workunit.client.1.vm07.stdout:4/959: dread d0/d4/d5/d34/fa3 [0,4194304] 0 2026-03-10T12:38:25.288 INFO:tasks.workunit.client.1.vm07.stdout:4/960: mknod d0/d4/d10/d3c/d2b/d54/c158 0 2026-03-10T12:38:25.290 INFO:tasks.workunit.client.1.vm07.stdout:4/961: mknod d0/d4/d10/d114/c159 0 2026-03-10T12:38:25.291 INFO:tasks.workunit.client.1.vm07.stdout:9/912: link d5/d13/d57/l107 d5/d1f/d5e/d6b/de0/l137 0 2026-03-10T12:38:25.306 INFO:tasks.workunit.client.1.vm07.stdout:3/887: truncate dc/dd/d43/d5c/f101 11960860 0 2026-03-10T12:38:25.314 INFO:tasks.workunit.client.1.vm07.stdout:7/810: write d0/d61/d79/f8d [1274198,335] 0 2026-03-10T12:38:25.317 INFO:tasks.workunit.client.0.vm00.stdout:7/967: write da/d3f/d71/f8c [776256,92187] 0 2026-03-10T12:38:25.317 INFO:tasks.workunit.client.1.vm07.stdout:6/828: write d1/d4/d6/d46/d4d/fdf [924310,50745] 0 2026-03-10T12:38:25.318 INFO:tasks.workunit.client.0.vm00.stdout:7/968: chown da/d26/d37/d56/ddf/d108/f11e 2 1 2026-03-10T12:38:25.320 INFO:tasks.workunit.client.1.vm07.stdout:0/964: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/faf [0,4194304] 0 2026-03-10T12:38:25.324 INFO:tasks.workunit.client.1.vm07.stdout:0/965: write d0/d14/d5f/d41/d6a/d9a/f130 [949478,106076] 0 2026-03-10T12:38:25.327 INFO:tasks.workunit.client.1.vm07.stdout:6/829: dwrite d1/d4/d6/d16/d1a/d2c/f59 [0,4194304] 0 2026-03-10T12:38:25.327 INFO:tasks.workunit.client.1.vm07.stdout:0/966: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9/f123 [1359844,37499] 0 2026-03-10T12:38:25.332 INFO:tasks.workunit.client.1.vm07.stdout:1/853: write d9/df/d29/d2b/d31/d91/f7f [2861098,60186] 0 2026-03-10T12:38:25.342 INFO:tasks.workunit.client.1.vm07.stdout:2/766: write d0/d42/d26/f5a [3693716,58479] 0 2026-03-10T12:38:25.343 INFO:tasks.workunit.client.1.vm07.stdout:8/811: dwrite d1/f2 [0,4194304] 0 2026-03-10T12:38:25.348 INFO:tasks.workunit.client.1.vm07.stdout:8/812: dwrite d1/d3/d11/f35 [0,4194304] 0 2026-03-10T12:38:25.356 INFO:tasks.workunit.client.1.vm07.stdout:5/862: write d0/d22/d18/d19/d2e/d67/fa0 [715801,70963] 0 2026-03-10T12:38:25.357 INFO:tasks.workunit.client.1.vm07.stdout:5/863: fdatasync d0/d22/d18/d19/d21/d54/f114 0 2026-03-10T12:38:25.357 INFO:tasks.workunit.client.1.vm07.stdout:5/864: stat d0/d22/d18/d19/d21/d54/dcb/f6a 0 2026-03-10T12:38:25.361 INFO:tasks.workunit.client.0.vm00.stdout:7/969: fdatasync da/d1b/d40/fca 0 2026-03-10T12:38:25.365 INFO:tasks.workunit.client.1.vm07.stdout:9/913: write d5/d13/d6c/da4/fa6 [4616033,62270] 0 2026-03-10T12:38:25.366 INFO:tasks.workunit.client.1.vm07.stdout:0/967: symlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/l143 0 2026-03-10T12:38:25.370 INFO:tasks.workunit.client.1.vm07.stdout:1/854: truncate d9/d2d/d80/d8e/fa0 1459261 0 2026-03-10T12:38:25.371 INFO:tasks.workunit.client.1.vm07.stdout:1/855: chown d9/df/d29/l7b 10691 1 2026-03-10T12:38:25.371 INFO:tasks.workunit.client.1.vm07.stdout:1/856: read - d9/d2d/d4f/d75/fda zero size 2026-03-10T12:38:25.372 INFO:tasks.workunit.client.1.vm07.stdout:1/857: chown d9/d2d/d4f/f95 57217 1 2026-03-10T12:38:25.374 INFO:tasks.workunit.client.1.vm07.stdout:1/858: truncate d9/d2d/d4f/dde/fef 1603044 0 2026-03-10T12:38:25.374 INFO:tasks.workunit.client.1.vm07.stdout:4/962: dwrite d0/d4/d10/d9a/f113 [0,4194304] 0 2026-03-10T12:38:25.379 INFO:tasks.workunit.client.1.vm07.stdout:4/963: write d0/d4/d10/d9a/d124/fb4 [1232492,15435] 0 2026-03-10T12:38:25.385 INFO:tasks.workunit.client.1.vm07.stdout:2/767: chown d0/fe4 882700887 1 2026-03-10T12:38:25.388 INFO:tasks.workunit.client.0.vm00.stdout:6/839: chown d2/d51/f63 20912388 1 2026-03-10T12:38:25.394 INFO:tasks.workunit.client.0.vm00.stdout:6/840: stat d2/d16/d29/f111 0 2026-03-10T12:38:25.394 INFO:tasks.workunit.client.0.vm00.stdout:6/841: stat d2 0 2026-03-10T12:38:25.394 INFO:tasks.workunit.client.1.vm07.stdout:8/813: rename d1/d3/d40/f8c to d1/d3/d40/d92/db6/f108 0 2026-03-10T12:38:25.394 INFO:tasks.workunit.client.1.vm07.stdout:5/865: symlink d0/d22/d18/d19/d36/l129 0 2026-03-10T12:38:25.394 INFO:tasks.workunit.client.1.vm07.stdout:9/914: creat d5/d69/d93/f138 x:0 0 0 2026-03-10T12:38:25.394 INFO:tasks.workunit.client.1.vm07.stdout:0/968: mknod d0/d14/d5f/d41/d6a/d74/c144 0 2026-03-10T12:38:25.397 INFO:tasks.workunit.client.1.vm07.stdout:1/859: mknod d9/df/d29/d2b/d31/d91/c120 0 2026-03-10T12:38:25.397 INFO:tasks.workunit.client.1.vm07.stdout:1/860: stat d9/df/c6a 0 2026-03-10T12:38:25.398 INFO:tasks.workunit.client.0.vm00.stdout:6/842: rmdir d2/d9f/df6/d11a 0 2026-03-10T12:38:25.399 INFO:tasks.workunit.client.1.vm07.stdout:4/964: unlink d0/d4/d10/d3c/d2b/d54/de1/c141 0 2026-03-10T12:38:25.402 INFO:tasks.workunit.client.1.vm07.stdout:2/768: truncate d0/d42/d4e/d77/f6f 1430304 0 2026-03-10T12:38:25.403 INFO:tasks.workunit.client.1.vm07.stdout:7/811: getdents d0/d47/da0 0 2026-03-10T12:38:25.405 INFO:tasks.workunit.client.1.vm07.stdout:5/866: read - d0/d22/d18/d19/d2e/d67/ff4 zero size 2026-03-10T12:38:25.406 INFO:tasks.workunit.client.1.vm07.stdout:6/830: truncate d1/d4/f82 625521 0 2026-03-10T12:38:25.408 INFO:tasks.workunit.client.1.vm07.stdout:9/915: creat d5/d16/d23/d26/f139 x:0 0 0 2026-03-10T12:38:25.410 INFO:tasks.workunit.client.1.vm07.stdout:6/831: dwrite d1/d4/d6/d46/d4d/dc7/dd9/ffb [0,4194304] 0 2026-03-10T12:38:25.413 INFO:tasks.workunit.client.1.vm07.stdout:0/969: fsync d0/d14/d5f/d76/d2f/d31/d4f/d9d/f103 0 2026-03-10T12:38:25.417 INFO:tasks.workunit.client.1.vm07.stdout:2/769: mknod d0/d29/d64/db5/c104 0 2026-03-10T12:38:25.418 INFO:tasks.workunit.client.1.vm07.stdout:7/812: symlink d0/d61/db4/d8a/l10c 0 2026-03-10T12:38:25.418 INFO:tasks.workunit.client.1.vm07.stdout:8/814: symlink d1/d3/d40/l109 0 2026-03-10T12:38:25.419 INFO:tasks.workunit.client.1.vm07.stdout:8/815: readlink d1/d3/d6/d50/d70/dcf/le6 0 2026-03-10T12:38:25.423 INFO:tasks.workunit.client.1.vm07.stdout:7/813: dwrite d0/d57/d62/d90/f109 [0,4194304] 0 2026-03-10T12:38:25.425 INFO:tasks.workunit.client.1.vm07.stdout:8/816: dread d1/d3/d6/d50/faa [0,4194304] 0 2026-03-10T12:38:25.426 INFO:tasks.workunit.client.1.vm07.stdout:8/817: write d1/d3/f8 [252719,53920] 0 2026-03-10T12:38:25.446 INFO:tasks.workunit.client.1.vm07.stdout:0/970: rename d0/d14/d5f/d76/d93 to d0/d14/d5f/d76/d2f/d31/d79/d85/d145 0 2026-03-10T12:38:25.446 INFO:tasks.workunit.client.1.vm07.stdout:5/867: mknod d0/d22/d18/d3e/d11f/c12a 0 2026-03-10T12:38:25.451 INFO:tasks.workunit.client.1.vm07.stdout:7/814: stat d0/d47/f8e 0 2026-03-10T12:38:25.455 INFO:tasks.workunit.client.1.vm07.stdout:7/815: dwrite d0/d61/db4/f4b [0,4194304] 0 2026-03-10T12:38:25.459 INFO:tasks.workunit.client.1.vm07.stdout:1/861: link d9/df/d29/d6b/l9b d9/d2d/d4f/l121 0 2026-03-10T12:38:25.461 INFO:tasks.workunit.client.1.vm07.stdout:6/832: rename d1/d4/d6/d43/d88/d97/lb9 to d1/d4/d6/d16/d1a/d2c/de0/l111 0 2026-03-10T12:38:25.465 INFO:tasks.workunit.client.1.vm07.stdout:6/833: dwrite d1/d4/d6/d16/d1a/d2c/f59 [4194304,4194304] 0 2026-03-10T12:38:25.471 INFO:tasks.workunit.client.1.vm07.stdout:5/868: fsync d0/d22/d18/d19/d21/d54/dcb/f6a 0 2026-03-10T12:38:25.473 INFO:tasks.workunit.client.1.vm07.stdout:1/862: creat d9/d2d/d4f/dde/f122 x:0 0 0 2026-03-10T12:38:25.480 INFO:tasks.workunit.client.1.vm07.stdout:2/770: rename d0/d42/d1f/d20/d86 to d0/d29/d64/db5/dbb/dca/d105 0 2026-03-10T12:38:25.480 INFO:tasks.workunit.client.1.vm07.stdout:2/771: fdatasync d0/d42/d26/d7d/fea 0 2026-03-10T12:38:25.482 INFO:tasks.workunit.client.1.vm07.stdout:6/834: rmdir d1/dd7/d66/dd6 39 2026-03-10T12:38:25.485 INFO:tasks.workunit.client.1.vm07.stdout:8/818: getdents d1/d3/d5d 0 2026-03-10T12:38:25.488 INFO:tasks.workunit.client.1.vm07.stdout:9/916: getdents d5/d16/d18 0 2026-03-10T12:38:25.491 INFO:tasks.workunit.client.0.vm00.stdout:6/843: write d2/d51/d70/fab [23241,49552] 0 2026-03-10T12:38:25.492 INFO:tasks.workunit.client.1.vm07.stdout:7/816: fdatasync d0/f2f 0 2026-03-10T12:38:25.495 INFO:tasks.workunit.client.1.vm07.stdout:7/817: dwrite d0/d57/d62/d90/f109 [0,4194304] 0 2026-03-10T12:38:25.503 INFO:tasks.workunit.client.1.vm07.stdout:5/869: creat d0/d22/d18/d3e/d11f/f12b x:0 0 0 2026-03-10T12:38:25.505 INFO:tasks.workunit.client.1.vm07.stdout:2/772: truncate d0/f1d 428099 0 2026-03-10T12:38:25.515 INFO:tasks.workunit.client.1.vm07.stdout:6/835: mkdir d1/d4/d6/d16/d1a/d6e/d112 0 2026-03-10T12:38:25.518 INFO:tasks.workunit.client.1.vm07.stdout:3/888: write dc/dd/d28/dd0/fdb [2723709,76737] 0 2026-03-10T12:38:25.520 INFO:tasks.workunit.client.1.vm07.stdout:8/819: creat d1/d3/d40/d92/dba/f10a x:0 0 0 2026-03-10T12:38:25.520 INFO:tasks.workunit.client.1.vm07.stdout:8/820: stat d1/d3/f73 0 2026-03-10T12:38:25.520 INFO:tasks.workunit.client.1.vm07.stdout:8/821: stat d1/f3f 0 2026-03-10T12:38:25.522 INFO:tasks.workunit.client.1.vm07.stdout:3/889: read dc/f17 [518559,74012] 0 2026-03-10T12:38:25.522 INFO:tasks.workunit.client.1.vm07.stdout:3/890: write dc/d18/d2d/f71 [606261,97189] 0 2026-03-10T12:38:25.523 INFO:tasks.workunit.client.1.vm07.stdout:3/891: chown f1 12242374 1 2026-03-10T12:38:25.526 INFO:tasks.workunit.client.1.vm07.stdout:9/917: creat d5/d16/d23/d26/f13a x:0 0 0 2026-03-10T12:38:25.532 INFO:tasks.workunit.client.1.vm07.stdout:7/818: mkdir d0/d47/d10d 0 2026-03-10T12:38:25.545 INFO:tasks.workunit.client.1.vm07.stdout:2/773: mknod d0/d29/d64/db5/dbb/df9/c106 0 2026-03-10T12:38:25.548 INFO:tasks.workunit.client.0.vm00.stdout:6/844: symlink d2/d16/d29/d31/d88/l131 0 2026-03-10T12:38:25.553 INFO:tasks.workunit.client.0.vm00.stdout:7/970: dwrite da/d41/f72 [0,4194304] 0 2026-03-10T12:38:25.557 INFO:tasks.workunit.client.1.vm07.stdout:8/822: dread - d1/d3/d18/fd0 zero size 2026-03-10T12:38:25.558 INFO:tasks.workunit.client.1.vm07.stdout:8/823: readlink d1/d3/d6/d50/d70/lb9 0 2026-03-10T12:38:25.560 INFO:tasks.workunit.client.0.vm00.stdout:7/971: dwrite da/d41/d7b/f121 [0,4194304] 0 2026-03-10T12:38:25.561 INFO:tasks.workunit.client.0.vm00.stdout:6/845: fsync d2/d9f/df6/fc8 0 2026-03-10T12:38:25.564 INFO:tasks.workunit.client.0.vm00.stdout:7/972: chown da/d41/d48/fd4 253809 1 2026-03-10T12:38:25.566 INFO:tasks.workunit.client.0.vm00.stdout:7/973: dread - da/d41/d48/fd4 zero size 2026-03-10T12:38:25.568 INFO:tasks.workunit.client.1.vm07.stdout:9/918: sync 2026-03-10T12:38:25.569 INFO:tasks.workunit.client.1.vm07.stdout:9/919: chown d5/d13/d9b/fec 1893670697 1 2026-03-10T12:38:25.570 INFO:tasks.workunit.client.0.vm00.stdout:7/974: write da/d25/d2e/d4c/f126 [840292,91691] 0 2026-03-10T12:38:25.571 INFO:tasks.workunit.client.0.vm00.stdout:7/975: fdatasync da/d47/dfd/f147 0 2026-03-10T12:38:25.573 INFO:tasks.workunit.client.1.vm07.stdout:4/965: truncate d0/fa1 3665666 0 2026-03-10T12:38:25.579 INFO:tasks.workunit.client.1.vm07.stdout:3/892: dread dc/dd/d28/d7a/d8e/fb0 [4194304,4194304] 0 2026-03-10T12:38:25.584 INFO:tasks.workunit.client.1.vm07.stdout:7/819: creat d0/d47/dde/f10e x:0 0 0 2026-03-10T12:38:25.588 INFO:tasks.workunit.client.0.vm00.stdout:7/976: creat da/d25/d2e/f14f x:0 0 0 2026-03-10T12:38:25.589 INFO:tasks.workunit.client.1.vm07.stdout:2/774: creat d0/d29/d64/db5/dbb/dca/f107 x:0 0 0 2026-03-10T12:38:25.592 INFO:tasks.workunit.client.1.vm07.stdout:8/824: truncate d1/d3/d40/f4c 723226 0 2026-03-10T12:38:25.592 INFO:tasks.workunit.client.1.vm07.stdout:8/825: stat d1/d3/d6c/lae 0 2026-03-10T12:38:25.599 INFO:tasks.workunit.client.0.vm00.stdout:7/977: truncate da/f16 287834 0 2026-03-10T12:38:25.599 INFO:tasks.workunit.client.1.vm07.stdout:9/920: creat d5/d13/d2c/de6/d64/d108/d127/f13b x:0 0 0 2026-03-10T12:38:25.599 INFO:tasks.workunit.client.1.vm07.stdout:9/921: chown d5/d1f/c51 56334 1 2026-03-10T12:38:25.600 INFO:tasks.workunit.client.1.vm07.stdout:1/863: getdents d9/df/d29/d2b/d30 0 2026-03-10T12:38:25.601 INFO:tasks.workunit.client.1.vm07.stdout:3/893: sync 2026-03-10T12:38:25.602 INFO:tasks.workunit.client.1.vm07.stdout:3/894: stat dc/d18 0 2026-03-10T12:38:25.602 INFO:tasks.workunit.client.1.vm07.stdout:4/966: unlink d0/d4/c9 0 2026-03-10T12:38:25.603 INFO:tasks.workunit.client.1.vm07.stdout:7/820: sync 2026-03-10T12:38:25.605 INFO:tasks.workunit.client.1.vm07.stdout:2/775: mkdir d0/d42/d26/d7d/d108 0 2026-03-10T12:38:25.606 INFO:tasks.workunit.client.1.vm07.stdout:7/821: dwrite d0/d61/db4/f4b [8388608,4194304] 0 2026-03-10T12:38:25.610 INFO:tasks.workunit.client.1.vm07.stdout:7/822: dwrite d0/d52/fb9 [0,4194304] 0 2026-03-10T12:38:25.611 INFO:tasks.workunit.client.1.vm07.stdout:7/823: write d0/d57/d62/d90/fed [804441,115890] 0 2026-03-10T12:38:25.616 INFO:tasks.workunit.client.1.vm07.stdout:8/826: fsync d1/d3/d18/d8e/fd6 0 2026-03-10T12:38:25.616 INFO:tasks.workunit.client.1.vm07.stdout:8/827: chown d1/d3/d6c/lb1 48952 1 2026-03-10T12:38:25.619 INFO:tasks.workunit.client.1.vm07.stdout:8/828: dwrite d1/d3/d6c/f9b [0,4194304] 0 2026-03-10T12:38:25.622 INFO:tasks.workunit.client.1.vm07.stdout:0/971: write d0/d14/d7c/f90 [1437283,99075] 0 2026-03-10T12:38:25.637 INFO:tasks.workunit.client.1.vm07.stdout:5/870: dwrite d0/d22/d18/d19/d21/fbd [0,4194304] 0 2026-03-10T12:38:25.639 INFO:tasks.workunit.client.1.vm07.stdout:9/922: creat d5/d1f/d75/f13c x:0 0 0 2026-03-10T12:38:25.649 INFO:tasks.workunit.client.0.vm00.stdout:7/978: rmdir da/d41 39 2026-03-10T12:38:25.649 INFO:tasks.workunit.client.0.vm00.stdout:7/979: write da/d1b/d40/db6/f14d [541478,96369] 0 2026-03-10T12:38:25.650 INFO:tasks.workunit.client.1.vm07.stdout:6/836: write d1/d4/d6/f41 [4724669,52637] 0 2026-03-10T12:38:25.654 INFO:tasks.workunit.client.1.vm07.stdout:2/776: chown d0/d29/d64/d74/d88/cd1 4 1 2026-03-10T12:38:25.658 INFO:tasks.workunit.client.1.vm07.stdout:7/824: creat d0/d57/dd6/d80/f10f x:0 0 0 2026-03-10T12:38:25.662 INFO:tasks.workunit.client.1.vm07.stdout:8/829: creat d1/d3/d6c/dde/f10b x:0 0 0 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.1.vm07.stdout:0/972: symlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115/d11d/l146 0 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.1.vm07.stdout:6/837: creat d1/d4/d6/d43/d65/f113 x:0 0 0 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.1.vm07.stdout:2/777: fsync d0/fe4 0 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.0.vm00.stdout:6/846: write d2/d42/d80/d9d/fe9 [42128,22189] 0 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.0.vm00.stdout:7/980: getdents da/d3f/dd1 0 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.0.vm00.stdout:6/847: chown d2/d14/f5d 3857927 1 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.0.vm00.stdout:7/981: write da/d41/d7b/d9d/dc8/d12e/f14a [75778,129212] 0 2026-03-10T12:38:25.674 INFO:tasks.workunit.client.0.vm00.stdout:7/982: symlink da/d47/d87/l150 0 2026-03-10T12:38:25.684 INFO:tasks.workunit.client.1.vm07.stdout:5/871: dread d0/d22/d18/d19/d36/d75/fdb [0,4194304] 0 2026-03-10T12:38:25.685 INFO:tasks.workunit.client.1.vm07.stdout:5/872: chown d0/d22/d18/d19/d21/d54/dcb/fb3 29269330 1 2026-03-10T12:38:25.685 INFO:tasks.workunit.client.0.vm00.stdout:6/848: dread d2/d42/d80/d89/fb8 [0,4194304] 0 2026-03-10T12:38:25.696 INFO:tasks.workunit.client.1.vm07.stdout:6/838: creat d1/d4/d6/d16/d49/f114 x:0 0 0 2026-03-10T12:38:25.699 INFO:tasks.workunit.client.1.vm07.stdout:9/923: dread d5/d13/d57/d4f/d6a/f8a [0,4194304] 0 2026-03-10T12:38:25.700 INFO:tasks.workunit.client.0.vm00.stdout:6/849: read - d2/da/dbf/ded/ff9 zero size 2026-03-10T12:38:25.700 INFO:tasks.workunit.client.1.vm07.stdout:6/839: dwrite d1/d4/d6/f30 [0,4194304] 0 2026-03-10T12:38:25.700 INFO:tasks.workunit.client.0.vm00.stdout:6/850: chown d2/d9f/df6/fc8 58 1 2026-03-10T12:38:25.721 INFO:tasks.workunit.client.0.vm00.stdout:7/983: rename da/d26/f27 to da/d41/d7b/d9d/f151 0 2026-03-10T12:38:25.722 INFO:tasks.workunit.client.0.vm00.stdout:7/984: chown da/d41/d7b/d9d/fc2 19042016 1 2026-03-10T12:38:25.731 INFO:tasks.workunit.client.1.vm07.stdout:3/895: dwrite dc/dd/d28/f67 [0,4194304] 0 2026-03-10T12:38:25.731 INFO:tasks.workunit.client.1.vm07.stdout:1/864: dwrite d9/df/d29/d2b/d92/d9d/fee [0,4194304] 0 2026-03-10T12:38:25.739 INFO:tasks.workunit.client.1.vm07.stdout:3/896: read dc/d18/f79 [50425,64551] 0 2026-03-10T12:38:25.747 INFO:tasks.workunit.client.1.vm07.stdout:1/865: sync 2026-03-10T12:38:25.752 INFO:tasks.workunit.client.1.vm07.stdout:0/973: write d0/f21 [1657297,67123] 0 2026-03-10T12:38:25.763 INFO:tasks.workunit.client.1.vm07.stdout:2/778: creat d0/d42/d4e/d77/f109 x:0 0 0 2026-03-10T12:38:25.764 INFO:tasks.workunit.client.1.vm07.stdout:2/779: chown d0/d42/d26 3 1 2026-03-10T12:38:25.767 INFO:tasks.workunit.client.1.vm07.stdout:7/825: truncate d0/d57/d62/f8b 3847188 0 2026-03-10T12:38:25.768 INFO:tasks.workunit.client.0.vm00.stdout:7/985: creat da/d26/d50/d73/f152 x:0 0 0 2026-03-10T12:38:25.768 INFO:tasks.workunit.client.1.vm07.stdout:7/826: dread - d0/d47/dab/ffa zero size 2026-03-10T12:38:25.769 INFO:tasks.workunit.client.0.vm00.stdout:6/851: dwrite d2/da/dc/d94/f121 [0,4194304] 0 2026-03-10T12:38:25.772 INFO:tasks.workunit.client.1.vm07.stdout:8/830: creat d1/d3/d6/d54/dd2/df3/f10c x:0 0 0 2026-03-10T12:38:25.779 INFO:tasks.workunit.client.1.vm07.stdout:5/873: truncate d0/f47 845023 0 2026-03-10T12:38:25.783 INFO:tasks.workunit.client.0.vm00.stdout:6/852: creat d2/d14/dbb/f132 x:0 0 0 2026-03-10T12:38:25.784 INFO:tasks.workunit.client.0.vm00.stdout:6/853: chown d2/da/dc/d94/ldf 53862 1 2026-03-10T12:38:25.788 INFO:tasks.workunit.client.1.vm07.stdout:9/924: dread - d5/d13/d6c/fd5 zero size 2026-03-10T12:38:25.793 INFO:tasks.workunit.client.1.vm07.stdout:6/840: truncate d1/fc9 377113 0 2026-03-10T12:38:25.793 INFO:tasks.workunit.client.1.vm07.stdout:4/967: getdents d0/d4/d5/d8f 0 2026-03-10T12:38:25.795 INFO:tasks.workunit.client.1.vm07.stdout:1/866: rename d9/df/d29/d2b/d92/db6 to d9/df/d29/d2b/d92/d123 0 2026-03-10T12:38:25.796 INFO:tasks.workunit.client.1.vm07.stdout:0/974: mknod d0/d14/d5f/d76/d2f/c147 0 2026-03-10T12:38:25.804 INFO:tasks.workunit.client.1.vm07.stdout:5/874: fsync d0/f1f 0 2026-03-10T12:38:25.805 INFO:tasks.workunit.client.1.vm07.stdout:4/968: truncate d0/d4/d10/d3c/f68 716301 0 2026-03-10T12:38:25.806 INFO:tasks.workunit.client.1.vm07.stdout:4/969: write d0/d4/d10/d9a/d124/fb4 [3043796,48507] 0 2026-03-10T12:38:25.807 INFO:tasks.workunit.client.1.vm07.stdout:4/970: chown d0/d4/d10/d9a/d124/f100 70055144 1 2026-03-10T12:38:25.813 INFO:tasks.workunit.client.0.vm00.stdout:6/854: getdents d2/da/dbf/ded/d118 0 2026-03-10T12:38:25.813 INFO:tasks.workunit.client.1.vm07.stdout:3/897: symlink dc/dd/d43/d76/d95/l12e 0 2026-03-10T12:38:25.813 INFO:tasks.workunit.client.1.vm07.stdout:0/975: symlink d0/d14/d5f/d76/d2f/d31/df0/d105/l148 0 2026-03-10T12:38:25.814 INFO:tasks.workunit.client.1.vm07.stdout:2/780: creat d0/d42/d1f/d20/df7/f10a x:0 0 0 2026-03-10T12:38:25.825 INFO:tasks.workunit.client.1.vm07.stdout:6/841: creat d1/d4/d6/d16/d1a/d99/df5/f115 x:0 0 0 2026-03-10T12:38:25.831 INFO:tasks.workunit.client.1.vm07.stdout:4/971: rename d0/d4/d5/d8f/c13b to d0/d4/d5/d78/dc5/c15a 0 2026-03-10T12:38:25.834 INFO:tasks.workunit.client.1.vm07.stdout:4/972: dwrite d0/d4/df2/df6/d46/d76/fae [0,4194304] 0 2026-03-10T12:38:25.852 INFO:tasks.workunit.client.1.vm07.stdout:5/875: mkdir d0/d22/d18/d3e/d5d/d12c 0 2026-03-10T12:38:25.856 INFO:tasks.workunit.client.0.vm00.stdout:7/986: dwrite f0 [4194304,4194304] 0 2026-03-10T12:38:25.856 INFO:tasks.workunit.client.1.vm07.stdout:4/973: rename d0/d4/d5/c10e to d0/d4/d10/d3c/d2b/d2d/d9c/c15b 0 2026-03-10T12:38:25.870 INFO:tasks.workunit.client.1.vm07.stdout:5/876: mkdir d0/d22/d18/d3e/d5d/d10b/d12d 0 2026-03-10T12:38:25.872 INFO:tasks.workunit.client.1.vm07.stdout:4/974: truncate d0/d4/d5/d78/dc5/df7/f97 3187732 0 2026-03-10T12:38:25.876 INFO:tasks.workunit.client.0.vm00.stdout:7/987: dread da/d47/dfd/fa9 [0,4194304] 0 2026-03-10T12:38:25.877 INFO:tasks.workunit.client.1.vm07.stdout:0/976: link d0/d14/d5f/d76/d2f/d31/df0/f10b d0/f149 0 2026-03-10T12:38:25.880 INFO:tasks.workunit.client.1.vm07.stdout:9/925: dwrite d5/d16/da3/fb1 [0,4194304] 0 2026-03-10T12:38:25.909 INFO:tasks.workunit.client.1.vm07.stdout:6/842: creat d1/d4/f116 x:0 0 0 2026-03-10T12:38:25.910 INFO:tasks.workunit.client.1.vm07.stdout:1/867: write d9/df/d29/d2b/d31/d91/faf [4714163,123589] 0 2026-03-10T12:38:25.912 INFO:tasks.workunit.client.1.vm07.stdout:7/827: write d0/d47/da0/dd4/f106 [1570683,23218] 0 2026-03-10T12:38:25.915 INFO:tasks.workunit.client.1.vm07.stdout:8/831: dwrite d1/d3/d6/d54/ffa [0,4194304] 0 2026-03-10T12:38:25.916 INFO:tasks.workunit.client.1.vm07.stdout:8/832: write d1/d3/d6/d50/d70/dfb/ffe [51449,14058] 0 2026-03-10T12:38:25.922 INFO:tasks.workunit.client.0.vm00.stdout:6/855: write d2/da/fda [769359,24667] 0 2026-03-10T12:38:25.939 INFO:tasks.workunit.client.1.vm07.stdout:2/781: write d0/d42/d1f/d20/f3f [12015326,2981] 0 2026-03-10T12:38:25.940 INFO:tasks.workunit.client.0.vm00.stdout:6/856: creat d2/da/dc/d94/f133 x:0 0 0 2026-03-10T12:38:25.944 INFO:tasks.workunit.client.0.vm00.stdout:6/857: symlink d2/da/dc/d83/l134 0 2026-03-10T12:38:25.945 INFO:tasks.workunit.client.0.vm00.stdout:6/858: write d2/da/dc/d83/d119/f129 [1569482,19725] 0 2026-03-10T12:38:25.946 INFO:tasks.workunit.client.0.vm00.stdout:6/859: chown d2/d14/dc0 1460179 1 2026-03-10T12:38:25.952 INFO:tasks.workunit.client.0.vm00.stdout:6/860: rename d2/da/dc/fd to d2/da/dbf/f135 0 2026-03-10T12:38:25.954 INFO:tasks.workunit.client.0.vm00.stdout:6/861: dread - d2/d16/d29/d31/d88/d92/fb6 zero size 2026-03-10T12:38:25.955 INFO:tasks.workunit.client.1.vm07.stdout:3/898: link dc/d18/l125 dc/d18/d2d/de5/l12f 0 2026-03-10T12:38:25.961 INFO:tasks.workunit.client.1.vm07.stdout:9/926: read d5/d13/d6c/d7a/f94 [432817,32122] 0 2026-03-10T12:38:25.962 INFO:tasks.workunit.client.1.vm07.stdout:9/927: chown d5/d1f/d5e/d10a 3865591 1 2026-03-10T12:38:25.963 INFO:tasks.workunit.client.1.vm07.stdout:3/899: dread dc/dd/db5/f115 [0,4194304] 0 2026-03-10T12:38:25.964 INFO:tasks.workunit.client.1.vm07.stdout:3/900: readlink la 0 2026-03-10T12:38:25.965 INFO:tasks.workunit.client.1.vm07.stdout:8/833: creat d1/d3/d6/d50/d70/dd4/f10d x:0 0 0 2026-03-10T12:38:25.965 INFO:tasks.workunit.client.1.vm07.stdout:2/782: symlink d0/d42/d26/d38/d4f/d5d/l10b 0 2026-03-10T12:38:25.977 INFO:tasks.workunit.client.0.vm00.stdout:6/862: mknod d2/da/dc/c136 0 2026-03-10T12:38:25.981 INFO:tasks.workunit.client.1.vm07.stdout:1/868: dread d9/df/d55/fce [0,4194304] 0 2026-03-10T12:38:25.981 INFO:tasks.workunit.client.1.vm07.stdout:1/869: truncate d9/d2d/de2/fbf 4606885 0 2026-03-10T12:38:25.983 INFO:tasks.workunit.client.1.vm07.stdout:9/928: symlink d5/d13/d2c/de6/d74/l13d 0 2026-03-10T12:38:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:25 vm00.local ceph-mon[50686]: pgmap v9: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 46 MiB/s rd, 115 MiB/s wr, 302 op/s 2026-03-10T12:38:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:25 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:25 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:25 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:25.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:25 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:25.984 INFO:tasks.workunit.client.1.vm07.stdout:5/877: creat d0/f12e x:0 0 0 2026-03-10T12:38:25.990 INFO:tasks.workunit.client.1.vm07.stdout:1/870: write d9/d2d/d4f/dde/fef [2483062,2937] 0 2026-03-10T12:38:25.993 INFO:tasks.workunit.client.1.vm07.stdout:9/929: rename d5/fcd to d5/d13/d9d/f13e 0 2026-03-10T12:38:25.995 INFO:tasks.workunit.client.1.vm07.stdout:7/828: link d0/l2e d0/d57/dd6/l110 0 2026-03-10T12:38:25.997 INFO:tasks.workunit.client.1.vm07.stdout:6/843: rmdir d1/d4/d6/d16/d1a/d6e/d112 0 2026-03-10T12:38:26.006 INFO:tasks.workunit.client.1.vm07.stdout:7/829: mkdir d0/d61/db4/df4/d111 0 2026-03-10T12:38:26.008 INFO:tasks.workunit.client.1.vm07.stdout:6/844: mknod d1/dd7/da3/dd8/c117 0 2026-03-10T12:38:26.025 INFO:tasks.workunit.client.1.vm07.stdout:0/977: dread - d0/d14/d5f/d76/d2f/d31/d4f/d9d/f104 zero size 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/988: write da/d3f/d60/fb1 [1124833,72007] 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/989: rmdir da/d25/d2c/d82/d68/d124 39 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/990: chown da/d25/d2e 367098616 1 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/991: mknod da/d25/d2c/d82/d101/c153 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/992: creat da/d25/d2c/d82/d68/d124/d13f/d148/f154 x:0 0 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/993: mknod da/d3f/d71/c155 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/994: chown da/d25/d2e/d4c/l7c 0 1 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/995: stat da/d25/d2c/d82/d68/df8/c10f 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/996: creat da/d26/d50/d73/f156 x:0 0 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.0.vm00.stdout:7/997: dread da/f10 [0,4194304] 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.1.vm07.stdout:0/978: read d0/d14/d5f/d76/d2f/d31/d4f/d9d/dd4/ffb [125468,68976] 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.1.vm07.stdout:2/783: dread d0/d29/d64/d6c/d94/fa7 [0,4194304] 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.1.vm07.stdout:2/784: chown d0/d29/d64/d74/d88 35207 1 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.1.vm07.stdout:0/979: fdatasync d0/d14/d5f/d76/d2f/d31/d4f/fc4 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.1.vm07.stdout:2/785: rename d0/d80/d93/fce to d0/d5b/f10c 0 2026-03-10T12:38:26.061 INFO:tasks.workunit.client.1.vm07.stdout:0/980: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115/d11d/f14a x:0 0 0 2026-03-10T12:38:26.062 INFO:tasks.workunit.client.1.vm07.stdout:2/786: stat d0/l11 0 2026-03-10T12:38:26.062 INFO:tasks.workunit.client.1.vm07.stdout:2/787: rename d0/d42/d4e/d77/lfd to d0/d29/l10d 0 2026-03-10T12:38:26.062 INFO:tasks.workunit.client.1.vm07.stdout:0/981: symlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dd9/l14b 0 2026-03-10T12:38:26.062 INFO:tasks.workunit.client.1.vm07.stdout:2/788: fsync d0/d42/d1f/d20/f39 0 2026-03-10T12:38:26.062 INFO:tasks.workunit.client.1.vm07.stdout:0/982: rename d0/d14/d5f/d3b/l80 to d0/d14/d5f/d76/d2f/d31/d79/dcc/d137/l14c 0 2026-03-10T12:38:26.062 INFO:tasks.workunit.client.0.vm00.stdout:7/998: dread da/d26/d37/fc4 [4194304,4194304] 0 2026-03-10T12:38:26.063 INFO:tasks.workunit.client.0.vm00.stdout:7/999: creat da/d47/dfd/f157 x:0 0 0 2026-03-10T12:38:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:25 vm07.local ceph-mon[58582]: pgmap v9: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 46 MiB/s rd, 115 MiB/s wr, 302 op/s 2026-03-10T12:38:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:25 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:25 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:25 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:25 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:26.213 INFO:tasks.workunit.client.1.vm07.stdout:5/878: sync 2026-03-10T12:38:26.217 INFO:tasks.workunit.client.1.vm07.stdout:5/879: mknod d0/d22/d18/d19/d21/d54/dcb/de8/c12f 0 2026-03-10T12:38:26.259 INFO:tasks.workunit.client.1.vm07.stdout:5/880: read - d0/d22/d18/d19/de5/f10d zero size 2026-03-10T12:38:26.259 INFO:tasks.workunit.client.1.vm07.stdout:5/881: truncate d0/d22/d18/d3e/d53/fa3 3878649 0 2026-03-10T12:38:26.259 INFO:tasks.workunit.client.1.vm07.stdout:5/882: chown d0/d22/d18/d19/d21/d3a 856919767 1 2026-03-10T12:38:26.283 INFO:tasks.workunit.client.1.vm07.stdout:4/975: dwrite d0/d4/d5/da/d66/fa8 [0,4194304] 0 2026-03-10T12:38:26.284 INFO:tasks.workunit.client.1.vm07.stdout:4/976: chown d0/d144 58 1 2026-03-10T12:38:26.291 INFO:tasks.workunit.client.0.vm00.stdout:6/863: dwrite d2/da/dc/d2f/f56 [0,4194304] 0 2026-03-10T12:38:26.293 INFO:tasks.workunit.client.1.vm07.stdout:4/977: truncate d0/d4/d5/da/fd4 836876 0 2026-03-10T12:38:26.298 INFO:tasks.workunit.client.1.vm07.stdout:4/978: dread - d0/d4/d10/d5f/d6d/f11d zero size 2026-03-10T12:38:26.303 INFO:tasks.workunit.client.1.vm07.stdout:4/979: chown d0/d4/d5/d78/dc5/df7/db2/dd5/f115 421094566 1 2026-03-10T12:38:26.307 INFO:tasks.workunit.client.1.vm07.stdout:3/901: dwrite dc/dd/d43/d5c/fa9 [0,4194304] 0 2026-03-10T12:38:26.309 INFO:tasks.workunit.client.1.vm07.stdout:8/834: write d1/d3/d40/d92/db6/f108 [26227,25164] 0 2026-03-10T12:38:26.310 INFO:tasks.workunit.client.1.vm07.stdout:8/835: chown d1/d3/d6/d54/l9a 4 1 2026-03-10T12:38:26.312 INFO:tasks.workunit.client.1.vm07.stdout:9/930: write d5/d1f/f9f [178140,118463] 0 2026-03-10T12:38:26.315 INFO:tasks.workunit.client.1.vm07.stdout:9/931: truncate d5/d69/d93/f138 43756 0 2026-03-10T12:38:26.317 INFO:tasks.workunit.client.1.vm07.stdout:1/871: dwrite d9/d2d/d4f/f95 [0,4194304] 0 2026-03-10T12:38:26.323 INFO:tasks.workunit.client.1.vm07.stdout:7/830: write d0/d61/db4/f53 [1449322,16225] 0 2026-03-10T12:38:26.328 INFO:tasks.workunit.client.1.vm07.stdout:6/845: dwrite d1/d4/d6/d16/d1a/d99/fa8 [4194304,4194304] 0 2026-03-10T12:38:26.332 INFO:tasks.workunit.client.1.vm07.stdout:3/902: truncate dc/dd/d1f/d45/fea 3322200 0 2026-03-10T12:38:26.334 INFO:tasks.workunit.client.0.vm00.stdout:6/864: link d2/d14/d7a/db9/f6c d2/d42/d9c/f137 0 2026-03-10T12:38:26.344 INFO:tasks.workunit.client.0.vm00.stdout:6/865: fdatasync d2/d42/d80/d9d/fe9 0 2026-03-10T12:38:26.344 INFO:tasks.workunit.client.1.vm07.stdout:9/932: unlink d5/d16/d23/d26/f139 0 2026-03-10T12:38:26.344 INFO:tasks.workunit.client.1.vm07.stdout:7/831: fdatasync d0/d61/f93 0 2026-03-10T12:38:26.344 INFO:tasks.workunit.client.1.vm07.stdout:3/903: symlink dc/d18/d24/d72/l130 0 2026-03-10T12:38:26.347 INFO:tasks.workunit.client.1.vm07.stdout:6/846: dread d1/d4/d6/d46/d4d/dc7/f109 [0,4194304] 0 2026-03-10T12:38:26.362 INFO:tasks.workunit.client.1.vm07.stdout:1/872: dread d9/df/d29/d2b/d31/d11f/f57 [0,4194304] 0 2026-03-10T12:38:26.362 INFO:tasks.workunit.client.1.vm07.stdout:9/933: creat d5/d13/d6c/d7a/f13f x:0 0 0 2026-03-10T12:38:26.362 INFO:tasks.workunit.client.1.vm07.stdout:1/873: dread - d9/df/d29/d2b/d92/d9d/f105 zero size 2026-03-10T12:38:26.364 INFO:tasks.workunit.client.1.vm07.stdout:7/832: chown d0/d57/d62/d90/da1/fe9 9609288 1 2026-03-10T12:38:26.366 INFO:tasks.workunit.client.1.vm07.stdout:3/904: rename dc/dd/f22 to dc/dd/d28/d7a/d8e/f131 0 2026-03-10T12:38:26.367 INFO:tasks.workunit.client.1.vm07.stdout:3/905: chown dc/dd/d1f/ce7 14277786 1 2026-03-10T12:38:26.370 INFO:tasks.workunit.client.1.vm07.stdout:2/789: dwrite d0/d29/d64/fd2 [0,4194304] 0 2026-03-10T12:38:26.375 INFO:tasks.workunit.client.1.vm07.stdout:0/983: dwrite d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/fa4 [0,4194304] 0 2026-03-10T12:38:26.376 INFO:tasks.workunit.client.1.vm07.stdout:5/883: write d0/d22/d18/d19/fa8 [499864,109159] 0 2026-03-10T12:38:26.378 INFO:tasks.workunit.client.1.vm07.stdout:0/984: dread - d0/d14/d5f/d41/fe8 zero size 2026-03-10T12:38:26.379 INFO:tasks.workunit.client.1.vm07.stdout:0/985: readlink d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/l143 0 2026-03-10T12:38:26.380 INFO:tasks.workunit.client.1.vm07.stdout:0/986: chown d0/d14/d5f/d76/d2f/c147 11955424 1 2026-03-10T12:38:26.384 INFO:tasks.workunit.client.1.vm07.stdout:8/836: link d1/d3/f73 d1/d3/d6/d54/dd2/f10e 0 2026-03-10T12:38:26.384 INFO:tasks.workunit.client.1.vm07.stdout:1/874: mkdir d9/d2d/d124 0 2026-03-10T12:38:26.384 INFO:tasks.workunit.client.1.vm07.stdout:7/833: creat d0/d61/db4/df4/f112 x:0 0 0 2026-03-10T12:38:26.385 INFO:tasks.workunit.client.1.vm07.stdout:3/906: truncate dc/d18/fd4 133933 0 2026-03-10T12:38:26.386 INFO:tasks.workunit.client.1.vm07.stdout:7/834: write d0/d61/db4/d8a/fbe [10842,50342] 0 2026-03-10T12:38:26.390 INFO:tasks.workunit.client.1.vm07.stdout:6/847: mkdir d1/dd7/d66/d118 0 2026-03-10T12:38:26.394 INFO:tasks.workunit.client.1.vm07.stdout:3/907: readlink dc/dd/d1f/d6f/l124 0 2026-03-10T12:38:26.396 INFO:tasks.workunit.client.1.vm07.stdout:1/875: truncate d9/df/d29/d2b/f32 11677286 0 2026-03-10T12:38:26.405 INFO:tasks.workunit.client.1.vm07.stdout:7/835: creat d0/d47/dab/f113 x:0 0 0 2026-03-10T12:38:26.405 INFO:tasks.workunit.client.1.vm07.stdout:8/837: mknod d1/d3/db2/dcd/dc7/c10f 0 2026-03-10T12:38:26.416 INFO:tasks.workunit.client.1.vm07.stdout:0/987: rename d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/f134 to d0/d14/d5f/d76/d2f/d31/d4f/d9d/f14d 0 2026-03-10T12:38:26.435 INFO:tasks.workunit.client.1.vm07.stdout:8/838: dread - d1/d3/db2/dcd/fa4 zero size 2026-03-10T12:38:26.435 INFO:tasks.workunit.client.1.vm07.stdout:8/839: write d1/d3/d40/d92/dba/feb [5085133,118966] 0 2026-03-10T12:38:26.435 INFO:tasks.workunit.client.1.vm07.stdout:8/840: stat d1/d3/d6/d50/d70/dfb 0 2026-03-10T12:38:26.435 INFO:tasks.workunit.client.1.vm07.stdout:7/836: getdents d0/d47/dde/df5 0 2026-03-10T12:38:26.435 INFO:tasks.workunit.client.1.vm07.stdout:6/848: rename d1/d4/d6/d46/d4d/dc7/dd9/ffb to d1/d4/d6/d4e/f119 0 2026-03-10T12:38:26.435 INFO:tasks.workunit.client.1.vm07.stdout:6/849: read d1/d4/f19 [2313202,32040] 0 2026-03-10T12:38:26.435 INFO:tasks.workunit.client.1.vm07.stdout:7/837: creat d0/d61/db4/df4/d111/f114 x:0 0 0 2026-03-10T12:38:26.436 INFO:tasks.workunit.client.1.vm07.stdout:7/838: fdatasync d0/d61/db4/df4/d111/f114 0 2026-03-10T12:38:26.437 INFO:tasks.workunit.client.1.vm07.stdout:3/908: sync 2026-03-10T12:38:26.441 INFO:tasks.workunit.client.1.vm07.stdout:6/850: rename d1/d4/f5a to d1/d4/d6/d16/d49/f11a 0 2026-03-10T12:38:26.442 INFO:tasks.workunit.client.1.vm07.stdout:7/839: readlink d0/d57/d62/ld7 0 2026-03-10T12:38:26.450 INFO:tasks.workunit.client.1.vm07.stdout:3/909: mkdir dc/dd/d1f/d45/d132 0 2026-03-10T12:38:26.452 INFO:tasks.workunit.client.1.vm07.stdout:7/840: rmdir d0/d57/d62 39 2026-03-10T12:38:26.453 INFO:tasks.workunit.client.1.vm07.stdout:7/841: fdatasync d0/d61/db4/f4b 0 2026-03-10T12:38:26.455 INFO:tasks.workunit.client.1.vm07.stdout:6/851: sync 2026-03-10T12:38:26.456 INFO:tasks.workunit.client.1.vm07.stdout:6/852: write d1/d4/d6/d16/f10d [771038,22140] 0 2026-03-10T12:38:26.456 INFO:tasks.workunit.client.1.vm07.stdout:6/853: chown d1/d4/l8a 9 1 2026-03-10T12:38:26.459 INFO:tasks.workunit.client.1.vm07.stdout:9/934: write d5/d13/f2b [4394755,114802] 0 2026-03-10T12:38:26.462 INFO:tasks.workunit.client.0.vm00.stdout:6/866: dwrite d2/da/f11 [0,4194304] 0 2026-03-10T12:38:26.465 INFO:tasks.workunit.client.1.vm07.stdout:2/790: dwrite d0/d5b/f76 [0,4194304] 0 2026-03-10T12:38:26.479 INFO:tasks.workunit.client.1.vm07.stdout:7/842: dread d0/f9b [0,4194304] 0 2026-03-10T12:38:26.480 INFO:tasks.workunit.client.1.vm07.stdout:5/884: write d0/f47 [38048,7197] 0 2026-03-10T12:38:26.480 INFO:tasks.workunit.client.1.vm07.stdout:9/935: creat d5/d13/d22/df0/f140 x:0 0 0 2026-03-10T12:38:26.480 INFO:tasks.workunit.client.1.vm07.stdout:6/854: symlink d1/d4/d6/l11b 0 2026-03-10T12:38:26.481 INFO:tasks.workunit.client.1.vm07.stdout:1/876: write d9/d2d/d4f/d5a/f65 [1963959,128657] 0 2026-03-10T12:38:26.481 INFO:tasks.workunit.client.1.vm07.stdout:2/791: creat d0/d42/d1f/f10e x:0 0 0 2026-03-10T12:38:26.482 INFO:tasks.workunit.client.1.vm07.stdout:2/792: chown d0/d42/d4e/daf/lfb 0 1 2026-03-10T12:38:26.482 INFO:tasks.workunit.client.1.vm07.stdout:4/980: write d0/d4/d5/da/fd4 [1140812,24577] 0 2026-03-10T12:38:26.485 INFO:tasks.workunit.client.1.vm07.stdout:0/988: dread d0/d14/d5f/d76/d2f/d31/d4f/d60/f89 [0,4194304] 0 2026-03-10T12:38:26.489 INFO:tasks.workunit.client.1.vm07.stdout:8/841: dread d1/d3/d6/d54/dd2/f10e [0,4194304] 0 2026-03-10T12:38:26.490 INFO:tasks.workunit.client.1.vm07.stdout:8/842: chown d1/d3/d11/cdc 487 1 2026-03-10T12:38:26.494 INFO:tasks.workunit.client.1.vm07.stdout:9/936: mkdir d5/d13/d9d/df2/d141 0 2026-03-10T12:38:26.497 INFO:tasks.workunit.client.1.vm07.stdout:1/877: dread d9/df/d29/d6b/fa1 [0,4194304] 0 2026-03-10T12:38:26.498 INFO:tasks.workunit.client.1.vm07.stdout:7/843: mkdir d0/d61/d115 0 2026-03-10T12:38:26.500 INFO:tasks.workunit.client.1.vm07.stdout:6/855: creat d1/d4/d6/d16/d1a/d6e/f11c x:0 0 0 2026-03-10T12:38:26.503 INFO:tasks.workunit.client.1.vm07.stdout:7/844: dwrite d0/d61/db4/f53 [0,4194304] 0 2026-03-10T12:38:26.507 INFO:tasks.workunit.client.1.vm07.stdout:2/793: fdatasync d0/d42/d1f/d90/fb2 0 2026-03-10T12:38:26.512 INFO:tasks.workunit.client.1.vm07.stdout:3/910: dwrite dc/d18/d24/f3e [0,4194304] 0 2026-03-10T12:38:26.519 INFO:tasks.workunit.client.1.vm07.stdout:3/911: chown dc/d18/d24 168 1 2026-03-10T12:38:26.519 INFO:tasks.workunit.client.1.vm07.stdout:9/937: truncate d5/d13/d2c/de6/f56 3578178 0 2026-03-10T12:38:26.519 INFO:tasks.workunit.client.1.vm07.stdout:1/878: sync 2026-03-10T12:38:26.534 INFO:tasks.workunit.client.0.vm00.stdout:6/867: creat d2/d16/d29/d31/d88/d92/f138 x:0 0 0 2026-03-10T12:38:26.535 INFO:tasks.workunit.client.1.vm07.stdout:7/845: dread - d0/d57/dd6/d80/f10f zero size 2026-03-10T12:38:26.535 INFO:tasks.workunit.client.1.vm07.stdout:6/856: symlink d1/d4/d6/d4e/l11d 0 2026-03-10T12:38:26.537 INFO:tasks.workunit.client.1.vm07.stdout:4/981: rename d0/d4/df2/df6/f93 to d0/d4/d5/f15c 0 2026-03-10T12:38:26.538 INFO:tasks.workunit.client.1.vm07.stdout:0/989: link d0/d14/d5f/d76/d2f/c147 d0/d14/d5f/d76/d2f/d31/d79/dcc/d137/c14e 0 2026-03-10T12:38:26.539 INFO:tasks.workunit.client.1.vm07.stdout:1/879: mknod d9/df/d29/d2b/c125 0 2026-03-10T12:38:26.542 INFO:tasks.workunit.client.1.vm07.stdout:3/912: rename dc/dd/d1f/l7c to dc/dd/d28/dd0/l133 0 2026-03-10T12:38:26.542 INFO:tasks.workunit.client.1.vm07.stdout:3/913: dread - dc/d18/d24/d72/fc2 zero size 2026-03-10T12:38:26.543 INFO:tasks.workunit.client.1.vm07.stdout:4/982: mkdir d0/d4/d5/da/d15d 0 2026-03-10T12:38:26.545 INFO:tasks.workunit.client.0.vm00.stdout:6/868: symlink d2/l139 0 2026-03-10T12:38:26.552 INFO:tasks.workunit.client.1.vm07.stdout:3/914: truncate dc/d18/d2d/f80 626706 0 2026-03-10T12:38:26.552 INFO:tasks.workunit.client.1.vm07.stdout:0/990: mknod d0/d14/d5f/d76/d2f/d31/d4f/c14f 0 2026-03-10T12:38:26.553 INFO:tasks.workunit.client.1.vm07.stdout:4/983: dread - d0/d4/d10/d3c/d2b/d2d/da7/fdb zero size 2026-03-10T12:38:26.554 INFO:tasks.workunit.client.1.vm07.stdout:1/880: sync 2026-03-10T12:38:26.555 INFO:tasks.workunit.client.0.vm00.stdout:6/869: mkdir d2/d14/dbb/d13a 0 2026-03-10T12:38:26.556 INFO:tasks.workunit.client.1.vm07.stdout:9/938: getdents d5/d69/d93/d97 0 2026-03-10T12:38:26.556 INFO:tasks.workunit.client.1.vm07.stdout:7/846: dread d0/d57/d62/f7e [0,4194304] 0 2026-03-10T12:38:26.557 INFO:tasks.workunit.client.1.vm07.stdout:9/939: write d5/d16/dd7/f135 [574622,76035] 0 2026-03-10T12:38:26.557 INFO:tasks.workunit.client.1.vm07.stdout:7/847: stat d0/d61/db4/d8a/d9d 0 2026-03-10T12:38:26.558 INFO:tasks.workunit.client.1.vm07.stdout:7/848: dread - d0/d61/d79/f104 zero size 2026-03-10T12:38:26.561 INFO:tasks.workunit.client.1.vm07.stdout:0/991: stat d0/c116 0 2026-03-10T12:38:26.563 INFO:tasks.workunit.client.0.vm00.stdout:6/870: fdatasync d2/d51/f63 0 2026-03-10T12:38:26.563 INFO:tasks.workunit.client.1.vm07.stdout:4/984: symlink d0/d4/d5/da/d66/l15e 0 2026-03-10T12:38:26.565 INFO:tasks.workunit.client.1.vm07.stdout:9/940: fsync d5/d1f/d5e/d6b/de0/fef 0 2026-03-10T12:38:26.568 INFO:tasks.workunit.client.1.vm07.stdout:6/857: link d1/dd7/da3/dd8/c117 d1/dd7/d66/dd6/c11e 0 2026-03-10T12:38:26.575 INFO:tasks.workunit.client.1.vm07.stdout:2/794: dread d0/d42/d26/f52 [0,4194304] 0 2026-03-10T12:38:26.575 INFO:tasks.workunit.client.1.vm07.stdout:3/915: dread dc/dd/f19 [0,4194304] 0 2026-03-10T12:38:26.575 INFO:tasks.workunit.client.1.vm07.stdout:1/881: symlink d9/df/l126 0 2026-03-10T12:38:26.575 INFO:tasks.workunit.client.1.vm07.stdout:1/882: write d9/d2d/d4f/d75/d77/f100 [78940,42129] 0 2026-03-10T12:38:26.575 INFO:tasks.workunit.client.1.vm07.stdout:1/883: readlink d9/d2d/d80/lf8 0 2026-03-10T12:38:26.579 INFO:tasks.workunit.client.1.vm07.stdout:7/849: creat d0/d57/dd6/d107/f116 x:0 0 0 2026-03-10T12:38:26.580 INFO:tasks.workunit.client.1.vm07.stdout:7/850: dread - d0/d61/d79/f104 zero size 2026-03-10T12:38:26.582 INFO:tasks.workunit.client.1.vm07.stdout:2/795: symlink d0/d5b/l10f 0 2026-03-10T12:38:26.586 INFO:tasks.workunit.client.1.vm07.stdout:8/843: dwrite d1/d3/d11/f77 [0,4194304] 0 2026-03-10T12:38:26.594 INFO:tasks.workunit.client.1.vm07.stdout:3/916: fsync dc/d18/d24/f3a 0 2026-03-10T12:38:26.618 INFO:tasks.workunit.client.1.vm07.stdout:0/992: fsync d0/d14/d5f/d76/f118 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:5/885: write d0/d22/d18/d19/d2e/da9/f103 [1610605,128938] 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:1/884: unlink d9/df/f11 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:0/993: dread d0/d14/f36 [0,4194304] 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:6/858: link d1/d4/d6/d16/d1a/d2c/de0/ff6 d1/d4/d71/f11f 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:6/859: readlink d1/d4/d6/d4e/l11d 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:2/796: creat d0/d29/d64/db5/dbb/df9/f110 x:0 0 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:3/917: read dc/d18/d99/da3/fd2 [272361,94815] 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:5/886: symlink d0/d22/d18/d3e/d5d/db6/l130 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:5/887: stat d0/d22/d18/d19/d21/d54/dcb/de8/c12f 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:9/941: creat d5/d13/d2c/de6/f142 x:0 0 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:1/885: truncate d9/df/d29/d2b/d31/f7d 1231320 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:9/942: fsync d5/d16/d18/f20 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:1/886: write d9/df/d29/d2b/d92/f10d [22187,115718] 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:9/943: chown d5/d13/d57/d4f/d6a/f101 510559655 1 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:0/994: fsync d0/d14/d5f/d76/d2f/d31/f6f 0 2026-03-10T12:38:26.619 INFO:tasks.workunit.client.1.vm07.stdout:9/944: dwrite d5/d13/d2c/f44 [0,4194304] 0 2026-03-10T12:38:26.624 INFO:tasks.workunit.client.1.vm07.stdout:9/945: dwrite d5/d13/f2b [4194304,4194304] 0 2026-03-10T12:38:26.628 INFO:tasks.workunit.client.1.vm07.stdout:3/918: creat dc/d18/d99/da3/f134 x:0 0 0 2026-03-10T12:38:26.630 INFO:tasks.workunit.client.1.vm07.stdout:3/919: readlink dc/d18/d24/ldf 0 2026-03-10T12:38:26.631 INFO:tasks.workunit.client.1.vm07.stdout:1/887: creat d9/d2d/d4f/d5a/f127 x:0 0 0 2026-03-10T12:38:26.633 INFO:tasks.workunit.client.1.vm07.stdout:9/946: dwrite d5/d13/d2c/de6/d64/d108/d127/f125 [0,4194304] 0 2026-03-10T12:38:26.644 INFO:tasks.workunit.client.1.vm07.stdout:8/844: creat d1/d3/d40/f110 x:0 0 0 2026-03-10T12:38:26.644 INFO:tasks.workunit.client.1.vm07.stdout:3/920: mkdir dc/dd/d1f/dc7/d135 0 2026-03-10T12:38:26.645 INFO:tasks.workunit.client.1.vm07.stdout:1/888: readlink d9/d2d/d4f/d75/d77/lb4 0 2026-03-10T12:38:26.646 INFO:tasks.workunit.client.1.vm07.stdout:1/889: chown d9/d2d/dd7/l106 63786 1 2026-03-10T12:38:26.673 INFO:tasks.workunit.client.1.vm07.stdout:9/947: truncate d5/d16/d23/fc8 4704114 0 2026-03-10T12:38:26.675 INFO:tasks.workunit.client.1.vm07.stdout:5/888: rename d0/d22/d18/d3e/d11f/c12a to d0/d22/d18/d19/d21/d54/dcb/c131 0 2026-03-10T12:38:26.676 INFO:tasks.workunit.client.1.vm07.stdout:8/845: mknod d1/d3/d40/d92/db6/c111 0 2026-03-10T12:38:26.676 INFO:tasks.workunit.client.1.vm07.stdout:8/846: stat d1/d3/f71 0 2026-03-10T12:38:26.678 INFO:tasks.workunit.client.1.vm07.stdout:3/921: creat dc/dd/d1f/d45/dbf/f136 x:0 0 0 2026-03-10T12:38:26.679 INFO:tasks.workunit.client.1.vm07.stdout:3/922: dread - dc/dd/d28/d7a/f11c zero size 2026-03-10T12:38:26.680 INFO:tasks.workunit.client.1.vm07.stdout:1/890: chown d9/df/d29/d2b/db1 578 1 2026-03-10T12:38:26.681 INFO:tasks.workunit.client.1.vm07.stdout:9/948: dread - d5/f119 zero size 2026-03-10T12:38:26.682 INFO:tasks.workunit.client.1.vm07.stdout:3/923: mknod dc/dd/d1f/dc7/dc9/d116/c137 0 2026-03-10T12:38:26.690 INFO:tasks.workunit.client.1.vm07.stdout:9/949: creat d5/d1f/d7d/f143 x:0 0 0 2026-03-10T12:38:26.691 INFO:tasks.workunit.client.1.vm07.stdout:9/950: write d5/d1f/d75/fbc [4926616,51968] 0 2026-03-10T12:38:26.694 INFO:tasks.workunit.client.1.vm07.stdout:3/924: rename dc/dd/d1f/d45/l58 to dc/dd/d1f/d45/d132/l138 0 2026-03-10T12:38:26.697 INFO:tasks.workunit.client.1.vm07.stdout:9/951: truncate d5/d1f/fb9 14459 0 2026-03-10T12:38:26.697 INFO:tasks.workunit.client.1.vm07.stdout:5/889: dread d0/d22/f16 [0,4194304] 0 2026-03-10T12:38:26.698 INFO:tasks.workunit.client.1.vm07.stdout:9/952: chown d5/d1f/d7d/f143 204494 1 2026-03-10T12:38:26.704 INFO:tasks.workunit.client.1.vm07.stdout:9/953: dread - d5/d13/d2c/de6/d64/d108/d127/f13b zero size 2026-03-10T12:38:26.704 INFO:tasks.workunit.client.1.vm07.stdout:8/847: creat d1/d3/d40/f112 x:0 0 0 2026-03-10T12:38:26.704 INFO:tasks.workunit.client.1.vm07.stdout:5/890: rename d0/d22/d18/d19/d2e/f88 to d0/d22/d18/d3e/d5d/db6/f132 0 2026-03-10T12:38:26.704 INFO:tasks.workunit.client.1.vm07.stdout:8/848: fdatasync d1/d3/d6c/fe3 0 2026-03-10T12:38:26.704 INFO:tasks.workunit.client.1.vm07.stdout:8/849: chown d1/d3/d40/d104 62465 1 2026-03-10T12:38:26.706 INFO:tasks.workunit.client.1.vm07.stdout:9/954: creat d5/d16/d23/d26/d68/f144 x:0 0 0 2026-03-10T12:38:26.717 INFO:tasks.workunit.client.1.vm07.stdout:3/925: dread dc/d18/f36 [8388608,4194304] 0 2026-03-10T12:38:26.727 INFO:tasks.workunit.client.1.vm07.stdout:8/850: rename d1/d3/d11/c64 to d1/d3/d5d/c113 0 2026-03-10T12:38:26.733 INFO:tasks.workunit.client.1.vm07.stdout:9/955: sync 2026-03-10T12:38:26.733 INFO:tasks.workunit.client.1.vm07.stdout:9/956: chown d5/d1f/d75 10429 1 2026-03-10T12:38:26.739 INFO:tasks.workunit.client.1.vm07.stdout:3/926: link dc/dd/d28/d7a/d8e/f10a dc/dd/d1f/dc7/dc9/d116/d11a/f139 0 2026-03-10T12:38:26.748 INFO:tasks.workunit.client.1.vm07.stdout:9/957: truncate d5/d13/d2c/de6/ffa 729038 0 2026-03-10T12:38:26.751 INFO:tasks.workunit.client.1.vm07.stdout:3/927: fsync dc/d18/d2d/f10b 0 2026-03-10T12:38:26.753 INFO:tasks.workunit.client.1.vm07.stdout:9/958: dwrite d5/d13/d2c/de6/d74/f122 [0,4194304] 0 2026-03-10T12:38:26.759 INFO:tasks.workunit.client.1.vm07.stdout:3/928: creat dc/dd/d1f/d6f/f13a x:0 0 0 2026-03-10T12:38:26.767 INFO:tasks.workunit.client.1.vm07.stdout:9/959: dread d5/d13/f14 [0,4194304] 0 2026-03-10T12:38:26.768 INFO:tasks.workunit.client.1.vm07.stdout:9/960: chown d5/d13/d2c/de6/d64/lbb 12344281 1 2026-03-10T12:38:26.774 INFO:tasks.workunit.client.0.vm00.stdout:6/871: write d2/d14/f32 [63563,48633] 0 2026-03-10T12:38:26.784 INFO:tasks.workunit.client.1.vm07.stdout:4/985: dwrite d0/d5c/fad [0,4194304] 0 2026-03-10T12:38:26.794 INFO:tasks.workunit.client.1.vm07.stdout:7/851: write d0/d57/d62/f7e [1409350,94592] 0 2026-03-10T12:38:26.795 INFO:tasks.workunit.client.1.vm07.stdout:7/852: write d0/d47/dde/ff6 [2292862,39292] 0 2026-03-10T12:38:26.798 INFO:tasks.workunit.client.1.vm07.stdout:2/797: dwrite d0/d42/d26/d38/f3d [0,4194304] 0 2026-03-10T12:38:26.805 INFO:tasks.workunit.client.1.vm07.stdout:6/860: write d1/fc9 [1111326,106224] 0 2026-03-10T12:38:26.808 INFO:tasks.workunit.client.1.vm07.stdout:6/861: fsync d1/d4/d6/d4e/d64/fa4 0 2026-03-10T12:38:26.809 INFO:tasks.workunit.client.1.vm07.stdout:6/862: dread d1/d4/d6/d16/d1a/d2c/f59 [4194304,4194304] 0 2026-03-10T12:38:26.814 INFO:tasks.workunit.client.1.vm07.stdout:0/995: write d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/fc7 [482872,11882] 0 2026-03-10T12:38:26.815 INFO:tasks.workunit.client.1.vm07.stdout:7/853: dread d0/d52/f98 [0,4194304] 0 2026-03-10T12:38:26.816 INFO:tasks.workunit.client.1.vm07.stdout:0/996: chown d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/d8c/d65/dc8/fb8 992 1 2026-03-10T12:38:26.826 INFO:tasks.workunit.client.1.vm07.stdout:0/997: creat d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/dd0/d135/f150 x:0 0 0 2026-03-10T12:38:26.827 INFO:tasks.workunit.client.1.vm07.stdout:0/998: chown d0/d14/d5f/d76/da1/c138 2210 1 2026-03-10T12:38:26.831 INFO:tasks.workunit.client.1.vm07.stdout:7/854: getdents d0/d57/d62/d90/da1 0 2026-03-10T12:38:26.886 INFO:tasks.workunit.client.1.vm07.stdout:1/891: dwrite d9/fd [0,4194304] 0 2026-03-10T12:38:26.905 INFO:tasks.workunit.client.1.vm07.stdout:5/891: dwrite d0/d22/d18/d19/d21/d54/dcb/de8/ffe [0,4194304] 0 2026-03-10T12:38:26.908 INFO:tasks.workunit.client.1.vm07.stdout:5/892: readlink d0/d22/d18/d19/d72/dcc/ld6 0 2026-03-10T12:38:26.912 INFO:tasks.workunit.client.1.vm07.stdout:8/851: write d1/d3/d6/d54/dd2/fdb [96554,35302] 0 2026-03-10T12:38:26.913 INFO:tasks.workunit.client.1.vm07.stdout:8/852: write d1/d3/d40/d92/dba/fc3 [1585756,59693] 0 2026-03-10T12:38:26.914 INFO:tasks.workunit.client.1.vm07.stdout:8/853: write d1/d3/d40/d92/dba/f10a [585429,83238] 0 2026-03-10T12:38:26.918 INFO:tasks.workunit.client.1.vm07.stdout:8/854: stat d1/d3/d40/d92/db6/fcb 0 2026-03-10T12:38:26.918 INFO:tasks.workunit.client.1.vm07.stdout:8/855: creat d1/d3/db2/dcd/d105/f114 x:0 0 0 2026-03-10T12:38:26.918 INFO:tasks.workunit.client.1.vm07.stdout:8/856: link d1/d3/d40/d92/f94 d1/d3/d18/f115 0 2026-03-10T12:38:26.923 INFO:tasks.workunit.client.1.vm07.stdout:8/857: truncate d1/d3/d5d/fd5 1011608 0 2026-03-10T12:38:26.923 INFO:tasks.workunit.client.1.vm07.stdout:8/858: chown d1/d3/db2/dcd/d105 7377994 1 2026-03-10T12:38:26.929 INFO:tasks.workunit.client.1.vm07.stdout:9/961: dread d5/d13/d2c/de6/ffa [0,4194304] 0 2026-03-10T12:38:26.943 INFO:tasks.workunit.client.1.vm07.stdout:9/962: mknod d5/d13/d6c/d89/dac/c145 0 2026-03-10T12:38:26.943 INFO:tasks.workunit.client.1.vm07.stdout:9/963: truncate d5/d16/f10c 881588 0 2026-03-10T12:38:26.943 INFO:tasks.workunit.client.1.vm07.stdout:9/964: mknod d5/d16/dd7/c146 0 2026-03-10T12:38:26.944 INFO:tasks.workunit.client.1.vm07.stdout:8/859: dread d1/d3/d18/d8e/ffc [0,4194304] 0 2026-03-10T12:38:26.945 INFO:tasks.workunit.client.1.vm07.stdout:8/860: truncate d1/d3/d6c/dde/de7/ff4 1502700 0 2026-03-10T12:38:26.946 INFO:tasks.workunit.client.1.vm07.stdout:8/861: chown d1/d3/d40/f112 16 1 2026-03-10T12:38:26.946 INFO:tasks.workunit.client.1.vm07.stdout:8/862: chown d1/d3/la5 5939532 1 2026-03-10T12:38:26.947 INFO:tasks.workunit.client.1.vm07.stdout:8/863: mknod d1/d3/db2/dcd/dc7/c116 0 2026-03-10T12:38:26.956 INFO:tasks.workunit.client.1.vm07.stdout:5/893: sync 2026-03-10T12:38:26.967 INFO:tasks.workunit.client.0.vm00.stdout:6/872: dwrite d2/d14/f5d [0,4194304] 0 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: Upgrade: Need to upgrade myself (mgr.vm07.kfawlb) 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: Upgrade: Need to upgrade myself (mgr.vm07.kfawlb) 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: Upgrade: Updating mgr.vm00.nescmq 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:26.977 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:26 vm00.local ceph-mon[50686]: Deploying daemon mgr.vm00.nescmq on vm00 2026-03-10T12:38:26.977 INFO:tasks.workunit.client.0.vm00.stdout:6/873: dwrite d2/d9f/dce/f126 [0,4194304] 0 2026-03-10T12:38:26.980 INFO:tasks.workunit.client.1.vm07.stdout:2/798: write d0/d42/d4e/dab/fd5 [1023173,12319] 0 2026-03-10T12:38:26.985 INFO:tasks.workunit.client.0.vm00.stdout:6/874: dread - d2/d16/d29/f84 zero size 2026-03-10T12:38:26.987 INFO:tasks.workunit.client.1.vm07.stdout:6/863: dwrite d1/d4/d6/d43/d88/dc3/ff2 [0,4194304] 0 2026-03-10T12:38:26.994 INFO:tasks.workunit.client.1.vm07.stdout:6/864: mkdir d1/d4/d6/d46/d4d/dc7/dd9/ddc/d120 0 2026-03-10T12:38:26.995 INFO:tasks.workunit.client.0.vm00.stdout:6/875: creat d2/d14/dbb/d12c/f13b x:0 0 0 2026-03-10T12:38:27.002 INFO:tasks.workunit.client.0.vm00.stdout:6/876: unlink d2/d14/d7a/f110 0 2026-03-10T12:38:27.007 INFO:tasks.workunit.client.0.vm00.stdout:6/877: readlink d2/d51/d70/l7c 0 2026-03-10T12:38:27.015 INFO:tasks.workunit.client.0.vm00.stdout:6/878: read - d2/d16/d29/f111 zero size 2026-03-10T12:38:27.015 INFO:tasks.workunit.client.0.vm00.stdout:6/879: fsync d2/d16/d29/d31/d88/d92/fba 0 2026-03-10T12:38:27.041 INFO:tasks.workunit.client.1.vm07.stdout:3/929: link dc/dd/d1f/d45/dbf/c119 dc/c13b 0 2026-03-10T12:38:27.042 INFO:tasks.workunit.client.1.vm07.stdout:3/930: fdatasync dc/d18/d99/da3/f134 0 2026-03-10T12:38:27.043 INFO:tasks.workunit.client.1.vm07.stdout:3/931: mknod dc/dd/db5/c13c 0 2026-03-10T12:38:27.045 INFO:tasks.workunit.client.1.vm07.stdout:3/932: dread dc/dd/d28/d7a/f117 [0,4194304] 0 2026-03-10T12:38:27.046 INFO:tasks.workunit.client.1.vm07.stdout:3/933: truncate dc/dd/d1f/f27 5046389 0 2026-03-10T12:38:27.047 INFO:tasks.workunit.client.1.vm07.stdout:3/934: write dc/d18/d24/fe8 [4586682,62268] 0 2026-03-10T12:38:27.054 INFO:tasks.workunit.client.1.vm07.stdout:4/986: creat d0/d4/d5/d78/dc5/df7/f15f x:0 0 0 2026-03-10T12:38:27.055 INFO:tasks.workunit.client.1.vm07.stdout:4/987: chown d0/d4/d10/f4b 0 1 2026-03-10T12:38:27.057 INFO:tasks.workunit.client.1.vm07.stdout:4/988: dread d0/d8e/f13d [0,4194304] 0 2026-03-10T12:38:27.058 INFO:tasks.workunit.client.1.vm07.stdout:0/999: rename d0/d14/d5f/d41/d86 to d0/d14/d5f/d76/d2f/d31/d4f/d60/d87/dc9/d115/d151 0 2026-03-10T12:38:27.060 INFO:tasks.workunit.client.1.vm07.stdout:9/965: rmdir d5/d13/d6c/d7a 39 2026-03-10T12:38:27.062 INFO:tasks.workunit.client.1.vm07.stdout:9/966: creat d5/d13/d2c/de6/d74/f147 x:0 0 0 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: Upgrade: Need to upgrade myself (mgr.vm07.kfawlb) 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: Upgrade: Need to upgrade myself (mgr.vm07.kfawlb) 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: Upgrade: Updating mgr.vm00.nescmq 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:27.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:26 vm07.local ceph-mon[58582]: Deploying daemon mgr.vm00.nescmq on vm00 2026-03-10T12:38:27.066 INFO:tasks.workunit.client.1.vm07.stdout:9/967: creat d5/d69/d93/d97/f148 x:0 0 0 2026-03-10T12:38:27.066 INFO:tasks.workunit.client.1.vm07.stdout:7/855: mkdir d0/d117 0 2026-03-10T12:38:27.070 INFO:tasks.workunit.client.1.vm07.stdout:7/856: dwrite d0/d57/d62/f7e [0,4194304] 0 2026-03-10T12:38:27.072 INFO:tasks.workunit.client.1.vm07.stdout:9/968: truncate d5/d1f/fd3 269108 0 2026-03-10T12:38:27.074 INFO:tasks.workunit.client.1.vm07.stdout:5/894: symlink d0/l133 0 2026-03-10T12:38:27.075 INFO:tasks.workunit.client.1.vm07.stdout:9/969: mknod d5/d16/d23/d26/d68/c149 0 2026-03-10T12:38:27.076 INFO:tasks.workunit.client.1.vm07.stdout:9/970: chown d5/d13/d2c/de6/d76/f84 0 1 2026-03-10T12:38:27.078 INFO:tasks.workunit.client.1.vm07.stdout:5/895: creat d0/d22/d18/d3e/d53/d9e/f134 x:0 0 0 2026-03-10T12:38:27.083 INFO:tasks.workunit.client.1.vm07.stdout:3/935: dread dc/dd/d28/d3b/f70 [0,4194304] 0 2026-03-10T12:38:27.084 INFO:tasks.workunit.client.1.vm07.stdout:3/936: chown dc/dd/d28/d3b/f4c 116 1 2026-03-10T12:38:27.086 INFO:tasks.workunit.client.1.vm07.stdout:7/857: creat d0/d57/d62/d90/f118 x:0 0 0 2026-03-10T12:38:27.088 INFO:tasks.workunit.client.1.vm07.stdout:4/989: link d0/d4/d10/d3c/d2b/d54/de1/cca d0/d4/d10/d3c/d2b/c160 0 2026-03-10T12:38:27.089 INFO:tasks.workunit.client.1.vm07.stdout:7/858: truncate d0/f27 3970966 0 2026-03-10T12:38:27.089 INFO:tasks.workunit.client.1.vm07.stdout:4/990: mknod d0/d4/df2/df6/d46/d76/c161 0 2026-03-10T12:38:27.090 INFO:tasks.workunit.client.1.vm07.stdout:4/991: chown d0/d4/d5/da/f15 3978288 1 2026-03-10T12:38:27.093 INFO:tasks.workunit.client.1.vm07.stdout:4/992: creat d0/d4/d10/d3c/d2b/f162 x:0 0 0 2026-03-10T12:38:27.094 INFO:tasks.workunit.client.1.vm07.stdout:4/993: unlink d0/d4/d10/d5f/fb6 0 2026-03-10T12:38:27.095 INFO:tasks.workunit.client.1.vm07.stdout:4/994: readlink d0/d4/d10/d5f/l8b 0 2026-03-10T12:38:27.102 INFO:tasks.workunit.client.1.vm07.stdout:1/892: dwrite d9/d2d/d80/fdf [0,4194304] 0 2026-03-10T12:38:27.102 INFO:tasks.workunit.client.1.vm07.stdout:1/893: chown d9/df/d29/d6b 3769946 1 2026-03-10T12:38:27.104 INFO:tasks.workunit.client.1.vm07.stdout:1/894: symlink d9/d2d/de2/l128 0 2026-03-10T12:38:27.110 INFO:tasks.workunit.client.1.vm07.stdout:1/895: unlink d9/df/d29/d2b/db1/fdc 0 2026-03-10T12:38:27.113 INFO:tasks.workunit.client.1.vm07.stdout:3/937: unlink dc/dd/l14 0 2026-03-10T12:38:27.118 INFO:tasks.workunit.client.1.vm07.stdout:8/864: write d1/d3/d40/fd1 [1291485,42384] 0 2026-03-10T12:38:27.119 INFO:tasks.workunit.client.1.vm07.stdout:8/865: truncate d1/d3/d6c/f9b 4973151 0 2026-03-10T12:38:27.123 INFO:tasks.workunit.client.1.vm07.stdout:2/799: truncate d0/d5b/d98/fe9 1924744 0 2026-03-10T12:38:27.126 INFO:tasks.workunit.client.1.vm07.stdout:9/971: rename d5/fda to d5/d13/d9d/df2/d141/f14a 0 2026-03-10T12:38:27.128 INFO:tasks.workunit.client.1.vm07.stdout:6/865: dwrite d1/f3d [0,4194304] 0 2026-03-10T12:38:27.130 INFO:tasks.workunit.client.1.vm07.stdout:8/866: rename d1/d3/d40/l109 to d1/d3/d40/d92/db6/l117 0 2026-03-10T12:38:27.147 INFO:tasks.workunit.client.1.vm07.stdout:2/800: dwrite d0/d42/d1f/fbf [0,4194304] 0 2026-03-10T12:38:27.147 INFO:tasks.workunit.client.1.vm07.stdout:6/866: dread - d1/d4/d6/d43/d88/d97/ff4 zero size 2026-03-10T12:38:27.147 INFO:tasks.workunit.client.1.vm07.stdout:6/867: fsync d1/d4/d6/d43/d88/dc3/ff2 0 2026-03-10T12:38:27.147 INFO:tasks.workunit.client.1.vm07.stdout:2/801: mkdir d0/de1/d111 0 2026-03-10T12:38:27.147 INFO:tasks.workunit.client.1.vm07.stdout:2/802: write d0/d42/d26/f5a [1093410,131044] 0 2026-03-10T12:38:27.152 INFO:tasks.workunit.client.1.vm07.stdout:8/867: rename d1/f107 to d1/d3/d11/d87/f118 0 2026-03-10T12:38:27.158 INFO:tasks.workunit.client.1.vm07.stdout:8/868: dwrite d1/d3/d40/d92/fed [0,4194304] 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:8/869: chown d1/d3/d40/f110 2769 1 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:2/803: rename d0/d29/d64/d6c/d94/fb8 to d0/d29/d64/d6c/f112 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:6/868: creat d1/d4/f121 x:0 0 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:6/869: write d1/d4/d6/d16/d49/f114 [259862,23638] 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:8/870: dread d1/d3/db2/dcd/f7c [0,4194304] 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:2/804: symlink d0/de1/l113 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:6/870: creat d1/d4/d6/d43/d65/f122 x:0 0 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:2/805: chown d0/d29/d64/d74/d88 175406010 1 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:6/871: mkdir d1/d4/d6/d46/d4d/d123 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:2/806: rename d0/d42/d26/d38/d4f to d0/d29/d64/db5/dbb/d114 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:6/872: creat d1/d4/d6/d4e/f124 x:0 0 0 2026-03-10T12:38:27.179 INFO:tasks.workunit.client.1.vm07.stdout:9/972: dread d5/d13/d6c/d7a/f94 [0,4194304] 0 2026-03-10T12:38:27.185 INFO:tasks.workunit.client.1.vm07.stdout:6/873: creat d1/d4/d6/d43/f125 x:0 0 0 2026-03-10T12:38:27.187 INFO:tasks.workunit.client.1.vm07.stdout:9/973: truncate d5/d13/d6c/d89/f113 443646 0 2026-03-10T12:38:27.189 INFO:tasks.workunit.client.1.vm07.stdout:6/874: rmdir d1/d4/d6/d96 39 2026-03-10T12:38:27.193 INFO:tasks.workunit.client.1.vm07.stdout:9/974: dread d5/d13/d6c/fb6 [0,4194304] 0 2026-03-10T12:38:27.218 INFO:tasks.workunit.client.1.vm07.stdout:9/975: fdatasync d5/f8 0 2026-03-10T12:38:27.218 INFO:tasks.workunit.client.1.vm07.stdout:1/896: dread d9/d2d/d4f/d75/de3/ff5 [0,4194304] 0 2026-03-10T12:38:27.235 INFO:tasks.workunit.client.1.vm07.stdout:8/871: sync 2026-03-10T12:38:27.239 INFO:tasks.workunit.client.1.vm07.stdout:8/872: creat d1/d3/d40/d92/db6/f119 x:0 0 0 2026-03-10T12:38:27.246 INFO:tasks.workunit.client.0.vm00.stdout:6/880: dwrite d2/da/dc/d2f/fdc [0,4194304] 0 2026-03-10T12:38:27.248 INFO:tasks.workunit.client.0.vm00.stdout:6/881: dread - d2/da/f6a zero size 2026-03-10T12:38:27.248 INFO:tasks.workunit.client.1.vm07.stdout:5/896: write d0/f1f [8111247,24509] 0 2026-03-10T12:38:27.248 INFO:tasks.workunit.client.0.vm00.stdout:6/882: readlink d2/d16/d29/d31/d88/d92/lff 0 2026-03-10T12:38:27.249 INFO:tasks.workunit.client.1.vm07.stdout:5/897: stat d0/d22/d18/d3e/d5d/db6/fe4 0 2026-03-10T12:38:27.258 INFO:tasks.workunit.client.1.vm07.stdout:7/859: write d0/d57/ff3 [3189897,40589] 0 2026-03-10T12:38:27.259 INFO:tasks.workunit.client.1.vm07.stdout:4/995: write d0/d4/d5/da/f6e [1091350,11081] 0 2026-03-10T12:38:27.266 INFO:tasks.workunit.client.0.vm00.stdout:6/883: rename d2/l139 to d2/d9f/dce/l13c 0 2026-03-10T12:38:27.273 INFO:tasks.workunit.client.1.vm07.stdout:7/860: creat d0/d47/dde/df5/f119 x:0 0 0 2026-03-10T12:38:27.274 INFO:tasks.workunit.client.0.vm00.stdout:6/884: creat d2/d42/d80/f13d x:0 0 0 2026-03-10T12:38:27.274 INFO:tasks.workunit.client.1.vm07.stdout:5/898: rename d0/d22/d18/d19/d2e/d67/ff4 to d0/d22/d18/d19/f135 0 2026-03-10T12:38:27.276 INFO:tasks.workunit.client.1.vm07.stdout:7/861: creat d0/d61/d115/f11a x:0 0 0 2026-03-10T12:38:27.276 INFO:tasks.workunit.client.1.vm07.stdout:7/862: dread - d0/d47/dde/df5/f119 zero size 2026-03-10T12:38:27.277 INFO:tasks.workunit.client.1.vm07.stdout:7/863: truncate d0/f4f 4865596 0 2026-03-10T12:38:27.278 INFO:tasks.workunit.client.1.vm07.stdout:5/899: symlink d0/d22/d18/d19/de5/l136 0 2026-03-10T12:38:27.279 INFO:tasks.workunit.client.1.vm07.stdout:5/900: write d0/f47 [1457841,122281] 0 2026-03-10T12:38:27.279 INFO:tasks.workunit.client.1.vm07.stdout:5/901: write d0/d22/d18/d19/d21/fd4 [5163859,40214] 0 2026-03-10T12:38:27.285 INFO:tasks.workunit.client.1.vm07.stdout:4/996: dread d0/d4/d10/d3c/d2b/f60 [0,4194304] 0 2026-03-10T12:38:27.288 INFO:tasks.workunit.client.1.vm07.stdout:5/902: dread d0/d22/d18/d19/d36/d75/d77/fd7 [0,4194304] 0 2026-03-10T12:38:27.291 INFO:tasks.workunit.client.1.vm07.stdout:4/997: symlink d0/d4/d5/da/d95/l163 0 2026-03-10T12:38:27.293 INFO:tasks.workunit.client.1.vm07.stdout:3/938: write dc/dd/fb7 [137032,80314] 0 2026-03-10T12:38:27.301 INFO:tasks.workunit.client.1.vm07.stdout:4/998: mknod d0/d4/d10/d114/c164 0 2026-03-10T12:38:27.302 INFO:tasks.workunit.client.1.vm07.stdout:4/999: readlink d0/d4/d10/d3c/l79 0 2026-03-10T12:38:27.304 INFO:tasks.workunit.client.1.vm07.stdout:5/903: mknod d0/d22/d18/d3e/d5d/d12c/c137 0 2026-03-10T12:38:27.307 INFO:tasks.workunit.client.1.vm07.stdout:3/939: dread dc/d18/d24/f3a [0,4194304] 0 2026-03-10T12:38:27.312 INFO:tasks.workunit.client.1.vm07.stdout:5/904: creat d0/d22/d18/d19/d72/dcc/f138 x:0 0 0 2026-03-10T12:38:27.314 INFO:tasks.workunit.client.1.vm07.stdout:3/940: dread dc/dd/d28/d3b/fa5 [0,4194304] 0 2026-03-10T12:38:27.322 INFO:tasks.workunit.client.1.vm07.stdout:5/905: dread d0/d22/d18/d3e/d53/d9e/f76 [0,4194304] 0 2026-03-10T12:38:27.322 INFO:tasks.workunit.client.1.vm07.stdout:5/906: write d0/d22/f50 [7924195,50046] 0 2026-03-10T12:38:27.324 INFO:tasks.workunit.client.1.vm07.stdout:2/807: truncate d0/d42/d4e/dab/fd5 689414 0 2026-03-10T12:38:27.326 INFO:tasks.workunit.client.1.vm07.stdout:5/907: rmdir d0/d22/dbc 39 2026-03-10T12:38:27.326 INFO:tasks.workunit.client.1.vm07.stdout:3/941: sync 2026-03-10T12:38:27.330 INFO:tasks.workunit.client.1.vm07.stdout:5/908: dwrite d0/d22/d18/d19/d21/fd4 [4194304,4194304] 0 2026-03-10T12:38:27.335 INFO:tasks.workunit.client.1.vm07.stdout:6/875: dwrite d1/d4/d6/d16/fbc [0,4194304] 0 2026-03-10T12:38:27.337 INFO:tasks.workunit.client.1.vm07.stdout:6/876: chown d1/d4/d6/d16/d1a/d9d 638399189 1 2026-03-10T12:38:27.352 INFO:tasks.workunit.client.1.vm07.stdout:1/897: write d9/df/d55/d9f/fb3 [1935720,82848] 0 2026-03-10T12:38:27.353 INFO:tasks.workunit.client.1.vm07.stdout:1/898: chown d9/df/d55/d9f 15229297 1 2026-03-10T12:38:27.355 INFO:tasks.workunit.client.1.vm07.stdout:9/976: dwrite d5/d13/f67 [0,4194304] 0 2026-03-10T12:38:27.357 INFO:tasks.workunit.client.1.vm07.stdout:5/909: rmdir d0/d22/d18/d19/d36/d75/d77 39 2026-03-10T12:38:27.360 INFO:tasks.workunit.client.1.vm07.stdout:8/873: dwrite d1/d3/d6/d54/f72 [0,4194304] 0 2026-03-10T12:38:27.365 INFO:tasks.workunit.client.0.vm00.stdout:6/885: dwrite d2/d16/d74/f62 [0,4194304] 0 2026-03-10T12:38:27.366 INFO:tasks.workunit.client.1.vm07.stdout:6/877: truncate d1/fcf 641004 0 2026-03-10T12:38:27.367 INFO:tasks.workunit.client.1.vm07.stdout:8/874: dwrite d1/d3/f8 [0,4194304] 0 2026-03-10T12:38:27.371 INFO:tasks.workunit.client.1.vm07.stdout:2/808: mkdir d0/d29/d64/d74/df4/d115 0 2026-03-10T12:38:27.376 INFO:tasks.workunit.client.1.vm07.stdout:3/942: creat dc/dd/d1f/dc7/dc9/d116/d11a/f13d x:0 0 0 2026-03-10T12:38:27.376 INFO:tasks.workunit.client.1.vm07.stdout:5/910: stat d0/d22/d18/d19/d21/d54/dcb/db8/dec/lfa 0 2026-03-10T12:38:27.377 INFO:tasks.workunit.client.1.vm07.stdout:6/878: sync 2026-03-10T12:38:27.383 INFO:tasks.workunit.client.1.vm07.stdout:2/809: creat d0/d29/d64/db5/dbb/d114/dad/ddd/f116 x:0 0 0 2026-03-10T12:38:27.392 INFO:tasks.workunit.client.1.vm07.stdout:1/899: symlink d9/df/d29/d2b/d31/l129 0 2026-03-10T12:38:27.392 INFO:tasks.workunit.client.1.vm07.stdout:1/900: chown d9/df/d29/d2b/d92/d9d 67392 1 2026-03-10T12:38:27.393 INFO:tasks.workunit.client.1.vm07.stdout:8/875: dread d1/d3/d6c/f74 [0,4194304] 0 2026-03-10T12:38:27.395 INFO:tasks.workunit.client.1.vm07.stdout:7/864: write d0/f27 [2050311,115882] 0 2026-03-10T12:38:27.396 INFO:tasks.workunit.client.1.vm07.stdout:6/879: dread - d1/dd7/fb5 zero size 2026-03-10T12:38:27.398 INFO:tasks.workunit.client.1.vm07.stdout:2/810: read d0/d42/d1f/d20/fa9 [3440712,85590] 0 2026-03-10T12:38:27.398 INFO:tasks.workunit.client.1.vm07.stdout:1/901: readlink d9/df/d29/d2b/d31/l53 0 2026-03-10T12:38:27.400 INFO:tasks.workunit.client.1.vm07.stdout:5/911: mknod d0/d22/d18/d19/d21/d54/dcb/c139 0 2026-03-10T12:38:27.409 INFO:tasks.workunit.client.1.vm07.stdout:6/880: fsync d1/d4/f3b 0 2026-03-10T12:38:27.409 INFO:tasks.workunit.client.1.vm07.stdout:9/977: rename d5/d16/f10c to d5/d13/f14b 0 2026-03-10T12:38:27.409 INFO:tasks.workunit.client.1.vm07.stdout:2/811: chown d0/d29/d64/db5/dbb/d114/d5d/l7c 1 1 2026-03-10T12:38:27.409 INFO:tasks.workunit.client.1.vm07.stdout:1/902: truncate d9/df/d29/d6b/fcc 147559 0 2026-03-10T12:38:27.409 INFO:tasks.workunit.client.1.vm07.stdout:2/812: dwrite d0/f9c [0,4194304] 0 2026-03-10T12:38:27.421 INFO:tasks.workunit.client.1.vm07.stdout:8/876: mkdir d1/d3/d11/d11a 0 2026-03-10T12:38:27.423 INFO:tasks.workunit.client.1.vm07.stdout:9/978: stat d5/d16/d23/d26/d68/fa0 0 2026-03-10T12:38:27.425 INFO:tasks.workunit.client.1.vm07.stdout:2/813: creat d0/d29/d64/d74/f117 x:0 0 0 2026-03-10T12:38:27.431 INFO:tasks.workunit.client.1.vm07.stdout:5/912: symlink d0/d22/l13a 0 2026-03-10T12:38:27.438 INFO:tasks.workunit.client.1.vm07.stdout:2/814: dread d0/d42/f22 [0,4194304] 0 2026-03-10T12:38:27.439 INFO:tasks.workunit.client.1.vm07.stdout:6/881: mknod d1/d4/d6/d46/d4d/d107/c126 0 2026-03-10T12:38:27.439 INFO:tasks.workunit.client.1.vm07.stdout:9/979: dread - d5/d13/d2c/de6/dce/f10e zero size 2026-03-10T12:38:27.439 INFO:tasks.workunit.client.1.vm07.stdout:2/815: creat d0/d29/d64/db5/f118 x:0 0 0 2026-03-10T12:38:27.440 INFO:tasks.workunit.client.1.vm07.stdout:6/882: dwrite d1/d4/d6/d46/d4d/dc7/f109 [0,4194304] 0 2026-03-10T12:38:27.442 INFO:tasks.workunit.client.1.vm07.stdout:7/865: sync 2026-03-10T12:38:27.446 INFO:tasks.workunit.client.1.vm07.stdout:6/883: dread d1/d4/d6/d16/d1a/d99/fa8 [4194304,4194304] 0 2026-03-10T12:38:27.458 INFO:tasks.workunit.client.1.vm07.stdout:9/980: rename d5/d16/dd7/f135 to d5/d13/d9d/df2/df4/f14c 0 2026-03-10T12:38:27.459 INFO:tasks.workunit.client.1.vm07.stdout:1/903: sync 2026-03-10T12:38:27.459 INFO:tasks.workunit.client.1.vm07.stdout:2/816: sync 2026-03-10T12:38:27.460 INFO:tasks.workunit.client.1.vm07.stdout:8/877: sync 2026-03-10T12:38:27.462 INFO:tasks.workunit.client.1.vm07.stdout:6/884: creat d1/d4/d6/d46/d4d/d107/f127 x:0 0 0 2026-03-10T12:38:27.463 INFO:tasks.workunit.client.1.vm07.stdout:6/885: truncate d1/d4/d6/d16/d49/f11a 5051921 0 2026-03-10T12:38:27.466 INFO:tasks.workunit.client.1.vm07.stdout:6/886: dread d1/d4/d6/d16/d1a/d2c/f59 [4194304,4194304] 0 2026-03-10T12:38:27.472 INFO:tasks.workunit.client.1.vm07.stdout:8/878: creat d1/d3/d6/d54/f11b x:0 0 0 2026-03-10T12:38:27.474 INFO:tasks.workunit.client.1.vm07.stdout:6/887: symlink d1/d4/d9b/l128 0 2026-03-10T12:38:27.479 INFO:tasks.workunit.client.1.vm07.stdout:5/913: dread d0/d22/d18/d3e/d53/faa [0,4194304] 0 2026-03-10T12:38:27.483 INFO:tasks.workunit.client.1.vm07.stdout:1/904: mkdir d9/d2d/d80/d8e/dc7/d12a 0 2026-03-10T12:38:27.483 INFO:tasks.workunit.client.1.vm07.stdout:2/817: mkdir d0/de1/d111/d119 0 2026-03-10T12:38:27.484 INFO:tasks.workunit.client.1.vm07.stdout:1/905: dread d9/fd [0,4194304] 0 2026-03-10T12:38:27.485 INFO:tasks.workunit.client.1.vm07.stdout:2/818: truncate d0/d29/d64/d6c/d94/fa7 7942708 0 2026-03-10T12:38:27.486 INFO:tasks.workunit.client.1.vm07.stdout:5/914: sync 2026-03-10T12:38:27.487 INFO:tasks.workunit.client.1.vm07.stdout:8/879: mkdir d1/d3/d40/d92/dba/d11c 0 2026-03-10T12:38:27.490 INFO:tasks.workunit.client.1.vm07.stdout:6/888: creat d1/d4/d6/d16/d1a/d99/df5/f129 x:0 0 0 2026-03-10T12:38:27.493 INFO:tasks.workunit.client.1.vm07.stdout:8/880: dwrite d1/d3/d40/f112 [0,4194304] 0 2026-03-10T12:38:27.493 INFO:tasks.workunit.client.1.vm07.stdout:1/906: write d9/d2d/d4f/d75/de3/ff5 [3831591,20441] 0 2026-03-10T12:38:27.497 INFO:tasks.workunit.client.1.vm07.stdout:8/881: chown d1/d3/d11/f35 60923 1 2026-03-10T12:38:27.498 INFO:tasks.workunit.client.1.vm07.stdout:1/907: write d9/df/d29/d2b/d31/d91/d59/fa4 [3226130,66185] 0 2026-03-10T12:38:27.499 INFO:tasks.workunit.client.1.vm07.stdout:8/882: sync 2026-03-10T12:38:27.508 INFO:tasks.workunit.client.1.vm07.stdout:5/915: dread d0/f9 [0,4194304] 0 2026-03-10T12:38:27.509 INFO:tasks.workunit.client.1.vm07.stdout:2/819: mknod d0/d29/d64/db5/dbb/dca/c11a 0 2026-03-10T12:38:27.518 INFO:tasks.workunit.client.1.vm07.stdout:3/943: dwrite dc/dd/d43/d5c/fd6 [0,4194304] 0 2026-03-10T12:38:27.520 INFO:tasks.workunit.client.1.vm07.stdout:1/908: creat d9/df/d29/d2b/db1/f12b x:0 0 0 2026-03-10T12:38:27.520 INFO:tasks.workunit.client.1.vm07.stdout:8/883: truncate d1/d3/d11/d87/fa2 562122 0 2026-03-10T12:38:27.535 INFO:tasks.workunit.client.1.vm07.stdout:1/909: mknod d9/df/d29/d2b/d31/d91/c12c 0 2026-03-10T12:38:27.536 INFO:tasks.workunit.client.1.vm07.stdout:8/884: mkdir d1/d3/d40/d92/db6/d11d 0 2026-03-10T12:38:27.539 INFO:tasks.workunit.client.1.vm07.stdout:6/889: rename d1/d4/d6/d4e/d64/cc6 to d1/d4/d6/c12a 0 2026-03-10T12:38:27.541 INFO:tasks.workunit.client.1.vm07.stdout:8/885: fsync d1/d3/d40/f112 0 2026-03-10T12:38:27.543 INFO:tasks.workunit.client.1.vm07.stdout:1/910: truncate d9/df/d29/d2b/d31/d91/d59/f84 283070 0 2026-03-10T12:38:27.546 INFO:tasks.workunit.client.1.vm07.stdout:1/911: dwrite d9/df/d29/d2b/d92/d9d/f105 [0,4194304] 0 2026-03-10T12:38:27.548 INFO:tasks.workunit.client.1.vm07.stdout:9/981: write d5/d69/d93/d97/fa2 [960076,23402] 0 2026-03-10T12:38:27.548 INFO:tasks.workunit.client.1.vm07.stdout:7/866: write d0/d47/f59 [3386254,46495] 0 2026-03-10T12:38:27.555 INFO:tasks.workunit.client.1.vm07.stdout:3/944: link dc/fc0 dc/d18/d99/f13e 0 2026-03-10T12:38:27.559 INFO:tasks.workunit.client.1.vm07.stdout:9/982: creat d5/d1f/d5e/d10a/f14d x:0 0 0 2026-03-10T12:38:27.561 INFO:tasks.workunit.client.1.vm07.stdout:7/867: sync 2026-03-10T12:38:27.562 INFO:tasks.workunit.client.1.vm07.stdout:5/916: dwrite d0/d22/d18/d19/d21/d3a/fa2 [0,4194304] 0 2026-03-10T12:38:27.573 INFO:tasks.workunit.client.1.vm07.stdout:8/886: unlink d1/d3/d11/l84 0 2026-03-10T12:38:27.573 INFO:tasks.workunit.client.1.vm07.stdout:3/945: creat dc/dd/d28/dd0/f13f x:0 0 0 2026-03-10T12:38:27.574 INFO:tasks.workunit.client.1.vm07.stdout:2/820: write d0/d42/d1f/d20/fdb [112018,62546] 0 2026-03-10T12:38:27.574 INFO:tasks.workunit.client.1.vm07.stdout:3/946: write dc/dd/d28/dd0/fdb [4494161,74350] 0 2026-03-10T12:38:27.580 INFO:tasks.workunit.client.1.vm07.stdout:3/947: mknod dc/dd/d1f/d45/dbf/c140 0 2026-03-10T12:38:27.586 INFO:tasks.workunit.client.1.vm07.stdout:2/821: creat d0/d29/d64/db5/dbb/d114/dad/ddd/f11b x:0 0 0 2026-03-10T12:38:27.586 INFO:tasks.workunit.client.1.vm07.stdout:7/868: link d0/d61/db4/cbb d0/d47/da0/c11b 0 2026-03-10T12:38:27.586 INFO:tasks.workunit.client.1.vm07.stdout:7/869: dread - d0/d61/db4/d8a/d9d/ffc zero size 2026-03-10T12:38:27.586 INFO:tasks.workunit.client.1.vm07.stdout:3/948: read dc/dd/d28/d7a/fab [934816,47497] 0 2026-03-10T12:38:27.589 INFO:tasks.workunit.client.1.vm07.stdout:5/917: link d0/d22/d18/fb4 d0/d22/d18/d19/d36/d75/f13b 0 2026-03-10T12:38:27.592 INFO:tasks.workunit.client.1.vm07.stdout:2/822: creat d0/d29/d64/db5/dbb/f11c x:0 0 0 2026-03-10T12:38:27.593 INFO:tasks.workunit.client.1.vm07.stdout:8/887: read d1/d3/d6/f81 [77424,37016] 0 2026-03-10T12:38:27.595 INFO:tasks.workunit.client.1.vm07.stdout:7/870: rename d0/d57/d108 to d0/d67/d11c 0 2026-03-10T12:38:27.595 INFO:tasks.workunit.client.1.vm07.stdout:7/871: chown d0/l33 246789747 1 2026-03-10T12:38:27.599 INFO:tasks.workunit.client.1.vm07.stdout:5/918: symlink d0/d22/d18/d19/d36/d75/ddc/l13c 0 2026-03-10T12:38:27.606 INFO:tasks.workunit.client.1.vm07.stdout:7/872: creat d0/d57/dd6/d80/f11d x:0 0 0 2026-03-10T12:38:27.611 INFO:tasks.workunit.client.1.vm07.stdout:3/949: link dc/dd/d1f/dc7/dc9/d116/d11a/f13d dc/dd/d1f/d6f/dcf/f141 0 2026-03-10T12:38:27.611 INFO:tasks.workunit.client.1.vm07.stdout:3/950: readlink dc/dd/d28/d3b/le4 0 2026-03-10T12:38:27.612 INFO:tasks.workunit.client.1.vm07.stdout:6/890: write d1/d4/d6/f60 [2910577,2892] 0 2026-03-10T12:38:27.617 INFO:tasks.workunit.client.0.vm00.stdout:6/886: link d2/da/dc/c60 d2/d51/c13e 0 2026-03-10T12:38:27.618 INFO:tasks.workunit.client.1.vm07.stdout:2/823: rename d0/d29/d64/db5/dbb/d114/dad/l101 to d0/d5b/l11d 0 2026-03-10T12:38:27.618 INFO:tasks.workunit.client.0.vm00.stdout:6/887: chown d2/d16/d74/f5a 1047561209 1 2026-03-10T12:38:27.619 INFO:tasks.workunit.client.1.vm07.stdout:1/912: write d9/d2d/d4f/d75/fda [382680,71674] 0 2026-03-10T12:38:27.627 INFO:tasks.workunit.client.1.vm07.stdout:5/919: symlink d0/d22/d18/d19/l13d 0 2026-03-10T12:38:27.628 INFO:tasks.workunit.client.1.vm07.stdout:1/913: dread d9/df/f4a [0,4194304] 0 2026-03-10T12:38:27.628 INFO:tasks.workunit.client.1.vm07.stdout:5/920: chown d0/d22/d18/fb4 2 1 2026-03-10T12:38:27.638 INFO:tasks.workunit.client.1.vm07.stdout:5/921: dread d0/d22/d18/d19/d2e/da9/f103 [0,4194304] 0 2026-03-10T12:38:27.642 INFO:tasks.workunit.client.1.vm07.stdout:2/824: symlink d0/dcd/l11e 0 2026-03-10T12:38:27.642 INFO:tasks.workunit.client.1.vm07.stdout:9/983: write d5/d69/d93/d97/fc3 [359592,118372] 0 2026-03-10T12:38:27.649 INFO:tasks.workunit.client.1.vm07.stdout:3/951: rename dc/dd/d28/c111 to dc/dd/d1f/dac/de6/c142 0 2026-03-10T12:38:27.650 INFO:tasks.workunit.client.1.vm07.stdout:3/952: readlink dc/d18/d24/ldf 0 2026-03-10T12:38:27.652 INFO:tasks.workunit.client.1.vm07.stdout:5/922: chown d0/d22/d18/d19/d2e/d67/lc9 87848 1 2026-03-10T12:38:27.664 INFO:tasks.workunit.client.1.vm07.stdout:8/888: write d1/d3/d6c/fce [2059630,65621] 0 2026-03-10T12:38:27.678 INFO:tasks.workunit.client.1.vm07.stdout:6/891: rmdir d1/d4/d6/d46/d4d/d123 0 2026-03-10T12:38:27.681 INFO:tasks.workunit.client.1.vm07.stdout:6/892: dwrite d1/d4/d6/d4e/fa1 [0,4194304] 0 2026-03-10T12:38:27.688 INFO:tasks.workunit.client.1.vm07.stdout:3/953: creat dc/dd/d28/d3b/f143 x:0 0 0 2026-03-10T12:38:27.691 INFO:tasks.workunit.client.0.vm00.stdout:6/888: write d2/d16/d74/f101 [191035,53197] 0 2026-03-10T12:38:27.697 INFO:tasks.workunit.client.0.vm00.stdout:6/889: symlink d2/da/dbf/ded/d118/l13f 0 2026-03-10T12:38:27.697 INFO:tasks.workunit.client.0.vm00.stdout:6/890: chown d2/d42/d103 6333005 1 2026-03-10T12:38:27.698 INFO:tasks.workunit.client.1.vm07.stdout:7/873: getdents d0/d61/d115 0 2026-03-10T12:38:27.699 INFO:tasks.workunit.client.1.vm07.stdout:2/825: dwrite d0/d42/d1f/d90/fb2 [0,4194304] 0 2026-03-10T12:38:27.699 INFO:tasks.workunit.client.1.vm07.stdout:7/874: stat d0/d61/db4/d8a/d9d/ce4 0 2026-03-10T12:38:27.703 INFO:tasks.workunit.client.1.vm07.stdout:2/826: dwrite d0/d42/d1f/f10e [0,4194304] 0 2026-03-10T12:38:27.715 INFO:tasks.workunit.client.1.vm07.stdout:3/954: rmdir dc/d18/de2 39 2026-03-10T12:38:27.715 INFO:tasks.workunit.client.1.vm07.stdout:3/955: fsync dc/d18/d2d/f71 0 2026-03-10T12:38:27.721 INFO:tasks.workunit.client.0.vm00.stdout:6/891: truncate d2/d42/fd4 715911 0 2026-03-10T12:38:27.731 INFO:tasks.workunit.client.0.vm00.stdout:6/892: write d2/d16/d29/d31/d88/dd5/fe8 [5030025,61325] 0 2026-03-10T12:38:27.731 INFO:tasks.workunit.client.1.vm07.stdout:9/984: write d5/d69/d93/d97/fe3 [4662859,102378] 0 2026-03-10T12:38:27.731 INFO:tasks.workunit.client.1.vm07.stdout:9/985: chown d5/d16/d23/d26/d68 13 1 2026-03-10T12:38:27.732 INFO:tasks.workunit.client.1.vm07.stdout:3/956: fdatasync dc/dd/d43/d5c/f101 0 2026-03-10T12:38:27.734 INFO:tasks.workunit.client.1.vm07.stdout:9/986: mknod d5/d1f/d7d/c14e 0 2026-03-10T12:38:27.734 INFO:tasks.workunit.client.1.vm07.stdout:5/923: getdents d0/d22/d18/d19/d21/d54/dcb 0 2026-03-10T12:38:27.736 INFO:tasks.workunit.client.1.vm07.stdout:2/827: link d0/d5b/d98/fee d0/de1/f11f 0 2026-03-10T12:38:27.736 INFO:tasks.workunit.client.1.vm07.stdout:2/828: chown d0/d42/d26/d7d/fea 126869 1 2026-03-10T12:38:27.738 INFO:tasks.workunit.client.1.vm07.stdout:9/987: creat d5/d16/da3/f14f x:0 0 0 2026-03-10T12:38:27.739 INFO:tasks.workunit.client.1.vm07.stdout:9/988: dread - d5/d13/d2c/de6/d76/f84 zero size 2026-03-10T12:38:27.740 INFO:tasks.workunit.client.1.vm07.stdout:1/914: truncate d9/d2d/d4f/d5a/f65 211810 0 2026-03-10T12:38:27.742 INFO:tasks.workunit.client.1.vm07.stdout:2/829: creat d0/d29/d64/d74/d88/f120 x:0 0 0 2026-03-10T12:38:27.749 INFO:tasks.workunit.client.1.vm07.stdout:2/830: rename d0/d42/d1f to d0/d42/d1f/d121 22 2026-03-10T12:38:27.749 INFO:tasks.workunit.client.1.vm07.stdout:5/924: unlink d0/ff 0 2026-03-10T12:38:27.749 INFO:tasks.workunit.client.1.vm07.stdout:5/925: chown d0/d22/d18/d19/d21/d3a/f85 873 1 2026-03-10T12:38:27.749 INFO:tasks.workunit.client.1.vm07.stdout:2/831: chown d0/d5b/l11d 0 1 2026-03-10T12:38:27.749 INFO:tasks.workunit.client.1.vm07.stdout:3/957: getdents dc/d18/d2d 0 2026-03-10T12:38:27.749 INFO:tasks.workunit.client.1.vm07.stdout:2/832: rename d0/d42/d4e/dab to d0/d42/d26/d7d/d122 0 2026-03-10T12:38:27.751 INFO:tasks.workunit.client.1.vm07.stdout:3/958: rename dc/dd/d1f/d45/dbf to dc/dd/d28/d7a/d144 0 2026-03-10T12:38:27.752 INFO:tasks.workunit.client.1.vm07.stdout:3/959: chown dc/d18/led 0 1 2026-03-10T12:38:27.754 INFO:tasks.workunit.client.1.vm07.stdout:2/833: rmdir d0/d29/d64/db5 39 2026-03-10T12:38:27.756 INFO:tasks.workunit.client.1.vm07.stdout:3/960: unlink dc/d18/cd5 0 2026-03-10T12:38:27.757 INFO:tasks.workunit.client.1.vm07.stdout:1/915: link d9/df/d29/d2b/c37 d9/df/d55/c12d 0 2026-03-10T12:38:27.758 INFO:tasks.workunit.client.1.vm07.stdout:3/961: creat dc/dd/d43/f145 x:0 0 0 2026-03-10T12:38:27.761 INFO:tasks.workunit.client.1.vm07.stdout:1/916: rename d9/df/d29/d2b/d31/d91/d59/l81 to d9/d2d/d80/d8e/dc7/l12e 0 2026-03-10T12:38:27.769 INFO:tasks.workunit.client.1.vm07.stdout:3/962: dwrite dc/dd/d1f/d6f/f13a [0,4194304] 0 2026-03-10T12:38:27.769 INFO:tasks.workunit.client.1.vm07.stdout:3/963: dwrite dc/dd/d28/f46 [0,4194304] 0 2026-03-10T12:38:27.771 INFO:tasks.workunit.client.1.vm07.stdout:9/989: sync 2026-03-10T12:38:27.775 INFO:tasks.workunit.client.1.vm07.stdout:9/990: dwrite d5/d13/d6c/da4/d102/f126 [0,4194304] 0 2026-03-10T12:38:27.788 INFO:tasks.workunit.client.1.vm07.stdout:2/834: getdents d0/d5b 0 2026-03-10T12:38:27.789 INFO:tasks.workunit.client.1.vm07.stdout:3/964: creat dc/dd/d1f/f146 x:0 0 0 2026-03-10T12:38:27.791 INFO:tasks.workunit.client.1.vm07.stdout:9/991: rmdir d5/d13 39 2026-03-10T12:38:27.792 INFO:tasks.workunit.client.1.vm07.stdout:3/965: symlink dc/d18/d99/da3/l147 0 2026-03-10T12:38:27.793 INFO:tasks.workunit.client.1.vm07.stdout:9/992: chown d5/d13/d6c/d7a/ff8 99176987 1 2026-03-10T12:38:27.794 INFO:tasks.workunit.client.1.vm07.stdout:2/835: mknod d0/d29/d64/db5/dbb/d114/dad/ddd/c123 0 2026-03-10T12:38:27.794 INFO:tasks.workunit.client.1.vm07.stdout:3/966: symlink dc/dd/d1f/l148 0 2026-03-10T12:38:27.795 INFO:tasks.workunit.client.1.vm07.stdout:9/993: mkdir d5/d13/d6c/d7a/d150 0 2026-03-10T12:38:27.811 INFO:tasks.workunit.client.1.vm07.stdout:1/917: dread d9/df/d29/f8b [0,4194304] 0 2026-03-10T12:38:27.815 INFO:tasks.workunit.client.1.vm07.stdout:1/918: unlink d9/df/d29/d2b/d30/cba 0 2026-03-10T12:38:27.819 INFO:tasks.workunit.client.1.vm07.stdout:3/967: dread dc/d18/d24/f3e [0,4194304] 0 2026-03-10T12:38:27.819 INFO:tasks.workunit.client.1.vm07.stdout:9/994: dread d5/d16/d23/d26/f46 [0,4194304] 0 2026-03-10T12:38:27.819 INFO:tasks.workunit.client.1.vm07.stdout:9/995: chown d5/d16/dd7/c146 798368 1 2026-03-10T12:38:27.822 INFO:tasks.workunit.client.1.vm07.stdout:1/919: rename d9/df/d29/d2b/d31/d91/ce5 to d9/df/c12f 0 2026-03-10T12:38:27.828 INFO:tasks.workunit.client.1.vm07.stdout:9/996: dread d5/d69/d93/d97/fd9 [0,4194304] 0 2026-03-10T12:38:27.830 INFO:tasks.workunit.client.1.vm07.stdout:1/920: dread d9/df/d29/d2b/d30/fd0 [0,4194304] 0 2026-03-10T12:38:27.830 INFO:tasks.workunit.client.1.vm07.stdout:9/997: chown d5/d1f/d5e/d6b/l6d 296132 1 2026-03-10T12:38:27.832 INFO:tasks.workunit.client.1.vm07.stdout:1/921: read d9/df/d29/d2b/d31/d91/fa9 [914042,92098] 0 2026-03-10T12:38:27.836 INFO:tasks.workunit.client.1.vm07.stdout:1/922: mkdir d9/ddb/d130 0 2026-03-10T12:38:27.837 INFO:tasks.workunit.client.1.vm07.stdout:1/923: creat d9/df/d29/d2b/d92/d123/f131 x:0 0 0 2026-03-10T12:38:27.842 INFO:tasks.workunit.client.1.vm07.stdout:1/924: dwrite d9/d2d/d4f/dde/fef [0,4194304] 0 2026-03-10T12:38:27.850 INFO:tasks.workunit.client.1.vm07.stdout:6/893: write d1/d4/f82 [1259607,31706] 0 2026-03-10T12:38:27.850 INFO:tasks.workunit.client.1.vm07.stdout:7/875: truncate d0/f5f 210879 0 2026-03-10T12:38:27.850 INFO:tasks.workunit.client.1.vm07.stdout:8/889: write d1/d3/f2d [3451213,82871] 0 2026-03-10T12:38:27.858 INFO:tasks.workunit.client.1.vm07.stdout:5/926: dwrite d0/d22/d18/d3e/d53/faa [0,4194304] 0 2026-03-10T12:38:27.859 INFO:tasks.workunit.client.1.vm07.stdout:5/927: truncate d0/d22/d18/d3e/d53/d9e/f134 769925 0 2026-03-10T12:38:27.872 INFO:tasks.workunit.client.1.vm07.stdout:6/894: rename d1/d4/d6/d46/d4d/d107/f127 to d1/d4/d6/d16/d1a/d9d/db2/f12b 0 2026-03-10T12:38:27.880 INFO:tasks.workunit.client.1.vm07.stdout:7/876: fdatasync d0/d52/fc7 0 2026-03-10T12:38:27.883 INFO:tasks.workunit.client.1.vm07.stdout:6/895: mkdir d1/d4/d44/d12c 0 2026-03-10T12:38:27.884 INFO:tasks.workunit.client.1.vm07.stdout:8/890: rename d1/d3/d40/fee to d1/d3/d18/f11e 0 2026-03-10T12:38:27.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:27 vm00.local ceph-mon[50686]: pgmap v10: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 41 MiB/s rd, 102 MiB/s wr, 269 op/s 2026-03-10T12:38:27.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:27 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:27.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:27 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:27.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:27 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:27.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:27 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:27.887 INFO:tasks.workunit.client.0.vm00.stdout:6/893: unlink d2/d51/d70/fab 0 2026-03-10T12:38:27.891 INFO:tasks.workunit.client.0.vm00.stdout:6/894: read - d2/d42/d9c/fe2 zero size 2026-03-10T12:38:27.896 INFO:tasks.workunit.client.1.vm07.stdout:2/836: write d0/d42/f2c [2121147,43667] 0 2026-03-10T12:38:27.901 INFO:tasks.workunit.client.1.vm07.stdout:3/968: write dc/d18/d99/da3/fd2 [728807,81946] 0 2026-03-10T12:38:27.901 INFO:tasks.workunit.client.1.vm07.stdout:9/998: dwrite d5/d13/d6c/fdf [0,4194304] 0 2026-03-10T12:38:27.905 INFO:tasks.workunit.client.1.vm07.stdout:1/925: dread d9/d2d/d4f/f95 [0,4194304] 0 2026-03-10T12:38:27.913 INFO:tasks.workunit.client.0.vm00.stdout:6/895: mknod d2/d42/d80/c140 0 2026-03-10T12:38:27.917 INFO:tasks.workunit.client.1.vm07.stdout:7/877: dread - d0/d57/dd6/d80/fc3 zero size 2026-03-10T12:38:27.923 INFO:tasks.workunit.client.1.vm07.stdout:8/891: mknod d1/d3/db2/dcd/db8/c11f 0 2026-03-10T12:38:27.926 INFO:tasks.workunit.client.1.vm07.stdout:2/837: mkdir d0/de1/d111/d124 0 2026-03-10T12:38:27.929 INFO:tasks.workunit.client.1.vm07.stdout:2/838: write d0/d42/d4e/d77/f89 [2440513,13330] 0 2026-03-10T12:38:27.931 INFO:tasks.workunit.client.0.vm00.stdout:6/896: truncate d2/d16/d74/f5a 4169852 0 2026-03-10T12:38:27.934 INFO:tasks.workunit.client.1.vm07.stdout:9/999: mknod d5/d13/d9d/df2/d141/c151 0 2026-03-10T12:38:27.942 INFO:tasks.workunit.client.1.vm07.stdout:7/878: creat d0/d61/db4/f11e x:0 0 0 2026-03-10T12:38:27.943 INFO:tasks.workunit.client.1.vm07.stdout:5/928: truncate d0/d22/d18/d3e/d53/faa 1174868 0 2026-03-10T12:38:27.945 INFO:tasks.workunit.client.1.vm07.stdout:8/892: creat d1/d3/d6/d54/f120 x:0 0 0 2026-03-10T12:38:27.946 INFO:tasks.workunit.client.1.vm07.stdout:8/893: read d1/d3/d6c/fa7 [3276265,58675] 0 2026-03-10T12:38:27.952 INFO:tasks.workunit.client.1.vm07.stdout:8/894: mknod d1/d3/d6c/c121 0 2026-03-10T12:38:27.952 INFO:tasks.workunit.client.1.vm07.stdout:8/895: fdatasync d1/d3/d40/d92/dba/fc3 0 2026-03-10T12:38:27.954 INFO:tasks.workunit.client.1.vm07.stdout:3/969: creat dc/dd/d43/d76/f149 x:0 0 0 2026-03-10T12:38:27.955 INFO:tasks.workunit.client.1.vm07.stdout:3/970: chown dc/d18/d99/da3/fd2 28702942 1 2026-03-10T12:38:27.955 INFO:tasks.workunit.client.1.vm07.stdout:3/971: chown dc/dd/d28/d7a/d144/f136 534 1 2026-03-10T12:38:27.957 INFO:tasks.workunit.client.1.vm07.stdout:2/839: link d0/d29/d64/db5/dbb/d114/d62/fd3 d0/d29/d64/d74/f125 0 2026-03-10T12:38:27.958 INFO:tasks.workunit.client.1.vm07.stdout:3/972: chown dc/dd/d1f/f27 0 1 2026-03-10T12:38:27.959 INFO:tasks.workunit.client.1.vm07.stdout:3/973: dread - dc/dd/d1f/dc7/dc9/d116/f11f zero size 2026-03-10T12:38:27.963 INFO:tasks.workunit.client.1.vm07.stdout:3/974: dwrite dc/d18/d99/d123/f12a [0,4194304] 0 2026-03-10T12:38:27.965 INFO:tasks.workunit.client.1.vm07.stdout:7/879: rename d0/f42 to d0/d57/d62/d90/f11f 0 2026-03-10T12:38:27.966 INFO:tasks.workunit.client.1.vm07.stdout:8/896: truncate d1/f48 4841347 0 2026-03-10T12:38:27.971 INFO:tasks.workunit.client.1.vm07.stdout:6/896: write d1/d4/d6/d4e/d64/fa4 [544732,49045] 0 2026-03-10T12:38:27.971 INFO:tasks.workunit.client.1.vm07.stdout:6/897: fsync d1/d4/d6/f30 0 2026-03-10T12:38:27.978 INFO:tasks.workunit.client.1.vm07.stdout:1/926: dwrite d9/df/d29/d2b/d31/d11f/fa6 [0,4194304] 0 2026-03-10T12:38:27.983 INFO:tasks.workunit.client.1.vm07.stdout:5/929: dwrite d0/d22/d18/d19/d36/d75/f13b [0,4194304] 0 2026-03-10T12:38:27.995 INFO:tasks.workunit.client.1.vm07.stdout:2/840: creat d0/d42/d4e/daf/f126 x:0 0 0 2026-03-10T12:38:27.995 INFO:tasks.workunit.client.1.vm07.stdout:2/841: chown d0/d29/d64/d74/df4 13 1 2026-03-10T12:38:27.996 INFO:tasks.workunit.client.0.vm00.stdout:6/897: dwrite d2/d16/d29/f54 [0,4194304] 0 2026-03-10T12:38:27.998 INFO:tasks.workunit.client.1.vm07.stdout:7/880: truncate d0/d47/f81 1948447 0 2026-03-10T12:38:28.000 INFO:tasks.workunit.client.1.vm07.stdout:6/898: fdatasync d1/d4/d6/d16/d1a/d99/fa8 0 2026-03-10T12:38:28.003 INFO:tasks.workunit.client.1.vm07.stdout:1/927: fsync d9/df/f24 0 2026-03-10T12:38:28.011 INFO:tasks.workunit.client.1.vm07.stdout:1/928: truncate d9/d2d/d4f/dde/f122 98882 0 2026-03-10T12:38:28.011 INFO:tasks.workunit.client.1.vm07.stdout:7/881: truncate d0/f7b 2362809 0 2026-03-10T12:38:28.011 INFO:tasks.workunit.client.1.vm07.stdout:7/882: chown d0/d61/db4/d8a/fbe 17452 1 2026-03-10T12:38:28.013 INFO:tasks.workunit.client.1.vm07.stdout:6/899: creat d1/d4/d71/f12d x:0 0 0 2026-03-10T12:38:28.014 INFO:tasks.workunit.client.1.vm07.stdout:3/975: dread dc/dd/d1f/d45/f68 [0,4194304] 0 2026-03-10T12:38:28.017 INFO:tasks.workunit.client.1.vm07.stdout:1/929: symlink d9/d2d/d4f/d75/d77/da7/l132 0 2026-03-10T12:38:28.019 INFO:tasks.workunit.client.1.vm07.stdout:2/842: truncate d0/d42/d26/d7d/d122/fd5 481067 0 2026-03-10T12:38:28.021 INFO:tasks.workunit.client.1.vm07.stdout:8/897: rename d1/d3/d6/d50/d70/dcf/le6 to d1/d3/d6/d50/l122 0 2026-03-10T12:38:28.024 INFO:tasks.workunit.client.1.vm07.stdout:3/976: mknod dc/dd/d43/d76/d95/dde/c14a 0 2026-03-10T12:38:28.025 INFO:tasks.workunit.client.0.vm00.stdout:6/898: dread d2/d42/fd4 [0,4194304] 0 2026-03-10T12:38:28.028 INFO:tasks.workunit.client.1.vm07.stdout:1/930: truncate d9/d2d/d4f/d75/fab 1396618 0 2026-03-10T12:38:28.029 INFO:tasks.workunit.client.1.vm07.stdout:2/843: rmdir d0/d42/d26 39 2026-03-10T12:38:28.032 INFO:tasks.workunit.client.0.vm00.stdout:6/899: mknod d2/d14/dbb/c141 0 2026-03-10T12:38:28.033 INFO:tasks.workunit.client.1.vm07.stdout:7/883: creat d0/d57/d62/d90/dce/f120 x:0 0 0 2026-03-10T12:38:28.036 INFO:tasks.workunit.client.1.vm07.stdout:6/900: symlink d1/d4/d6/d4e/l12e 0 2026-03-10T12:38:28.037 INFO:tasks.workunit.client.0.vm00.stdout:6/900: dread - d2/d9f/feb zero size 2026-03-10T12:38:28.038 INFO:tasks.workunit.client.1.vm07.stdout:6/901: dread d1/dd7/d66/dd6/fda [0,4194304] 0 2026-03-10T12:38:28.045 INFO:tasks.workunit.client.1.vm07.stdout:2/844: creat d0/d29/d64/d74/f127 x:0 0 0 2026-03-10T12:38:28.045 INFO:tasks.workunit.client.1.vm07.stdout:3/977: dread dc/dd/d43/d5c/f101 [0,4194304] 0 2026-03-10T12:38:28.064 INFO:tasks.workunit.client.1.vm07.stdout:5/930: dread d0/d22/d18/d19/d2e/d67/fc8 [0,4194304] 0 2026-03-10T12:38:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:27 vm07.local ceph-mon[58582]: pgmap v10: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 41 MiB/s rd, 102 MiB/s wr, 269 op/s 2026-03-10T12:38:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:27 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:27 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:27 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:28.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:27 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:28.082 INFO:tasks.workunit.client.1.vm07.stdout:8/898: dwrite d1/d3/d6c/f74 [0,4194304] 0 2026-03-10T12:38:28.083 INFO:tasks.workunit.client.1.vm07.stdout:1/931: dread d9/df/d29/d2b/d30/fa8 [0,4194304] 0 2026-03-10T12:38:28.083 INFO:tasks.workunit.client.1.vm07.stdout:5/931: dread d0/d22/d18/d19/d21/d54/dcb/de8/ffe [0,4194304] 0 2026-03-10T12:38:28.089 INFO:tasks.workunit.client.1.vm07.stdout:5/932: chown d0/d22/d18/d19/d21/f2f 81655293 1 2026-03-10T12:38:28.090 INFO:tasks.workunit.client.1.vm07.stdout:2/845: rmdir d0/de1/d111/d124 0 2026-03-10T12:38:28.092 INFO:tasks.workunit.client.0.vm00.stdout:6/901: link d2/d16/d74/c99 d2/d16/d29/d31/d88/d92/daa/c142 0 2026-03-10T12:38:28.104 INFO:tasks.workunit.client.1.vm07.stdout:6/902: dwrite d1/dd7/d66/fba [4194304,4194304] 0 2026-03-10T12:38:28.109 INFO:tasks.workunit.client.0.vm00.stdout:6/902: rename d2/d42/d9c to d2/d16/d29/d31/d88/d143 0 2026-03-10T12:38:28.114 INFO:tasks.workunit.client.0.vm00.stdout:6/903: write d2/d14/d7a/db9/f9b [4901375,6760] 0 2026-03-10T12:38:28.114 INFO:tasks.workunit.client.0.vm00.stdout:6/904: chown d2/da/dc/d2f/fdc 156773 1 2026-03-10T12:38:28.118 INFO:tasks.workunit.client.1.vm07.stdout:7/884: getdents d0/d47/da0 0 2026-03-10T12:38:28.121 INFO:tasks.workunit.client.0.vm00.stdout:6/905: unlink d2/da/dc/f40 0 2026-03-10T12:38:28.121 INFO:tasks.workunit.client.1.vm07.stdout:2/846: read d0/d42/f22 [3178531,97448] 0 2026-03-10T12:38:28.122 INFO:tasks.workunit.client.1.vm07.stdout:2/847: dread - d0/d29/d64/d74/d88/f120 zero size 2026-03-10T12:38:28.126 INFO:tasks.workunit.client.1.vm07.stdout:6/903: creat d1/d4/d6/d46/d4d/f12f x:0 0 0 2026-03-10T12:38:28.126 INFO:tasks.workunit.client.0.vm00.stdout:6/906: mknod d2/da/dc/d94/c144 0 2026-03-10T12:38:28.126 INFO:tasks.workunit.client.0.vm00.stdout:6/907: chown d2/d14/dbb/f132 176675411 1 2026-03-10T12:38:28.127 INFO:tasks.workunit.client.0.vm00.stdout:6/908: symlink d2/da/dc/d2f/d10a/d12e/l145 0 2026-03-10T12:38:28.130 INFO:tasks.workunit.client.1.vm07.stdout:7/885: creat d0/d57/dd6/d80/f121 x:0 0 0 2026-03-10T12:38:28.131 INFO:tasks.workunit.client.1.vm07.stdout:3/978: link dc/dd/d28/d7a/d8e/f10a dc/dd/d43/d76/d95/dde/d129/f14b 0 2026-03-10T12:38:28.143 INFO:tasks.workunit.client.0.vm00.stdout:6/909: creat d2/da/dc/f146 x:0 0 0 2026-03-10T12:38:28.145 INFO:tasks.workunit.client.1.vm07.stdout:7/886: mkdir d0/d67/d11c/d122 0 2026-03-10T12:38:28.149 INFO:tasks.workunit.client.1.vm07.stdout:8/899: write d1/d3/d6/d50/d70/f7f [46961,71479] 0 2026-03-10T12:38:28.153 INFO:tasks.workunit.client.1.vm07.stdout:5/933: link d0/d22/d18/d19/d2e/l78 d0/d22/d18/d3e/d5d/d12c/l13e 0 2026-03-10T12:38:28.156 INFO:tasks.workunit.client.1.vm07.stdout:6/904: symlink d1/d4/d6/d96/l130 0 2026-03-10T12:38:28.159 INFO:tasks.workunit.client.1.vm07.stdout:7/887: rmdir d0/d57/d62/d90/dce 39 2026-03-10T12:38:28.161 INFO:tasks.workunit.client.1.vm07.stdout:8/900: mknod d1/d3/d6/d50/d70/dcf/c123 0 2026-03-10T12:38:28.162 INFO:tasks.workunit.client.0.vm00.stdout:6/910: write d2/d51/f63 [2102191,29066] 0 2026-03-10T12:38:28.174 INFO:tasks.workunit.client.0.vm00.stdout:6/911: mknod d2/d14/d7a/c147 0 2026-03-10T12:38:28.174 INFO:tasks.workunit.client.1.vm07.stdout:1/932: write d9/f1f [1196837,97290] 0 2026-03-10T12:38:28.176 INFO:tasks.workunit.client.1.vm07.stdout:2/848: link d0/d80/lc7 d0/d29/d64/db5/dbb/dca/l128 0 2026-03-10T12:38:28.176 INFO:tasks.workunit.client.1.vm07.stdout:1/933: read d9/d2d/d80/fdf [2177538,120508] 0 2026-03-10T12:38:28.181 INFO:tasks.workunit.client.1.vm07.stdout:3/979: write dc/d18/d2d/f80 [1345328,21904] 0 2026-03-10T12:38:28.192 INFO:tasks.workunit.client.1.vm07.stdout:5/934: creat d0/d22/d18/f13f x:0 0 0 2026-03-10T12:38:28.192 INFO:tasks.workunit.client.1.vm07.stdout:6/905: write d1/d4/d6/f80 [3442605,1032] 0 2026-03-10T12:38:28.194 INFO:tasks.workunit.client.1.vm07.stdout:7/888: creat d0/f123 x:0 0 0 2026-03-10T12:38:28.200 INFO:tasks.workunit.client.1.vm07.stdout:2/849: write d0/d42/d1f/d20/fa0 [1284963,105408] 0 2026-03-10T12:38:28.202 INFO:tasks.workunit.client.1.vm07.stdout:6/906: rename d1/d4/d6/d16/d1a/lad to d1/dd7/d66/dd6/deb/l131 0 2026-03-10T12:38:28.205 INFO:tasks.workunit.client.0.vm00.stdout:6/912: dread d2/d16/d74/f11d [0,4194304] 0 2026-03-10T12:38:28.205 INFO:tasks.workunit.client.0.vm00.stdout:6/913: dread - d2/d16/d29/d31/d88/d92/f138 zero size 2026-03-10T12:38:28.205 INFO:tasks.workunit.client.1.vm07.stdout:8/901: getdents d1/d3 0 2026-03-10T12:38:28.206 INFO:tasks.workunit.client.1.vm07.stdout:2/850: symlink d0/d42/d1f/d90/l129 0 2026-03-10T12:38:28.208 INFO:tasks.workunit.client.1.vm07.stdout:1/934: write d9/df/d55/f87 [124632,59480] 0 2026-03-10T12:38:28.213 INFO:tasks.workunit.client.0.vm00.stdout:6/914: creat d2/d16/f148 x:0 0 0 2026-03-10T12:38:28.216 INFO:tasks.workunit.client.1.vm07.stdout:3/980: write dc/d18/d2d/de5/f10e [1639186,7407] 0 2026-03-10T12:38:28.220 INFO:tasks.workunit.client.0.vm00.stdout:6/915: truncate d2/da/f6a 185579 0 2026-03-10T12:38:28.220 INFO:tasks.workunit.client.0.vm00.stdout:6/916: write d2/d16/f17 [5065682,72574] 0 2026-03-10T12:38:28.221 INFO:tasks.workunit.client.1.vm07.stdout:1/935: dread d9/df/d29/d2b/d31/d11f/ff2 [0,4194304] 0 2026-03-10T12:38:28.221 INFO:tasks.workunit.client.1.vm07.stdout:1/936: dread - d9/d2d/d4f/d5a/f127 zero size 2026-03-10T12:38:28.222 INFO:tasks.workunit.client.1.vm07.stdout:1/937: truncate d9/d2d/d4f/dde/f122 930388 0 2026-03-10T12:38:28.224 INFO:tasks.workunit.client.1.vm07.stdout:8/902: chown d1/f68 63368 1 2026-03-10T12:38:28.228 INFO:tasks.workunit.client.1.vm07.stdout:7/889: write d0/d47/f8e [3677093,54198] 0 2026-03-10T12:38:28.230 INFO:tasks.workunit.client.1.vm07.stdout:2/851: rename d0/d29/d64/d74/f117 to d0/d29/d64/db5/dbb/dca/f12a 0 2026-03-10T12:38:28.233 INFO:tasks.workunit.client.0.vm00.stdout:6/917: mkdir d2/da/dc/d94/d149 0 2026-03-10T12:38:28.234 INFO:tasks.workunit.client.1.vm07.stdout:5/935: creat d0/d22/d18/d19/d36/d75/f140 x:0 0 0 2026-03-10T12:38:28.237 INFO:tasks.workunit.client.0.vm00.stdout:6/918: creat d2/d14/d7a/f14a x:0 0 0 2026-03-10T12:38:28.237 INFO:tasks.workunit.client.1.vm07.stdout:3/981: creat dc/d18/d2d/de5/f14c x:0 0 0 2026-03-10T12:38:28.240 INFO:tasks.workunit.client.0.vm00.stdout:6/919: creat d2/d9f/dce/f14b x:0 0 0 2026-03-10T12:38:28.241 INFO:tasks.workunit.client.1.vm07.stdout:7/890: symlink d0/d67/l124 0 2026-03-10T12:38:28.249 INFO:tasks.workunit.client.1.vm07.stdout:7/891: dread d0/d52/fb9 [0,4194304] 0 2026-03-10T12:38:28.253 INFO:tasks.workunit.client.1.vm07.stdout:2/852: dread d0/f2d [0,4194304] 0 2026-03-10T12:38:28.253 INFO:tasks.workunit.client.1.vm07.stdout:2/853: fsync d0/d42/d1f/d20/fa0 0 2026-03-10T12:38:28.254 INFO:tasks.workunit.client.1.vm07.stdout:6/907: truncate d1/d4/d6/d16/fbc 4835794 0 2026-03-10T12:38:28.258 INFO:tasks.workunit.client.0.vm00.stdout:6/920: symlink d2/d16/d29/d31/d34/l14c 0 2026-03-10T12:38:28.259 INFO:tasks.workunit.client.1.vm07.stdout:2/854: dwrite d0/d42/d1f/fbf [0,4194304] 0 2026-03-10T12:38:28.260 INFO:tasks.workunit.client.0.vm00.stdout:6/921: dread d2/d16/d29/d31/d88/d92/daa/dc1/f117 [0,4194304] 0 2026-03-10T12:38:28.266 INFO:tasks.workunit.client.1.vm07.stdout:2/855: dwrite d0/d42/d4e/ffe [0,4194304] 0 2026-03-10T12:38:28.276 INFO:tasks.workunit.client.1.vm07.stdout:1/938: rename d9/d2d/d4f/d75/d77/da7/dfe to d9/df/d29/d2b/d31/d11f/d133 0 2026-03-10T12:38:28.284 INFO:tasks.workunit.client.1.vm07.stdout:3/982: symlink dc/dd/d43/d5c/l14d 0 2026-03-10T12:38:28.284 INFO:tasks.workunit.client.1.vm07.stdout:3/983: stat dc/dd/fb7 0 2026-03-10T12:38:28.285 INFO:tasks.workunit.client.1.vm07.stdout:8/903: truncate d1/d3/d40/f4c 322123 0 2026-03-10T12:38:28.285 INFO:tasks.workunit.client.1.vm07.stdout:8/904: write d1/d3/d6/d50/f56 [2620646,26600] 0 2026-03-10T12:38:28.288 INFO:tasks.workunit.client.1.vm07.stdout:7/892: symlink d0/d61/d79/l125 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:2/856: creat d0/d29/d64/db5/dbb/d114/dad/ddd/f12b x:0 0 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:6/908: rename d1/d4/d6/d46/d4d/f12f to d1/d4/d6/d16/d1a/d6e/f132 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:6/909: stat d1/d4/d6/d43/d65/f9c 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:7/893: symlink d0/d47/dde/l126 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:2/857: symlink d0/d29/d64/db5/l12c 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:6/910: creat d1/dd7/f133 x:0 0 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:7/894: rename d0/d61/db4/f11e to d0/d61/f127 0 2026-03-10T12:38:28.322 INFO:tasks.workunit.client.1.vm07.stdout:7/895: dwrite d0/d61/d115/f11a [0,4194304] 0 2026-03-10T12:38:28.331 INFO:tasks.workunit.client.0.vm00.stdout:6/922: sync 2026-03-10T12:38:28.337 INFO:tasks.workunit.client.0.vm00.stdout:6/923: dread d2/d16/d29/d31/d88/d92/daa/dc1/f122 [0,4194304] 0 2026-03-10T12:38:28.345 INFO:tasks.workunit.client.0.vm00.stdout:6/924: readlink d2/d51/d70/lb7 0 2026-03-10T12:38:28.347 INFO:tasks.workunit.client.0.vm00.stdout:6/925: creat d2/d9f/f14d x:0 0 0 2026-03-10T12:38:28.349 INFO:tasks.workunit.client.0.vm00.stdout:6/926: getdents d2/d9f/df6/de6 0 2026-03-10T12:38:28.353 INFO:tasks.workunit.client.0.vm00.stdout:6/927: read d2/da/fcc [624368,28272] 0 2026-03-10T12:38:28.356 INFO:tasks.workunit.client.1.vm07.stdout:5/936: dwrite d0/d22/d18/d30/f35 [0,4194304] 0 2026-03-10T12:38:28.357 INFO:tasks.workunit.client.1.vm07.stdout:5/937: chown d0/d22/d18/d19/l13d 6043057 1 2026-03-10T12:38:28.360 INFO:tasks.workunit.client.1.vm07.stdout:5/938: mknod d0/d22/d18/c141 0 2026-03-10T12:38:28.371 INFO:tasks.workunit.client.1.vm07.stdout:6/911: rename d1/d4/d6/d46 to d1/d4/d44/d134 0 2026-03-10T12:38:28.372 INFO:tasks.workunit.client.1.vm07.stdout:1/939: write d9/df/d29/d2b/d31/d91/fa9 [2447413,90394] 0 2026-03-10T12:38:28.372 INFO:tasks.workunit.client.1.vm07.stdout:1/940: chown d9/df/d29/d2b/d92/d123 1 1 2026-03-10T12:38:28.372 INFO:tasks.workunit.client.1.vm07.stdout:6/912: chown d1/d4/d6/d16/d1a/d2c/de0/ff6 1265213 1 2026-03-10T12:38:28.375 INFO:tasks.workunit.client.1.vm07.stdout:8/905: write d1/d3/d6/f4f [83336,35393] 0 2026-03-10T12:38:28.379 INFO:tasks.workunit.client.1.vm07.stdout:1/941: rmdir d9/d2d/dd7 39 2026-03-10T12:38:28.379 INFO:tasks.workunit.client.1.vm07.stdout:1/942: fsync d9/fe 0 2026-03-10T12:38:28.382 INFO:tasks.workunit.client.1.vm07.stdout:7/896: rename d0/f20 to d0/d61/db4/df4/f128 0 2026-03-10T12:38:28.384 INFO:tasks.workunit.client.1.vm07.stdout:3/984: write dc/d18/fd4 [896001,90917] 0 2026-03-10T12:38:28.390 INFO:tasks.workunit.client.1.vm07.stdout:6/913: creat d1/d4/d44/d12c/f135 x:0 0 0 2026-03-10T12:38:28.393 INFO:tasks.workunit.client.1.vm07.stdout:2/858: dwrite d0/d42/d1f/d20/f39 [0,4194304] 0 2026-03-10T12:38:28.394 INFO:tasks.workunit.client.1.vm07.stdout:5/939: rename d0/d22/d18/d3e/d11f/f12b to d0/d22/d109/f142 0 2026-03-10T12:38:28.398 INFO:tasks.workunit.client.1.vm07.stdout:5/940: dwrite d0/d22/d18/d19/fa8 [0,4194304] 0 2026-03-10T12:38:28.407 INFO:tasks.workunit.client.1.vm07.stdout:8/906: getdents d1/d3/d40/d104 0 2026-03-10T12:38:28.409 INFO:tasks.workunit.client.1.vm07.stdout:3/985: fdatasync dc/dd/d43/d76/d95/da0/fa2 0 2026-03-10T12:38:28.410 INFO:tasks.workunit.client.1.vm07.stdout:6/914: mknod d1/dd7/da3/dd5/c136 0 2026-03-10T12:38:28.413 INFO:tasks.workunit.client.0.vm00.stdout:6/928: write d2/d16/d74/f59 [724744,81217] 0 2026-03-10T12:38:28.421 INFO:tasks.workunit.client.1.vm07.stdout:7/897: link d0/d57/d62/d90/fed d0/d47/dab/f129 0 2026-03-10T12:38:28.428 INFO:tasks.workunit.client.1.vm07.stdout:8/907: truncate d1/d3/d6/f24 4512758 0 2026-03-10T12:38:28.428 INFO:tasks.workunit.client.1.vm07.stdout:3/986: creat dc/d18/d99/da3/f14e x:0 0 0 2026-03-10T12:38:28.428 INFO:tasks.workunit.client.1.vm07.stdout:8/908: dwrite d1/d3/d40/d92/db6/f108 [0,4194304] 0 2026-03-10T12:38:28.428 INFO:tasks.workunit.client.1.vm07.stdout:8/909: fsync d1/d3/d6/f4f 0 2026-03-10T12:38:28.441 INFO:tasks.workunit.client.0.vm00.stdout:6/929: link d2/d16/d74/f7d d2/d16/d29/d31/d88/d92/f14e 0 2026-03-10T12:38:28.445 INFO:tasks.workunit.client.1.vm07.stdout:5/941: dread d0/d22/d18/d3e/d5d/db6/fe4 [0,4194304] 0 2026-03-10T12:38:28.448 INFO:tasks.workunit.client.1.vm07.stdout:3/987: mknod dc/d18/d99/da3/def/c14f 0 2026-03-10T12:38:28.448 INFO:tasks.workunit.client.1.vm07.stdout:8/910: rmdir d1/d3/db2/dcd/db8 39 2026-03-10T12:38:28.450 INFO:tasks.workunit.client.1.vm07.stdout:3/988: write dc/dd/d1f/d45/f5e [3143288,42051] 0 2026-03-10T12:38:28.451 INFO:tasks.workunit.client.1.vm07.stdout:3/989: chown dc/dd/d1f/d45/f5e 115276241 1 2026-03-10T12:38:28.452 INFO:tasks.workunit.client.0.vm00.stdout:6/930: rmdir d2/d9f/df6/de6 0 2026-03-10T12:38:28.453 INFO:tasks.workunit.client.0.vm00.stdout:6/931: readlink d2/d51/l53 0 2026-03-10T12:38:28.453 INFO:tasks.workunit.client.1.vm07.stdout:1/943: link d9/dff/d103/l11d d9/d2d/d4f/d75/d77/l134 0 2026-03-10T12:38:28.454 INFO:tasks.workunit.client.1.vm07.stdout:1/944: chown d9/d2d/d4f/dde/c116 89802052 1 2026-03-10T12:38:28.460 INFO:tasks.workunit.client.1.vm07.stdout:3/990: rmdir dc/d18/d99/da3/def 39 2026-03-10T12:38:28.460 INFO:tasks.workunit.client.1.vm07.stdout:1/945: readlink d9/df/d29/d2b/d31/d91/l94 0 2026-03-10T12:38:28.460 INFO:tasks.workunit.client.1.vm07.stdout:8/911: rename d1/d3/d6c/fce to d1/d3/db2/f124 0 2026-03-10T12:38:28.460 INFO:tasks.workunit.client.1.vm07.stdout:5/942: link d0/d22/d18/d3e/d53/cc0 d0/d22/d18/d30/c143 0 2026-03-10T12:38:28.462 INFO:tasks.workunit.client.1.vm07.stdout:3/991: dread dc/d18/fdd [0,4194304] 0 2026-03-10T12:38:28.465 INFO:tasks.workunit.client.1.vm07.stdout:5/943: read d0/d22/d18/d19/d21/fa1 [18894,72117] 0 2026-03-10T12:38:28.466 INFO:tasks.workunit.client.1.vm07.stdout:3/992: fdatasync dc/d18/d24/fe3 0 2026-03-10T12:38:28.467 INFO:tasks.workunit.client.1.vm07.stdout:1/946: mknod d9/d2d/dd7/c135 0 2026-03-10T12:38:28.468 INFO:tasks.workunit.client.1.vm07.stdout:5/944: rmdir d0/d22/d18/d3e/d53 39 2026-03-10T12:38:28.469 INFO:tasks.workunit.client.0.vm00.stdout:6/932: dread d2/da/fda [0,4194304] 0 2026-03-10T12:38:28.470 INFO:tasks.workunit.client.0.vm00.stdout:6/933: chown d2/d16/f148 123 1 2026-03-10T12:38:28.470 INFO:tasks.workunit.client.0.vm00.stdout:6/934: readlink d2/d16/lf7 0 2026-03-10T12:38:28.472 INFO:tasks.workunit.client.1.vm07.stdout:1/947: symlink d9/df/d29/d2b/d31/d11f/l136 0 2026-03-10T12:38:28.473 INFO:tasks.workunit.client.0.vm00.stdout:6/935: truncate d2/da/dc/d2f/f56 4627028 0 2026-03-10T12:38:28.473 INFO:tasks.workunit.client.0.vm00.stdout:6/936: dread - d2/d42/d80/d9d/fca zero size 2026-03-10T12:38:28.476 INFO:tasks.workunit.client.1.vm07.stdout:8/912: dread d1/d3/d40/d92/db6/f67 [0,4194304] 0 2026-03-10T12:38:28.477 INFO:tasks.workunit.client.1.vm07.stdout:5/945: truncate d0/d22/d18/d19/d21/d54/f10e 830231 0 2026-03-10T12:38:28.480 INFO:tasks.workunit.client.1.vm07.stdout:5/946: dwrite d0/f12e [0,4194304] 0 2026-03-10T12:38:28.482 INFO:tasks.workunit.client.0.vm00.stdout:6/937: truncate d2/d16/d29/f84 101727 0 2026-03-10T12:38:28.483 INFO:tasks.workunit.client.1.vm07.stdout:3/993: rename dc/d18/d2d/f10b to dc/d18/de2/df6/f150 0 2026-03-10T12:38:28.484 INFO:tasks.workunit.client.1.vm07.stdout:3/994: chown dc/dd/d43/d76/d95/db8/cff 0 1 2026-03-10T12:38:28.485 INFO:tasks.workunit.client.1.vm07.stdout:1/948: creat d9/ddb/f137 x:0 0 0 2026-03-10T12:38:28.486 INFO:tasks.workunit.client.1.vm07.stdout:8/913: fsync d1/d3/d6c/fe3 0 2026-03-10T12:38:28.488 INFO:tasks.workunit.client.0.vm00.stdout:6/938: sync 2026-03-10T12:38:28.489 INFO:tasks.workunit.client.0.vm00.stdout:6/939: sync 2026-03-10T12:38:28.495 INFO:tasks.workunit.client.0.vm00.stdout:6/940: creat d2/da/dc/d94/f14f x:0 0 0 2026-03-10T12:38:28.496 INFO:tasks.workunit.client.1.vm07.stdout:2/859: dwrite d0/d29/d64/d74/d75/fa5 [4194304,4194304] 0 2026-03-10T12:38:28.501 INFO:tasks.workunit.client.1.vm07.stdout:6/915: write d1/d4/d6/d16/d1a/d2c/de0/ff6 [954208,4972] 0 2026-03-10T12:38:28.506 INFO:tasks.workunit.client.1.vm07.stdout:7/898: dwrite d0/d47/da0/fef [0,4194304] 0 2026-03-10T12:38:28.520 INFO:tasks.workunit.client.1.vm07.stdout:6/916: write d1/d4/d6/d16/fdd [749184,10595] 0 2026-03-10T12:38:28.530 INFO:tasks.workunit.client.1.vm07.stdout:7/899: dread d0/f56 [0,4194304] 0 2026-03-10T12:38:28.537 INFO:tasks.workunit.client.1.vm07.stdout:6/917: dwrite d1/d4/d6/d16/d1a/d6e/f132 [0,4194304] 0 2026-03-10T12:38:28.545 INFO:tasks.workunit.client.1.vm07.stdout:2/860: truncate d0/d42/d26/d7d/d122/fd5 481373 0 2026-03-10T12:38:28.549 INFO:tasks.workunit.client.1.vm07.stdout:7/900: creat d0/d67/d10a/f12a x:0 0 0 2026-03-10T12:38:28.552 INFO:tasks.workunit.client.1.vm07.stdout:7/901: dwrite d0/d61/d115/f11a [0,4194304] 0 2026-03-10T12:38:28.564 INFO:tasks.workunit.client.1.vm07.stdout:6/918: write d1/d4/d6/d4e/f119 [2431271,2923] 0 2026-03-10T12:38:28.568 INFO:tasks.workunit.client.1.vm07.stdout:6/919: dwrite d1/d4/d6/d43/f125 [0,4194304] 0 2026-03-10T12:38:28.570 INFO:tasks.workunit.client.1.vm07.stdout:6/920: chown d1/d4/d6/d43/d88/d97/ff4 58345104 1 2026-03-10T12:38:28.575 INFO:tasks.workunit.client.1.vm07.stdout:7/902: unlink d0/d52/la7 0 2026-03-10T12:38:28.588 INFO:tasks.workunit.client.1.vm07.stdout:2/861: dread d0/d42/f1b [0,4194304] 0 2026-03-10T12:38:28.592 INFO:tasks.workunit.client.1.vm07.stdout:6/921: fdatasync d1/d4/d6/d16/d49/fd3 0 2026-03-10T12:38:28.594 INFO:tasks.workunit.client.1.vm07.stdout:2/862: creat d0/d5b/f12d x:0 0 0 2026-03-10T12:38:28.597 INFO:tasks.workunit.client.1.vm07.stdout:5/947: dwrite d0/d22/d18/d19/de5/f10d [0,4194304] 0 2026-03-10T12:38:28.606 INFO:tasks.workunit.client.1.vm07.stdout:2/863: symlink d0/d29/d64/d6c/l12e 0 2026-03-10T12:38:28.607 INFO:tasks.workunit.client.1.vm07.stdout:2/864: chown d0/d42/d1f/d20 188039 1 2026-03-10T12:38:28.607 INFO:tasks.workunit.client.1.vm07.stdout:2/865: rename d0/d5b to d0/d5b/d12f 22 2026-03-10T12:38:28.620 INFO:tasks.workunit.client.1.vm07.stdout:3/995: write dc/dd/d28/d7a/d144/f106 [2729792,102346] 0 2026-03-10T12:38:28.622 INFO:tasks.workunit.client.1.vm07.stdout:2/866: symlink d0/d29/d64/db5/dbb/df9/l130 0 2026-03-10T12:38:28.623 INFO:tasks.workunit.client.1.vm07.stdout:6/922: rmdir d1/dd7/d66/d118 0 2026-03-10T12:38:28.628 INFO:tasks.workunit.client.1.vm07.stdout:2/867: dwrite d0/d42/d26/f5a [0,4194304] 0 2026-03-10T12:38:28.633 INFO:tasks.workunit.client.1.vm07.stdout:1/949: dwrite d9/d2d/d4f/d5a/f65 [0,4194304] 0 2026-03-10T12:38:28.639 INFO:tasks.workunit.client.1.vm07.stdout:6/923: unlink d1/d4/d6/d43/ccd 0 2026-03-10T12:38:28.651 INFO:tasks.workunit.client.1.vm07.stdout:8/914: truncate d1/d3/d6/d54/f72 2353894 0 2026-03-10T12:38:28.651 INFO:tasks.workunit.client.1.vm07.stdout:3/996: mkdir dc/dd/d151 0 2026-03-10T12:38:28.651 INFO:tasks.workunit.client.1.vm07.stdout:3/997: write dc/dd/d28/d3b/f4d [905986,52857] 0 2026-03-10T12:38:28.659 INFO:tasks.workunit.client.1.vm07.stdout:3/998: mknod dc/dd/d1f/d45/c152 0 2026-03-10T12:38:28.660 INFO:tasks.workunit.client.1.vm07.stdout:7/903: dread d0/d57/d62/fa9 [0,4194304] 0 2026-03-10T12:38:28.663 INFO:tasks.workunit.client.1.vm07.stdout:3/999: unlink dc/dd/d28/dd0/fdb 0 2026-03-10T12:38:28.664 INFO:tasks.workunit.client.1.vm07.stdout:1/950: rename d9/df/d55/f87 to d9/df/f138 0 2026-03-10T12:38:28.667 INFO:tasks.workunit.client.1.vm07.stdout:1/951: chown d9/dff/d103/l11d 61967 1 2026-03-10T12:38:28.669 INFO:tasks.workunit.client.1.vm07.stdout:8/915: getdents d1/d3/d6/d50 0 2026-03-10T12:38:28.669 INFO:tasks.workunit.client.1.vm07.stdout:8/916: dread - d1/d3/fbb zero size 2026-03-10T12:38:28.671 INFO:tasks.workunit.client.1.vm07.stdout:1/952: rename d9/df/d29/d2b/d31/fd8 to d9/df/d29/d2b/d31/d91/d59/f139 0 2026-03-10T12:38:28.673 INFO:tasks.workunit.client.1.vm07.stdout:7/904: getdents d0/d67/d11c 0 2026-03-10T12:38:28.674 INFO:tasks.workunit.client.1.vm07.stdout:7/905: dread - d0/d61/db4/f7a zero size 2026-03-10T12:38:28.677 INFO:tasks.workunit.client.1.vm07.stdout:7/906: dwrite d0/d47/dde/ff0 [0,4194304] 0 2026-03-10T12:38:28.686 INFO:tasks.workunit.client.1.vm07.stdout:1/953: creat d9/d2d/dd7/f13a x:0 0 0 2026-03-10T12:38:28.689 INFO:tasks.workunit.client.1.vm07.stdout:7/907: fsync d0/d61/d79/db5/fc2 0 2026-03-10T12:38:28.690 INFO:tasks.workunit.client.1.vm07.stdout:1/954: mkdir d9/df/d55/d13b 0 2026-03-10T12:38:28.691 INFO:tasks.workunit.client.1.vm07.stdout:7/908: mknod d0/d61/db4/c12b 0 2026-03-10T12:38:28.692 INFO:tasks.workunit.client.1.vm07.stdout:7/909: write d0/d61/d79/f8d [1204348,63563] 0 2026-03-10T12:38:28.695 INFO:tasks.workunit.client.1.vm07.stdout:7/910: dwrite d0/d61/db4/df4/f112 [0,4194304] 0 2026-03-10T12:38:28.698 INFO:tasks.workunit.client.1.vm07.stdout:1/955: truncate d9/df/d29/d2b/d31/fc6 157084 0 2026-03-10T12:38:28.700 INFO:tasks.workunit.client.1.vm07.stdout:1/956: creat d9/d2d/d4f/d5a/d11c/f13c x:0 0 0 2026-03-10T12:38:28.701 INFO:tasks.workunit.client.1.vm07.stdout:1/957: mknod d9/d2d/d4f/dde/c13d 0 2026-03-10T12:38:28.722 INFO:tasks.workunit.client.1.vm07.stdout:5/948: sync 2026-03-10T12:38:28.725 INFO:tasks.workunit.client.1.vm07.stdout:5/949: fdatasync d0/d22/d18/d19/d2e/f59 0 2026-03-10T12:38:28.727 INFO:tasks.workunit.client.1.vm07.stdout:5/950: symlink d0/d22/d18/d3e/df6/l144 0 2026-03-10T12:38:28.742 INFO:tasks.workunit.client.1.vm07.stdout:5/951: dread d0/d22/d18/d19/fa8 [0,4194304] 0 2026-03-10T12:38:28.742 INFO:tasks.workunit.client.1.vm07.stdout:5/952: chown d0/d22/f89 19751 1 2026-03-10T12:38:28.742 INFO:tasks.workunit.client.1.vm07.stdout:5/953: dread - d0/d22/d18/d30/f11e zero size 2026-03-10T12:38:28.742 INFO:tasks.workunit.client.1.vm07.stdout:5/954: mkdir d0/d22/d18/d19/d36/d75/ddc/d145 0 2026-03-10T12:38:28.742 INFO:tasks.workunit.client.1.vm07.stdout:5/955: fdatasync d0/d22/d18/d19/d21/d54/f7d 0 2026-03-10T12:38:28.769 INFO:tasks.workunit.client.0.vm00.stdout:6/941: rename d2/d14/dc0 to d2/d51/d12b/d150 0 2026-03-10T12:38:28.770 INFO:tasks.workunit.client.0.vm00.stdout:6/942: truncate d2/d9f/dce/f14b 740626 0 2026-03-10T12:38:28.776 INFO:tasks.workunit.client.1.vm07.stdout:6/924: dwrite d1/d4/d6/d16/d1a/d33/f92 [0,4194304] 0 2026-03-10T12:38:28.778 INFO:tasks.workunit.client.1.vm07.stdout:2/868: dwrite d0/f12 [4194304,4194304] 0 2026-03-10T12:38:28.779 INFO:tasks.workunit.client.1.vm07.stdout:2/869: fdatasync d0/f8d 0 2026-03-10T12:38:28.779 INFO:tasks.workunit.client.0.vm00.stdout:6/943: dwrite d2/da/dc/d94/f133 [0,4194304] 0 2026-03-10T12:38:28.779 INFO:tasks.workunit.client.1.vm07.stdout:2/870: truncate d0/d42/d4e/daf/f126 976070 0 2026-03-10T12:38:28.788 INFO:tasks.workunit.client.1.vm07.stdout:8/917: write d1/d3/d40/d92/db6/f67 [5835764,79159] 0 2026-03-10T12:38:28.792 INFO:tasks.workunit.client.1.vm07.stdout:7/911: dwrite d0/d57/dd6/d80/fc3 [0,4194304] 0 2026-03-10T12:38:28.801 INFO:tasks.workunit.client.1.vm07.stdout:1/958: getdents d9/d2d/d4f/dde 0 2026-03-10T12:38:28.801 INFO:tasks.workunit.client.1.vm07.stdout:1/959: stat d9/df/d29/d2b/d30/l34 0 2026-03-10T12:38:28.812 INFO:tasks.workunit.client.0.vm00.stdout:6/944: write d2/d16/f41 [474892,129445] 0 2026-03-10T12:38:28.822 INFO:tasks.workunit.client.0.vm00.stdout:6/945: chown d2/d42/d80 451367979 1 2026-03-10T12:38:28.822 INFO:tasks.workunit.client.1.vm07.stdout:2/871: rmdir d0/d29/d64/d6c/d94 39 2026-03-10T12:38:28.822 INFO:tasks.workunit.client.1.vm07.stdout:8/918: chown d1/d3/d11/c95 1120562929 1 2026-03-10T12:38:28.822 INFO:tasks.workunit.client.1.vm07.stdout:5/956: dwrite d0/d22/d18/d19/d21/dc2/ded/f10a [0,4194304] 0 2026-03-10T12:38:28.822 INFO:tasks.workunit.client.1.vm07.stdout:8/919: dwrite d1/d3/d6/d54/ffa [0,4194304] 0 2026-03-10T12:38:28.825 INFO:tasks.workunit.client.1.vm07.stdout:8/920: write d1/d3/d40/f110 [201024,83008] 0 2026-03-10T12:38:28.826 INFO:tasks.workunit.client.0.vm00.stdout:6/946: mkdir d2/da/dc/d2f/d151 0 2026-03-10T12:38:28.830 INFO:tasks.workunit.client.1.vm07.stdout:2/872: rename d0/d29/d64/d74/d88/lb0 to d0/d42/d26/d7d/l131 0 2026-03-10T12:38:28.835 INFO:tasks.workunit.client.1.vm07.stdout:8/921: chown d1/fc 107 1 2026-03-10T12:38:28.841 INFO:tasks.workunit.client.1.vm07.stdout:8/922: mknod d1/d3/d6/d50/c125 0 2026-03-10T12:38:28.850 INFO:tasks.workunit.client.1.vm07.stdout:2/873: link d0/d29/d64/db5/dbb/dca/c11a d0/d29/d64/d74/d75/db7/c132 0 2026-03-10T12:38:28.850 INFO:tasks.workunit.client.1.vm07.stdout:5/957: rename d0/d22/d18/d19/l28 to d0/d22/d18/d3e/d5d/l146 0 2026-03-10T12:38:28.850 INFO:tasks.workunit.client.1.vm07.stdout:2/874: symlink d0/d42/d4e/d77/l133 0 2026-03-10T12:38:28.852 INFO:tasks.workunit.client.1.vm07.stdout:2/875: dwrite d0/d42/d26/d7d/fe8 [0,4194304] 0 2026-03-10T12:38:28.852 INFO:tasks.workunit.client.1.vm07.stdout:5/958: truncate d0/d22/d18/d19/d36/d75/ddc/fff 122162 0 2026-03-10T12:38:28.854 INFO:tasks.workunit.client.1.vm07.stdout:5/959: symlink d0/d22/d18/d3e/d5d/d10b/l147 0 2026-03-10T12:38:28.859 INFO:tasks.workunit.client.1.vm07.stdout:2/876: unlink d0/c7 0 2026-03-10T12:38:28.863 INFO:tasks.workunit.client.1.vm07.stdout:2/877: dread d0/d29/d64/d74/d88/f58 [0,4194304] 0 2026-03-10T12:38:28.865 INFO:tasks.workunit.client.1.vm07.stdout:8/923: sync 2026-03-10T12:38:28.867 INFO:tasks.workunit.client.1.vm07.stdout:2/878: unlink d0/d29/d64/d74/d88/f51 0 2026-03-10T12:38:28.872 INFO:tasks.workunit.client.1.vm07.stdout:8/924: rename d1/d3/d6c/f74 to d1/d3/d11/d11a/f126 0 2026-03-10T12:38:28.884 INFO:tasks.workunit.client.1.vm07.stdout:8/925: truncate d1/d3/d6c/fc9 438998 0 2026-03-10T12:38:28.885 INFO:tasks.workunit.client.1.vm07.stdout:8/926: readlink d1/d3/db2/lbc 0 2026-03-10T12:38:28.911 INFO:tasks.workunit.client.1.vm07.stdout:8/927: mkdir d1/d3/d6c/dde/d127 0 2026-03-10T12:38:28.918 INFO:tasks.workunit.client.1.vm07.stdout:8/928: getdents d1/d3 0 2026-03-10T12:38:28.950 INFO:tasks.workunit.client.1.vm07.stdout:6/925: write d1/d4/d6/d16/d1a/d33/f7b [342561,19174] 0 2026-03-10T12:38:28.952 INFO:tasks.workunit.client.1.vm07.stdout:6/926: mkdir d1/d106/d137 0 2026-03-10T12:38:28.954 INFO:tasks.workunit.client.1.vm07.stdout:6/927: creat d1/d4/d6/d96/f138 x:0 0 0 2026-03-10T12:38:28.955 INFO:tasks.workunit.client.1.vm07.stdout:6/928: dread - d1/d4/d71/f79 zero size 2026-03-10T12:38:28.957 INFO:tasks.workunit.client.1.vm07.stdout:6/929: creat d1/d4/d44/d134/d4d/dc7/f139 x:0 0 0 2026-03-10T12:38:28.959 INFO:tasks.workunit.client.1.vm07.stdout:1/960: write d9/d2d/d4f/d75/d77/da7/fcd [4705998,116049] 0 2026-03-10T12:38:28.961 INFO:tasks.workunit.client.1.vm07.stdout:7/912: dwrite d0/fe6 [0,4194304] 0 2026-03-10T12:38:28.964 INFO:tasks.workunit.client.1.vm07.stdout:6/930: truncate d1/d4/d6/f7c 2635032 0 2026-03-10T12:38:28.967 INFO:tasks.workunit.client.0.vm00.stdout:6/947: write d2/da/dc/d2f/f4f [1013673,53144] 0 2026-03-10T12:38:28.974 INFO:tasks.workunit.client.1.vm07.stdout:5/960: write d0/d22/f27 [671319,98541] 0 2026-03-10T12:38:28.982 INFO:tasks.workunit.client.1.vm07.stdout:2/879: truncate d0/f9c 2290247 0 2026-03-10T12:38:28.983 INFO:tasks.workunit.client.1.vm07.stdout:2/880: write d0/d42/d1f/fbf [1702934,8009] 0 2026-03-10T12:38:28.985 INFO:tasks.workunit.client.1.vm07.stdout:7/913: rename d0/d61/db4/d8a/feb to d0/d52/f12c 0 2026-03-10T12:38:28.988 INFO:tasks.workunit.client.0.vm00.stdout:6/948: creat d2/da/dc/d2f/d151/f152 x:0 0 0 2026-03-10T12:38:28.989 INFO:tasks.workunit.client.1.vm07.stdout:5/961: dread d0/d22/d18/d19/d21/d54/f7d [0,4194304] 0 2026-03-10T12:38:28.993 INFO:tasks.workunit.client.0.vm00.stdout:6/949: mknod d2/da/dbf/ded/d118/c153 0 2026-03-10T12:38:28.998 INFO:tasks.workunit.client.1.vm07.stdout:2/881: read - d0/d42/d4e/daf/fde zero size 2026-03-10T12:38:29.000 INFO:tasks.workunit.client.1.vm07.stdout:7/914: truncate d0/f56 452898 0 2026-03-10T12:38:29.000 INFO:tasks.workunit.client.1.vm07.stdout:2/882: stat d0/d29/d64/d74/d75/cc3 0 2026-03-10T12:38:29.003 INFO:tasks.workunit.client.1.vm07.stdout:8/929: rename d1/d3/d40/d92/dba/f10a to d1/d3/d6/d50/d70/dd4/f128 0 2026-03-10T12:38:29.005 INFO:tasks.workunit.client.1.vm07.stdout:7/915: rmdir d0/d61/db4/d8a/d9d 39 2026-03-10T12:38:29.010 INFO:tasks.workunit.client.1.vm07.stdout:2/883: write d0/d5b/f10c [2984898,123281] 0 2026-03-10T12:38:29.013 INFO:tasks.workunit.client.1.vm07.stdout:8/930: creat d1/d3/d11/d87/f129 x:0 0 0 2026-03-10T12:38:29.015 INFO:tasks.workunit.client.1.vm07.stdout:1/961: getdents d9/df/dc9/df4 0 2026-03-10T12:38:29.018 INFO:tasks.workunit.client.1.vm07.stdout:2/884: creat d0/d29/d64/db5/dbb/d114/d5d/f134 x:0 0 0 2026-03-10T12:38:29.023 INFO:tasks.workunit.client.1.vm07.stdout:8/931: symlink d1/d3/d40/d92/db6/l12a 0 2026-03-10T12:38:29.023 INFO:tasks.workunit.client.1.vm07.stdout:8/932: chown d1/d3/d6/d54/f120 1006363 1 2026-03-10T12:38:29.024 INFO:tasks.workunit.client.1.vm07.stdout:6/931: write d1/d4/d44/d134/d4d/dc7/dd9/ddc/ff1 [581234,82208] 0 2026-03-10T12:38:29.027 INFO:tasks.workunit.client.1.vm07.stdout:5/962: link d0/d22/l45 d0/d22/d18/d3e/d5d/l148 0 2026-03-10T12:38:29.034 INFO:tasks.workunit.client.1.vm07.stdout:8/933: mknod d1/d3/d6/d54/dd2/df3/c12b 0 2026-03-10T12:38:29.034 INFO:tasks.workunit.client.1.vm07.stdout:8/934: dwrite d1/d3/f2d [4194304,4194304] 0 2026-03-10T12:38:29.035 INFO:tasks.workunit.client.1.vm07.stdout:6/932: chown d1/d4/d6/d16/fbc 6124767 1 2026-03-10T12:38:29.036 INFO:tasks.workunit.client.1.vm07.stdout:1/962: sync 2026-03-10T12:38:29.042 INFO:tasks.workunit.client.1.vm07.stdout:8/935: mknod d1/d3/d6c/dde/de7/c12c 0 2026-03-10T12:38:29.045 INFO:tasks.workunit.client.1.vm07.stdout:8/936: dwrite d1/d3/d11/f77 [0,4194304] 0 2026-03-10T12:38:29.050 INFO:tasks.workunit.client.1.vm07.stdout:8/937: dwrite d1/d3/d11/d87/f100 [0,4194304] 0 2026-03-10T12:38:29.058 INFO:tasks.workunit.client.1.vm07.stdout:6/933: mkdir d1/dd7/da3/dd5/d13a 0 2026-03-10T12:38:29.070 INFO:tasks.workunit.client.0.vm00.stdout:6/950: write d2/d14/dbb/fd7 [657626,64585] 0 2026-03-10T12:38:29.071 INFO:tasks.workunit.client.1.vm07.stdout:6/934: dread d1/d4/d6/d16/d49/f11a [0,4194304] 0 2026-03-10T12:38:29.072 INFO:tasks.workunit.client.1.vm07.stdout:2/885: link d0/d5b/fec d0/d29/d64/db5/dbb/d114/dad/f135 0 2026-03-10T12:38:29.077 INFO:tasks.workunit.client.0.vm00.stdout:6/951: creat d2/d16/d29/d31/d88/d143/f154 x:0 0 0 2026-03-10T12:38:29.093 INFO:tasks.workunit.client.1.vm07.stdout:6/935: mkdir d1/d4/d44/d134/d13b 0 2026-03-10T12:38:29.093 INFO:tasks.workunit.client.1.vm07.stdout:6/936: fdatasync d1/d4/d6/d43/d65/f122 0 2026-03-10T12:38:29.093 INFO:tasks.workunit.client.0.vm00.stdout:6/952: write d2/d14/dbb/d12c/f13b [344586,20921] 0 2026-03-10T12:38:29.093 INFO:tasks.workunit.client.0.vm00.stdout:6/953: unlink d2/d16/d29/d31/d88/d92/lff 0 2026-03-10T12:38:29.093 INFO:tasks.workunit.client.0.vm00.stdout:6/954: creat d2/da/dc/d2f/f155 x:0 0 0 2026-03-10T12:38:29.093 INFO:tasks.workunit.client.1.vm07.stdout:7/916: write d0/d57/dd6/d80/fd2 [1379597,82660] 0 2026-03-10T12:38:29.096 INFO:tasks.workunit.client.1.vm07.stdout:8/938: link d1/d3/db2/dcd/f7c d1/d3/d40/d92/dba/df1/f12d 0 2026-03-10T12:38:29.099 INFO:tasks.workunit.client.1.vm07.stdout:8/939: dwrite d1/d3/d6/d54/dd2/df3/f10c [0,4194304] 0 2026-03-10T12:38:29.101 INFO:tasks.workunit.client.1.vm07.stdout:6/937: dread - d1/d4/d6/d16/d1a/d2c/ffd zero size 2026-03-10T12:38:29.112 INFO:tasks.workunit.client.1.vm07.stdout:8/940: symlink d1/d3/d6c/l12e 0 2026-03-10T12:38:29.114 INFO:tasks.workunit.client.1.vm07.stdout:8/941: dread - d1/d3/d6/d50/d70/fe4 zero size 2026-03-10T12:38:29.116 INFO:tasks.workunit.client.1.vm07.stdout:6/938: sync 2026-03-10T12:38:29.116 INFO:tasks.workunit.client.1.vm07.stdout:6/939: stat d1/d4/d6/d16/d1a/d99/df5 0 2026-03-10T12:38:29.118 INFO:tasks.workunit.client.1.vm07.stdout:8/942: rename d1/d3/db2/dcd/dc7/c10f to d1/d3/db2/dcd/dc7/c12f 0 2026-03-10T12:38:29.119 INFO:tasks.workunit.client.1.vm07.stdout:6/940: symlink d1/d4/d44/d134/d4d/dc7/l13c 0 2026-03-10T12:38:29.121 INFO:tasks.workunit.client.1.vm07.stdout:8/943: mknod d1/d3/d6/d50/d70/dd4/c130 0 2026-03-10T12:38:29.123 INFO:tasks.workunit.client.1.vm07.stdout:6/941: dwrite d1/d4/d71/f11f [0,4194304] 0 2026-03-10T12:38:29.128 INFO:tasks.workunit.client.1.vm07.stdout:8/944: sync 2026-03-10T12:38:29.131 INFO:tasks.workunit.client.1.vm07.stdout:6/942: creat d1/d4/d9b/f13d x:0 0 0 2026-03-10T12:38:29.132 INFO:tasks.workunit.client.1.vm07.stdout:8/945: unlink d1/d3/d6/lc0 0 2026-03-10T12:38:29.156 INFO:tasks.workunit.client.1.vm07.stdout:5/963: write d0/d22/dbc/f8b [7926204,58496] 0 2026-03-10T12:38:29.158 INFO:tasks.workunit.client.1.vm07.stdout:5/964: creat d0/d22/d18/d3e/d5d/d12c/f149 x:0 0 0 2026-03-10T12:38:29.158 INFO:tasks.workunit.client.1.vm07.stdout:5/965: readlink d0/d22/d18/d19/d36/l51 0 2026-03-10T12:38:29.161 INFO:tasks.workunit.client.1.vm07.stdout:2/886: write d0/d42/d1f/d20/fe2 [2466317,10650] 0 2026-03-10T12:38:29.162 INFO:tasks.workunit.client.1.vm07.stdout:1/963: dwrite d9/df/f58 [0,4194304] 0 2026-03-10T12:38:29.166 INFO:tasks.workunit.client.1.vm07.stdout:5/966: rmdir d0/d22/d18/d19/d72 39 2026-03-10T12:38:29.177 INFO:tasks.workunit.client.1.vm07.stdout:2/887: dread d0/d42/f2c [0,4194304] 0 2026-03-10T12:38:29.181 INFO:tasks.workunit.client.1.vm07.stdout:2/888: dread d0/d42/d26/d7d/fc8 [0,4194304] 0 2026-03-10T12:38:29.189 INFO:tasks.workunit.client.1.vm07.stdout:1/964: fdatasync d9/df/d29/d2b/d31/f72 0 2026-03-10T12:38:29.192 INFO:tasks.workunit.client.1.vm07.stdout:5/967: creat d0/d22/d18/d19/d21/dc2/df0/f14a x:0 0 0 2026-03-10T12:38:29.194 INFO:tasks.workunit.client.1.vm07.stdout:2/889: mknod d0/d29/d64/db5/dbb/df9/c136 0 2026-03-10T12:38:29.195 INFO:tasks.workunit.client.1.vm07.stdout:2/890: write d0/d29/d64/db5/dbb/d114/dad/ddd/f116 [838387,27579] 0 2026-03-10T12:38:29.196 INFO:tasks.workunit.client.1.vm07.stdout:2/891: dread - d0/d5b/f12d zero size 2026-03-10T12:38:29.198 INFO:tasks.workunit.client.1.vm07.stdout:7/917: dwrite d0/d57/dd6/d80/fac [0,4194304] 0 2026-03-10T12:38:29.211 INFO:tasks.workunit.client.1.vm07.stdout:5/968: mkdir d0/d22/d18/d19/d21/dc2/df0/d14b 0 2026-03-10T12:38:29.218 INFO:tasks.workunit.client.1.vm07.stdout:6/943: dwrite d1/d4/d6/d43/d88/d97/ff4 [0,4194304] 0 2026-03-10T12:38:29.223 INFO:tasks.workunit.client.1.vm07.stdout:8/946: dwrite d1/d3/d6c/fe3 [0,4194304] 0 2026-03-10T12:38:29.224 INFO:tasks.workunit.client.1.vm07.stdout:5/969: sync 2026-03-10T12:38:29.225 INFO:tasks.workunit.client.1.vm07.stdout:2/892: rename d0/d29/l10d to d0/d29/d64/db5/dbb/l137 0 2026-03-10T12:38:29.230 INFO:tasks.workunit.client.1.vm07.stdout:6/944: creat d1/d4/d44/d134/d4d/d107/f13e x:0 0 0 2026-03-10T12:38:29.230 INFO:tasks.workunit.client.1.vm07.stdout:8/947: readlink d1/l4d 0 2026-03-10T12:38:29.231 INFO:tasks.workunit.client.1.vm07.stdout:8/948: write d1/d3/f8 [3221956,121677] 0 2026-03-10T12:38:29.232 INFO:tasks.workunit.client.1.vm07.stdout:5/970: unlink d0/d22/d18/d19/d2e/da9/f103 0 2026-03-10T12:38:29.235 INFO:tasks.workunit.client.1.vm07.stdout:1/965: getdents d9/df/d29/d2b/d92/d123 0 2026-03-10T12:38:29.242 INFO:tasks.workunit.client.1.vm07.stdout:8/949: dread d1/d3/d6/d50/d70/dfb/ffe [0,4194304] 0 2026-03-10T12:38:29.244 INFO:tasks.workunit.client.1.vm07.stdout:6/945: truncate d1/d4/d6/d43/d88/d97/fa2 228668 0 2026-03-10T12:38:29.246 INFO:tasks.workunit.client.1.vm07.stdout:5/971: fsync d0/d22/d18/d3e/df6/ff8 0 2026-03-10T12:38:29.249 INFO:tasks.workunit.client.1.vm07.stdout:2/893: creat d0/d29/d64/f138 x:0 0 0 2026-03-10T12:38:29.250 INFO:tasks.workunit.client.1.vm07.stdout:8/950: symlink d1/d3/l131 0 2026-03-10T12:38:29.251 INFO:tasks.workunit.client.1.vm07.stdout:8/951: chown d1/d3/f16 208 1 2026-03-10T12:38:29.251 INFO:tasks.workunit.client.1.vm07.stdout:8/952: fsync d1/d3/d11/f43 0 2026-03-10T12:38:29.252 INFO:tasks.workunit.client.1.vm07.stdout:6/946: creat d1/d4/d44/d12c/f13f x:0 0 0 2026-03-10T12:38:29.255 INFO:tasks.workunit.client.1.vm07.stdout:1/966: rename d9/d2d/d80/d8e/lbb to d9/d2d/d4f/d75/d77/da7/l13e 0 2026-03-10T12:38:29.258 INFO:tasks.workunit.client.1.vm07.stdout:2/894: rename d0/d5b/f10c to d0/d29/d64/db5/f139 0 2026-03-10T12:38:29.265 INFO:tasks.workunit.client.1.vm07.stdout:1/967: unlink d9/d2d/d4f/d75/de3/ff9 0 2026-03-10T12:38:29.265 INFO:tasks.workunit.client.1.vm07.stdout:5/972: creat d0/d22/d18/d19/d36/d75/f14c x:0 0 0 2026-03-10T12:38:29.265 INFO:tasks.workunit.client.1.vm07.stdout:2/895: creat d0/d80/d93/f13a x:0 0 0 2026-03-10T12:38:29.266 INFO:tasks.workunit.client.1.vm07.stdout:1/968: creat d9/df/d29/d2b/d92/f13f x:0 0 0 2026-03-10T12:38:29.266 INFO:tasks.workunit.client.1.vm07.stdout:6/947: dread d1/d4/d6/d16/d1a/d33/f3c [0,4194304] 0 2026-03-10T12:38:29.268 INFO:tasks.workunit.client.1.vm07.stdout:5/973: fsync d0/d22/d18/d19/d2e/da9/fb5 0 2026-03-10T12:38:29.269 INFO:tasks.workunit.client.1.vm07.stdout:8/953: rename d1/d3/d6/d54/f72 to d1/d3/d6c/dde/d127/f132 0 2026-03-10T12:38:29.272 INFO:tasks.workunit.client.1.vm07.stdout:5/974: symlink d0/d22/d18/d19/d2e/da9/l14d 0 2026-03-10T12:38:29.274 INFO:tasks.workunit.client.1.vm07.stdout:1/969: symlink d9/d2d/d124/l140 0 2026-03-10T12:38:29.275 INFO:tasks.workunit.client.1.vm07.stdout:8/954: mknod d1/d3/d6/d50/d70/dfb/c133 0 2026-03-10T12:38:29.278 INFO:tasks.workunit.client.1.vm07.stdout:8/955: mknod d1/d3/d40/d92/dba/c134 0 2026-03-10T12:38:29.279 INFO:tasks.workunit.client.1.vm07.stdout:8/956: readlink d1/d3/d40/ldd 0 2026-03-10T12:38:29.279 INFO:tasks.workunit.client.1.vm07.stdout:8/957: chown d1/d3/d6/l78 5 1 2026-03-10T12:38:29.282 INFO:tasks.workunit.client.1.vm07.stdout:8/958: symlink d1/d3/d40/d92/db6/d11d/l135 0 2026-03-10T12:38:29.285 INFO:tasks.workunit.client.1.vm07.stdout:8/959: dwrite d1/d3/d11/f35 [0,4194304] 0 2026-03-10T12:38:29.287 INFO:tasks.workunit.client.1.vm07.stdout:1/970: getdents d9/df/d29/d2b/d31/d11f 0 2026-03-10T12:38:29.289 INFO:tasks.workunit.client.1.vm07.stdout:2/896: sync 2026-03-10T12:38:29.292 INFO:tasks.workunit.client.1.vm07.stdout:8/960: dread d1/d3/d6c/fe3 [0,4194304] 0 2026-03-10T12:38:29.293 INFO:tasks.workunit.client.1.vm07.stdout:8/961: read - d1/d3/db2/dcd/d105/f114 zero size 2026-03-10T12:38:29.300 INFO:tasks.workunit.client.1.vm07.stdout:1/971: mknod d9/df/d29/d2b/d92/d9d/c141 0 2026-03-10T12:38:29.300 INFO:tasks.workunit.client.1.vm07.stdout:2/897: mkdir d0/d80/d93/d13b 0 2026-03-10T12:38:29.301 INFO:tasks.workunit.client.1.vm07.stdout:2/898: truncate d0/d29/d64/f138 218381 0 2026-03-10T12:38:29.303 INFO:tasks.workunit.client.1.vm07.stdout:8/962: truncate d1/d3/db2/dcd/fa4 497710 0 2026-03-10T12:38:29.311 INFO:tasks.workunit.client.1.vm07.stdout:1/972: dread d9/df/d29/d2b/d31/d11f/f7a [0,4194304] 0 2026-03-10T12:38:29.318 INFO:tasks.workunit.client.1.vm07.stdout:2/899: truncate d0/de1/f11f 14214 0 2026-03-10T12:38:29.318 INFO:tasks.workunit.client.1.vm07.stdout:2/900: chown d0/d29/d64/d6c 0 1 2026-03-10T12:38:29.319 INFO:tasks.workunit.client.1.vm07.stdout:2/901: readlink d0/d42/d1f/le5 0 2026-03-10T12:38:29.322 INFO:tasks.workunit.client.1.vm07.stdout:1/973: truncate d9/df/d29/d2b/d31/f72 648186 0 2026-03-10T12:38:29.323 INFO:tasks.workunit.client.1.vm07.stdout:1/974: chown d9/d2d/d4f/d5a/f104 176297 1 2026-03-10T12:38:29.324 INFO:tasks.workunit.client.1.vm07.stdout:8/963: dread d1/f6b [4194304,4194304] 0 2026-03-10T12:38:29.329 INFO:tasks.workunit.client.1.vm07.stdout:7/918: write d0/d57/d62/f8b [2037087,31928] 0 2026-03-10T12:38:29.330 INFO:tasks.workunit.client.1.vm07.stdout:7/919: truncate d0/d57/dd6/d80/f10f 437168 0 2026-03-10T12:38:29.335 INFO:tasks.workunit.client.1.vm07.stdout:1/975: fdatasync d9/df/d29/d2b/d31/f7d 0 2026-03-10T12:38:29.340 INFO:tasks.workunit.client.1.vm07.stdout:8/964: dread d1/f2 [0,4194304] 0 2026-03-10T12:38:29.344 INFO:tasks.workunit.client.1.vm07.stdout:7/920: dread d0/f4f [0,4194304] 0 2026-03-10T12:38:29.345 INFO:tasks.workunit.client.1.vm07.stdout:1/976: symlink d9/d2d/d124/l142 0 2026-03-10T12:38:29.350 INFO:tasks.workunit.client.1.vm07.stdout:8/965: fdatasync d1/d3/d6/f81 0 2026-03-10T12:38:29.354 INFO:tasks.workunit.client.1.vm07.stdout:1/977: symlink d9/df/d29/d2b/d31/d91/l143 0 2026-03-10T12:38:29.358 INFO:tasks.workunit.client.1.vm07.stdout:8/966: mknod d1/d3/d6/d54/c136 0 2026-03-10T12:38:29.365 INFO:tasks.workunit.client.1.vm07.stdout:7/921: fdatasync d0/d57/d62/f75 0 2026-03-10T12:38:29.368 INFO:tasks.workunit.client.1.vm07.stdout:6/948: truncate d1/d4/d6/d16/d1a/d33/f92 2651316 0 2026-03-10T12:38:29.368 INFO:tasks.workunit.client.1.vm07.stdout:5/975: write d0/d22/d18/d3e/d5d/db6/fe6 [154262,100079] 0 2026-03-10T12:38:29.370 INFO:tasks.workunit.client.1.vm07.stdout:1/978: dread - d9/df/d29/d2b/d30/ffd zero size 2026-03-10T12:38:29.372 INFO:tasks.workunit.client.1.vm07.stdout:6/949: dread d1/d4/d6/f8d [0,4194304] 0 2026-03-10T12:38:29.374 INFO:tasks.workunit.client.1.vm07.stdout:8/967: dread d1/d3/d5d/fd5 [0,4194304] 0 2026-03-10T12:38:29.375 INFO:tasks.workunit.client.1.vm07.stdout:6/950: truncate d1/d4/d44/d134/d4d/dc7/f139 33927 0 2026-03-10T12:38:29.377 INFO:tasks.workunit.client.1.vm07.stdout:7/922: rename d0/d47/dde/df5/f119 to d0/d47/dde/df5/f12d 0 2026-03-10T12:38:29.380 INFO:tasks.workunit.client.1.vm07.stdout:1/979: symlink d9/df/d29/d2b/d31/d11f/l144 0 2026-03-10T12:38:29.382 INFO:tasks.workunit.client.1.vm07.stdout:1/980: dread d9/d2d/d4f/dde/f122 [0,4194304] 0 2026-03-10T12:38:29.385 INFO:tasks.workunit.client.0.vm00.stdout:6/955: link d2/da/dc/d2f/l7f d2/d51/d70/l156 0 2026-03-10T12:38:29.385 INFO:tasks.workunit.client.1.vm07.stdout:5/976: mknod d0/d22/d18/d19/d36/d75/c14e 0 2026-03-10T12:38:29.393 INFO:tasks.workunit.client.1.vm07.stdout:7/923: creat d0/d117/f12e x:0 0 0 2026-03-10T12:38:29.394 INFO:tasks.workunit.client.1.vm07.stdout:7/924: write d0/d57/d62/d90/f118 [211404,126783] 0 2026-03-10T12:38:29.400 INFO:tasks.workunit.client.1.vm07.stdout:2/902: dwrite d0/d29/d64/db5/dbb/d114/fc4 [0,4194304] 0 2026-03-10T12:38:29.406 INFO:tasks.workunit.client.0.vm00.stdout:6/956: rmdir d2/d16/d29/d31/d34/d124 0 2026-03-10T12:38:29.406 INFO:tasks.workunit.client.0.vm00.stdout:6/957: readlink d2/da/dc/d2f/d10a/d12e/l145 0 2026-03-10T12:38:29.407 INFO:tasks.workunit.client.1.vm07.stdout:8/968: dread d1/d3/f57 [0,4194304] 0 2026-03-10T12:38:29.407 INFO:tasks.workunit.client.0.vm00.stdout:6/958: dread - d2/d42/d80/dfd/f12f zero size 2026-03-10T12:38:29.409 INFO:tasks.workunit.client.1.vm07.stdout:2/903: dwrite d0/d29/d64/db5/dbb/d114/dad/ddd/f116 [0,4194304] 0 2026-03-10T12:38:29.416 INFO:tasks.workunit.client.1.vm07.stdout:1/981: symlink d9/d2d/l145 0 2026-03-10T12:38:29.417 INFO:tasks.workunit.client.0.vm00.stdout:6/959: mkdir d2/d14/dbb/d12c/d157 0 2026-03-10T12:38:29.418 INFO:tasks.workunit.client.0.vm00.stdout:6/960: write d2/d14/dbb/fd7 [659536,78508] 0 2026-03-10T12:38:29.418 INFO:tasks.workunit.client.1.vm07.stdout:7/925: mkdir d0/d61/db4/df4/d111/d12f 0 2026-03-10T12:38:29.418 INFO:tasks.workunit.client.1.vm07.stdout:6/951: write d1/d4/d6/d43/d65/f86 [1632396,21720] 0 2026-03-10T12:38:29.418 INFO:tasks.workunit.client.0.vm00.stdout:6/961: stat d2/d42/d80/d89/le5 0 2026-03-10T12:38:29.424 INFO:tasks.workunit.client.1.vm07.stdout:5/977: symlink d0/d22/d18/d19/d36/d75/l14f 0 2026-03-10T12:38:29.438 INFO:tasks.workunit.client.1.vm07.stdout:2/904: creat d0/d29/d64/db5/dbb/dca/d105/f13c x:0 0 0 2026-03-10T12:38:29.440 INFO:tasks.workunit.client.1.vm07.stdout:6/952: dread d1/d4/d6/f13 [0,4194304] 0 2026-03-10T12:38:29.443 INFO:tasks.workunit.client.0.vm00.stdout:6/962: creat d2/d16/d29/d31/f158 x:0 0 0 2026-03-10T12:38:29.443 INFO:tasks.workunit.client.0.vm00.stdout:6/963: chown d2/d42/d80/d9d 15 1 2026-03-10T12:38:29.445 INFO:tasks.workunit.client.1.vm07.stdout:1/982: write d9/d2d/d4f/f95 [3240184,59518] 0 2026-03-10T12:38:29.445 INFO:tasks.workunit.client.1.vm07.stdout:1/983: chown d9/l42 103181 1 2026-03-10T12:38:29.451 INFO:tasks.workunit.client.1.vm07.stdout:1/984: dwrite d9/df/d29/d2b/d31/d11f/de1/f118 [0,4194304] 0 2026-03-10T12:38:29.456 INFO:tasks.workunit.client.1.vm07.stdout:7/926: dwrite d0/d57/d62/d90/dce/f120 [0,4194304] 0 2026-03-10T12:38:29.462 INFO:tasks.workunit.client.1.vm07.stdout:5/978: creat d0/d22/d18/d19/d21/d54/dcb/db8/f150 x:0 0 0 2026-03-10T12:38:29.463 INFO:tasks.workunit.client.1.vm07.stdout:5/979: write d0/d22/d18/d3e/d5d/db6/fc4 [3850648,25807] 0 2026-03-10T12:38:29.474 INFO:tasks.workunit.client.0.vm00.stdout:6/964: rename d2/d42/l6b to d2/da/dc/l159 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.0.vm00.stdout:6/965: dread - d2/d9f/dce/ff2 zero size 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:6/953: rename d1/d4/d6/d16/d1a/d99/df5/f115 to d1/d4/d6/d43/d65/f140 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:6/954: write d1/d4/d6/d43/f125 [4090407,93449] 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:1/985: mkdir d9/df/d29/d2b/d92/d146 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:7/927: fdatasync d0/d52/fc7 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:1/986: chown d9/df/d29/d2b/d31/d11f/f7a 226 1 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:8/969: link d1/d3/d6/l17 d1/d3/d11/d87/l137 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:7/928: creat d0/d57/d62/d90/da1/f130 x:0 0 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:2/905: link d0/d29/d64/d74/d88/f58 d0/d42/d26/d7d/d122/f13d 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:2/906: stat d0/f73 0 2026-03-10T12:38:29.486 INFO:tasks.workunit.client.1.vm07.stdout:7/929: dread - d0/d61/db4/f7a zero size 2026-03-10T12:38:29.487 INFO:tasks.workunit.client.1.vm07.stdout:2/907: read d0/d42/d1f/d90/fb2 [3874435,7858] 0 2026-03-10T12:38:29.491 INFO:tasks.workunit.client.1.vm07.stdout:1/987: creat d9/df/dc9/df4/f147 x:0 0 0 2026-03-10T12:38:29.492 INFO:tasks.workunit.client.1.vm07.stdout:7/930: chown d0/d47/da0/c11b 419 1 2026-03-10T12:38:29.496 INFO:tasks.workunit.client.1.vm07.stdout:1/988: dwrite d9/df/d29/d2b/db1/f12b [0,4194304] 0 2026-03-10T12:38:29.498 INFO:tasks.workunit.client.0.vm00.stdout:6/966: rmdir d2/d14/dbb/d12c/d157 0 2026-03-10T12:38:29.513 INFO:tasks.workunit.client.1.vm07.stdout:6/955: getdents d1/d4/d6/d43 0 2026-03-10T12:38:29.518 INFO:tasks.workunit.client.1.vm07.stdout:1/989: dread d9/d2d/fcb [0,4194304] 0 2026-03-10T12:38:29.526 INFO:tasks.workunit.client.1.vm07.stdout:6/956: dread d1/d4/d6/d16/d1a/d99/fa8 [0,4194304] 0 2026-03-10T12:38:29.527 INFO:tasks.workunit.client.1.vm07.stdout:6/957: fdatasync d1/fc9 0 2026-03-10T12:38:29.529 INFO:tasks.workunit.client.1.vm07.stdout:1/990: creat d9/df/d29/d2b/d31/d11f/f148 x:0 0 0 2026-03-10T12:38:29.531 INFO:tasks.workunit.client.1.vm07.stdout:7/931: rmdir d0/d61/db4/df4/d111/d12f 0 2026-03-10T12:38:29.536 INFO:tasks.workunit.client.1.vm07.stdout:5/980: truncate d0/d22/d18/d19/d21/dc2/ded/f10a 748508 0 2026-03-10T12:38:29.536 INFO:tasks.workunit.client.1.vm07.stdout:5/981: write d0/d22/d18/d30/f11e [873522,48347] 0 2026-03-10T12:38:29.538 INFO:tasks.workunit.client.1.vm07.stdout:1/991: rename d9/df/d29/d2b/d92/d123/cc3 to d9/df/d29/d2b/d31/d11f/c149 0 2026-03-10T12:38:29.541 INFO:tasks.workunit.client.1.vm07.stdout:7/932: creat d0/d47/dde/df5/f131 x:0 0 0 2026-03-10T12:38:29.542 INFO:tasks.workunit.client.1.vm07.stdout:7/933: truncate d0/d61/db4/d8a/fbe 328747 0 2026-03-10T12:38:29.542 INFO:tasks.workunit.client.1.vm07.stdout:7/934: fsync d0/d61/db4/f103 0 2026-03-10T12:38:29.547 INFO:tasks.workunit.client.1.vm07.stdout:2/908: write d0/d42/d4e/daf/fcf [815592,94844] 0 2026-03-10T12:38:29.548 INFO:tasks.workunit.client.1.vm07.stdout:2/909: read d0/d29/d64/db5/dbb/d114/dad/ddd/f116 [407171,10626] 0 2026-03-10T12:38:29.549 INFO:tasks.workunit.client.1.vm07.stdout:2/910: readlink d0/d5b/le0 0 2026-03-10T12:38:29.550 INFO:tasks.workunit.client.1.vm07.stdout:2/911: write d0/d42/d26/d7d/fe8 [4443253,41589] 0 2026-03-10T12:38:29.551 INFO:tasks.workunit.client.1.vm07.stdout:2/912: write d0/d29/d64/f67 [1604042,3445] 0 2026-03-10T12:38:29.552 INFO:tasks.workunit.client.1.vm07.stdout:8/970: dwrite d1/d3/d6/d54/fa8 [0,4194304] 0 2026-03-10T12:38:29.553 INFO:tasks.workunit.client.1.vm07.stdout:5/982: chown d0/d22/d18/d19/d72/dcc/fe7 1 1 2026-03-10T12:38:29.561 INFO:tasks.workunit.client.0.vm00.stdout:6/967: mkdir d2/d16/d29/d31/d88/d15a 0 2026-03-10T12:38:29.562 INFO:tasks.workunit.client.1.vm07.stdout:1/992: rename d9/df/d55/d9f/c10c to d9/d2d/de2/dc8/c14a 0 2026-03-10T12:38:29.563 INFO:tasks.workunit.client.1.vm07.stdout:7/935: creat d0/d61/db4/df4/d111/f132 x:0 0 0 2026-03-10T12:38:29.570 INFO:tasks.workunit.client.1.vm07.stdout:6/958: write d1/d4/d6/f7c [947177,41095] 0 2026-03-10T12:38:29.574 INFO:tasks.workunit.client.0.vm00.stdout:6/968: mkdir d2/da/dc/d94/d149/d15b 0 2026-03-10T12:38:29.575 INFO:tasks.workunit.client.0.vm00.stdout:6/969: truncate d2/d16/f6d 2573022 0 2026-03-10T12:38:29.576 INFO:tasks.workunit.client.0.vm00.stdout:6/970: readlink d2/d42/d80/d89/l93 0 2026-03-10T12:38:29.577 INFO:tasks.workunit.client.1.vm07.stdout:5/983: unlink d0/d22/d18/d19/d21/d54/dcb/db8/fca 0 2026-03-10T12:38:29.580 INFO:tasks.workunit.client.1.vm07.stdout:5/984: dwrite d0/d22/f50 [0,4194304] 0 2026-03-10T12:38:29.585 INFO:tasks.workunit.client.1.vm07.stdout:1/993: rmdir d9/d2d/d4f/d5a/d11c 39 2026-03-10T12:38:29.596 INFO:tasks.workunit.client.1.vm07.stdout:6/959: rename d1/d4/d6/d4e/l11d to d1/d4/d44/d134/d4d/dc7/dd9/l141 0 2026-03-10T12:38:29.596 INFO:tasks.workunit.client.1.vm07.stdout:6/960: write d1/fc9 [1712945,36138] 0 2026-03-10T12:38:29.599 INFO:tasks.workunit.client.0.vm00.stdout:6/971: creat d2/d16/d29/d31/d88/d92/daa/f15c x:0 0 0 2026-03-10T12:38:29.599 INFO:tasks.workunit.client.0.vm00.stdout:6/972: chown d2/da/dc/f25 11296 1 2026-03-10T12:38:29.600 INFO:tasks.workunit.client.1.vm07.stdout:2/913: mkdir d0/de1/df2/d13e 0 2026-03-10T12:38:29.602 INFO:tasks.workunit.client.0.vm00.stdout:6/973: creat d2/d16/d29/d31/d88/f15d x:0 0 0 2026-03-10T12:38:29.604 INFO:tasks.workunit.client.1.vm07.stdout:5/985: rmdir d0/d22/d18/d19/d72/dcc 39 2026-03-10T12:38:29.610 INFO:tasks.workunit.client.0.vm00.stdout:6/974: symlink d2/da/dc/d94/d149/d15b/l15e 0 2026-03-10T12:38:29.612 INFO:tasks.workunit.client.0.vm00.stdout:6/975: dread d2/da/dc/d94/f133 [0,4194304] 0 2026-03-10T12:38:29.615 INFO:tasks.workunit.client.1.vm07.stdout:8/971: write d1/d3/db2/dcd/fa4 [880639,23534] 0 2026-03-10T12:38:29.617 INFO:tasks.workunit.client.1.vm07.stdout:7/936: truncate d0/d57/dd6/d80/ffb 2702140 0 2026-03-10T12:38:29.622 INFO:tasks.workunit.client.1.vm07.stdout:6/961: dwrite d1/d4/d6/d16/d49/fd3 [0,4194304] 0 2026-03-10T12:38:29.630 INFO:tasks.workunit.client.0.vm00.stdout:6/976: mkdir d2/da/dc/d94/d15f 0 2026-03-10T12:38:29.634 INFO:tasks.workunit.client.1.vm07.stdout:8/972: symlink d1/d3/d6/d50/d70/dcf/l138 0 2026-03-10T12:38:29.640 INFO:tasks.workunit.client.0.vm00.stdout:6/977: mkdir d2/d42/d103/d160 0 2026-03-10T12:38:29.641 INFO:tasks.workunit.client.1.vm07.stdout:8/973: dread - d1/d3/d11/d87/fe5 zero size 2026-03-10T12:38:29.641 INFO:tasks.workunit.client.1.vm07.stdout:8/974: stat d1/d3/d11/ce8 0 2026-03-10T12:38:29.642 INFO:tasks.workunit.client.0.vm00.stdout:6/978: symlink d2/da/dc/d94/d15f/l161 0 2026-03-10T12:38:29.643 INFO:tasks.workunit.client.0.vm00.stdout:6/979: dread d2/d14/f32 [0,4194304] 0 2026-03-10T12:38:29.643 INFO:tasks.workunit.client.1.vm07.stdout:8/975: mknod d1/d3/db2/c139 0 2026-03-10T12:38:29.644 INFO:tasks.workunit.client.1.vm07.stdout:5/986: dread d0/d22/d18/f4c [4194304,4194304] 0 2026-03-10T12:38:29.645 INFO:tasks.workunit.client.1.vm07.stdout:8/976: creat d1/d3/d6/d50/d70/dfb/f13a x:0 0 0 2026-03-10T12:38:29.649 INFO:tasks.workunit.client.1.vm07.stdout:8/977: unlink d1/d3/d6/d50/fc8 0 2026-03-10T12:38:29.650 INFO:tasks.workunit.client.1.vm07.stdout:8/978: truncate d1/d3/d6/faf 1529338 0 2026-03-10T12:38:29.652 INFO:tasks.workunit.client.1.vm07.stdout:8/979: mkdir d1/d3/d6c/dde/d127/d13b 0 2026-03-10T12:38:29.653 INFO:tasks.workunit.client.1.vm07.stdout:8/980: rename d1/d3/d40/d92/f94 to d1/d3/d40/d92/db6/f13c 0 2026-03-10T12:38:29.655 INFO:tasks.workunit.client.1.vm07.stdout:8/981: symlink d1/d3/d11/d11a/l13d 0 2026-03-10T12:38:29.655 INFO:tasks.workunit.client.1.vm07.stdout:8/982: chown d1/d3/d6/d54/ffa 172 1 2026-03-10T12:38:29.656 INFO:tasks.workunit.client.1.vm07.stdout:8/983: creat d1/d3/d40/d92/db6/f13e x:0 0 0 2026-03-10T12:38:29.658 INFO:tasks.workunit.client.0.vm00.stdout:6/980: symlink d2/d16/d29/d31/l162 0 2026-03-10T12:38:29.658 INFO:tasks.workunit.client.1.vm07.stdout:8/984: link d1/d3/d6c/dde/de7/ff4 d1/d3/d18/f13f 0 2026-03-10T12:38:29.660 INFO:tasks.workunit.client.0.vm00.stdout:6/981: mknod d2/d14/dbb/c163 0 2026-03-10T12:38:29.661 INFO:tasks.workunit.client.1.vm07.stdout:8/985: stat d1/d3/d6/d50/d70/dd4/l103 0 2026-03-10T12:38:29.661 INFO:tasks.workunit.client.1.vm07.stdout:8/986: dread - d1/d3/d6c/dde/f10b zero size 2026-03-10T12:38:29.663 INFO:tasks.workunit.client.1.vm07.stdout:8/987: unlink d1/l4d 0 2026-03-10T12:38:29.674 INFO:tasks.workunit.client.1.vm07.stdout:2/914: dwrite d0/d29/f32 [0,4194304] 0 2026-03-10T12:38:29.676 INFO:tasks.workunit.client.1.vm07.stdout:1/994: dwrite d9/df/d29/d2b/d31/d91/d59/f84 [0,4194304] 0 2026-03-10T12:38:29.679 INFO:tasks.workunit.client.1.vm07.stdout:6/962: write d1/d4/d6/d16/d1a/d2c/ffd [378474,94298] 0 2026-03-10T12:38:29.681 INFO:tasks.workunit.client.1.vm07.stdout:7/937: dwrite d0/d57/d62/f75 [0,4194304] 0 2026-03-10T12:38:29.681 INFO:tasks.workunit.client.1.vm07.stdout:7/938: stat d0/d61/db4/f54 0 2026-03-10T12:38:29.692 INFO:tasks.workunit.client.1.vm07.stdout:7/939: readlink d0/d61/d79/l125 0 2026-03-10T12:38:29.709 INFO:tasks.workunit.client.1.vm07.stdout:8/988: getdents d1/d3/d40/d92/db6/d11d 0 2026-03-10T12:38:29.714 INFO:tasks.workunit.client.1.vm07.stdout:5/987: write d0/d22/d18/d19/d21/fa1 [1061658,83655] 0 2026-03-10T12:38:29.717 INFO:tasks.workunit.client.1.vm07.stdout:1/995: mkdir d9/d2d/de2/dc8/d14b 0 2026-03-10T12:38:29.719 INFO:tasks.workunit.client.0.vm00.stdout:6/982: getdents d2/d16/d29/d31/d34 0 2026-03-10T12:38:29.721 INFO:tasks.workunit.client.1.vm07.stdout:7/940: creat d0/d47/dab/f133 x:0 0 0 2026-03-10T12:38:29.726 INFO:tasks.workunit.client.0.vm00.stdout:6/983: mkdir d2/d16/d29/d31/d164 0 2026-03-10T12:38:29.726 INFO:tasks.workunit.client.1.vm07.stdout:2/915: mknod d0/d42/c13f 0 2026-03-10T12:38:29.728 INFO:tasks.workunit.client.0.vm00.stdout:6/984: truncate d2/d42/d80/d9d/f106 919177 0 2026-03-10T12:38:29.730 INFO:tasks.workunit.client.0.vm00.stdout:6/985: rmdir d2/d16/d29/d31/d88/d143 39 2026-03-10T12:38:29.733 INFO:tasks.workunit.client.1.vm07.stdout:2/916: unlink d0/d29/d64/d74/d75/db7/lda 0 2026-03-10T12:38:29.733 INFO:tasks.workunit.client.1.vm07.stdout:2/917: chown d0/d29/d64/d74/df4 3 1 2026-03-10T12:38:29.737 INFO:tasks.workunit.client.1.vm07.stdout:8/989: link d1/d3/d6/c6f d1/d3/d6/d50/d70/dd4/c140 0 2026-03-10T12:38:29.739 INFO:tasks.workunit.client.1.vm07.stdout:2/918: unlink d0/d29/d64/db5/dbb/dca/d105/f13c 0 2026-03-10T12:38:29.739 INFO:tasks.workunit.client.1.vm07.stdout:5/988: getdents d0/d22/d18/d3e/d5d/d12c 0 2026-03-10T12:38:29.743 INFO:tasks.workunit.client.1.vm07.stdout:8/990: dread d1/f88 [0,4194304] 0 2026-03-10T12:38:29.746 INFO:tasks.workunit.client.1.vm07.stdout:7/941: dread d0/d61/fdb [0,4194304] 0 2026-03-10T12:38:29.746 INFO:tasks.workunit.client.1.vm07.stdout:2/919: creat d0/d45/f140 x:0 0 0 2026-03-10T12:38:29.746 INFO:tasks.workunit.client.1.vm07.stdout:8/991: chown d1/d3/d11/l5c 62 1 2026-03-10T12:38:29.747 INFO:tasks.workunit.client.1.vm07.stdout:7/942: dread d0/f4f [0,4194304] 0 2026-03-10T12:38:29.749 INFO:tasks.workunit.client.1.vm07.stdout:2/920: fsync d0/d29/d64/db5/dbb/d114/dad/ddd/f116 0 2026-03-10T12:38:29.749 INFO:tasks.workunit.client.1.vm07.stdout:5/989: mkdir d0/d22/d18/d3e/d5d/d10b/d12d/d151 0 2026-03-10T12:38:29.758 INFO:tasks.workunit.client.1.vm07.stdout:6/963: write d1/d4/d44/fd0 [967687,127471] 0 2026-03-10T12:38:29.762 INFO:tasks.workunit.client.1.vm07.stdout:1/996: write d9/df/d29/f70 [4805403,108695] 0 2026-03-10T12:38:29.765 INFO:tasks.workunit.client.0.vm00.stdout:6/986: truncate d2/d16/d29/f54 2417711 0 2026-03-10T12:38:29.767 INFO:tasks.workunit.client.0.vm00.stdout:6/987: write d2/d16/f20 [5806487,53561] 0 2026-03-10T12:38:29.771 INFO:tasks.workunit.client.1.vm07.stdout:6/964: dread d1/d4/d6/d16/d1a/d9d/f10c [0,4194304] 0 2026-03-10T12:38:29.771 INFO:tasks.workunit.client.1.vm07.stdout:2/921: unlink d0/d29/fb3 0 2026-03-10T12:38:29.773 INFO:tasks.workunit.client.1.vm07.stdout:5/990: creat d0/d22/d18/d19/d2e/f152 x:0 0 0 2026-03-10T12:38:29.782 INFO:tasks.workunit.client.0.vm00.stdout:6/988: mkdir d2/d9f/dce/d165 0 2026-03-10T12:38:29.782 INFO:tasks.workunit.client.1.vm07.stdout:7/943: dwrite d0/d57/d62/d90/da1/fe7 [0,4194304] 0 2026-03-10T12:38:29.783 INFO:tasks.workunit.client.1.vm07.stdout:5/991: dread d0/d22/d18/d19/d21/d54/dcb/f87 [0,4194304] 0 2026-03-10T12:38:29.783 INFO:tasks.workunit.client.1.vm07.stdout:8/992: mkdir d1/d3/d11/d141 0 2026-03-10T12:38:29.783 INFO:tasks.workunit.client.1.vm07.stdout:2/922: creat d0/de1/f141 x:0 0 0 2026-03-10T12:38:29.784 INFO:tasks.workunit.client.1.vm07.stdout:6/965: truncate d1/d4/d6/d16/d1a/d99/fa8 9178610 0 2026-03-10T12:38:29.797 INFO:tasks.workunit.client.1.vm07.stdout:7/944: chown d0/d57/dd6/d80/ffb 42273022 1 2026-03-10T12:38:29.798 INFO:tasks.workunit.client.1.vm07.stdout:2/923: mknod d0/d80/c142 0 2026-03-10T12:38:29.800 INFO:tasks.workunit.client.1.vm07.stdout:5/992: creat d0/d22/dbc/f153 x:0 0 0 2026-03-10T12:38:29.801 INFO:tasks.workunit.client.1.vm07.stdout:2/924: dread d0/d42/d26/d7d/fc8 [0,4194304] 0 2026-03-10T12:38:29.801 INFO:tasks.workunit.client.0.vm00.stdout:6/989: dread d2/d16/f17 [0,4194304] 0 2026-03-10T12:38:29.802 INFO:tasks.workunit.client.0.vm00.stdout:6/990: dread - d2/d9f/dce/ff2 zero size 2026-03-10T12:38:29.802 INFO:tasks.workunit.client.0.vm00.stdout:6/991: chown d2/da/dbf/ded/d118 0 1 2026-03-10T12:38:29.804 INFO:tasks.workunit.client.1.vm07.stdout:1/997: creat d9/d2d/d80/d8e/f14c x:0 0 0 2026-03-10T12:38:29.806 INFO:tasks.workunit.client.1.vm07.stdout:6/966: dwrite d1/d4/d6/d16/d1a/f9f [0,4194304] 0 2026-03-10T12:38:29.808 INFO:tasks.workunit.client.1.vm07.stdout:8/993: mknod d1/d3/d6c/dde/d127/d13b/c142 0 2026-03-10T12:38:29.813 INFO:tasks.workunit.client.1.vm07.stdout:5/993: fdatasync d0/d22/d18/d19/d2e/d67/fe2 0 2026-03-10T12:38:29.815 INFO:tasks.workunit.client.1.vm07.stdout:2/925: creat d0/d29/d64/db5/dbb/d114/d5d/f143 x:0 0 0 2026-03-10T12:38:29.817 INFO:tasks.workunit.client.1.vm07.stdout:7/945: dread d0/d61/db4/f4b [0,4194304] 0 2026-03-10T12:38:29.818 INFO:tasks.workunit.client.1.vm07.stdout:7/946: fsync d0/d61/db4/df4/d111/f132 0 2026-03-10T12:38:29.830 INFO:tasks.workunit.client.1.vm07.stdout:6/967: mkdir d1/dd7/da3/dd5/d142 0 2026-03-10T12:38:29.831 INFO:tasks.workunit.client.1.vm07.stdout:6/968: write d1/d4/d6/d43/d88/dc3/ff2 [3543436,72334] 0 2026-03-10T12:38:29.832 INFO:tasks.workunit.client.1.vm07.stdout:8/994: mkdir d1/d3/d11/d87/d143 0 2026-03-10T12:38:29.834 INFO:tasks.workunit.client.0.vm00.stdout:6/992: mkdir d2/d16/d74/d102/d166 0 2026-03-10T12:38:29.844 INFO:tasks.workunit.client.0.vm00.stdout:6/993: read - d2/d9f/dce/ff2 zero size 2026-03-10T12:38:29.845 INFO:tasks.workunit.client.1.vm07.stdout:5/994: symlink d0/d22/d18/d19/d21/d54/dcb/de8/l154 0 2026-03-10T12:38:29.845 INFO:tasks.workunit.client.1.vm07.stdout:5/995: dread - d0/d22/d18/d19/d72/ffc zero size 2026-03-10T12:38:29.845 INFO:tasks.workunit.client.1.vm07.stdout:7/947: rename d0/d117 to d0/d57/d62/d90/da1/d134 0 2026-03-10T12:38:29.845 INFO:tasks.workunit.client.1.vm07.stdout:6/969: readlink d1/d4/d44/d134/d4d/dc7/dd9/l141 0 2026-03-10T12:38:29.845 INFO:tasks.workunit.client.1.vm07.stdout:7/948: mkdir d0/d57/d62/d90/dce/d135 0 2026-03-10T12:38:29.850 INFO:tasks.workunit.client.1.vm07.stdout:6/970: creat d1/dd7/da3/dd5/f143 x:0 0 0 2026-03-10T12:38:29.850 INFO:tasks.workunit.client.1.vm07.stdout:6/971: fsync d1/dd7/f133 0 2026-03-10T12:38:29.851 INFO:tasks.workunit.client.1.vm07.stdout:8/995: dread d1/d3/db2/f124 [0,4194304] 0 2026-03-10T12:38:29.853 INFO:tasks.workunit.client.0.vm00.stdout:6/994: rename d2/d42/d103/d160 to d2/d16/d29/d31/d88/d167 0 2026-03-10T12:38:29.855 INFO:tasks.workunit.client.1.vm07.stdout:1/998: getdents d9/df/d55/d9f 0 2026-03-10T12:38:29.856 INFO:tasks.workunit.client.1.vm07.stdout:1/999: readlink d9/df/d29/d2b/d31/l3b 0 2026-03-10T12:38:29.861 INFO:tasks.workunit.client.0.vm00.stdout:6/995: dwrite d2/d14/f5d [4194304,4194304] 0 2026-03-10T12:38:29.864 INFO:tasks.workunit.client.1.vm07.stdout:8/996: fsync d1/d3/fbb 0 2026-03-10T12:38:29.877 INFO:tasks.workunit.client.1.vm07.stdout:8/997: dread d1/d3/d40/d92/db6/fad [0,4194304] 0 2026-03-10T12:38:29.877 INFO:tasks.workunit.client.0.vm00.stdout:6/996: sync 2026-03-10T12:38:29.877 INFO:tasks.workunit.client.1.vm07.stdout:2/926: dwrite d0/d42/d26/f85 [0,4194304] 0 2026-03-10T12:38:29.878 INFO:tasks.workunit.client.0.vm00.stdout:6/997: fdatasync d2/da/dc/d94/f121 0 2026-03-10T12:38:29.878 INFO:tasks.workunit.client.0.vm00.stdout:6/998: read - d2/d42/d80/dfd/f12f zero size 2026-03-10T12:38:29.894 INFO:tasks.workunit.client.1.vm07.stdout:2/927: chown d0/d29/d64/d6c/d94 26 1 2026-03-10T12:38:29.894 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:29 vm00.local ceph-mon[50686]: pgmap v11: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 41 MiB/s rd, 102 MiB/s wr, 269 op/s 2026-03-10T12:38:29.898 INFO:tasks.workunit.client.1.vm07.stdout:5/996: dread d0/d22/d18/fb4 [4194304,4194304] 0 2026-03-10T12:38:29.899 INFO:tasks.workunit.client.1.vm07.stdout:8/998: truncate d1/d3/d40/d92/db6/fad 44587 0 2026-03-10T12:38:29.901 INFO:tasks.workunit.client.0.vm00.stdout:6/999: creat d2/d16/d29/d31/d88/d15a/f168 x:0 0 0 2026-03-10T12:38:29.902 INFO:tasks.workunit.client.1.vm07.stdout:7/949: dwrite d0/d52/f97 [0,4194304] 0 2026-03-10T12:38:29.908 INFO:tasks.workunit.client.1.vm07.stdout:6/972: dwrite d1/d4/d71/f79 [0,4194304] 0 2026-03-10T12:38:29.915 INFO:tasks.workunit.client.0.vm00.stderr:+ rm -rf -- ./tmp.328fxcUIfq 2026-03-10T12:38:29.926 INFO:tasks.workunit.client.1.vm07.stdout:7/950: mknod d0/d57/d62/d90/da1/d134/c136 0 2026-03-10T12:38:29.926 INFO:tasks.workunit.client.1.vm07.stdout:5/997: symlink d0/l155 0 2026-03-10T12:38:29.929 INFO:tasks.workunit.client.1.vm07.stdout:7/951: creat d0/d57/dd6/d80/f137 x:0 0 0 2026-03-10T12:38:29.929 INFO:tasks.workunit.client.1.vm07.stdout:7/952: dread - d0/d61/d79/f104 zero size 2026-03-10T12:38:29.930 INFO:tasks.workunit.client.1.vm07.stdout:5/998: fdatasync d0/d22/d18/d3e/d5d/db6/f132 0 2026-03-10T12:38:29.931 INFO:tasks.workunit.client.1.vm07.stdout:6/973: creat d1/d4/f144 x:0 0 0 2026-03-10T12:38:29.932 INFO:tasks.workunit.client.1.vm07.stdout:8/999: link d1/d3/db2/dcd/db8/lbf d1/d3/d6c/dde/de7/l144 0 2026-03-10T12:38:29.935 INFO:tasks.workunit.client.1.vm07.stdout:6/974: dwrite d1/d4/d6/d4e/f124 [0,4194304] 0 2026-03-10T12:38:29.936 INFO:tasks.workunit.client.1.vm07.stdout:7/953: rename d0/d61/db4/d8a/l96 to d0/d57/d62/d90/dce/l138 0 2026-03-10T12:38:29.937 INFO:tasks.workunit.client.1.vm07.stdout:5/999: fsync d0/d22/d18/d19/d72/fd8 0 2026-03-10T12:38:29.939 INFO:tasks.workunit.client.1.vm07.stdout:6/975: mknod d1/d4/d6/d43/d88/d97/c145 0 2026-03-10T12:38:29.940 INFO:tasks.workunit.client.1.vm07.stdout:7/954: creat d0/d67/d11c/d122/f139 x:0 0 0 2026-03-10T12:38:29.942 INFO:tasks.workunit.client.1.vm07.stdout:7/955: read d0/f56 [162160,31091] 0 2026-03-10T12:38:29.947 INFO:tasks.workunit.client.1.vm07.stdout:6/976: getdents d1/d106/d137 0 2026-03-10T12:38:29.988 INFO:tasks.workunit.client.1.vm07.stdout:2/928: sync 2026-03-10T12:38:30.051 INFO:tasks.workunit.client.1.vm07.stdout:7/956: write d0/d47/f58 [320860,89373] 0 2026-03-10T12:38:30.053 INFO:tasks.workunit.client.1.vm07.stdout:7/957: link d0/d61/d79/f104 d0/d57/f13a 0 2026-03-10T12:38:30.056 INFO:tasks.workunit.client.1.vm07.stdout:6/977: truncate d1/d4/d6/f41 512158 0 2026-03-10T12:38:30.056 INFO:tasks.workunit.client.1.vm07.stdout:2/929: write d0/d42/f2c [845145,100906] 0 2026-03-10T12:38:30.056 INFO:tasks.workunit.client.1.vm07.stdout:6/978: stat d1/d4/d6/d16/d1a/d6e/f11c 0 2026-03-10T12:38:30.061 INFO:tasks.workunit.client.1.vm07.stdout:2/930: rmdir d0/d42/d1f/d90 39 2026-03-10T12:38:30.066 INFO:tasks.workunit.client.1.vm07.stdout:7/958: write d0/d47/dab/dae/fbd [477899,10957] 0 2026-03-10T12:38:30.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:29 vm07.local ceph-mon[58582]: pgmap v11: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 41 MiB/s rd, 102 MiB/s wr, 269 op/s 2026-03-10T12:38:30.067 INFO:tasks.workunit.client.1.vm07.stdout:7/959: mkdir d0/d47/da0/dd4/d13b 0 2026-03-10T12:38:30.068 INFO:tasks.workunit.client.1.vm07.stdout:6/979: write d1/d4/d6/f7e [725566,110129] 0 2026-03-10T12:38:30.074 INFO:tasks.workunit.client.1.vm07.stdout:7/960: unlink d0/f28 0 2026-03-10T12:38:30.074 INFO:tasks.workunit.client.1.vm07.stdout:7/961: chown d0/d61/db4 13 1 2026-03-10T12:38:30.076 INFO:tasks.workunit.client.1.vm07.stdout:6/980: fdatasync d1/d4/d6/d16/d1a/d99/fe9 0 2026-03-10T12:38:30.078 INFO:tasks.workunit.client.1.vm07.stdout:7/962: mkdir d0/d61/d13c 0 2026-03-10T12:38:30.081 INFO:tasks.workunit.client.1.vm07.stdout:6/981: unlink d1/d106/fef 0 2026-03-10T12:38:30.085 INFO:tasks.workunit.client.1.vm07.stdout:6/982: read d1/d4/d6/f91 [528937,18246] 0 2026-03-10T12:38:30.088 INFO:tasks.workunit.client.1.vm07.stdout:6/983: symlink d1/d4/d6/d16/l146 0 2026-03-10T12:38:30.088 INFO:tasks.workunit.client.1.vm07.stdout:6/984: readlink d1/d4/d6/l32 0 2026-03-10T12:38:30.092 INFO:tasks.workunit.client.1.vm07.stdout:6/985: unlink d1/d4/d6/d96/fea 0 2026-03-10T12:38:30.093 INFO:tasks.workunit.client.1.vm07.stdout:6/986: creat d1/d4/d44/f147 x:0 0 0 2026-03-10T12:38:30.095 INFO:tasks.workunit.client.1.vm07.stdout:2/931: dread d0/d42/d26/d7d/fea [0,4194304] 0 2026-03-10T12:38:30.097 INFO:tasks.workunit.client.1.vm07.stdout:6/987: rename d1/dd7/d66/dd6/deb to d1/d4/d9b/d148 0 2026-03-10T12:38:30.098 INFO:tasks.workunit.client.1.vm07.stdout:2/932: dwrite d0/d42/d4e/d77/f103 [0,4194304] 0 2026-03-10T12:38:30.104 INFO:tasks.workunit.client.1.vm07.stdout:7/963: dwrite d0/d47/dab/f129 [0,4194304] 0 2026-03-10T12:38:30.105 INFO:tasks.workunit.client.1.vm07.stdout:7/964: fdatasync d0/d61/db4/f4b 0 2026-03-10T12:38:30.116 INFO:tasks.workunit.client.1.vm07.stdout:7/965: mkdir d0/d67/d11c/d13d 0 2026-03-10T12:38:30.121 INFO:tasks.workunit.client.1.vm07.stdout:7/966: fsync d0/f56 0 2026-03-10T12:38:30.122 INFO:tasks.workunit.client.1.vm07.stdout:6/988: getdents d1/d4/d6/d16/d1a/d99/df5 0 2026-03-10T12:38:30.124 INFO:tasks.workunit.client.1.vm07.stdout:6/989: fsync d1/d4/d44/d134/d4d/dc7/f139 0 2026-03-10T12:38:30.125 INFO:tasks.workunit.client.1.vm07.stdout:7/967: dwrite d0/d61/db4/f53 [0,4194304] 0 2026-03-10T12:38:30.126 INFO:tasks.workunit.client.1.vm07.stdout:7/968: stat d0/d52/f97 0 2026-03-10T12:38:30.144 INFO:tasks.workunit.client.1.vm07.stdout:7/969: write d0/d47/dde/f10e [315262,95705] 0 2026-03-10T12:38:30.144 INFO:tasks.workunit.client.1.vm07.stdout:7/970: mkdir d0/d61/db4/d8a/d13e 0 2026-03-10T12:38:30.204 INFO:tasks.workunit.client.1.vm07.stdout:2/933: truncate d0/d42/d26/f52 2654935 0 2026-03-10T12:38:30.204 INFO:tasks.workunit.client.1.vm07.stdout:7/971: dwrite d0/d57/d62/d90/fed [4194304,4194304] 0 2026-03-10T12:38:30.206 INFO:tasks.workunit.client.1.vm07.stdout:7/972: symlink d0/d47/dab/dae/l13f 0 2026-03-10T12:38:30.207 INFO:tasks.workunit.client.1.vm07.stdout:7/973: truncate d0/d57/dd6/d80/f137 524630 0 2026-03-10T12:38:30.210 INFO:tasks.workunit.client.1.vm07.stdout:7/974: chown d0/d61/db4/d8a/d9d/ce4 1219 1 2026-03-10T12:38:30.212 INFO:tasks.workunit.client.1.vm07.stdout:7/975: readlink d0/l2d 0 2026-03-10T12:38:30.213 INFO:tasks.workunit.client.1.vm07.stdout:6/990: dwrite d1/d4/f11 [0,4194304] 0 2026-03-10T12:38:30.215 INFO:tasks.workunit.client.1.vm07.stdout:7/976: chown d0/d57/lec 1494491408 1 2026-03-10T12:38:30.215 INFO:tasks.workunit.client.1.vm07.stdout:6/991: chown d1/d4/d6/d16/d49/f67 7197 1 2026-03-10T12:38:30.216 INFO:tasks.workunit.client.1.vm07.stdout:7/977: fdatasync d0/d61/db4/df4/d111/f132 0 2026-03-10T12:38:30.218 INFO:tasks.workunit.client.1.vm07.stdout:7/978: stat d0/d67/c6e 0 2026-03-10T12:38:30.220 INFO:tasks.workunit.client.1.vm07.stdout:7/979: stat d0/f70 0 2026-03-10T12:38:30.223 INFO:tasks.workunit.client.1.vm07.stdout:6/992: dwrite d1/d4/d6/d4e/fa1 [0,4194304] 0 2026-03-10T12:38:30.225 INFO:tasks.workunit.client.1.vm07.stdout:7/980: truncate d0/d61/d79/f104 875826 0 2026-03-10T12:38:30.231 INFO:tasks.workunit.client.1.vm07.stdout:6/993: rmdir d1/d4/d9b/d148 39 2026-03-10T12:38:30.251 INFO:tasks.workunit.client.1.vm07.stdout:2/934: sync 2026-03-10T12:38:30.251 INFO:tasks.workunit.client.1.vm07.stdout:2/935: dread - d0/de1/f141 zero size 2026-03-10T12:38:30.252 INFO:tasks.workunit.client.1.vm07.stdout:2/936: fdatasync d0/d42/d4e/daf/f126 0 2026-03-10T12:38:30.252 INFO:tasks.workunit.client.1.vm07.stdout:2/937: write d0/d42/d26/d7d/f9a [1510874,86784] 0 2026-03-10T12:38:30.253 INFO:tasks.workunit.client.1.vm07.stdout:2/938: readlink d0/d5b/l11d 0 2026-03-10T12:38:30.253 INFO:tasks.workunit.client.1.vm07.stdout:2/939: write d0/d42/f2c [2112923,126927] 0 2026-03-10T12:38:30.260 INFO:tasks.workunit.client.1.vm07.stdout:2/940: rename d0/d42/c30 to d0/de1/d111/d119/c144 0 2026-03-10T12:38:30.279 INFO:tasks.workunit.client.1.vm07.stdout:2/941: dread d0/d29/d64/fd2 [0,4194304] 0 2026-03-10T12:38:30.282 INFO:tasks.workunit.client.1.vm07.stdout:2/942: mknod d0/d29/d64/db5/dbb/df9/c145 0 2026-03-10T12:38:30.288 INFO:tasks.workunit.client.1.vm07.stdout:7/981: dwrite d0/d61/d79/f104 [0,4194304] 0 2026-03-10T12:38:30.293 INFO:tasks.workunit.client.1.vm07.stdout:7/982: dwrite d0/d61/d115/f11a [0,4194304] 0 2026-03-10T12:38:30.296 INFO:tasks.workunit.client.1.vm07.stdout:2/943: rename d0/d29/d64/d74/d75/db7/c132 to d0/d42/d1f/c146 0 2026-03-10T12:38:30.298 INFO:tasks.workunit.client.1.vm07.stdout:7/983: rename d0/d57/d62/f8b to d0/d57/dd6/d80/f140 0 2026-03-10T12:38:30.299 INFO:tasks.workunit.client.1.vm07.stdout:2/944: creat d0/d29/d64/db5/f147 x:0 0 0 2026-03-10T12:38:30.301 INFO:tasks.workunit.client.1.vm07.stdout:2/945: mknod d0/d42/d26/d7d/d108/c148 0 2026-03-10T12:38:30.313 INFO:tasks.workunit.client.1.vm07.stdout:6/994: write d1/d4/d6/d16/faf [4328011,49885] 0 2026-03-10T12:38:30.315 INFO:tasks.workunit.client.1.vm07.stdout:2/946: creat d0/d42/d1f/dc0/f149 x:0 0 0 2026-03-10T12:38:30.318 INFO:tasks.workunit.client.1.vm07.stdout:6/995: read d1/d4/d6/f91 [296845,77880] 0 2026-03-10T12:38:30.319 INFO:tasks.workunit.client.1.vm07.stdout:6/996: chown d1/d4/d44/d134/d4d/dc7/f109 1 1 2026-03-10T12:38:30.320 INFO:tasks.workunit.client.1.vm07.stdout:2/947: mkdir d0/d29/d64/d74/df4/d14a 0 2026-03-10T12:38:30.320 INFO:tasks.workunit.client.1.vm07.stdout:2/948: readlink d0/d5b/le0 0 2026-03-10T12:38:30.323 INFO:tasks.workunit.client.1.vm07.stdout:2/949: creat d0/de1/df2/d13e/f14b x:0 0 0 2026-03-10T12:38:30.327 INFO:tasks.workunit.client.1.vm07.stdout:2/950: rename d0/d42/d1f/d20/df7/f10a to d0/de1/d111/f14c 0 2026-03-10T12:38:30.327 INFO:tasks.workunit.client.1.vm07.stdout:2/951: chown d0/dcd/ff5 29 1 2026-03-10T12:38:30.328 INFO:tasks.workunit.client.1.vm07.stdout:2/952: write d0/d42/d4e/d77/f89 [1132241,44825] 0 2026-03-10T12:38:30.332 INFO:tasks.workunit.client.1.vm07.stdout:2/953: dwrite d0/d29/d64/d6c/d94/fa7 [0,4194304] 0 2026-03-10T12:38:30.341 INFO:tasks.workunit.client.1.vm07.stdout:2/954: creat d0/d29/d64/db5/dbb/dca/f14d x:0 0 0 2026-03-10T12:38:30.342 INFO:tasks.workunit.client.1.vm07.stdout:2/955: truncate d0/d29/d64/db5/dbb/d114/dad/ddd/f12b 408316 0 2026-03-10T12:38:30.346 INFO:tasks.workunit.client.1.vm07.stdout:6/997: dread d1/d4/f3b [0,4194304] 0 2026-03-10T12:38:30.359 INFO:tasks.workunit.client.1.vm07.stdout:6/998: mknod d1/d4/d44/d134/c149 0 2026-03-10T12:38:30.385 INFO:tasks.workunit.client.1.vm07.stdout:7/984: write d0/f70 [2458435,53808] 0 2026-03-10T12:38:30.389 INFO:tasks.workunit.client.1.vm07.stdout:7/985: dread - d0/d61/db4/df4/f102 zero size 2026-03-10T12:38:30.390 INFO:tasks.workunit.client.1.vm07.stdout:7/986: truncate d0/d47/dde/ff6 4600625 0 2026-03-10T12:38:30.392 INFO:tasks.workunit.client.1.vm07.stdout:2/956: write d0/d42/d4e/d77/d70/f8a [2431307,97975] 0 2026-03-10T12:38:30.393 INFO:tasks.workunit.client.1.vm07.stdout:2/957: readlink d0/d42/d1f/lfa 0 2026-03-10T12:38:30.398 INFO:tasks.workunit.client.1.vm07.stdout:7/987: readlink d0/d61/l6b 0 2026-03-10T12:38:30.400 INFO:tasks.workunit.client.1.vm07.stdout:2/958: rename d0/f8d to d0/de1/d111/f14e 0 2026-03-10T12:38:30.400 INFO:tasks.workunit.client.1.vm07.stdout:6/999: dread d1/d4/d6/d16/d1a/d2c/de0/ff6 [0,4194304] 0 2026-03-10T12:38:30.402 INFO:tasks.workunit.client.1.vm07.stdout:7/988: dread - d0/d57/d62/d90/fd5 zero size 2026-03-10T12:38:30.403 INFO:tasks.workunit.client.1.vm07.stdout:7/989: chown d0/d57/d62/d90/da1/fe7 31 1 2026-03-10T12:38:30.403 INFO:tasks.workunit.client.1.vm07.stdout:2/959: mkdir d0/d29/d64/d74/df4/d115/d14f 0 2026-03-10T12:38:30.404 INFO:tasks.workunit.client.1.vm07.stdout:7/990: creat d0/d67/d10a/f141 x:0 0 0 2026-03-10T12:38:30.405 INFO:tasks.workunit.client.1.vm07.stdout:2/960: fdatasync d0/d29/d64/d6c/fb9 0 2026-03-10T12:38:30.406 INFO:tasks.workunit.client.1.vm07.stdout:7/991: dread - d0/d61/d79/f8f zero size 2026-03-10T12:38:30.408 INFO:tasks.workunit.client.1.vm07.stdout:2/961: mkdir d0/d29/d64/d74/d75/db7/d150 0 2026-03-10T12:38:30.408 INFO:tasks.workunit.client.1.vm07.stdout:2/962: chown d0/d29/d64/db5/dbb/df9/l130 4984 1 2026-03-10T12:38:30.412 INFO:tasks.workunit.client.1.vm07.stdout:7/992: symlink d0/d57/d62/d90/da1/l142 0 2026-03-10T12:38:30.416 INFO:tasks.workunit.client.1.vm07.stdout:7/993: unlink d0/f9b 0 2026-03-10T12:38:30.416 INFO:tasks.workunit.client.1.vm07.stdout:7/994: chown d0/d57/dd6/d80 97734 1 2026-03-10T12:38:30.417 INFO:tasks.workunit.client.1.vm07.stdout:7/995: truncate d0/d57/dd6/d80/f121 370583 0 2026-03-10T12:38:30.420 INFO:tasks.workunit.client.1.vm07.stdout:7/996: getdents d0/d57/dd6/d80 0 2026-03-10T12:38:30.423 INFO:tasks.workunit.client.1.vm07.stdout:7/997: symlink d0/d61/d13c/l143 0 2026-03-10T12:38:30.424 INFO:tasks.workunit.client.1.vm07.stdout:7/998: chown d0/d47/f8e 256 1 2026-03-10T12:38:30.426 INFO:tasks.workunit.client.1.vm07.stdout:2/963: write d0/f46 [448077,14940] 0 2026-03-10T12:38:30.436 INFO:tasks.workunit.client.1.vm07.stdout:2/964: dread d0/d42/d26/d38/f3d [4194304,4194304] 0 2026-03-10T12:38:30.442 INFO:tasks.workunit.client.1.vm07.stdout:7/999: dwrite d0/d57/d62/d90/fd5 [0,4194304] 0 2026-03-10T12:38:30.444 INFO:tasks.workunit.client.1.vm07.stdout:2/965: creat d0/d42/d26/d7d/d122/f151 x:0 0 0 2026-03-10T12:38:30.444 INFO:tasks.workunit.client.1.vm07.stdout:2/966: write d0/d45/f140 [395124,54264] 0 2026-03-10T12:38:30.451 INFO:tasks.workunit.client.1.vm07.stdout:2/967: unlink d0/d80/c142 0 2026-03-10T12:38:30.453 INFO:tasks.workunit.client.1.vm07.stdout:2/968: creat d0/de1/d111/d119/f152 x:0 0 0 2026-03-10T12:38:30.453 INFO:tasks.workunit.client.1.vm07.stdout:2/969: dread - d0/de1/df2/d13e/f14b zero size 2026-03-10T12:38:30.454 INFO:tasks.workunit.client.1.vm07.stdout:2/970: write d0/de1/d111/f14c [165425,38690] 0 2026-03-10T12:38:30.456 INFO:tasks.workunit.client.1.vm07.stdout:2/971: creat d0/d29/d64/d74/d75/db7/f153 x:0 0 0 2026-03-10T12:38:30.457 INFO:tasks.workunit.client.1.vm07.stdout:2/972: mkdir d0/d29/d154 0 2026-03-10T12:38:30.457 INFO:tasks.workunit.client.1.vm07.stdout:2/973: fsync d0/d29/d64/db5/dbb/df9/f110 0 2026-03-10T12:38:30.460 INFO:tasks.workunit.client.1.vm07.stdout:2/974: link d0/d42/d1f/d20/c31 d0/d29/d64/db5/dbb/d114/c155 0 2026-03-10T12:38:30.462 INFO:tasks.workunit.client.1.vm07.stdout:2/975: rename d0/d29/d64/d74/df4/d115 to d0/d42/d4e/d156 0 2026-03-10T12:38:30.464 INFO:tasks.workunit.client.1.vm07.stdout:2/976: dread d0/d42/d26/f85 [0,4194304] 0 2026-03-10T12:38:30.465 INFO:tasks.workunit.client.1.vm07.stdout:2/977: chown d0/d29/d64/db5/dbb/d114/dad/ddd/f116 381238800 1 2026-03-10T12:38:30.467 INFO:tasks.workunit.client.1.vm07.stdout:2/978: mknod d0/d29/d64/d74/d75/c157 0 2026-03-10T12:38:30.469 INFO:tasks.workunit.client.1.vm07.stdout:2/979: mknod d0/d29/d154/c158 0 2026-03-10T12:38:30.470 INFO:tasks.workunit.client.1.vm07.stdout:2/980: read d0/d42/d1f/d20/fdb [162666,99353] 0 2026-03-10T12:38:30.471 INFO:tasks.workunit.client.1.vm07.stdout:2/981: dread - d0/d29/d64/db5/dbb/dca/f12a zero size 2026-03-10T12:38:30.475 INFO:tasks.workunit.client.1.vm07.stdout:2/982: link d0/dcd/ff5 d0/d29/d64/d74/d88/f159 0 2026-03-10T12:38:30.478 INFO:tasks.workunit.client.1.vm07.stdout:2/983: unlink d0/lb 0 2026-03-10T12:38:30.481 INFO:tasks.workunit.client.1.vm07.stdout:2/984: creat d0/d45/f15a x:0 0 0 2026-03-10T12:38:30.483 INFO:tasks.workunit.client.1.vm07.stdout:2/985: symlink d0/d42/d4e/d77/l15b 0 2026-03-10T12:38:30.483 INFO:tasks.workunit.client.1.vm07.stdout:2/986: chown d0/d29/d64/db5/dbb/df9/f110 13337253 1 2026-03-10T12:38:30.484 INFO:tasks.workunit.client.1.vm07.stdout:2/987: rmdir d0/d42/d4e/d77/d70 39 2026-03-10T12:38:30.486 INFO:tasks.workunit.client.1.vm07.stdout:2/988: creat d0/de1/df2/f15c x:0 0 0 2026-03-10T12:38:30.499 INFO:tasks.workunit.client.1.vm07.stdout:2/989: dwrite d0/d45/fa1 [0,4194304] 0 2026-03-10T12:38:30.500 INFO:tasks.workunit.client.1.vm07.stdout:2/990: chown d0/d29/d64/d6c/fef 1 1 2026-03-10T12:38:30.504 INFO:tasks.workunit.client.1.vm07.stdout:2/991: creat d0/d29/d64/db5/dbb/dca/d105/f15d x:0 0 0 2026-03-10T12:38:30.504 INFO:tasks.workunit.client.1.vm07.stdout:2/992: readlink d0/d42/d1f/d20/l47 0 2026-03-10T12:38:30.508 INFO:tasks.workunit.client.1.vm07.stdout:2/993: mkdir d0/d42/d1f/d20/df7/d15e 0 2026-03-10T12:38:30.536 INFO:tasks.workunit.client.1.vm07.stdout:2/994: write d0/f2d [1075136,26469] 0 2026-03-10T12:38:30.540 INFO:tasks.workunit.client.1.vm07.stdout:2/995: getdents d0/de1/d111 0 2026-03-10T12:38:30.541 INFO:tasks.workunit.client.1.vm07.stdout:2/996: truncate d0/d42/d4e/ffe 4527672 0 2026-03-10T12:38:30.549 INFO:tasks.workunit.client.1.vm07.stdout:2/997: dread d0/d42/d4e/ffe [0,4194304] 0 2026-03-10T12:38:30.551 INFO:tasks.workunit.client.1.vm07.stdout:2/998: unlink d0/f46 0 2026-03-10T12:38:30.552 INFO:tasks.workunit.client.1.vm07.stdout:2/999: fsync d0/d29/d64/d74/f127 0 2026-03-10T12:38:30.555 INFO:tasks.workunit.client.1.vm07.stderr:+ rm -rf -- ./tmp.40gfFcEho1 2026-03-10T12:38:31.667 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:31 vm00.local ceph-mon[50686]: pgmap v12: 65 pgs: 65 active+clean; 4.2 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 65 MiB/s rd, 158 MiB/s wr, 403 op/s 2026-03-10T12:38:31.667 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:31 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:31.667 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:31 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:31.667 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:31 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:31.667 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:31 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:31.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:31 vm07.local ceph-mon[58582]: pgmap v12: 65 pgs: 65 active+clean; 4.2 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 65 MiB/s rd, 158 MiB/s wr, 403 op/s 2026-03-10T12:38:31.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:31 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:31.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:31 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:31.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:31 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:31.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:31 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr fail", "who": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr fail", "who": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd='[{"prefix": "mgr fail", "who": "vm07.kfawlb"}]': finished 2026-03-10T12:38:33.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:33 vm00.local ceph-mon[50686]: mgrmap e26: vm00.nescmq(active, starting, since 0.00977453s) 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 192.168.123.107:0/2683299047' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr fail", "who": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "mgr fail", "who": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: from='mgr.24461 ' entity='mgr.vm07.kfawlb' cmd='[{"prefix": "mgr fail", "who": "vm07.kfawlb"}]': finished 2026-03-10T12:38:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:33 vm07.local ceph-mon[58582]: mgrmap e26: vm00.nescmq(active, starting, since 0.00977453s) 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: Active manager daemon vm00.nescmq restarted 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: Activating manager daemon vm00.nescmq 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/crt"}]: dispatch 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: mgrmap e27: vm00.nescmq(active, starting, since 0.0101066s) 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/crt"}]: dispatch 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:38:35.870 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/key"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: Manager daemon vm00.nescmq is now available 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:38:35.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:35 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:38:36.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: Active manager daemon vm00.nescmq restarted 2026-03-10T12:38:36.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: Activating manager daemon vm00.nescmq 2026-03-10T12:38:36.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/crt"}]: dispatch 2026-03-10T12:38:36.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T12:38:36.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: mgrmap e27: vm00.nescmq(active, starting, since 0.0101066s) 2026-03-10T12:38:36.280 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/crt"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm00.nescmq/key"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: Manager daemon vm00.nescmq is now available 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:38:36.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:35 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:38:37.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:37 vm00.local ceph-mon[50686]: mgrmap e28: vm00.nescmq(active, since 1.01498s) 2026-03-10T12:38:37.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:37 vm00.local ceph-mon[50686]: pgmap v3: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail 2026-03-10T12:38:37.401 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:37 vm07.local ceph-mon[58582]: mgrmap e28: vm00.nescmq(active, since 1.01498s) 2026-03-10T12:38:37.401 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:37 vm07.local ceph-mon[58582]: pgmap v3: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail 2026-03-10T12:38:38.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:38 vm00.local ceph-mon[50686]: mgrmap e29: vm00.nescmq(active, since 2s) 2026-03-10T12:38:38.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:38 vm00.local ceph-mon[50686]: pgmap v4: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail 2026-03-10T12:38:38.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:38 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:38.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:38 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:38.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:38 vm07.local ceph-mon[58582]: mgrmap e29: vm00.nescmq(active, since 2s) 2026-03-10T12:38:38.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:38 vm07.local ceph-mon[58582]: pgmap v4: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail 2026-03-10T12:38:38.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:38 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:38.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:38 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.182 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:38] ENGINE Bus STARTING 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: Standby manager daemon vm07.kfawlb started 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/crt"}]: dispatch 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/key"}]: dispatch 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:38] ENGINE Serving on http://192.168.123.100:8765 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:39 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:38] ENGINE Bus STARTING 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: Standby manager daemon vm07.kfawlb started 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/crt"}]: dispatch 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/key"}]: dispatch 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/2762059859' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:38] ENGINE Serving on http://192.168.123.100:8765 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:39.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:39 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:38] ENGINE Serving on https://192.168.123.100:7150 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:38] ENGINE Bus STARTED 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: [10/Mar/2026:12:38:38] ENGINE Client ('192.168.123.100', 40490) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: pgmap v5: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: mgrmap e30: vm00.nescmq(active, since 4s), standbys: vm07.kfawlb 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm07.kfawlb", "id": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:40.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:40 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:38] ENGINE Serving on https://192.168.123.100:7150 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:38] ENGINE Bus STARTED 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: [10/Mar/2026:12:38:38] ENGINE Client ('192.168.123.100', 40490) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: pgmap v5: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: mgrmap e30: vm00.nescmq(active, since 4s), standbys: vm07.kfawlb 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm07.kfawlb", "id": "vm07.kfawlb"}]: dispatch 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:40.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:40 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:42 vm07.local ceph-mon[58582]: pgmap v6: 65 pgs: 65 active+clean; 3.7 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 35 KiB/s rd, 362 KiB/s wr, 51 op/s 2026-03-10T12:38:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:42 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:42 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:42 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:42 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:42.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:42 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:42 vm00.local ceph-mon[50686]: pgmap v6: 65 pgs: 65 active+clean; 3.7 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 35 KiB/s rd, 362 KiB/s wr, 51 op/s 2026-03-10T12:38:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:42 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:42 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:42 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:38:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:42 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:42.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:42 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm00:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:43.809 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:43 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm00:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:43 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.830 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:44 vm07.local ceph-mon[58582]: Reconfiguring prometheus.vm00 (dependencies changed)... 2026-03-10T12:38:44.830 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:44 vm07.local ceph-mon[58582]: Reconfiguring daemon prometheus.vm00 on vm00 2026-03-10T12:38:44.830 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:44 vm07.local ceph-mon[58582]: pgmap v7: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 40 KiB/s rd, 789 KiB/s wr, 126 op/s 2026-03-10T12:38:44.830 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:44 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.830 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:44 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.830 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:44 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:44.830 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:44 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:44.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:44 vm00.local ceph-mon[50686]: Reconfiguring prometheus.vm00 (dependencies changed)... 2026-03-10T12:38:44.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:44 vm00.local ceph-mon[50686]: Reconfiguring daemon prometheus.vm00 on vm00 2026-03-10T12:38:44.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:44 vm00.local ceph-mon[50686]: pgmap v7: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 40 KiB/s rd, 789 KiB/s wr, 126 op/s 2026-03-10T12:38:44.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:44 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:44 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:44.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:44 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:44.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:44 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:45.334 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.332+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2986395289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99d4072470 msgr2=0x7f99d410beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:45.334 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.332+0000 7f99d9fcc700 1 --2- 192.168.123.100:0/2986395289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99d4072470 0x7f99d410beb0 secure :-1 s=READY pgs=355 cs=0 l=1 rev1=1 crypto rx=0x7f99cc00b3a0 tx=0x7f99cc00b6b0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.334 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.334+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2986395289 shutdown_connections 2026-03-10T12:38:45.335 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.334+0000 7f99d9fcc700 1 --2- 192.168.123.100:0/2986395289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99d4072470 0x7f99d410beb0 unknown :-1 s=CLOSED pgs=355 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.335 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.334+0000 7f99d9fcc700 1 --2- 192.168.123.100:0/2986395289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 0x7f99d4071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.335 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.334+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2986395289 >> 192.168.123.100:0/2986395289 conn(0x7f99d406d1a0 msgr2=0x7f99d406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:45.336 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.335+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2986395289 shutdown_connections 2026-03-10T12:38:45.336 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.335+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2986395289 wait complete. 2026-03-10T12:38:45.337 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.336+0000 7f99d9fcc700 1 Processor -- start 2026-03-10T12:38:45.337 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d9fcc700 1 -- start start 2026-03-10T12:38:45.337 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d9fcc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 0x7f99d4116ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:45.337 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d9fcc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99d4117020 0x7f99d41b27e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:45.337 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d9fcc700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99d4117550 con 0x7f99d4117020 2026-03-10T12:38:45.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d9fcc700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99d41176c0 con 0x7f99d4071a90 2026-03-10T12:38:45.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d37fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 0x7f99d4116ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:45.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d37fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 0x7f99d4116ae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:44674/0 (socket says 192.168.123.100:44674) 2026-03-10T12:38:45.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.337+0000 7f99d37fe700 1 -- 192.168.123.100:0/2700110892 learned_addr learned my addr 192.168.123.100:0/2700110892 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:45.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.339+0000 7f99d37fe700 1 -- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99d4117020 msgr2=0x7f99d41b27e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:38:45.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.339+0000 7f99d37fe700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99d4117020 0x7f99d41b27e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.339+0000 7f99d37fe700 1 -- 192.168.123.100:0/2700110892 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99cc00b050 con 0x7f99d4071a90 2026-03-10T12:38:45.340 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.340+0000 7f99d37fe700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 0x7f99d4116ae0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f99c400d8d0 tx=0x7f99c400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:45.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.342+0000 7f99d0ff9700 1 -- 192.168.123.100:0/2700110892 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f99c4009940 con 0x7f99d4071a90 2026-03-10T12:38:45.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.342+0000 7f99d0ff9700 1 -- 192.168.123.100:0/2700110892 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f99c4010460 con 0x7f99d4071a90 2026-03-10T12:38:45.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.343+0000 7f99d0ff9700 1 -- 192.168.123.100:0/2700110892 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f99c400f5d0 con 0x7f99d4071a90 2026-03-10T12:38:45.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.344+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2700110892 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f99d41b2d80 con 0x7f99d4071a90 2026-03-10T12:38:45.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.344+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2700110892 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f99d41b31a0 con 0x7f99d4071a90 2026-03-10T12:38:45.346 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.345+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2700110892 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f99d4110c20 con 0x7f99d4071a90 2026-03-10T12:38:45.347 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.347+0000 7f99d0ff9700 1 -- 192.168.123.100:0/2700110892 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f99c4009aa0 con 0x7f99d4071a90 2026-03-10T12:38:45.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.347+0000 7f99d0ff9700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f99bc0774f0 0x7f99bc0799a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:45.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.347+0000 7f99d0ff9700 1 -- 192.168.123.100:0/2700110892 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f99c4099680 con 0x7f99d4071a90 2026-03-10T12:38:45.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.348+0000 7f99d2ffd700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f99bc0774f0 0x7f99bc0799a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:45.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.348+0000 7f99d2ffd700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f99bc0774f0 0x7f99bc0799a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f99cc00bb30 tx=0x7f99cc000f40 comp rx=0 tx=0).ready entity=mgr.24513 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:45.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.353+0000 7f99d0ff9700 1 -- 192.168.123.100:0/2700110892 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f99c4061b40 con 0x7f99d4071a90 2026-03-10T12:38:45.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.502+0000 7f99d9fcc700 1 -- 192.168.123.100:0/2700110892 --> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f99d4061190 con 0x7f99bc0774f0 2026-03-10T12:38:45.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.505+0000 7f99d0ff9700 1 -- 192.168.123.100:0/2700110892 <== mgr.24513 v2:192.168.123.100:6800/3276280342 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f99d4061190 con 0x7f99bc0774f0 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 -- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f99bc0774f0 msgr2=0x7f99bc0799a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f99bc0774f0 0x7f99bc0799a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f99cc00bb30 tx=0x7f99cc000f40 comp rx=0 tx=0).stop 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 -- 192.168.123.100:0/2700110892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 msgr2=0x7f99d4116ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 0x7f99d4116ae0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f99c400d8d0 tx=0x7f99c400dc90 comp rx=0 tx=0).stop 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 -- 192.168.123.100:0/2700110892 shutdown_connections 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f99bc0774f0 0x7f99bc0799a0 secure :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f99cc00bb30 tx=0x7f99cc000f40 comp rx=0 tx=0).stop 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99d4071a90 0x7f99d4116ae0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 --2- 192.168.123.100:0/2700110892 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99d4117020 0x7f99d41b27e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.508+0000 7f99ba7fc700 1 -- 192.168.123.100:0/2700110892 >> 192.168.123.100:0/2700110892 conn(0x7f99d406d1a0 msgr2=0x7f99d410b250 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.509+0000 7f99ba7fc700 1 -- 192.168.123.100:0/2700110892 shutdown_connections 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.509+0000 7f99ba7fc700 1 -- 192.168.123.100:0/2700110892 wait complete. 2026-03-10T12:38:45.583 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:38:45.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 -- 192.168.123.100:0/3736130080 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150071950 msgr2=0x7f9150071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:45.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 --2- 192.168.123.100:0/3736130080 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150071950 0x7f9150071d60 secure :-1 s=READY pgs=356 cs=0 l=1 rev1=1 crypto rx=0x7f9140008790 tx=0x7f9140008aa0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 -- 192.168.123.100:0/3736130080 shutdown_connections 2026-03-10T12:38:45.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 --2- 192.168.123.100:0/3736130080 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9150072330 0x7f91500770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 --2- 192.168.123.100:0/3736130080 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150071950 0x7f9150071d60 unknown :-1 s=CLOSED pgs=356 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 -- 192.168.123.100:0/3736130080 >> 192.168.123.100:0/3736130080 conn(0x7f915006d1a0 msgr2=0x7f915006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:45.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 -- 192.168.123.100:0/3736130080 shutdown_connections 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.871+0000 7f9155bff700 1 -- 192.168.123.100:0/3736130080 wait complete. 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f9155bff700 1 Processor -- start 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f9155bff700 1 -- start start 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f9155bff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9150072330 0x7f9150082450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f9155bff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150082990 0x7f9150082e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f9155bff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9150083e00 con 0x7f9150082990 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f9155bff700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f91501b2a90 con 0x7f9150072330 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f914ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150082990 0x7f9150082e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f914ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150082990 0x7f9150082e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54856/0 (socket says 192.168.123.100:54856) 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f914ffff700 1 -- 192.168.123.100:0/3886582447 learned_addr learned my addr 192.168.123.100:0/3886582447 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.872+0000 7f9154bfd700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9150072330 0x7f9150082450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.873+0000 7f9154bfd700 1 -- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150082990 msgr2=0x7f9150082e00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.873+0000 7f9154bfd700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150082990 0x7f9150082e00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.873+0000 7f9154bfd700 1 -- 192.168.123.100:0/3886582447 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9140008440 con 0x7f9150072330 2026-03-10T12:38:45.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:45.873+0000 7f9154bfd700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9150072330 0x7f9150082450 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f9140000c00 tx=0x7f914000fa80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.123 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.122+0000 7f914dffb700 1 -- 192.168.123.100:0/3886582447 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9140003fe0 con 0x7f9150072330 2026-03-10T12:38:46.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:46 vm00.local ceph-mon[50686]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:46.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:46 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:38:46.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:46 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:38:46.123 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:46 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:46.123 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.123+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f91501b2c30 con 0x7f9150072330 2026-03-10T12:38:46.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.123+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f91501b30f0 con 0x7f9150072330 2026-03-10T12:38:46.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.123+0000 7f914dffb700 1 -- 192.168.123.100:0/3886582447 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9140003c40 con 0x7f9150072330 2026-03-10T12:38:46.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.124+0000 7f914dffb700 1 -- 192.168.123.100:0/3886582447 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f914000a7b0 con 0x7f9150072330 2026-03-10T12:38:46.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.124+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f915004ea50 con 0x7f9150072330 2026-03-10T12:38:46.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.126+0000 7f914dffb700 1 -- 192.168.123.100:0/3886582447 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f914000a910 con 0x7f9150072330 2026-03-10T12:38:46.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.126+0000 7f914dffb700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9138077450 0x7f9138079900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.127+0000 7f914ffff700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9138077450 0x7f9138079900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.127+0000 7f914ffff700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9138077450 0x7f9138079900 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f9148007910 tx=0x7f914800d040 comp rx=0 tx=0).ready entity=mgr.24513 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.136+0000 7f914dffb700 1 -- 192.168.123.100:0/3886582447 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f9140031360 con 0x7f9150072330 2026-03-10T12:38:46.146 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.145+0000 7f914dffb700 1 -- 192.168.123.100:0/3886582447 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f91400677e0 con 0x7f9150072330 2026-03-10T12:38:46.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.307+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 --> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f91501b3340 con 0x7f9138077450 2026-03-10T12:38:46.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.308+0000 7f914dffb700 1 -- 192.168.123.100:0/3886582447 <== mgr.24513 v2:192.168.123.100:6800/3276280342 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7f91501b3340 con 0x7f9138077450 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.310+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9138077450 msgr2=0x7f9138079900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.310+0000 7f9155bff700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9138077450 0x7f9138079900 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f9148007910 tx=0x7f914800d040 comp rx=0 tx=0).stop 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9150072330 msgr2=0x7f9150082450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9150072330 0x7f9150082450 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f9140000c00 tx=0x7f914000fa80 comp rx=0 tx=0).stop 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 shutdown_connections 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9138077450 0x7f9138079900 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9150072330 0x7f9150082450 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 --2- 192.168.123.100:0/3886582447 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9150082990 0x7f9150082e00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 >> 192.168.123.100:0/3886582447 conn(0x7f915006d1a0 msgr2=0x7f91500705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 shutdown_connections 2026-03-10T12:38:46.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.311+0000 7f9155bff700 1 -- 192.168.123.100:0/3886582447 wait complete. 2026-03-10T12:38:46.335 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:46 vm07.local ceph-mon[58582]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:38:46.335 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:46 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:38:46.335 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:46 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:38:46.335 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:46 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:46.396 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 -- 192.168.123.100:0/2999912264 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a3e40 msgr2=0x7fc5c80a4290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 --2- 192.168.123.100:0/2999912264 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a3e40 0x7fc5c80a4290 secure :-1 s=READY pgs=357 cs=0 l=1 rev1=1 crypto rx=0x7fc5d00669f0 tx=0x7fc5d00671e0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 -- 192.168.123.100:0/2999912264 shutdown_connections 2026-03-10T12:38:46.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 --2- 192.168.123.100:0/2999912264 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a3e40 0x7fc5c80a4290 unknown :-1 s=CLOSED pgs=357 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 --2- 192.168.123.100:0/2999912264 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5c80a5800 0x7fc5c80a5c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 -- 192.168.123.100:0/2999912264 >> 192.168.123.100:0/2999912264 conn(0x7fc5c809f7b0 msgr2=0x7fc5c80a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:46.399 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 -- 192.168.123.100:0/2999912264 shutdown_connections 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 -- 192.168.123.100:0/2999912264 wait complete. 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 Processor -- start 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 -- start start 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.396+0000 7fc5d5e23700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a5800 0x7fc5c80cfd00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.397+0000 7fc5d5e23700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5c80d0240 0x7fc5c8010f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.397+0000 7fc5d5e23700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5c80d0740 con 0x7fc5c80a5800 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.397+0000 7fc5d5e23700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5c80d08b0 con 0x7fc5c80d0240 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.399+0000 7fc5d4e21700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a5800 0x7fc5c80cfd00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.399+0000 7fc5d4e21700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a5800 0x7fc5c80cfd00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54876/0 (socket says 192.168.123.100:54876) 2026-03-10T12:38:46.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.399+0000 7fc5d4e21700 1 -- 192.168.123.100:0/1041299868 learned_addr learned my addr 192.168.123.100:0/1041299868 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:46.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.400+0000 7fc5d4e21700 1 -- 192.168.123.100:0/1041299868 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5c80d0240 msgr2=0x7fc5c8010f40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.400+0000 7fc5d4e21700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5c80d0240 0x7fc5c8010f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.400+0000 7fc5d4e21700 1 -- 192.168.123.100:0/1041299868 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5d0067050 con 0x7fc5c80a5800 2026-03-10T12:38:46.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.401+0000 7fc5d4e21700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a5800 0x7fc5c80cfd00 secure :-1 s=READY pgs=358 cs=0 l=1 rev1=1 crypto rx=0x7fc5c400b6e0 tx=0x7fc5c400baa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.401+0000 7fc5cdffb700 1 -- 192.168.123.100:0/1041299868 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc5c400f800 con 0x7fc5c80a5800 2026-03-10T12:38:46.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.401+0000 7fc5cdffb700 1 -- 192.168.123.100:0/1041299868 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc5c400fe40 con 0x7fc5c80a5800 2026-03-10T12:38:46.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.401+0000 7fc5cdffb700 1 -- 192.168.123.100:0/1041299868 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc5c400d3e0 con 0x7fc5c80a5800 2026-03-10T12:38:46.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.401+0000 7fc5d5e23700 1 -- 192.168.123.100:0/1041299868 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5c80114e0 con 0x7fc5c80a5800 2026-03-10T12:38:46.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.401+0000 7fc5d5e23700 1 -- 192.168.123.100:0/1041299868 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5c8011a00 con 0x7fc5c80a5800 2026-03-10T12:38:46.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.402+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc5c8004f40 con 0x7fc5c80a5800 2026-03-10T12:38:46.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.408+0000 7fc5cdffb700 1 -- 192.168.123.100:0/1041299868 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fc5c401e030 con 0x7fc5c80a5800 2026-03-10T12:38:46.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.408+0000 7fc5cdffb700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fc5c0077680 0x7fc5c0079b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.408+0000 7fc5cdffb700 1 -- 192.168.123.100:0/1041299868 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fc5c4099410 con 0x7fc5c80a5800 2026-03-10T12:38:46.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.409+0000 7fc5cffff700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fc5c0077680 0x7fc5c0079b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.410+0000 7fc5cffff700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fc5c0077680 0x7fc5c0079b30 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc5d004ed20 tx=0x7fc5d0070040 comp rx=0 tx=0).ready entity=mgr.24513 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.410+0000 7fc5cdffb700 1 -- 192.168.123.100:0/1041299868 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fc5c4061fa0 con 0x7fc5c80a5800 2026-03-10T12:38:46.589 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.588+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 --> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc5c8003e60 con 0x7fc5c0077680 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.594+0000 7fc5cdffb700 1 -- 192.168.123.100:0/1041299868 <== mgr.24513 v2:192.168.123.100:6800/3276280342 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fc5c8003e60 con 0x7fc5c0077680 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (5m) 7s ago 5m 25.2M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (5m) 7s ago 5m 8434k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (5m) 9s ago 5m 11.1M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (5m) 7s ago 5m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (5m) 9s ago 5m 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (4m) 7s ago 5m 88.5M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (3m) 7s ago 3m 156M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (3m) 7s ago 3m 16.5M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (3m) 9s ago 3m 16.4M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (3m) 9s ago 3m 181M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:38:46.594 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (19s) 7s ago 6m 592M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (46s) 9s ago 5m 359M - 19.2.3-678-ge911bdeb 654f31e6858e ca47c92cac17 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (6m) 7s ago 6m 51.2M 2048M 18.2.0 dc2bc1663786 c8d836b38502 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (5m) 9s ago 5m 43.4M 2048M 18.2.0 dc2bc1663786 7712955135fc 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (5m) 7s ago 5m 14.8M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (5m) 9s ago 5m 15.1M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (4m) 7s ago 4m 341M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (4m) 7s ago 4m 377M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (4m) 7s ago 4m 326M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (4m) 9s ago 4m 428M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (4m) 9s ago 4m 378M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (4m) 9s ago 4m 393M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:38:46.595 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 starting - - - - 2026-03-10T12:38:46.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fc5c0077680 msgr2=0x7fc5c0079b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fc5c0077680 0x7fc5c0079b30 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fc5d004ed20 tx=0x7fc5d0070040 comp rx=0 tx=0).stop 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a5800 msgr2=0x7fc5c80cfd00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a5800 0x7fc5c80cfd00 secure :-1 s=READY pgs=358 cs=0 l=1 rev1=1 crypto rx=0x7fc5c400b6e0 tx=0x7fc5c400baa0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 shutdown_connections 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fc5c0077680 0x7fc5c0079b30 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5c80a5800 0x7fc5c80cfd00 unknown :-1 s=CLOSED pgs=358 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 --2- 192.168.123.100:0/1041299868 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5c80d0240 0x7fc5c8010f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 >> 192.168.123.100:0/1041299868 conn(0x7fc5c809f7b0 msgr2=0x7fc5c80078d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.596+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 shutdown_connections 2026-03-10T12:38:46.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.597+0000 7fc5bb7fe700 1 -- 192.168.123.100:0/1041299868 wait complete. 2026-03-10T12:38:46.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.666+0000 7ff53ef4c700 1 -- 192.168.123.100:0/1645732546 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 msgr2=0x7ff5381012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.666+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/1645732546 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 0x7ff5381012a0 secure :-1 s=READY pgs=359 cs=0 l=1 rev1=1 crypto rx=0x7ff528009b50 tx=0x7ff528009e60 comp rx=0 tx=0).stop 2026-03-10T12:38:46.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.669+0000 7ff53ef4c700 1 -- 192.168.123.100:0/1645732546 shutdown_connections 2026-03-10T12:38:46.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.669+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/1645732546 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 0x7ff538103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.669+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/1645732546 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 0x7ff5381012a0 unknown :-1 s=CLOSED pgs=359 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.669+0000 7ff53ef4c700 1 -- 192.168.123.100:0/1645732546 >> 192.168.123.100:0/1645732546 conn(0x7ff5380faa70 msgr2=0x7ff5380fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:46.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.669+0000 7ff53ef4c700 1 -- 192.168.123.100:0/1645732546 shutdown_connections 2026-03-10T12:38:46.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.669+0000 7ff53ef4c700 1 -- 192.168.123.100:0/1645732546 wait complete. 2026-03-10T12:38:46.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff53ef4c700 1 Processor -- start 2026-03-10T12:38:46.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff53ef4c700 1 -- start start 2026-03-10T12:38:46.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff53ef4c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 0x7ff53819c470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff53ef4c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 0x7ff53819c9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff53ef4c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff53819cfd0 con 0x7ff5380fee80 2026-03-10T12:38:46.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff53ef4c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff53819d110 con 0x7ff5381017e0 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff537fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 0x7ff53819c9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff537fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 0x7ff53819c9b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:44728/0 (socket says 192.168.123.100:44728) 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.670+0000 7ff537fff700 1 -- 192.168.123.100:0/3222875509 learned_addr learned my addr 192.168.123.100:0/3222875509 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff53cce8700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 0x7ff53819c470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff537fff700 1 -- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 msgr2=0x7ff53819c470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff537fff700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 0x7ff53819c470 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff537fff700 1 -- 192.168.123.100:0/3222875509 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff5280097e0 con 0x7ff5381017e0 2026-03-10T12:38:46.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff53cce8700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 0x7ff53819c470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:38:46.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff537fff700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 0x7ff53819c9b0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7ff52c00eb10 tx=0x7ff52c00ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff535ffb700 1 -- 192.168.123.100:0/3222875509 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff52c00cc40 con 0x7ff5381017e0 2026-03-10T12:38:46.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.671+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff5381a1bc0 con 0x7ff5381017e0 2026-03-10T12:38:46.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.672+0000 7ff535ffb700 1 -- 192.168.123.100:0/3222875509 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff52c00cda0 con 0x7ff5381017e0 2026-03-10T12:38:46.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.672+0000 7ff535ffb700 1 -- 192.168.123.100:0/3222875509 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff52c018850 con 0x7ff5381017e0 2026-03-10T12:38:46.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.672+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff5381a2110 con 0x7ff5381017e0 2026-03-10T12:38:46.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.673+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff538196590 con 0x7ff5381017e0 2026-03-10T12:38:46.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.674+0000 7ff535ffb700 1 -- 192.168.123.100:0/3222875509 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7ff52c018a80 con 0x7ff5381017e0 2026-03-10T12:38:46.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.674+0000 7ff535ffb700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7ff520077610 0x7ff520079ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.674+0000 7ff535ffb700 1 -- 192.168.123.100:0/3222875509 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7ff52c026080 con 0x7ff5381017e0 2026-03-10T12:38:46.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.675+0000 7ff53cce8700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7ff520077610 0x7ff520079ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.675 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.675+0000 7ff53cce8700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7ff520077610 0x7ff520079ac0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7ff5280053b0 tx=0x7ff528005a90 comp rx=0 tx=0).ready entity=mgr.24513 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.677 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.676+0000 7ff535ffb700 1 -- 192.168.123.100:0/3222875509 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7ff52c0629e0 con 0x7ff5381017e0 2026-03-10T12:38:46.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.877+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ff53802d080 con 0x7ff5381017e0 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.877+0000 7ff535ffb700 1 -- 192.168.123.100:0/3222875509 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7ff52c062130 con 0x7ff5381017e0 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 12, 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:38:46.878 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:38:46.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7ff520077610 msgr2=0x7ff520079ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7ff520077610 0x7ff520079ac0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7ff5280053b0 tx=0x7ff528005a90 comp rx=0 tx=0).stop 2026-03-10T12:38:46.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 msgr2=0x7ff53819c9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 0x7ff53819c9b0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7ff52c00eb10 tx=0x7ff52c00ee20 comp rx=0 tx=0).stop 2026-03-10T12:38:46.881 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 shutdown_connections 2026-03-10T12:38:46.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7ff520077610 0x7ff520079ac0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff5380fee80 0x7ff53819c470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 --2- 192.168.123.100:0/3222875509 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff5381017e0 0x7ff53819c9b0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 >> 192.168.123.100:0/3222875509 conn(0x7ff5380faa70 msgr2=0x7ff5380fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:46.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 shutdown_connections 2026-03-10T12:38:46.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.881+0000 7ff53ef4c700 1 -- 192.168.123.100:0/3222875509 wait complete. 2026-03-10T12:38:46.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 -- 192.168.123.100:0/2282856323 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704072330 msgr2=0x7f17040770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 --2- 192.168.123.100:0/2282856323 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704072330 0x7f17040770b0 secure :-1 s=READY pgs=360 cs=0 l=1 rev1=1 crypto rx=0x7f16fc00b330 tx=0x7f16fc00b640 comp rx=0 tx=0).stop 2026-03-10T12:38:46.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 -- 192.168.123.100:0/2282856323 shutdown_connections 2026-03-10T12:38:46.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 --2- 192.168.123.100:0/2282856323 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704072330 0x7f17040770b0 unknown :-1 s=CLOSED pgs=360 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 --2- 192.168.123.100:0/2282856323 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1704071950 0x7f1704071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 -- 192.168.123.100:0/2282856323 >> 192.168.123.100:0/2282856323 conn(0x7f170406d1a0 msgr2=0x7f170406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 -- 192.168.123.100:0/2282856323 shutdown_connections 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.973+0000 7f17090f0700 1 -- 192.168.123.100:0/2282856323 wait complete. 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.974+0000 7f17090f0700 1 Processor -- start 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.974+0000 7f17090f0700 1 -- start start 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.974+0000 7f17090f0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1704071950 0x7f1704082500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.974+0000 7f17090f0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704082a40 0x7f1704082eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.974+0000 7f17090f0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f170412dd80 con 0x7f1704082a40 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.974+0000 7f17090f0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f170412def0 con 0x7f1704071950 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.975+0000 7f1703fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1704071950 0x7f1704082500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.975+0000 7f1703fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1704071950 0x7f1704082500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:44740/0 (socket says 192.168.123.100:44740) 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.975+0000 7f1703fff700 1 -- 192.168.123.100:0/330122572 learned_addr learned my addr 192.168.123.100:0/330122572 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:46.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.975+0000 7f17037fe700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704082a40 0x7f1704082eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.976+0000 7f17037fe700 1 -- 192.168.123.100:0/330122572 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1704071950 msgr2=0x7f1704082500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:46.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.976+0000 7f17037fe700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1704071950 0x7f1704082500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:46.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.976+0000 7f17037fe700 1 -- 192.168.123.100:0/330122572 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f16fc00b050 con 0x7f1704082a40 2026-03-10T12:38:46.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.976+0000 7f17037fe700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704082a40 0x7f1704082eb0 secure :-1 s=READY pgs=361 cs=0 l=1 rev1=1 crypto rx=0x7f16fc00b330 tx=0x7f16fc0088b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.977 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.976+0000 7f17017fa700 1 -- 192.168.123.100:0/330122572 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f16fc00e040 con 0x7f1704082a40 2026-03-10T12:38:46.977 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.977+0000 7f17090f0700 1 -- 192.168.123.100:0/330122572 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f170412e110 con 0x7f1704082a40 2026-03-10T12:38:46.977 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.977+0000 7f17090f0700 1 -- 192.168.123.100:0/330122572 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f170412e600 con 0x7f1704082a40 2026-03-10T12:38:46.978 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.977+0000 7f17090f0700 1 -- 192.168.123.100:0/330122572 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f170404ea50 con 0x7f1704082a40 2026-03-10T12:38:46.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.978+0000 7f17017fa700 1 -- 192.168.123.100:0/330122572 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f16fc009d70 con 0x7f1704082a40 2026-03-10T12:38:46.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.978+0000 7f17017fa700 1 -- 192.168.123.100:0/330122572 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f16fc004400 con 0x7f1704082a40 2026-03-10T12:38:46.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.980+0000 7f17017fa700 1 -- 192.168.123.100:0/330122572 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f16fc008a50 con 0x7f1704082a40 2026-03-10T12:38:46.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.981+0000 7f17017fa700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f16ec077680 0x7f16ec079b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:46.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.981+0000 7f1703fff700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f16ec077680 0x7f16ec079b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:46.984 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.984+0000 7f1703fff700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f16ec077680 0x7f16ec079b30 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f16f4005f00 tx=0x7f16f400c040 comp rx=0 tx=0).ready entity=mgr.24513 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:46.984 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.984+0000 7f17017fa700 1 -- 192.168.123.100:0/330122572 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f16fc068560 con 0x7f1704082a40 2026-03-10T12:38:46.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:46.994+0000 7f17017fa700 1 -- 192.168.123.100:0/330122572 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f16fc063f80 con 0x7f1704082a40 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: Upgrade: Updating mgr.vm07.kfawlb 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: Deploying daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: pgmap v8: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 31 KiB/s rd, 614 KiB/s wr, 98 op/s 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: from='client.24537 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: from='client.24541 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/3222875509' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:47.237 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:47 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:47.270 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.269+0000 7f17090f0700 1 -- 192.168.123.100:0/330122572 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f170412e8e0 con 0x7f1704082a40 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:47.275 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.271+0000 7f17017fa700 1 -- 192.168.123.100:0/330122572 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1870 (secure 0 0 0) 0x7f16fc015760 con 0x7f1704082a40 2026-03-10T12:38:47.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 -- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f16ec077680 msgr2=0x7f16ec079b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f16ec077680 0x7f16ec079b30 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f16f4005f00 tx=0x7f16f400c040 comp rx=0 tx=0).stop 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 -- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704082a40 msgr2=0x7f1704082eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704082a40 0x7f1704082eb0 secure :-1 s=READY pgs=361 cs=0 l=1 rev1=1 crypto rx=0x7f16fc00b330 tx=0x7f16fc0088b0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 -- 192.168.123.100:0/330122572 shutdown_connections 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f16ec077680 0x7f16ec079b30 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1704071950 0x7f1704082500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 --2- 192.168.123.100:0/330122572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1704082a40 0x7f1704082eb0 unknown :-1 s=CLOSED pgs=361 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 -- 192.168.123.100:0/330122572 >> 192.168.123.100:0/330122572 conn(0x7f170406d1a0 msgr2=0x7f17040764f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.277+0000 7f16eaffd700 1 -- 192.168.123.100:0/330122572 shutdown_connections 2026-03-10T12:38:47.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.278+0000 7f16eaffd700 1 -- 192.168.123.100:0/330122572 wait complete. 2026-03-10T12:38:47.289 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.441+0000 7fb936254700 1 -- 192.168.123.100:0/3646106467 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930071980 msgr2=0x7fb930071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.441+0000 7fb936254700 1 --2- 192.168.123.100:0/3646106467 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930071980 0x7fb930071d90 secure :-1 s=READY pgs=362 cs=0 l=1 rev1=1 crypto rx=0x7fb9200077e0 tx=0x7fb920007af0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.441+0000 7fb936254700 1 -- 192.168.123.100:0/3646106467 shutdown_connections 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.441+0000 7fb936254700 1 --2- 192.168.123.100:0/3646106467 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb930072360 0x7fb9300770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.441+0000 7fb936254700 1 --2- 192.168.123.100:0/3646106467 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930071980 0x7fb930071d90 unknown :-1 s=CLOSED pgs=362 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.441+0000 7fb936254700 1 -- 192.168.123.100:0/3646106467 >> 192.168.123.100:0/3646106467 conn(0x7fb93006d1a0 msgr2=0x7fb93006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 -- 192.168.123.100:0/3646106467 shutdown_connections 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 -- 192.168.123.100:0/3646106467 wait complete. 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 Processor -- start 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 -- start start 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb930072360 0x7fb930082550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930082a90 0x7fb930082f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb93012dd80 con 0x7fb930082a90 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.442+0000 7fb936254700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb93012def0 con 0x7fb930072360 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.443+0000 7fb92effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930082a90 0x7fb930082f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.443+0000 7fb92effd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930082a90 0x7fb930082f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54920/0 (socket says 192.168.123.100:54920) 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.443+0000 7fb92effd700 1 -- 192.168.123.100:0/1428600901 learned_addr learned my addr 192.168.123.100:0/1428600901 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.443+0000 7fb92effd700 1 -- 192.168.123.100:0/1428600901 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb930072360 msgr2=0x7fb930082550 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.443+0000 7fb92effd700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb930072360 0x7fb930082550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.443+0000 7fb92effd700 1 -- 192.168.123.100:0/1428600901 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb920007430 con 0x7fb930082a90 2026-03-10T12:38:47.444 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.443+0000 7fb92effd700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930082a90 0x7fb930082f00 secure :-1 s=READY pgs=363 cs=0 l=1 rev1=1 crypto rx=0x7fb928007f00 tx=0x7fb92800d3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:47.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.446+0000 7fb92cff9700 1 -- 192.168.123.100:0/1428600901 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb92800dcf0 con 0x7fb930082a90 2026-03-10T12:38:47.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.446+0000 7fb936254700 1 -- 192.168.123.100:0/1428600901 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb93012e170 con 0x7fb930082a90 2026-03-10T12:38:47.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.446+0000 7fb936254700 1 -- 192.168.123.100:0/1428600901 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb93012e6c0 con 0x7fb930082a90 2026-03-10T12:38:47.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.446+0000 7fb92cff9700 1 -- 192.168.123.100:0/1428600901 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb92800f040 con 0x7fb930082a90 2026-03-10T12:38:47.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.446+0000 7fb92cff9700 1 -- 192.168.123.100:0/1428600901 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb9280127c0 con 0x7fb930082a90 2026-03-10T12:38:47.448 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.447+0000 7fb936254700 1 -- 192.168.123.100:0/1428600901 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb93004ea50 con 0x7fb930082a90 2026-03-10T12:38:47.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.449+0000 7fb92cff9700 1 -- 192.168.123.100:0/1428600901 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fb928004ad0 con 0x7fb930082a90 2026-03-10T12:38:47.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.450+0000 7fb92cff9700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fb918077780 0x7fb918079c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:47.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.450+0000 7fb92cff9700 1 -- 192.168.123.100:0/1428600901 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7fb928098e90 con 0x7fb930082a90 2026-03-10T12:38:47.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.450+0000 7fb92f7fe700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fb918077780 0x7fb918079c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:47.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.451+0000 7fb92cff9700 1 -- 192.168.123.100:0/1428600901 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7fb928061a20 con 0x7fb930082a90 2026-03-10T12:38:47.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.452+0000 7fb92f7fe700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fb918077780 0x7fb918079c30 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fb920007ef0 tx=0x7fb920005ba0 comp rx=0 tx=0).ready entity=mgr.24513 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: Upgrade: Updating mgr.vm07.kfawlb 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: Deploying daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: pgmap v8: 65 pgs: 65 active+clean; 3.0 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 31 KiB/s rd, 614 KiB/s wr, 98 op/s 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: from='client.24537 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: from='client.24541 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/3222875509' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:47.502 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:47 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:47.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.692+0000 7fb936254700 1 -- 192.168.123.100:0/1428600901 --> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb930061190 con 0x7fb918077780 2026-03-10T12:38:47.696 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:38:47.696 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "", 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading mgr daemons", 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:38:47.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.693+0000 7fb92cff9700 1 -- 192.168.123.100:0/1428600901 <== mgr.24513 v2:192.168.123.100:6800/3276280342 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+328 (secure 0 0 0) 0x7fb930061190 con 0x7fb918077780 2026-03-10T12:38:47.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.696+0000 7fb9167fc700 1 -- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fb918077780 msgr2=0x7fb918079c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.696+0000 7fb9167fc700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fb918077780 0x7fb918079c30 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fb920007ef0 tx=0x7fb920005ba0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.696+0000 7fb9167fc700 1 -- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930082a90 msgr2=0x7fb930082f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.698 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.696+0000 7fb9167fc700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930082a90 0x7fb930082f00 secure :-1 s=READY pgs=363 cs=0 l=1 rev1=1 crypto rx=0x7fb928007f00 tx=0x7fb92800d3b0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.699+0000 7fb9167fc700 1 -- 192.168.123.100:0/1428600901 shutdown_connections 2026-03-10T12:38:47.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.699+0000 7fb9167fc700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7fb918077780 0x7fb918079c30 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.699+0000 7fb9167fc700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb930072360 0x7fb930082550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.699+0000 7fb9167fc700 1 --2- 192.168.123.100:0/1428600901 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb930082a90 0x7fb930082f00 unknown :-1 s=CLOSED pgs=363 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.699+0000 7fb9167fc700 1 -- 192.168.123.100:0/1428600901 >> 192.168.123.100:0/1428600901 conn(0x7fb93006d1a0 msgr2=0x7fb93006e090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:47.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.699+0000 7fb9167fc700 1 -- 192.168.123.100:0/1428600901 shutdown_connections 2026-03-10T12:38:47.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.699+0000 7fb9167fc700 1 -- 192.168.123.100:0/1428600901 wait complete. 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- 192.168.123.100:0/3470202644 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9088072360 msgr2=0x7f90880770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 --2- 192.168.123.100:0/3470202644 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9088072360 0x7f90880770e0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f9080008220 tx=0x7f9080008530 comp rx=0 tx=0).stop 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- 192.168.123.100:0/3470202644 shutdown_connections 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 --2- 192.168.123.100:0/3470202644 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9088072360 0x7f90880770e0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 --2- 192.168.123.100:0/3470202644 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088071980 0x7f9088071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- 192.168.123.100:0/3470202644 >> 192.168.123.100:0/3470202644 conn(0x7f908806d1a0 msgr2=0x7f908806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- 192.168.123.100:0/3470202644 shutdown_connections 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- 192.168.123.100:0/3470202644 wait complete. 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 Processor -- start 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- start start 2026-03-10T12:38:47.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088071980 0x7f908807b320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f908807b860 0x7f90880808d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f908807bd60 con 0x7f9088071980 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.828+0000 7f908fd14700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f908807bed0 con 0x7f908807b860 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.829+0000 7f908d2af700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f908807b860 0x7f90880808d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.829+0000 7f908d2af700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f908807b860 0x7f90880808d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:46418/0 (socket says 192.168.123.100:46418) 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.829+0000 7f908d2af700 1 -- 192.168.123.100:0/694070323 learned_addr learned my addr 192.168.123.100:0/694070323 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.829+0000 7f908d2af700 1 -- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088071980 msgr2=0x7f908807b320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.829+0000 7f908d2af700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088071980 0x7f908807b320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:47.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.829+0000 7f908d2af700 1 -- 192.168.123.100:0/694070323 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9080007ed0 con 0x7f908807b860 2026-03-10T12:38:47.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.830+0000 7f908d2af700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f908807b860 0x7f90880808d0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f9080000f80 tx=0x7f9080007d50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:47.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.830+0000 7f907effd700 1 -- 192.168.123.100:0/694070323 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f908001c070 con 0x7f908807b860 2026-03-10T12:38:47.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.830+0000 7f908fd14700 1 -- 192.168.123.100:0/694070323 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9088080e10 con 0x7f908807b860 2026-03-10T12:38:47.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.830+0000 7f908fd14700 1 -- 192.168.123.100:0/694070323 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f90880812d0 con 0x7f908807b860 2026-03-10T12:38:47.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.830+0000 7f907effd700 1 -- 192.168.123.100:0/694070323 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9080020410 con 0x7f908807b860 2026-03-10T12:38:47.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.830+0000 7f907effd700 1 -- 192.168.123.100:0/694070323 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90800175c0 con 0x7f908807b860 2026-03-10T12:38:47.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.831+0000 7f908fd14700 1 -- 192.168.123.100:0/694070323 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f906c005320 con 0x7f908807b860 2026-03-10T12:38:47.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.836+0000 7f907effd700 1 -- 192.168.123.100:0/694070323 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f9080020580 con 0x7f908807b860 2026-03-10T12:38:47.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.836+0000 7f907effd700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9074077790 0x7f9074079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:38:47.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.836+0000 7f907effd700 1 -- 192.168.123.100:0/694070323 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5792+0+0 (secure 0 0 0) 0x7f908009c120 con 0x7f908807b860 2026-03-10T12:38:47.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.837+0000 7f908dab0700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9074077790 0x7f9074079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:38:47.839 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.837+0000 7f908dab0700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9074077790 0x7f9074079c40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f9084005950 tx=0x7f90840058e0 comp rx=0 tx=0).ready entity=mgr.24513 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:38:47.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:47.844+0000 7f907effd700 1 -- 192.168.123.100:0/694070323 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191525 (secure 0 0 0) 0x7f9080064cb0 con 0x7f908807b860 2026-03-10T12:38:48.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.105+0000 7f908fd14700 1 -- 192.168.123.100:0/694070323 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f906c005190 con 0x7f908807b860 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.107+0000 7f907effd700 1 -- 192.168.123.100:0/694070323 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f9080064400 con 0x7f908807b860 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 -- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9074077790 msgr2=0x7f9074079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9074077790 0x7f9074079c40 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f9084005950 tx=0x7f90840058e0 comp rx=0 tx=0).stop 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 -- 192.168.123.100:0/694070323 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f908807b860 msgr2=0x7f90880808d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f908807b860 0x7f90880808d0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f9080000f80 tx=0x7f9080007d50 comp rx=0 tx=0).stop 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 -- 192.168.123.100:0/694070323 shutdown_connections 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:6800/3276280342,v1:192.168.123.100:6801/3276280342] conn(0x7f9074077790 0x7f9074079c40 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9088071980 0x7f908807b320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 --2- 192.168.123.100:0/694070323 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f908807b860 0x7f90880808d0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.109+0000 7f907cff9700 1 -- 192.168.123.100:0/694070323 >> 192.168.123.100:0/694070323 conn(0x7f908806d1a0 msgr2=0x7f90880763d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.110+0000 7f907cff9700 1 -- 192.168.123.100:0/694070323 shutdown_connections 2026-03-10T12:38:48.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:38:48.110+0000 7f907cff9700 1 -- 192.168.123.100:0/694070323 wait complete. 2026-03-10T12:38:48.320 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:48 vm07.local ceph-mon[58582]: pgmap v9: 65 pgs: 65 active+clean; 2.1 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 39 KiB/s rd, 1.1 MiB/s wr, 154 op/s 2026-03-10T12:38:48.320 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:48 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/330122572' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:38:48.320 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:48 vm07.local ceph-mon[58582]: from='client.? 192.168.123.100:0/694070323' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:38:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:48 vm00.local ceph-mon[50686]: pgmap v9: 65 pgs: 65 active+clean; 2.1 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 39 KiB/s rd, 1.1 MiB/s wr, 154 op/s 2026-03-10T12:38:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:48 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/330122572' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:38:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:48 vm00.local ceph-mon[50686]: from='client.? 192.168.123.100:0/694070323' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:38:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:49 vm07.local ceph-mon[58582]: from='client.14764 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:49 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:49 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:49.482 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:49 vm00.local ceph-mon[50686]: from='client.14764 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:38:49.482 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:49 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:49.482 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:49 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:50.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:50 vm00.local ceph-mon[50686]: pgmap v10: 65 pgs: 65 active+clean; 2.1 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 36 KiB/s rd, 1011 KiB/s wr, 141 op/s 2026-03-10T12:38:50.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:50 vm07.local ceph-mon[58582]: pgmap v10: 65 pgs: 65 active+clean; 2.1 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 36 KiB/s rd, 1011 KiB/s wr, 141 op/s 2026-03-10T12:38:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:51 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:51 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:51 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:51 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:51 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:51.545 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:51 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.546 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:51 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.546 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:51 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.546 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:51 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:51.546 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:51 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:38:51.983 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T12:38:51.983 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-10T12:38:52.304 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: pgmap v11: 65 pgs: 65 active+clean; 2.1 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 36 KiB/s rd, 1011 KiB/s wr, 141 op/s 2026-03-10T12:38:52.304 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.304 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.304 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:52.304 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm00.nescmq"}]: dispatch 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm00.nescmq"}]': finished 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.kfawlb"}]: dispatch 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.kfawlb"}]': finished 2026-03-10T12:38:52.305 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:52 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: pgmap v11: 65 pgs: 65 active+clean; 2.1 GiB data, 8.3 GiB used, 112 GiB / 120 GiB avail; 36 KiB/s rd, 1011 KiB/s wr, 141 op/s 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm00.nescmq"}]: dispatch 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm00.nescmq"}]': finished 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.kfawlb"}]: dispatch 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm07.kfawlb"}]': finished 2026-03-10T12:38:52.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:52 vm00.local ceph-mon[50686]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T12:38:53.403 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local systemd[1]: Stopping Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:38:53.403 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00[50682]: 2026-03-10T12:38:53.266+0000 7f748b5ca700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm00 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:38:53.403 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00[50682]: 2026-03-10T12:38:53.266+0000 7f748b5ca700 -1 mon.vm00@0(leader) e2 *** Got Signal Terminated *** 2026-03-10T12:38:53.716 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local podman[103130]: 2026-03-10 12:38:53.40308371 +0000 UTC m=+0.162885256 container died c8d836b38502143acd65e1297ec718326020ef5a02520a61c79d2f72d906ddd6 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0, GIT_CLEAN=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux ) 2026-03-10T12:38:53.716 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local podman[103130]: 2026-03-10 12:38:53.453303207 +0000 UTC m=+0.213104753 container remove c8d836b38502143acd65e1297ec718326020ef5a02520a61c79d2f72d906ddd6 (image=quay.io/ceph/ceph:v18.2.0, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, GIT_BRANCH=HEAD, GIT_CLEAN=True, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, RELEASE=HEAD, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T12:38:53.717 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local bash[103130]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00 2026-03-10T12:38:53.717 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service: Deactivated successfully. 2026-03-10T12:38:53.717 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local systemd[1]: Stopped Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:38:53.717 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service: Consumed 6.719s CPU time. 2026-03-10T12:38:53.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local systemd[1]: Starting Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:53 vm00.local podman[103249]: 2026-03-10 12:38:53.985322659 +0000 UTC m=+0.023072082 container create e8cc5980a849f57aa4de7b2f6a01cc8ebb89ffad73537fbb5e24554522e66a25 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local podman[103249]: 2026-03-10 12:38:54.027228206 +0000 UTC m=+0.064977629 container init e8cc5980a849f57aa4de7b2f6a01cc8ebb89ffad73537fbb5e24554522e66a25 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local podman[103249]: 2026-03-10 12:38:54.030978404 +0000 UTC m=+0.068727827 container start e8cc5980a849f57aa4de7b2f6a01cc8ebb89ffad73537fbb5e24554522e66a25 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local bash[103249]: e8cc5980a849f57aa4de7b2f6a01cc8ebb89ffad73537fbb5e24554522e66a25 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local podman[103249]: 2026-03-10 12:38:53.973790806 +0000 UTC m=+0.011540239 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local systemd[1]: Started Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: pidfile_write: ignore empty --pid-file 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: load: jerasure load: lrc 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: RocksDB version: 7.9.2 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Git sha 0 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: DB SUMMARY 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: DB Session ID: 9YM3Q8DVKLCKI5V55SDZ 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: CURRENT file: CURRENT 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: MANIFEST file: MANIFEST-000015 size: 776 Bytes 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm00/store.db dir, Total Num: 1, files: 000023.sst 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm00/store.db: 000021.log size: 763516 ; 2026-03-10T12:38:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.error_if_exists: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.create_if_missing: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.paranoid_checks: 1 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.env: 0x55e1df43adc0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.info_log: 0x55e1e17cd900 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.statistics: (nil) 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.use_fsync: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_log_file_size: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.allow_fallocate: 1 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.use_direct_reads: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.db_log_dir: 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.wal_dir: 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.write_buffer_manager: 0x55e1e17d1900 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.unordered_write: 0 2026-03-10T12:38:54.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.row_cache: None 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.wal_filter: None 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.two_write_queues: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.wal_compression: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.atomic_flush: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.log_readahead_size: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_background_jobs: 2 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_background_compactions: -1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_subcompactions: 1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_open_files: -1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_background_flushes: -1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Compression algorithms supported: 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kZSTD supported: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kXpressCompression supported: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kBZip2Compression supported: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kLZ4Compression supported: 1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kZlibCompression supported: 1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: kSnappyCompression supported: 1 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm00/store.db/MANIFEST-000015 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T12:38:54.486 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.merge_operator: 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_filter: None 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1e17cd580) 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: cache_index_and_filter_blocks: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: pin_top_level_index_and_filter: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: index_type: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: data_block_index_type: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: index_shortening: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: checksum: 4 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: no_block_cache: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache: 0x55e1e17f09b0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache_name: BinnedLRUCache 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache_options: 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: capacity : 536870912 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: num_shard_bits : 4 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: strict_capacity_limit : 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: high_pri_pool_ratio: 0.000 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_cache_compressed: (nil) 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: persistent_cache: (nil) 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_size: 4096 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_size_deviation: 10 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_restart_interval: 16 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: index_block_restart_interval: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: metadata_block_size: 4096 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: partition_filters: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: use_delta_encoding: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: filter_policy: bloomfilter 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: whole_key_filtering: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: verify_compression: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: read_amp_bytes_per_bit: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: format_version: 5 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: enable_index_compression: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: block_align: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: max_auto_readahead_size: 262144 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: prepopulate_block_cache: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: initial_auto_readahead_size: 8192 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression: NoCompression 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.num_levels: 7 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T12:38:54.487 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.inplace_update_support: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.bloom_locality: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.max_successive_merges: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.ttl: 2592000 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.enable_blob_files: false 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.min_blob_size: 0 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T12:38:54.488 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm00/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 25, last_sequence is 8506, log_number is 21,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 21 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 21 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2849e21e-e961-4f79-abd2-83e75de95a7e 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773146334076429, "job": 1, "event": "recovery_started", "wal_files": [21]} 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #21 mode 2 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773146334085162, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 26, "file_size": 642521, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8507, "largest_seqno": 8883, "table_properties": {"data_size": 639544, "index_size": 1424, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 517, "raw_key_size": 4650, "raw_average_key_size": 24, "raw_value_size": 635115, "raw_average_value_size": 3378, "num_data_blocks": 67, "num_entries": 188, "num_filter_entries": 188, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773146334, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2849e21e-e961-4f79-abd2-83e75de95a7e", "db_session_id": "9YM3Q8DVKLCKI5V55SDZ", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773146334085361, "job": 1, "event": "recovery_finished"} 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/version_set.cc:5047] Creating manifest 28 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm00/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e1e17f2e00 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: DB pointer 0x55e1e1802000 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: ** DB Stats ** 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: ** Compaction Stats [default] ** 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: L0 1/0 627.46 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 87.7 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: L6 1/0 7.33 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Sum 2/0 7.94 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 87.7 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 87.7 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: ** Compaction Stats [default] ** 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 87.7 0.01 0.00 1 0.007 0 0 0.0 0.0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Flush(GB): cumulative 0.001, interval 0.001 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Cumulative compaction: 0.00 GB write, 30.62 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Interval compaction: 0.00 GB write, 30.62 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Block cache BinnedLRUCache@0x55e1e17f09b0#2 capacity: 512.00 MB usage: 2.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.1e-05 secs_since: 0 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,0.59 KB,0.000113249%) IndexBlock(1,1.58 KB,0.000301003%) Misc(1,0.00 KB,0%) 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: starting mon.vm00 rank 0 at public addrs [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] at bind addrs [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] mon_data /var/lib/ceph/mon/ceph-vm00 fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???) e2 preinit fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???).mds e13 new map 2026-03-10T12:38:54.489 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???).mds e13 print_map 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: e13 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: legacy client fscid: 1 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Filesystem 'cephfs' (1) 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: fs_name cephfs 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: epoch 13 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: tableserver 0 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: root 0 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: session_timeout 60 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: session_autoclose 300 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: max_file_size 1099511627776 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: max_xattr_size 65536 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: required_client_features {} 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: last_failure 0 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: last_failure_osd_epoch 0 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: max_mds 2 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: in 0,1 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: up {0=24313,1=24307} 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: failed 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: damaged 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: stopped 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: data_pools [3] 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: metadata_pool 2 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: inline_data disabled 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: balancer 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: bal_rank_mask -1 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: standby_count_wanted 1 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: qdb_cluster leader: 0 members: 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: [mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: [mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: Standby daemons: 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: [mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout: [mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???).osd e43 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T12:38:54.490 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00@-1(???).paxosservice(auth 1..21) refresh upgraded, format 0 -> 3 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00 calling monitor election 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mon.vm00 is new leader, mons vm00,vm07 in quorum (ranks 0,1) 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: monmap epoch 2 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: last_changed 2026-03-10T12:33:35.856080+0000 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: created 2026-03-10T12:32:16.576749+0000 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: min_mon_release 18 (reef) 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: election_strategy: 1 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: 0: [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] mon.vm00 2026-03-10T12:38:55.044 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: 1: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-10T12:38:55.045 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:38:55.045 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T12:38:55.045 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mgrmap e30: vm00.nescmq(active, since 19s), standbys: vm07.kfawlb 2026-03-10T12:38:55.045 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: overall HEALTH_OK 2026-03-10T12:38:55.045 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: from='mgr.24513 ' entity='' 2026-03-10T12:38:55.045 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:54 vm00.local ceph-mon[103263]: mgrmap e31: vm00.nescmq(active, since 19s), standbys: vm07.kfawlb 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: from='mgr.24513 192.168.123.100:0/3896762873' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: mon.vm00 calling monitor election 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: mon.vm00 is new leader, mons vm00,vm07 in quorum (ranks 0,1) 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: monmap epoch 2 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: last_changed 2026-03-10T12:33:35.856080+0000 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: created 2026-03-10T12:32:16.576749+0000 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: min_mon_release 18 (reef) 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: election_strategy: 1 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: 0: [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] mon.vm00 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: 1: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: mgrmap e30: vm00.nescmq(active, since 19s), standbys: vm07.kfawlb 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: overall HEALTH_OK 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: from='mgr.24513 ' entity='' 2026-03-10T12:38:55.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:54 vm07.local ceph-mon[58582]: mgrmap e31: vm00.nescmq(active, since 19s), standbys: vm07.kfawlb 2026-03-10T12:38:55.209 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T12:38:55.209 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-10T12:38:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:56 vm07.local ceph-mon[58582]: Standby manager daemon vm07.kfawlb restarted 2026-03-10T12:38:56.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:56 vm07.local ceph-mon[58582]: Standby manager daemon vm07.kfawlb started 2026-03-10T12:38:56.610 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:56 vm00.local ceph-mon[103263]: Standby manager daemon vm07.kfawlb restarted 2026-03-10T12:38:56.610 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:56 vm00.local ceph-mon[103263]: Standby manager daemon vm07.kfawlb started 2026-03-10T12:38:57.952 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:38:57 vm00.local ceph-mon[103263]: mgrmap e32: vm00.nescmq(active, since 21s), standbys: vm07.kfawlb 2026-03-10T12:38:57.973 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:38:57 vm07.local ceph-mon[58582]: mgrmap e32: vm00.nescmq(active, since 21s), standbys: vm07.kfawlb 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: Active manager daemon vm00.nescmq restarted 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: Activating manager daemon vm00.nescmq 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: osdmap e44: 6 total, 6 up, 6 in 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: mgrmap e33: vm00.nescmq(active, starting, since 0.00811916s), standbys: vm07.kfawlb 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm07.kfawlb", "id": "vm07.kfawlb"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:39:00.869 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: Active manager daemon vm00.nescmq restarted 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: Activating manager daemon vm00.nescmq 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: osdmap e44: 6 total, 6 up, 6 in 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: mgrmap e33: vm00.nescmq(active, starting, since 0.00811916s), standbys: vm07.kfawlb 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:39:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm00.nescmq", "id": "vm00.nescmq"}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr metadata", "who": "vm07.kfawlb", "id": "vm07.kfawlb"}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:39:01.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:00 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:39:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:01 vm07.local ceph-mon[58582]: Manager daemon vm00.nescmq is now available 2026-03-10T12:39:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:01 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:01 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:01 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:39:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:01 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:39:02.073 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:01 vm00.local ceph-mon[103263]: Manager daemon vm00.nescmq is now available 2026-03-10T12:39:02.073 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:02.073 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:02.073 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/mirror_snapshot_schedule"}]: dispatch 2026-03-10T12:39:02.073 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm00.nescmq/trash_purge_schedule"}]: dispatch 2026-03-10T12:39:02.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:02 vm00.local ceph-mon[103263]: mgrmap e34: vm00.nescmq(active, since 1.25468s), standbys: vm07.kfawlb 2026-03-10T12:39:02.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:02 vm00.local ceph-mon[103263]: Standby manager daemon vm07.kfawlb restarted 2026-03-10T12:39:02.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:02 vm00.local ceph-mon[103263]: Standby manager daemon vm07.kfawlb started 2026-03-10T12:39:02.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:02 vm00.local ceph-mon[103263]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/crt"}]: dispatch 2026-03-10T12:39:02.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:02 vm00.local ceph-mon[103263]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:39:02.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:02 vm00.local ceph-mon[103263]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/key"}]: dispatch 2026-03-10T12:39:02.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:02 vm00.local ceph-mon[103263]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:39:03.056 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:02 vm07.local ceph-mon[58582]: mgrmap e34: vm00.nescmq(active, since 1.25468s), standbys: vm07.kfawlb 2026-03-10T12:39:03.056 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:02 vm07.local ceph-mon[58582]: Standby manager daemon vm07.kfawlb restarted 2026-03-10T12:39:03.056 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:02 vm07.local ceph-mon[58582]: Standby manager daemon vm07.kfawlb started 2026-03-10T12:39:03.056 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:02 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/crt"}]: dispatch 2026-03-10T12:39:03.056 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:02 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T12:39:03.056 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:02 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm07.kfawlb/key"}]: dispatch 2026-03-10T12:39:03.056 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:02 vm07.local ceph-mon[58582]: from='mgr.? 192.168.123.107:0/948956560' entity='mgr.vm07.kfawlb' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T12:39:04.167 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:03 vm00.local ceph-mon[103263]: pgmap v3: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:04.167 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:03 vm00.local ceph-mon[103263]: pgmap v4: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:04.167 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:03 vm00.local ceph-mon[103263]: mgrmap e35: vm00.nescmq(active, since 2s), standbys: vm07.kfawlb 2026-03-10T12:39:04.167 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:03 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:04.167 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:03 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:04.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:03 vm07.local ceph-mon[58582]: pgmap v3: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:04.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:03 vm07.local ceph-mon[58582]: pgmap v4: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:04.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:03 vm07.local ceph-mon[58582]: mgrmap e35: vm00.nescmq(active, since 2s), standbys: vm07.kfawlb 2026-03-10T12:39:04.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:03 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:04.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:03 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:04 vm07.local ceph-mon[58582]: [10/Mar/2026:12:39:03] ENGINE Bus STARTING 2026-03-10T12:39:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:04 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:04 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:04 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:04 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:04 vm00.local ceph-mon[103263]: [10/Mar/2026:12:39:03] ENGINE Bus STARTING 2026-03-10T12:39:05.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.871 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: [10/Mar/2026:12:39:03] ENGINE Serving on http://192.168.123.100:8765 2026-03-10T12:39:05.871 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: [10/Mar/2026:12:39:04] ENGINE Serving on https://192.168.123.100:7150 2026-03-10T12:39:05.871 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: [10/Mar/2026:12:39:04] ENGINE Bus STARTED 2026-03-10T12:39:05.871 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: [10/Mar/2026:12:39:04] ENGINE Client ('192.168.123.100', 39022) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T12:39:05.871 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: pgmap v5: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:05.871 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: mgrmap e36: vm00.nescmq(active, since 4s), standbys: vm07.kfawlb 2026-03-10T12:39:05.871 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.876 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.876 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:39:05.876 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.876 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:05.876 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:39:05.876 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:05.876 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: [10/Mar/2026:12:39:03] ENGINE Serving on http://192.168.123.100:8765 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: [10/Mar/2026:12:39:04] ENGINE Serving on https://192.168.123.100:7150 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: [10/Mar/2026:12:39:04] ENGINE Bus STARTED 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: [10/Mar/2026:12:39:04] ENGINE Client ('192.168.123.100', 39022) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: pgmap v5: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: mgrmap e36: vm00.nescmq(active, since 4s), standbys: vm07.kfawlb 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:05 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:07.146 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:06 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: Updating vm00:/etc/ceph/ceph.conf 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: Updating vm07:/etc/ceph/ceph.conf 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:07.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:06 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T12:39:07.878 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: Updating vm00:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: pgmap v6: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:07 vm07.local ceph-mon[58582]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:08.158 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local systemd[1]: Stopping Ceph mon.vm07 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.conf 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: Updating vm00:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: Updating vm07:/etc/ceph/ceph.client.admin.keyring 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: Updating vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: Updating vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/config/ceph.client.admin.keyring 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: pgmap v6: 65 pgs: 65 active+clean; 294 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:39:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:08.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07[58573]: 2026-03-10T12:39:08.155+0000 7f7a069f8700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm07 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:39:08.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07[58573]: 2026-03-10T12:39:08.155+0000 7f7a069f8700 -1 mon.vm07@1(peon) e2 *** Got Signal Terminated *** 2026-03-10T12:39:08.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local podman[93497]: 2026-03-10 12:39:08.250773587 +0000 UTC m=+0.125051636 container died 7712955135fc47be4d6534adc5ea5cafbaa63e48b3e8bafaebeb7f180d5167fb (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07, org.label-schema.build-date=20231212, ceph=True, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.license=GPLv2, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux ) 2026-03-10T12:39:08.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local podman[93497]: 2026-03-10 12:39:08.275877322 +0000 UTC m=+0.150155361 container remove 7712955135fc47be4d6534adc5ea5cafbaa63e48b3e8bafaebeb7f180d5167fb (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07, GIT_BRANCH=HEAD, ceph=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T12:39:08.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local bash[93497]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07 2026-03-10T12:39:08.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm07.service: Deactivated successfully. 2026-03-10T12:39:08.408 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local systemd[1]: Stopped Ceph mon.vm07 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:39:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm07.service: Consumed 3.630s CPU time. 2026-03-10T12:39:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local systemd[1]: Starting Ceph mon.vm07 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local podman[93607]: 2026-03-10 12:39:08.832477525 +0000 UTC m=+0.038316277 container create 032abad282fcf6b3c9787641057c15d976a7820fb683c76606af4256488efb5b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223) 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local podman[93607]: 2026-03-10 12:39:08.808016071 +0000 UTC m=+0.013854823 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local podman[93607]: 2026-03-10 12:39:08.935403839 +0000 UTC m=+0.141242591 container init 032abad282fcf6b3c9787641057c15d976a7820fb683c76606af4256488efb5b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local podman[93607]: 2026-03-10 12:39:08.94769842 +0000 UTC m=+0.153537172 container start 032abad282fcf6b3c9787641057c15d976a7820fb683c76606af4256488efb5b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm07, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local bash[93607]: 032abad282fcf6b3c9787641057c15d976a7820fb683c76606af4256488efb5b 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:08 vm07.local systemd[1]: Started Ceph mon.vm07 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: pidfile_write: ignore empty --pid-file 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: load: jerasure load: lrc 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: RocksDB version: 7.9.2 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Git sha 0 2026-03-10T12:39:09.123 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: DB SUMMARY 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: DB Session ID: S5C5CQ5I0B6OC9ACRJOU 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: CURRENT file: CURRENT 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: MANIFEST file: MANIFEST-000010 size: 669 Bytes 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm07/store.db dir, Total Num: 1, files: 000018.sst 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm07/store.db: 000016.log size: 4245486 ; 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.error_if_exists: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.create_if_missing: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.paranoid_checks: 1 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.env: 0x5586a311ddc0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.info_log: 0x5586a549b900 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.statistics: (nil) 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.use_fsync: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_log_file_size: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.allow_fallocate: 1 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.use_direct_reads: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.db_log_dir: 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.wal_dir: 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.write_buffer_manager: 0x5586a549f900 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T12:39:09.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.unordered_write: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.row_cache: None 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.wal_filter: None 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.two_write_queues: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.wal_compression: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.atomic_flush: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.log_readahead_size: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_background_jobs: 2 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_background_compactions: -1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_subcompactions: 1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_open_files: -1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_background_flushes: -1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Compression algorithms supported: 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kZSTD supported: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kXpressCompression supported: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kBZip2Compression supported: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kLZ4Compression supported: 1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kZlibCompression supported: 1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: kSnappyCompression supported: 1 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm07/store.db/MANIFEST-000010 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.merge_operator: 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_filter: None 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T12:39:09.125 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5586a549b580) 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: cache_index_and_filter_blocks: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: pin_top_level_index_and_filter: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_type: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_block_index_type: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_shortening: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: checksum: 4 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: no_block_cache: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache: 0x5586a54be9b0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_name: BinnedLRUCache 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_options: 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: capacity : 536870912 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: num_shard_bits : 4 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: strict_capacity_limit : 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: high_pri_pool_ratio: 0.000 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_cache_compressed: (nil) 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: persistent_cache: (nil) 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_size: 4096 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_size_deviation: 10 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_restart_interval: 16 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: index_block_restart_interval: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: metadata_block_size: 4096 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: partition_filters: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: use_delta_encoding: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: filter_policy: bloomfilter 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: whole_key_filtering: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: verify_compression: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: read_amp_bytes_per_bit: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: format_version: 5 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: enable_index_compression: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: block_align: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_auto_readahead_size: 262144 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: prepopulate_block_cache: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: initial_auto_readahead_size: 8192 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression: NoCompression 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.num_levels: 7 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T12:39:09.126 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.inplace_update_support: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.bloom_locality: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.max_successive_merges: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.ttl: 2592000 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.enable_blob_files: false 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.min_blob_size: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T12:39:09.127 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm07/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 20, last_sequence is 8500, log_number is 16,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 16 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 16 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: a4d3c7e8-fc6e-462c-b9ee-d2b8335780b3 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773146349044621, "job": 1, "event": "recovery_started", "wal_files": [16]} 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #16 mode 2 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773146349123664, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 21, "file_size": 2580638, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8505, "largest_seqno": 9329, "table_properties": {"data_size": 2575851, "index_size": 2781, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 9275, "raw_average_key_size": 24, "raw_value_size": 2567154, "raw_average_value_size": 6685, "num_data_blocks": 133, "num_entries": 384, "num_filter_entries": 384, "num_deletions": 2, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773146349, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a4d3c7e8-fc6e-462c-b9ee-d2b8335780b3", "db_session_id": "S5C5CQ5I0B6OC9ACRJOU", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773146349124056, "job": 1, "event": "recovery_finished"} 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/version_set.cc:5047] Creating manifest 23 2026-03-10T12:39:09.128 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T12:39:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm07/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T12:39:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5586a54c0e00 2026-03-10T12:39:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: DB pointer 0x5586a54d0000 2026-03-10T12:39:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T12:39:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** DB Stats ** 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Uptime(secs): 0.1 total, 0.1 interval 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** Compaction Stats [default] ** 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: L0 1/0 2.46 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 50.9 0.05 0.00 1 0.048 0 0 0.0 0.0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: L6 1/0 7.33 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Sum 2/0 9.79 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 50.9 0.05 0.00 1 0.048 0 0 0.0 0.0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 50.9 0.05 0.00 1 0.048 0 0 0.0 0.0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** Compaction Stats [default] ** 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 50.9 0.05 0.00 1 0.048 0 0 0.0 0.0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Uptime(secs): 0.1 total, 0.1 interval 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Flush(GB): cumulative 0.002, interval 0.002 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Cumulative compaction: 0.00 GB write, 24.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Interval compaction: 0.00 GB write, 24.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Block cache BinnedLRUCache@0x5586a54be9b0#2 capacity: 512.00 MB usage: 38.05 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.1e-05 secs_since: 0 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Block cache entry stats(count,size,portion): DataBlock(1,6.08 KB,0.00115931%) FilterBlock(2,9.06 KB,0.00172853%) IndexBlock(2,22.91 KB,0.00436902%) Misc(1,0.00 KB,0%) 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: starting mon.vm07 rank 1 at public addrs [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] at bind addrs [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon_data /var/lib/ceph/mon/ceph-vm07 fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???) e2 preinit fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???).mds e13 new map 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???).mds e13 print_map 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: e13 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: legacy client fscid: 1 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Filesystem 'cephfs' (1) 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: fs_name cephfs 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: epoch 13 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:39:09.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout: modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: tableserver 0 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: root 0 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: session_timeout 60 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: session_autoclose 300 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_file_size 1099511627776 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_xattr_size 65536 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: required_client_features {} 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: last_failure 0 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: last_failure_osd_epoch 0 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: max_mds 2 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: in 0,1 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: up {0=24313,1=24307} 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: failed 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: damaged 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: stopped 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: data_pools [3] 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: metadata_pool 2 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: inline_data disabled 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: balancer 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: bal_rank_mask -1 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: standby_count_wanted 1 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: qdb_cluster leader: 0 members: 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: Standby daemons: 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout: [mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???).osd e44 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???).osd e44 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???).osd e44 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???).osd e44 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T12:39:09.568 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:09 vm07.local ceph-mon[93622]: mon.vm07@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: overall HEALTH_OK 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: mon.vm00 calling monitor election 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: mon.vm07 calling monitor election 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: mon.vm00 is new leader, mons vm00,vm07 in quorum (ranks 0,1) 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: monmap epoch 3 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: last_changed 2026-03-10T12:39:09.382652+0000 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: created 2026-03-10T12:32:16.576749+0000 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: min_mon_release 19 (squid) 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: election_strategy: 1 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: 0: [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] mon.vm00 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: 1: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: osdmap e44: 6 total, 6 up, 6 in 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: mgrmap e36: vm00.nescmq(active, since 8s), standbys: vm07.kfawlb 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: overall HEALTH_OK 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:10.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: overall HEALTH_OK 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm00"}]: dispatch 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: mon.vm00 calling monitor election 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mon metadata", "id": "vm07"}]: dispatch 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: mon.vm07 calling monitor election 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: mon.vm00 is new leader, mons vm00,vm07 in quorum (ranks 0,1) 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: monmap epoch 3 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: last_changed 2026-03-10T12:39:09.382652+0000 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: created 2026-03-10T12:32:16.576749+0000 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: min_mon_release 19 (squid) 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: election_strategy: 1 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: 0: [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] mon.vm00 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: 1: [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] mon.vm07 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: osdmap e44: 6 total, 6 up, 6 in 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: mgrmap e36: vm00.nescmq(active, since 8s), standbys: vm07.kfawlb 2026-03-10T12:39:10.724 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: overall HEALTH_OK 2026-03-10T12:39:10.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:10.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:10.725 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:12.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:11 vm07.local ceph-mon[93622]: pgmap v8: 65 pgs: 65 active+clean; 296 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 619 KiB/s wr, 192 op/s 2026-03-10T12:39:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:11 vm00.local ceph-mon[103263]: pgmap v8: 65 pgs: 65 active+clean; 296 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 619 KiB/s wr, 192 op/s 2026-03-10T12:39:13.025 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:12 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:13.025 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:12 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:13.025 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:12 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:13.025 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:12 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:12 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:12 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:12 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:12 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:13 vm07.local ceph-mon[93622]: pgmap v9: 65 pgs: 65 active+clean; 300 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 45 KiB/s rd, 1.1 MiB/s wr, 299 op/s 2026-03-10T12:39:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:13 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:13 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.173 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:13 vm00.local ceph-mon[103263]: pgmap v9: 65 pgs: 65 active+clean; 300 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 45 KiB/s rd, 1.1 MiB/s wr, 299 op/s 2026-03-10T12:39:14.173 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:13 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.173 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:13 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.776 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:14.776 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:14.776 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.776 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: Reconfiguring mon.vm00 (monmap changed)... 2026-03-10T12:39:14.776 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:39:14.776 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:39:14.776 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:14.777 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: Reconfiguring daemon mon.vm00 on vm00 2026-03-10T12:39:14.777 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.777 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:14.777 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:39:14.777 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:39:14.777 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: Reconfiguring mon.vm00 (monmap changed)... 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: Reconfiguring daemon mon.vm00 on vm00 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm00.nescmq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:39:15.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: Reconfiguring mgr.vm00.nescmq (monmap changed)... 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: Reconfiguring daemon mgr.vm00.nescmq on vm00 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: pgmap v10: 65 pgs: 65 active+clean; 300 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 40 KiB/s rd, 1.0 MiB/s wr, 267 op/s 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm00"}]: dispatch 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: Reconfiguring mgr.vm00.nescmq (monmap changed)... 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: Reconfiguring daemon mgr.vm00.nescmq on vm00 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: pgmap v10: 65 pgs: 65 active+clean; 300 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 40 KiB/s rd, 1.0 MiB/s wr, 267 op/s 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm00"}]: dispatch 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:17.145 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: Reconfiguring ceph-exporter.vm00 (monmap changed)... 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: Unable to update caps for client.ceph-exporter.vm00 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: Reconfiguring daemon ceph-exporter.vm00 on vm00 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: Reconfiguring crash.vm00 (monmap changed)... 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: Reconfiguring daemon crash.vm00 on vm00 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:17.146 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: Reconfiguring ceph-exporter.vm00 (monmap changed)... 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: Unable to update caps for client.ceph-exporter.vm00 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: Reconfiguring daemon ceph-exporter.vm00 on vm00 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: Reconfiguring crash.vm00 (monmap changed)... 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: Reconfiguring daemon crash.vm00 on vm00 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T12:39:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:17.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.886 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T12:39:17.886 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: Reconfiguring daemon osd.0 on vm00 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: pgmap v11: 65 pgs: 65 active+clean; 300 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 40 KiB/s rd, 1.0 MiB/s wr, 267 op/s 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: Reconfiguring daemon osd.1 on vm00 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:17.887 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:18.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.278+0000 7f4265a1f700 1 -- 192.168.123.100:0/468817619 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f42601080e0 msgr2=0x7f42601084f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.278+0000 7f4265a1f700 1 --2- 192.168.123.100:0/468817619 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f42601080e0 0x7f42601084f0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f4254008790 tx=0x7f4254008aa0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.280+0000 7f4265a1f700 1 -- 192.168.123.100:0/468817619 shutdown_connections 2026-03-10T12:39:18.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.280+0000 7f4265a1f700 1 --2- 192.168.123.100:0/468817619 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4260071960 0x7f4260071dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.280+0000 7f4265a1f700 1 --2- 192.168.123.100:0/468817619 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f42601080e0 0x7f42601084f0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.280+0000 7f4265a1f700 1 -- 192.168.123.100:0/468817619 >> 192.168.123.100:0/468817619 conn(0x7f426006d3e0 msgr2=0x7f426006f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.281+0000 7f4265a1f700 1 -- 192.168.123.100:0/468817619 shutdown_connections 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f4265a1f700 1 -- 192.168.123.100:0/468817619 wait complete. 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f4265a1f700 1 Processor -- start 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f4265a1f700 1 -- start start 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f4265a1f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4260071960 0x7f426007e630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f4265a1f700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f426007eb70 0x7f426007fbc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f4265a1f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f426007f070 con 0x7f4260071960 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f4265a1f700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f426007f1e0 con 0x7f426007eb70 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f425e7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f426007eb70 0x7f426007fbc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f425e7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f426007eb70 0x7f426007fbc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:42926/0 (socket says 192.168.123.100:42926) 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f425e7fc700 1 -- 192.168.123.100:0/1071916703 learned_addr learned my addr 192.168.123.100:0/1071916703 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.282+0000 7f425effd700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4260071960 0x7f426007e630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.283+0000 7f425e7fc700 1 -- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4260071960 msgr2=0x7f426007e630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.283+0000 7f425e7fc700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4260071960 0x7f426007e630 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.283+0000 7f425e7fc700 1 -- 192.168.123.100:0/1071916703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4254008440 con 0x7f426007eb70 2026-03-10T12:39:18.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.283+0000 7f425e7fc700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f426007eb70 0x7f426007fbc0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f425000d670 tx=0x7f42500086d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:18.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.283+0000 7f4264a1d700 1 -- 192.168.123.100:0/1071916703 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4250008e70 con 0x7f426007eb70 2026-03-10T12:39:18.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.283+0000 7f4264a1d700 1 -- 192.168.123.100:0/1071916703 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4250004030 con 0x7f426007eb70 2026-03-10T12:39:18.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.283+0000 7f4264a1d700 1 -- 192.168.123.100:0/1071916703 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4250018620 con 0x7f426007eb70 2026-03-10T12:39:18.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.284+0000 7f4265a1f700 1 -- 192.168.123.100:0/1071916703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4260080160 con 0x7f426007eb70 2026-03-10T12:39:18.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.284+0000 7f4265a1f700 1 -- 192.168.123.100:0/1071916703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f42600805e0 con 0x7f426007eb70 2026-03-10T12:39:18.285 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.285+0000 7f4265a1f700 1 -- 192.168.123.100:0/1071916703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4260076710 con 0x7f426007eb70 2026-03-10T12:39:18.286 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.285+0000 7f4264a1d700 1 -- 192.168.123.100:0/1071916703 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f4250003a00 con 0x7f426007eb70 2026-03-10T12:39:18.286 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.285+0000 7f4264a1d700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4248077820 0x7f4248079cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.286 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.285+0000 7f4264a1d700 1 -- 192.168.123.100:0/1071916703 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4250099a80 con 0x7f426007eb70 2026-03-10T12:39:18.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.288+0000 7f425effd700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4248077820 0x7f4248079cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.288+0000 7f425effd700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4248077820 0x7f4248079cd0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f4254000c00 tx=0x7f425400b320 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:18.292 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.292+0000 7f4264a1d700 1 -- 192.168.123.100:0/1071916703 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4250062090 con 0x7f426007eb70 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: Reconfiguring daemon osd.0 on vm00 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: pgmap v11: 65 pgs: 65 active+clean; 300 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 40 KiB/s rd, 1.0 MiB/s wr, 267 op/s 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: Reconfiguring daemon osd.1 on vm00 2026-03-10T12:39:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:18.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:18.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T12:39:18.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:18.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:18.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:18.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:18.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:18.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.522+0000 7f4265a1f700 1 -- 192.168.123.100:0/1071916703 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4260080b80 con 0x7f4248077820 2026-03-10T12:39:18.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.529+0000 7f4264a1d700 1 -- 192.168.123.100:0/1071916703 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7f4260080b80 con 0x7f4248077820 2026-03-10T12:39:18.541 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.540+0000 7f42467fc700 1 -- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4248077820 msgr2=0x7f4248079cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.541 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.541+0000 7f42467fc700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4248077820 0x7f4248079cd0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f4254000c00 tx=0x7f425400b320 comp rx=0 tx=0).stop 2026-03-10T12:39:18.541 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.541+0000 7f42467fc700 1 -- 192.168.123.100:0/1071916703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f426007eb70 msgr2=0x7f426007fbc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.541 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.541+0000 7f42467fc700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f426007eb70 0x7f426007fbc0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f425000d670 tx=0x7f42500086d0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.543 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.543+0000 7f42467fc700 1 -- 192.168.123.100:0/1071916703 shutdown_connections 2026-03-10T12:39:18.543 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.543+0000 7f42467fc700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4248077820 0x7f4248079cd0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.543 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.543+0000 7f42467fc700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4260071960 0x7f426007e630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.543 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.543+0000 7f42467fc700 1 --2- 192.168.123.100:0/1071916703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f426007eb70 0x7f426007fbc0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.543 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.543+0000 7f42467fc700 1 -- 192.168.123.100:0/1071916703 >> 192.168.123.100:0/1071916703 conn(0x7f426006d3e0 msgr2=0x7f426006f060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:18.544 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.544+0000 7f42467fc700 1 -- 192.168.123.100:0/1071916703 shutdown_connections 2026-03-10T12:39:18.544 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.544+0000 7f42467fc700 1 -- 192.168.123.100:0/1071916703 wait complete. 2026-03-10T12:39:18.557 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:39:18.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- 192.168.123.100:0/1733918928 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08071950 msgr2=0x7fab08071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/1733918928 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08071950 0x7fab08071d60 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fab0400bc70 tx=0x7fab0400bf80 comp rx=0 tx=0).stop 2026-03-10T12:39:18.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- 192.168.123.100:0/1733918928 shutdown_connections 2026-03-10T12:39:18.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/1733918928 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab08072330 0x7fab080770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/1733918928 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08071950 0x7fab08071d60 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.661 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- 192.168.123.100:0/1733918928 >> 192.168.123.100:0/1733918928 conn(0x7fab0806d1a0 msgr2=0x7fab0806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- 192.168.123.100:0/1733918928 shutdown_connections 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- 192.168.123.100:0/1733918928 wait complete. 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 Processor -- start 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- start start 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab08072330 0x7fab08082520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08082a60 0x7fab08082ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab0812dd80 con 0x7fab08082a60 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.661+0000 7fab0f2ab700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab0812def0 con 0x7fab08072330 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.662+0000 7fab0daa8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08082a60 0x7fab08082ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.662+0000 7fab0daa8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08082a60 0x7fab08082ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33288/0 (socket says 192.168.123.100:33288) 2026-03-10T12:39:18.662 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.662+0000 7fab0daa8700 1 -- 192.168.123.100:0/4262090726 learned_addr learned my addr 192.168.123.100:0/4262090726 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:18.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.662+0000 7fab0e2a9700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab08072330 0x7fab08082520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.663+0000 7fab0daa8700 1 -- 192.168.123.100:0/4262090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab08072330 msgr2=0x7fab08082520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.663+0000 7fab0daa8700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab08072330 0x7fab08082520 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.663+0000 7fab0daa8700 1 -- 192.168.123.100:0/4262090726 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab0400b920 con 0x7fab08082a60 2026-03-10T12:39:18.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.663+0000 7fab0daa8700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08082a60 0x7fab08082ed0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fab0000b330 tx=0x7fab0000b640 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:18.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.664+0000 7faaff7fe700 1 -- 192.168.123.100:0/4262090726 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab00004d60 con 0x7fab08082a60 2026-03-10T12:39:18.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.664+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fab0812e120 con 0x7fab08082a60 2026-03-10T12:39:18.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.664+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fab0812e670 con 0x7fab08082a60 2026-03-10T12:39:18.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.664+0000 7faaff7fe700 1 -- 192.168.123.100:0/4262090726 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fab0000f930 con 0x7fab08082a60 2026-03-10T12:39:18.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.665+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faaec005320 con 0x7fab08082a60 2026-03-10T12:39:18.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.665+0000 7faaff7fe700 1 -- 192.168.123.100:0/4262090726 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab0000faa0 con 0x7fab08082a60 2026-03-10T12:39:18.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.668+0000 7faaff7fe700 1 -- 192.168.123.100:0/4262090726 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fab0000fc00 con 0x7fab08082a60 2026-03-10T12:39:18.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.669+0000 7faaff7fe700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf4077a30 0x7faaf4079ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.669+0000 7fab0e2a9700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf4077a30 0x7faaf4079ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.670+0000 7fab0e2a9700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf4077a30 0x7faaf4079ee0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fab0400bc70 tx=0x7fab0400d3e0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:18.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.670+0000 7faaff7fe700 1 -- 192.168.123.100:0/4262090726 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fab00013070 con 0x7fab08082a60 2026-03-10T12:39:18.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.674+0000 7faaff7fe700 1 -- 192.168.123.100:0/4262090726 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fab00062bb0 con 0x7fab08082a60 2026-03-10T12:39:18.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.844+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7faaec000bf0 con 0x7faaf4077a30 2026-03-10T12:39:18.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.847+0000 7faaff7fe700 1 -- 192.168.123.100:0/4262090726 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7faaec000bf0 con 0x7faaf4077a30 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.850+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf4077a30 msgr2=0x7faaf4079ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.850+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf4077a30 0x7faaf4079ee0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fab0400bc70 tx=0x7fab0400d3e0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.850+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08082a60 msgr2=0x7fab08082ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.850+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08082a60 0x7fab08082ed0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fab0000b330 tx=0x7fab0000b640 comp rx=0 tx=0).stop 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.851+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 shutdown_connections 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.851+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf4077a30 0x7faaf4079ee0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.851+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab08072330 0x7fab08082520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.851+0000 7fab0f2ab700 1 --2- 192.168.123.100:0/4262090726 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab08082a60 0x7fab08082ed0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.851+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 >> 192.168.123.100:0/4262090726 conn(0x7fab0806d1a0 msgr2=0x7fab0806e160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:18.852 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.852+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 shutdown_connections 2026-03-10T12:39:18.852 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.852+0000 7fab0f2ab700 1 -- 192.168.123.100:0/4262090726 wait complete. 2026-03-10T12:39:18.937 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 -- 192.168.123.100:0/4048973714 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd328071980 msgr2=0x7fd328071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.937 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 --2- 192.168.123.100:0/4048973714 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd328071980 0x7fd328071d90 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fd318007780 tx=0x7fd31800c050 comp rx=0 tx=0).stop 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 -- 192.168.123.100:0/4048973714 shutdown_connections 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 --2- 192.168.123.100:0/4048973714 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd328072360 0x7fd3280770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 --2- 192.168.123.100:0/4048973714 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd328071980 0x7fd328071d90 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 -- 192.168.123.100:0/4048973714 >> 192.168.123.100:0/4048973714 conn(0x7fd32806d1a0 msgr2=0x7fd32806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 -- 192.168.123.100:0/4048973714 shutdown_connections 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 -- 192.168.123.100:0/4048973714 wait complete. 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 Processor -- start 2026-03-10T12:39:18.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.937+0000 7fd32ea59700 1 -- start start 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.938+0000 7fd32ea59700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd328072360 0x7fd3281313a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.938+0000 7fd32ea59700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3281318e0 0x7fd32807f590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.938+0000 7fd32ea59700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd328131de0 con 0x7fd3281318e0 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.938+0000 7fd32ea59700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd328131f20 con 0x7fd328072360 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.938+0000 7fd3277fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3281318e0 0x7fd32807f590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.938+0000 7fd3277fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3281318e0 0x7fd32807f590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33300/0 (socket says 192.168.123.100:33300) 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.938+0000 7fd3277fe700 1 -- 192.168.123.100:0/3514368083 learned_addr learned my addr 192.168.123.100:0/3514368083 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd327fff700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd328072360 0x7fd3281313a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd3277fe700 1 -- 192.168.123.100:0/3514368083 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd328072360 msgr2=0x7fd3281313a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd3277fe700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd328072360 0x7fd3281313a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd3277fe700 1 -- 192.168.123.100:0/3514368083 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd318007430 con 0x7fd3281318e0 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd3277fe700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3281318e0 0x7fd32807f590 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fd32000bf40 tx=0x7fd32000bf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:18.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd3257fa700 1 -- 192.168.123.100:0/3514368083 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd32000cb40 con 0x7fd3281318e0 2026-03-10T12:39:18.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd32807fb30 con 0x7fd3281318e0 2026-03-10T12:39:18.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.942+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd3280800b0 con 0x7fd3281318e0 2026-03-10T12:39:18.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.943+0000 7fd3257fa700 1 -- 192.168.123.100:0/3514368083 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd32000cca0 con 0x7fd3281318e0 2026-03-10T12:39:18.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.943+0000 7fd3257fa700 1 -- 192.168.123.100:0/3514368083 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd320012720 con 0x7fd3281318e0 2026-03-10T12:39:18.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.944+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd314005320 con 0x7fd3281318e0 2026-03-10T12:39:18.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.946+0000 7fd3257fa700 1 -- 192.168.123.100:0/3514368083 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fd320012980 con 0x7fd3281318e0 2026-03-10T12:39:18.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.946+0000 7fd3257fa700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd310077a40 0x7fd310079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:18.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.946+0000 7fd3257fa700 1 -- 192.168.123.100:0/3514368083 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd3200997e0 con 0x7fd3281318e0 2026-03-10T12:39:18.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.946+0000 7fd327fff700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd310077a40 0x7fd310079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:18.950 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.950+0000 7fd327fff700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd310077a40 0x7fd310079ef0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fd318007400 tx=0x7fd31800c490 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:18.950 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:18.950+0000 7fd3257fa700 1 -- 192.168.123.100:0/3514368083 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd320061f20 con 0x7fd3281318e0 2026-03-10T12:39:19.111 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.111+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd314000bf0 con 0x7fd310077a40 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: Reconfiguring daemon osd.2 on vm00 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: Reconfiguring mds.cephfs.vm00.lnokoe (monmap changed)... 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: Reconfiguring daemon mds.cephfs.vm00.lnokoe on vm00 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm07"}]: dispatch 2026-03-10T12:39:19.113 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.121+0000 7fd3257fa700 1 -- 192.168.123.100:0/3514368083 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fd314000bf0 con 0x7fd310077a40 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (5m) 14s ago 6m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (6m) 14s ago 6m 8468k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (5m) 7s ago 5m 11.1M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (6m) 14s ago 6m 7407k - 18.2.0 dc2bc1663786 4726e39e7eb0 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (5m) 7s ago 5m 7402k - 18.2.0 dc2bc1663786 f917dac1f418 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (5m) 14s ago 6m 89.8M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:39:19.121 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (4m) 14s ago 4m 162M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (4m) 14s ago 4m 17.3M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (4m) 7s ago 4m 17.2M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (4m) 7s ago 4m 177M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (51s) 14s ago 6m 592M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (32s) 7s ago 5m 501M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (25s) 14s ago 7m 40.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (10s) 7s ago 5m 35.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (6m) 14s ago 6m 14.4M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (5m) 7s ago 5m 15.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (5m) 14s ago 5m 320M 4096M 18.2.0 dc2bc1663786 d5b05007694d 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (5m) 14s ago 5m 366M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (5m) 14s ago 5m 302M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (4m) 7s ago 4m 425M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (4m) 7s ago 4m 394M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (4m) 7s ago 4m 365M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:39:19.122 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (35s) 14s ago 6m 51.9M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:39:19.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd310077a40 msgr2=0x7fd310079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd310077a40 0x7fd310079ef0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fd318007400 tx=0x7fd31800c490 comp rx=0 tx=0).stop 2026-03-10T12:39:19.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3281318e0 msgr2=0x7fd32807f590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3281318e0 0x7fd32807f590 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fd32000bf40 tx=0x7fd32000bf70 comp rx=0 tx=0).stop 2026-03-10T12:39:19.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 shutdown_connections 2026-03-10T12:39:19.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd310077a40 0x7fd310079ef0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd328072360 0x7fd3281313a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 --2- 192.168.123.100:0/3514368083 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd3281318e0 0x7fd32807f590 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 >> 192.168.123.100:0/3514368083 conn(0x7fd32806d1a0 msgr2=0x7fd328076480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:19.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 shutdown_connections 2026-03-10T12:39:19.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.124+0000 7fd32ea59700 1 -- 192.168.123.100:0/3514368083 wait complete. 2026-03-10T12:39:19.228 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 -- 192.168.123.100:0/717010391 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281014a0 msgr2=0x7f17281018f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 --2- 192.168.123.100:0/717010391 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281014a0 0x7f17281018f0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f1718009a60 tx=0x7f1718009d70 comp rx=0 tx=0).stop 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 -- 192.168.123.100:0/717010391 shutdown_connections 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 --2- 192.168.123.100:0/717010391 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281014a0 0x7f17281018f0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 --2- 192.168.123.100:0/717010391 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17281002a0 0x7f17281006b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 -- 192.168.123.100:0/717010391 >> 192.168.123.100:0/717010391 conn(0x7f17280fb850 msgr2=0x7f17280fdc80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 -- 192.168.123.100:0/717010391 shutdown_connections 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.227+0000 7f172f108700 1 -- 192.168.123.100:0/717010391 wait complete. 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.228+0000 7f172f108700 1 Processor -- start 2026-03-10T12:39:19.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.228+0000 7f172f108700 1 -- start start 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.228+0000 7f172f108700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281002a0 0x7f1728195990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.228+0000 7f172f108700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17281014a0 0x7f1728195ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172e106700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281002a0 0x7f1728195990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172e106700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281002a0 0x7f1728195990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33314/0 (socket says 192.168.123.100:33314) 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172e106700 1 -- 192.168.123.100:0/4193366694 learned_addr learned my addr 192.168.123.100:0/4193366694 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172f108700 1 -- 192.168.123.100:0/4193366694 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17281964f0 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172f108700 1 -- 192.168.123.100:0/4193366694 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1728196630 con 0x7f17281014a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172d905700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17281014a0 0x7f1728195ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172e106700 1 -- 192.168.123.100:0/4193366694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17281014a0 msgr2=0x7f1728195ed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172e106700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17281014a0 0x7f1728195ed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172e106700 1 -- 192.168.123.100:0/4193366694 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1718009710 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172e106700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281002a0 0x7f1728195990 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f172400ea00 tx=0x7f172400ed10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f171f7fe700 1 -- 192.168.123.100:0/4193366694 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f172400cb80 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172f108700 1 -- 192.168.123.100:0/4193366694 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f172819b0e0 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.229+0000 7f172f108700 1 -- 192.168.123.100:0/4193366694 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f172819b630 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.230+0000 7f171f7fe700 1 -- 192.168.123.100:0/4193366694 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1724004500 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.230+0000 7f171f7fe700 1 -- 192.168.123.100:0/4193366694 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1724010430 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.230+0000 7f172f108700 1 -- 192.168.123.100:0/4193366694 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f172804ea50 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.231+0000 7f171f7fe700 1 -- 192.168.123.100:0/4193366694 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f172400cce0 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.231+0000 7f171f7fe700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f17140779f0 0x7f1714079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.231+0000 7f171f7fe700 1 -- 192.168.123.100:0/4193366694 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f1724014070 con 0x7f17281002a0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.232+0000 7f172d905700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f17140779f0 0x7f1714079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.232+0000 7f172d905700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f17140779f0 0x7f1714079ea0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f171800b5c0 tx=0x7f17180095f0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.235+0000 7f171f7fe700 1 -- 192.168.123.100:0/4193366694 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f17240ca9f0 con 0x7f17281002a0 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: Reconfiguring daemon osd.2 on vm00 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: Reconfiguring mds.cephfs.vm00.lnokoe (monmap changed)... 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: Reconfiguring daemon mds.cephfs.vm00.lnokoe on vm00 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm07"}]: dispatch 2026-03-10T12:39:19.266 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:19.426 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.425+0000 7f172f108700 1 -- 192.168.123.100:0/4193366694 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f172819b910 con 0x7f17281002a0 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.429+0000 7f171f7fe700 1 -- 192.168.123.100:0/4193366694 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+694 (secure 0 0 0) 0x7f1724062860 con 0x7f17281002a0 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 10, 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:39:19.430 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.432+0000 7f171d7fa700 1 -- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f17140779f0 msgr2=0x7f1714079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.432+0000 7f171d7fa700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f17140779f0 0x7f1714079ea0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f171800b5c0 tx=0x7f17180095f0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 -- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281002a0 msgr2=0x7f1728195990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281002a0 0x7f1728195990 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f172400ea00 tx=0x7f172400ed10 comp rx=0 tx=0).stop 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 -- 192.168.123.100:0/4193366694 shutdown_connections 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f17140779f0 0x7f1714079ea0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f17281002a0 0x7f1728195990 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.433 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 --2- 192.168.123.100:0/4193366694 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f17281014a0 0x7f1728195ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 -- 192.168.123.100:0/4193366694 >> 192.168.123.100:0/4193366694 conn(0x7f17280fb850 msgr2=0x7f17280fdb20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:19.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 -- 192.168.123.100:0/4193366694 shutdown_connections 2026-03-10T12:39:19.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.433+0000 7f171d7fa700 1 -- 192.168.123.100:0/4193366694 wait complete. 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.507+0000 7fd0f77fe700 1 -- 192.168.123.100:0/3121746639 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0e8005670 con 0x7fd0f81036f0 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 -- 192.168.123.100:0/3121746639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 msgr2=0x7fd0f8105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 --2- 192.168.123.100:0/3121746639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8105ad0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fd0e8009b00 tx=0x7fd0e8009e10 comp rx=0 tx=0).stop 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 -- 192.168.123.100:0/3121746639 shutdown_connections 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 --2- 192.168.123.100:0/3121746639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8105ad0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 --2- 192.168.123.100:0/3121746639 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd0f8100dd0 0x7fd0f81031b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 -- 192.168.123.100:0/3121746639 >> 192.168.123.100:0/3121746639 conn(0x7fd0f80fa7b0 msgr2=0x7fd0f80fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:19.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 -- 192.168.123.100:0/3121746639 shutdown_connections 2026-03-10T12:39:19.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.508+0000 7fd0fd213700 1 -- 192.168.123.100:0/3121746639 wait complete. 2026-03-10T12:39:19.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0fd213700 1 Processor -- start 2026-03-10T12:39:19.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0fd213700 1 -- start start 2026-03-10T12:39:19.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0fd213700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd0f8100dd0 0x7fd0f8197ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0fd213700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8198410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0fd213700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0f8198a30 con 0x7fd0f8100dd0 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0fd213700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd0f81991c0 con 0x7fd0f81036f0 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0effff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8198410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0effff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8198410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:42986/0 (socket says 192.168.123.100:42986) 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.509+0000 7fd0effff700 1 -- 192.168.123.100:0/3567396703 learned_addr learned my addr 192.168.123.100:0/3567396703 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.510+0000 7fd0effff700 1 -- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd0f8100dd0 msgr2=0x7fd0f8197ed0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.510+0000 7fd0f7fff700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd0f8100dd0 0x7fd0f8197ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.510+0000 7fd0effff700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd0f8100dd0 0x7fd0f8197ed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.510+0000 7fd0effff700 1 -- 192.168.123.100:0/3567396703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd0e0009710 con 0x7fd0f81036f0 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.510+0000 7fd0f7fff700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd0f8100dd0 0x7fd0f8197ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:39:19.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.510+0000 7fd0effff700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8198410 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd0e800bb80 tx=0x7fd0e8005f00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:19.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.510+0000 7fd0f5ffb700 1 -- 192.168.123.100:0/3567396703 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0e801d070 con 0x7fd0f81036f0 2026-03-10T12:39:19.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.511+0000 7fd0f5ffb700 1 -- 192.168.123.100:0/3567396703 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd0e8022470 con 0x7fd0f81036f0 2026-03-10T12:39:19.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.511+0000 7fd0f5ffb700 1 -- 192.168.123.100:0/3567396703 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd0e800f700 con 0x7fd0f81036f0 2026-03-10T12:39:19.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.513+0000 7fd0fd213700 1 -- 192.168.123.100:0/3567396703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd0e80097e0 con 0x7fd0f81036f0 2026-03-10T12:39:19.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.513+0000 7fd0fd213700 1 -- 192.168.123.100:0/3567396703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd0f8199770 con 0x7fd0f81036f0 2026-03-10T12:39:19.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.514+0000 7fd0fd213700 1 -- 192.168.123.100:0/3567396703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd0f804ea50 con 0x7fd0f81036f0 2026-03-10T12:39:19.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.515+0000 7fd0f5ffb700 1 -- 192.168.123.100:0/3567396703 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fd0e8022a50 con 0x7fd0f81036f0 2026-03-10T12:39:19.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.515+0000 7fd0f5ffb700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd0d80779f0 0x7fd0d8079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.515+0000 7fd0f5ffb700 1 -- 192.168.123.100:0/3567396703 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd0e809be60 con 0x7fd0f81036f0 2026-03-10T12:39:19.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.515+0000 7fd0f7fff700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd0d80779f0 0x7fd0d8079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.516+0000 7fd0f7fff700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd0d80779f0 0x7fd0d8079ea0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fd0f8197c30 tx=0x7fd0e0009450 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:19.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.517+0000 7fd0f5ffb700 1 -- 192.168.123.100:0/3567396703 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd0e80644f0 con 0x7fd0f81036f0 2026-03-10T12:39:19.708 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.707+0000 7fd0fd213700 1 -- 192.168.123.100:0/3567396703 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fd0f8100bb0 con 0x7fd0f81036f0 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.712+0000 7fd0f5ffb700 1 -- 192.168.123.100:0/3567396703 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1961 (secure 0 0 0) 0x7fd0e80277c0 con 0x7fd0f81036f0 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:39:19.712 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 0 members: 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:19.713 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:19.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 -- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd0d80779f0 msgr2=0x7fd0d8079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd0d80779f0 0x7fd0d8079ea0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fd0f8197c30 tx=0x7fd0e0009450 comp rx=0 tx=0).stop 2026-03-10T12:39:19.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 -- 192.168.123.100:0/3567396703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 msgr2=0x7fd0f8198410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8198410 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd0e800bb80 tx=0x7fd0e8005f00 comp rx=0 tx=0).stop 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 -- 192.168.123.100:0/3567396703 shutdown_connections 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd0d80779f0 0x7fd0d8079ea0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd0f8100dd0 0x7fd0f8197ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 --2- 192.168.123.100:0/3567396703 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd0f81036f0 0x7fd0f8198410 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 -- 192.168.123.100:0/3567396703 >> 192.168.123.100:0/3567396703 conn(0x7fd0f80fa7b0 msgr2=0x7fd0f80fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 -- 192.168.123.100:0/3567396703 shutdown_connections 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.715+0000 7fd0eeffd700 1 -- 192.168.123.100:0/3567396703 wait complete. 2026-03-10T12:39:19.716 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 -- 192.168.123.100:0/1850208656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc076950 msgr2=0x7feacc076dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 --2- 192.168.123.100:0/1850208656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc076950 0x7feacc076dc0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7feac401c320 tx=0x7feac401c630 comp rx=0 tx=0).stop 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 -- 192.168.123.100:0/1850208656 shutdown_connections 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 --2- 192.168.123.100:0/1850208656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc076950 0x7feacc076dc0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 --2- 192.168.123.100:0/1850208656 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feacc075700 0x7feacc075b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 -- 192.168.123.100:0/1850208656 >> 192.168.123.100:0/1850208656 conn(0x7feacc0fda80 msgr2=0x7feacc0ffed0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 -- 192.168.123.100:0/1850208656 shutdown_connections 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.787+0000 7fead1343700 1 -- 192.168.123.100:0/1850208656 wait complete. 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7fead1343700 1 Processor -- start 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7fead1343700 1 -- start start 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7fead1343700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc075700 0x7feacc106600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7fead1343700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feacc106b40 0x7feacc10bbb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7fead1343700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feacc107040 con 0x7feacc075700 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7fead1343700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feacc1071b0 con 0x7feacc106b40 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feacaffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc075700 0x7feacc106600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feacaffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc075700 0x7feacc106600 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:33354/0 (socket says 192.168.123.100:33354) 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feacaffd700 1 -- 192.168.123.100:0/1765339383 learned_addr learned my addr 192.168.123.100:0/1765339383 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feaca7fc700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feacc106b40 0x7feacc10bbb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feacaffd700 1 -- 192.168.123.100:0/1765339383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feacc106b40 msgr2=0x7feacc10bbb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feacaffd700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feacc106b40 0x7feacc10bbb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feacaffd700 1 -- 192.168.123.100:0/1765339383 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feac401c060 con 0x7feacc075700 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.788+0000 7feacaffd700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc075700 0x7feacc106600 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7feabc00d8d0 tx=0x7feabc00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.789+0000 7feab3fff700 1 -- 192.168.123.100:0/1765339383 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feabc009940 con 0x7feacc075700 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.789+0000 7fead1343700 1 -- 192.168.123.100:0/1765339383 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feacc10c150 con 0x7feacc075700 2026-03-10T12:39:19.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.789+0000 7fead1343700 1 -- 192.168.123.100:0/1765339383 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feacc10c670 con 0x7feacc075700 2026-03-10T12:39:19.791 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.789+0000 7feab3fff700 1 -- 192.168.123.100:0/1765339383 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feabc010460 con 0x7feacc075700 2026-03-10T12:39:19.791 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.789+0000 7feab3fff700 1 -- 192.168.123.100:0/1765339383 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feabc00f5d0 con 0x7feacc075700 2026-03-10T12:39:19.791 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.790+0000 7fead1343700 1 -- 192.168.123.100:0/1765339383 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feab8005320 con 0x7feacc075700 2026-03-10T12:39:19.796 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.794+0000 7feab3fff700 1 -- 192.168.123.100:0/1765339383 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7feabc0105d0 con 0x7feacc075700 2026-03-10T12:39:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.794+0000 7feab3fff700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab4077a40 0x7feab4079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.794+0000 7feab3fff700 1 -- 192.168.123.100:0/1765339383 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7feabc09a0b0 con 0x7feacc075700 2026-03-10T12:39:19.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.796+0000 7feaca7fc700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab4077a40 0x7feab4079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:19.798 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.797+0000 7feaca7fc700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab4077a40 0x7feab4079ef0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7feac401cab0 tx=0x7feac400b040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:19.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.804+0000 7feab3fff700 1 -- 192.168.123.100:0/1765339383 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feabc062070 con 0x7feacc075700 2026-03-10T12:39:19.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.974+0000 7fead1343700 1 -- 192.168.123.100:0/1765339383 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7feab8000bf0 con 0x7feab4077a40 2026-03-10T12:39:19.974 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: Reconfiguring mds.cephfs.vm00.wdwvcu (monmap changed)... 2026-03-10T12:39:19.974 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: Reconfiguring daemon mds.cephfs.vm00.wdwvcu on vm00 2026-03-10T12:39:19.974 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:19.974 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: Unable to update caps for client.ceph-exporter.vm07 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: pgmap v12: 65 pgs: 65 active+clean; 303 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 58 KiB/s rd, 1.6 MiB/s wr, 439 op/s 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='client.34134 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/4193366694' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3567396703' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:39:19.975 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:19 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.978+0000 7feab3fff700 1 -- 192.168.123.100:0/1765339383 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7feab8000bf0 con 0x7feab4077a40 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "4/23 daemons upgraded", 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading mon daemons", 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:39:19.979 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 -- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab4077a40 msgr2=0x7feab4079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab4077a40 0x7feab4079ef0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7feac401cab0 tx=0x7feac400b040 comp rx=0 tx=0).stop 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 -- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc075700 msgr2=0x7feacc106600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc075700 0x7feacc106600 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7feabc00d8d0 tx=0x7feabc00dc90 comp rx=0 tx=0).stop 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 -- 192.168.123.100:0/1765339383 shutdown_connections 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab4077a40 0x7feab4079ef0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feacc075700 0x7feacc106600 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 --2- 192.168.123.100:0/1765339383 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feacc106b40 0x7feacc10bbb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 -- 192.168.123.100:0/1765339383 >> 192.168.123.100:0/1765339383 conn(0x7feacc0fda80 msgr2=0x7feacc06c780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:19.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 -- 192.168.123.100:0/1765339383 shutdown_connections 2026-03-10T12:39:19.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:19.981+0000 7feab1ffb700 1 -- 192.168.123.100:0/1765339383 wait complete. 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: Reconfiguring mds.cephfs.vm00.wdwvcu (monmap changed)... 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: Reconfiguring daemon mds.cephfs.vm00.wdwvcu on vm00 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: Reconfiguring ceph-exporter.vm07 (monmap changed)... 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: Unable to update caps for client.ceph-exporter.vm07 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: Reconfiguring daemon ceph-exporter.vm07 on vm07 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: pgmap v12: 65 pgs: 65 active+clean; 303 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 58 KiB/s rd, 1.6 MiB/s wr, 439 op/s 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='client.34134 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm07.kfawlb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/4193366694' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3567396703' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T12:39:20.068 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:19 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:20.074 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 -- 192.168.123.100:0/1385800100 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8072360 msgr2=0x7f0ed80770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 --2- 192.168.123.100:0/1385800100 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8072360 0x7f0ed80770e0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f0ed000d3e0 tx=0x7f0ed000d6f0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 -- 192.168.123.100:0/1385800100 shutdown_connections 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 --2- 192.168.123.100:0/1385800100 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8072360 0x7f0ed80770e0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 --2- 192.168.123.100:0/1385800100 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ed8071980 0x7f0ed8071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 -- 192.168.123.100:0/1385800100 >> 192.168.123.100:0/1385800100 conn(0x7f0ed806d1a0 msgr2=0x7f0ed806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 -- 192.168.123.100:0/1385800100 shutdown_connections 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.074+0000 7f0edf435700 1 -- 192.168.123.100:0/1385800100 wait complete. 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edf435700 1 Processor -- start 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edf435700 1 -- start start 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edf435700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ed8071980 0x7f0ed80824f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edf435700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8082a30 0x7f0ed8082ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:20.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edf435700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ed812dd80 con 0x7f0ed8071980 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edf435700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ed812def0 con 0x7f0ed8082a30 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edc9d0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8082a30 0x7f0ed8082ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edc9d0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8082a30 0x7f0ed8082ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:43014/0 (socket says 192.168.123.100:43014) 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edc9d0700 1 -- 192.168.123.100:0/328859615 learned_addr learned my addr 192.168.123.100:0/328859615 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edc9d0700 1 -- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ed8071980 msgr2=0x7f0ed80824f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edc9d0700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ed8071980 0x7f0ed80824f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edc9d0700 1 -- 192.168.123.100:0/328859615 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0ed000d090 con 0x7f0ed8082a30 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.075+0000 7f0edc9d0700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8082a30 0x7f0ed8082ea0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f0ed000add0 tx=0x7f0ed000aeb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:20.076 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.076+0000 7f0ece7fc700 1 -- 192.168.123.100:0/328859615 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ed0010040 con 0x7f0ed8082a30 2026-03-10T12:39:20.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.076+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ed812e110 con 0x7f0ed8082a30 2026-03-10T12:39:20.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.076+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ed812e660 con 0x7f0ed8082a30 2026-03-10T12:39:20.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.077+0000 7f0ece7fc700 1 -- 192.168.123.100:0/328859615 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0ed0009430 con 0x7f0ed8082a30 2026-03-10T12:39:20.077 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.077+0000 7f0ece7fc700 1 -- 192.168.123.100:0/328859615 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ed0004a30 con 0x7f0ed8082a30 2026-03-10T12:39:20.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.078+0000 7f0ece7fc700 1 -- 192.168.123.100:0/328859615 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f0ed0008430 con 0x7f0ed8082a30 2026-03-10T12:39:20.079 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.078+0000 7f0ece7fc700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0ec4077a40 0x7f0ec4079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:20.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.080+0000 7f0ece7fc700 1 -- 192.168.123.100:0/328859615 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f0ed009c000 con 0x7f0ed8082a30 2026-03-10T12:39:20.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.081+0000 7f0edd1d1700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0ec4077a40 0x7f0ec4079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:20.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.081+0000 7f0edd1d1700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0ec4077a40 0x7f0ec4079ef0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f0ed4005fd0 tx=0x7f0ed400c040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:20.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.081+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ebc005320 con 0x7f0ed8082a30 2026-03-10T12:39:20.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.085+0000 7f0ece7fc700 1 -- 192.168.123.100:0/328859615 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0ed0064610 con 0x7f0ed8082a30 2026-03-10T12:39:20.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.302+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f0ebc005190 con 0x7f0ed8082a30 2026-03-10T12:39:20.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.303+0000 7f0ece7fc700 1 -- 192.168.123.100:0/328859615 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f0ed0063d60 con 0x7f0ed8082a30 2026-03-10T12:39:20.303 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:39:20.307 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.306+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0ec4077a40 msgr2=0x7f0ec4079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:20.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.306+0000 7f0edf435700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0ec4077a40 0x7f0ec4079ef0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f0ed4005fd0 tx=0x7f0ed400c040 comp rx=0 tx=0).stop 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.306+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8082a30 msgr2=0x7f0ed8082ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.306+0000 7f0edf435700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8082a30 0x7f0ed8082ea0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f0ed000add0 tx=0x7f0ed000aeb0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.307+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 shutdown_connections 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.307+0000 7f0edf435700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0ec4077a40 0x7f0ec4079ef0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.307+0000 7f0edf435700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0ed8071980 0x7f0ed80824f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.307+0000 7f0edf435700 1 --2- 192.168.123.100:0/328859615 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0ed8082a30 0x7f0ed8082ea0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.307+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 >> 192.168.123.100:0/328859615 conn(0x7f0ed806d1a0 msgr2=0x7f0ed8076520 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.307+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 shutdown_connections 2026-03-10T12:39:20.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:20.307+0000 7f0edf435700 1 -- 192.168.123.100:0/328859615 wait complete. 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: Reconfiguring daemon crash.vm07 on vm07 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='client.34138 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: Reconfiguring mgr.vm07.kfawlb (monmap changed)... 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: Reconfiguring daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: Reconfiguring mon.vm07 (monmap changed)... 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: Reconfiguring daemon mon.vm07 on vm07 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/328859615' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T12:39:20.922 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: Reconfiguring crash.vm07 (monmap changed)... 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: Reconfiguring daemon crash.vm07 on vm07 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='client.34138 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: Reconfiguring mgr.vm07.kfawlb (monmap changed)... 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: Reconfiguring daemon mgr.vm07.kfawlb on vm07 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: Reconfiguring mon.vm07 (monmap changed)... 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: Reconfiguring daemon mon.vm07 on vm07 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/328859615' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T12:39:21.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: from='client.34148 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T12:39:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: Reconfiguring daemon osd.3 on vm07 2026-03-10T12:39:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: pgmap v13: 65 pgs: 65 active+clean; 303 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.1 MiB/s wr, 300 op/s 2026-03-10T12:39:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T12:39:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: Reconfiguring daemon osd.4 on vm07 2026-03-10T12:39:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:22.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:22.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T12:39:22.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T12:39:22.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:22.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: Reconfiguring daemon osd.5 on vm07 2026-03-10T12:39:22.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: from='client.34148 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: Reconfiguring daemon osd.3 on vm07 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: pgmap v13: 65 pgs: 65 active+clean; 303 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.1 MiB/s wr, 300 op/s 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: Reconfiguring daemon osd.4 on vm07 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: Reconfiguring daemon osd.5 on vm07 2026-03-10T12:39:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: Reconfiguring mds.cephfs.vm07.wznhgu (monmap changed)... 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: Reconfiguring daemon mds.cephfs.vm07.wznhgu on vm07 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: pgmap v14: 65 pgs: 65 active+clean; 301 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.7 MiB/s wr, 425 op/s 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: Reconfiguring mds.cephfs.vm07.rhzwnr (monmap changed)... 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:23.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:22 vm07.local ceph-mon[93622]: Reconfiguring daemon mds.cephfs.vm07.rhzwnr on vm07 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: Reconfiguring mds.cephfs.vm07.wznhgu (monmap changed)... 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: Reconfiguring daemon mds.cephfs.vm07.wznhgu on vm07 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: pgmap v14: 65 pgs: 65 active+clean; 301 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.7 MiB/s wr, 425 op/s 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: Reconfiguring mds.cephfs.vm07.rhzwnr (monmap changed)... 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:22 vm00.local ceph-mon[103263]: Reconfiguring daemon mds.cephfs.vm07.rhzwnr on vm07 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all mon 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm00"}]: dispatch 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm00"}]': finished 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]: dispatch 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]': finished 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all mon 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm00"}]: dispatch 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm00"}]': finished 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]: dispatch 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm07"}]': finished 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm00", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:25.457 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:25 vm00.local ceph-mon[103263]: Upgrade: Updating crash.vm00 (1/2) 2026-03-10T12:39:25.457 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:25 vm00.local ceph-mon[103263]: Deploying daemon crash.vm00 on vm00 2026-03-10T12:39:25.457 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:25 vm00.local ceph-mon[103263]: pgmap v15: 65 pgs: 65 active+clean; 301 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.1 MiB/s wr, 297 op/s 2026-03-10T12:39:25.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:25 vm07.local ceph-mon[93622]: Upgrade: Updating crash.vm00 (1/2) 2026-03-10T12:39:25.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:25 vm07.local ceph-mon[93622]: Deploying daemon crash.vm00 on vm00 2026-03-10T12:39:25.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:25 vm07.local ceph-mon[93622]: pgmap v15: 65 pgs: 65 active+clean; 301 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.1 MiB/s wr, 297 op/s 2026-03-10T12:39:26.944 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:26.944 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:26.944 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:26.944 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:26.944 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:26.977 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:26.977 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:26.977 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:26.977 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm07", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T12:39:26.977 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:27.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:27 vm00.local ceph-mon[103263]: Upgrade: Updating crash.vm07 (2/2) 2026-03-10T12:39:27.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:27 vm00.local ceph-mon[103263]: Deploying daemon crash.vm07 on vm07 2026-03-10T12:39:27.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:27 vm00.local ceph-mon[103263]: pgmap v16: 65 pgs: 65 active+clean; 301 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.1 MiB/s wr, 297 op/s 2026-03-10T12:39:28.234 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:27 vm07.local ceph-mon[93622]: Upgrade: Updating crash.vm07 (2/2) 2026-03-10T12:39:28.234 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:27 vm07.local ceph-mon[93622]: Deploying daemon crash.vm07 on vm07 2026-03-10T12:39:28.234 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:27 vm07.local ceph-mon[93622]: pgmap v16: 65 pgs: 65 active+clean; 301 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.1 MiB/s wr, 297 op/s 2026-03-10T12:39:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:29 vm07.local ceph-mon[93622]: pgmap v17: 65 pgs: 65 active+clean; 293 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 42 KiB/s rd, 1.7 MiB/s wr, 430 op/s 2026-03-10T12:39:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:29 vm00.local ceph-mon[103263]: pgmap v17: 65 pgs: 65 active+clean; 293 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 42 KiB/s rd, 1.7 MiB/s wr, 430 op/s 2026-03-10T12:39:29.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:30.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:30.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:31.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:31 vm07.local ceph-mon[93622]: pgmap v18: 65 pgs: 65 active+clean; 293 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 1.2 MiB/s wr, 259 op/s 2026-03-10T12:39:31.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:31.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:31.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:31.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:31 vm00.local ceph-mon[103263]: pgmap v18: 65 pgs: 65 active+clean; 293 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 1.2 MiB/s wr, 259 op/s 2026-03-10T12:39:31.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:31.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:31.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:32.454 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:32 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:32.454 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:32 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:32.719 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:32 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:32.720 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:32 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:33.758 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:33 vm00.local ceph-mon[103263]: pgmap v19: 65 pgs: 65 active+clean; 291 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 47 KiB/s rd, 1.8 MiB/s wr, 366 op/s 2026-03-10T12:39:33.758 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:33.758 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:33.758 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:33.758 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:33.758 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:33.758 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:33 vm07.local ceph-mon[93622]: pgmap v19: 65 pgs: 65 active+clean; 291 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 47 KiB/s rd, 1.8 MiB/s wr, 366 op/s 2026-03-10T12:39:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:35.479 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:35 vm00.local ceph-mon[103263]: pgmap v20: 65 pgs: 65 active+clean; 291 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.2 MiB/s wr, 240 op/s 2026-03-10T12:39:35.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:35 vm07.local ceph-mon[93622]: pgmap v20: 65 pgs: 65 active+clean; 291 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.2 MiB/s wr, 240 op/s 2026-03-10T12:39:36.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.736 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all crash 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm00"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm00"}]': finished 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]': finished 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: Upgrade: osd.0 is safe to restart 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T12:39:36.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all crash 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm00"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm00"}]': finished 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm07"}]': finished 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: Upgrade: osd.0 is safe to restart 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T12:39:36.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:37.953 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:37 vm00.local ceph-mon[103263]: Upgrade: Updating osd.0 2026-03-10T12:39:37.953 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:37 vm00.local ceph-mon[103263]: Deploying daemon osd.0 on vm00 2026-03-10T12:39:37.953 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:37 vm00.local ceph-mon[103263]: pgmap v21: 65 pgs: 65 active+clean; 291 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.2 MiB/s wr, 240 op/s 2026-03-10T12:39:37.953 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:37 vm00.local systemd[1]: Stopping Ceph osd.0 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:39:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:37 vm07.local ceph-mon[93622]: Upgrade: Updating osd.0 2026-03-10T12:39:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:37 vm07.local ceph-mon[93622]: Deploying daemon osd.0 on vm00 2026-03-10T12:39:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:37 vm07.local ceph-mon[93622]: pgmap v21: 65 pgs: 65 active+clean; 291 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 1.2 MiB/s wr, 240 op/s 2026-03-10T12:39:38.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:37 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[67549]: 2026-03-10T12:39:37.951+0000 7f69c0674700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:39:38.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:37 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[67549]: 2026-03-10T12:39:37.951+0000 7f69c0674700 -1 osd.0 44 *** Got signal Terminated *** 2026-03-10T12:39:38.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:37 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[67549]: 2026-03-10T12:39:37.951+0000 7f69c0674700 -1 osd.0 44 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:39:39.226 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:38 vm00.local ceph-mon[103263]: osd.0 marked itself down and dead 2026-03-10T12:39:39.226 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:38 vm00.local podman[108908]: 2026-03-10 12:39:38.947990107 +0000 UTC m=+1.071611815 container died d5b05007694de793c1276b4036624ec5adf89d1218c26cebefe1be1622b4848d (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, org.label-schema.schema-version=1.0) 2026-03-10T12:39:39.226 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:38 vm00.local podman[108908]: 2026-03-10 12:39:38.990732505 +0000 UTC m=+1.114354202 container remove d5b05007694de793c1276b4036624ec5adf89d1218c26cebefe1be1622b4848d (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0, CEPH_POINT_RELEASE=-18.2.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, ceph=True) 2026-03-10T12:39:39.226 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:38 vm00.local bash[108908]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0 2026-03-10T12:39:39.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:38 vm07.local ceph-mon[93622]: osd.0 marked itself down and dead 2026-03-10T12:39:39.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local podman[108978]: 2026-03-10 12:39:39.225947589 +0000 UTC m=+0.025561025 container create 82c4e3c2370202a01c8c7b061ce3bb94d8706433ae08c27b138edb5c1f764309 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:39:39.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local podman[108978]: 2026-03-10 12:39:39.258968955 +0000 UTC m=+0.058582391 container init 82c4e3c2370202a01c8c7b061ce3bb94d8706433ae08c27b138edb5c1f764309 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid) 2026-03-10T12:39:39.485 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local podman[108978]: 2026-03-10 12:39:39.269395006 +0000 UTC m=+0.069008432 container start 82c4e3c2370202a01c8c7b061ce3bb94d8706433ae08c27b138edb5c1f764309 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:39:39.485 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local podman[108978]: 2026-03-10 12:39:39.272046167 +0000 UTC m=+0.071659612 container attach 82c4e3c2370202a01c8c7b061ce3bb94d8706433ae08c27b138edb5c1f764309 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:39:39.485 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local podman[108978]: 2026-03-10 12:39:39.21664445 +0000 UTC m=+0.016257886 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:39:39.485 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local conmon[108990]: conmon 82c4e3c2370202a01c8c : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-82c4e3c2370202a01c8c7b061ce3bb94d8706433ae08c27b138edb5c1f764309.scope/container/memory.events 2026-03-10T12:39:39.485 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local podman[108978]: 2026-03-10 12:39:39.454332923 +0000 UTC m=+0.253946359 container died 82c4e3c2370202a01c8c7b061ce3bb94d8706433ae08c27b138edb5c1f764309 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3) 2026-03-10T12:39:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:39 vm00.local ceph-mon[103263]: pgmap v22: 65 pgs: 65 active+clean; 287 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.6 MiB/s wr, 393 op/s 2026-03-10T12:39:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:39 vm00.local ceph-mon[103263]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:39:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:39 vm00.local ceph-mon[103263]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T12:39:39.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local podman[108978]: 2026-03-10 12:39:39.70638176 +0000 UTC m=+0.505995196 container remove 82c4e3c2370202a01c8c7b061ce3bb94d8706433ae08c27b138edb5c1f764309 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T12:39:39.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.0.service: Deactivated successfully. 2026-03-10T12:39:39.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.0.service: Unit process 108990 (conmon) remains running after unit stopped. 2026-03-10T12:39:39.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.0.service: Unit process 108999 (podman) remains running after unit stopped. 2026-03-10T12:39:39.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local systemd[1]: Stopped Ceph osd.0 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:39:39.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.0.service: Consumed 33.934s CPU time, 559.7M memory peak. 2026-03-10T12:39:39.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:39 vm00.local systemd[1]: Starting Ceph osd.0 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:39:40.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:39 vm07.local ceph-mon[93622]: pgmap v22: 65 pgs: 65 active+clean; 287 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.6 MiB/s wr, 393 op/s 2026-03-10T12:39:40.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:39 vm07.local ceph-mon[93622]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:39:40.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:39 vm07.local ceph-mon[93622]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local podman[109079]: 2026-03-10 12:39:40.237148623 +0000 UTC m=+0.068401195 container create 1a7320e2f492051e363192a1160d99afbab386c69c42cde8ae15b21799e1a1cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local podman[109079]: 2026-03-10 12:39:40.199558688 +0000 UTC m=+0.030811270 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local podman[109079]: 2026-03-10 12:39:40.295469683 +0000 UTC m=+0.126722265 container init 1a7320e2f492051e363192a1160d99afbab386c69c42cde8ae15b21799e1a1cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local podman[109079]: 2026-03-10 12:39:40.309603132 +0000 UTC m=+0.140855695 container start 1a7320e2f492051e363192a1160d99afbab386c69c42cde8ae15b21799e1a1cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local podman[109079]: 2026-03-10 12:39:40.318741514 +0000 UTC m=+0.149994086 container attach 1a7320e2f492051e363192a1160d99afbab386c69c42cde8ae15b21799e1a1cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local bash[109079]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:40.600 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:40 vm00.local bash[109079]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:41.165 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:40 vm00.local ceph-mon[103263]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T12:39:41.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:40 vm07.local ceph-mon[93622]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4507c758-80ef-44ab-ada5-b211da5a1c02/osd-block-0f6cb3f2-3337-4851-ba13-f08c9574062c --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T12:39:41.484 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-4507c758-80ef-44ab-ada5-b211da5a1c02/osd-block-0f6cb3f2-3337-4851-ba13-f08c9574062c --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/ln -snf /dev/ceph-4507c758-80ef-44ab-ada5-b211da5a1c02/osd-block-0f6cb3f2-3337-4851-ba13-f08c9574062c /var/lib/ceph/osd/ceph-0/block 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/ln -snf /dev/ceph-4507c758-80ef-44ab-ada5-b211da5a1c02/osd-block-0f6cb3f2-3337-4851-ba13-f08c9574062c /var/lib/ceph/osd/ceph-0/block 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate[109089]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local bash[109079]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local podman[109079]: 2026-03-10 12:39:41.565164134 +0000 UTC m=+1.396416706 container died 1a7320e2f492051e363192a1160d99afbab386c69c42cde8ae15b21799e1a1cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0) 2026-03-10T12:39:41.933 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local podman[109079]: 2026-03-10 12:39:41.697970675 +0000 UTC m=+1.529223247 container remove 1a7320e2f492051e363192a1160d99afbab386c69c42cde8ae15b21799e1a1cc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-activate, org.label-schema.build-date=20260223, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:39:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:41 vm00.local ceph-mon[103263]: pgmap v25: 65 pgs: 9 stale+active+clean, 56 active+clean; 287 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 552 KiB/s wr, 229 op/s 2026-03-10T12:39:42.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local podman[109335]: 2026-03-10 12:39:41.934706746 +0000 UTC m=+0.103827281 container create 9b151d44f3cf6043c87ac7fcfa5325a6c8ae8e87753e1530528f422236f5312d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:39:42.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:41 vm00.local podman[109335]: 2026-03-10 12:39:41.844767447 +0000 UTC m=+0.013887991 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:39:42.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:42 vm00.local podman[109335]: 2026-03-10 12:39:42.002193268 +0000 UTC m=+0.171313803 container init 9b151d44f3cf6043c87ac7fcfa5325a6c8ae8e87753e1530528f422236f5312d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.build-date=20260223) 2026-03-10T12:39:42.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:42 vm00.local podman[109335]: 2026-03-10 12:39:42.008051071 +0000 UTC m=+0.177171606 container start 9b151d44f3cf6043c87ac7fcfa5325a6c8ae8e87753e1530528f422236f5312d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-10T12:39:42.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:42 vm00.local bash[109335]: 9b151d44f3cf6043c87ac7fcfa5325a6c8ae8e87753e1530528f422236f5312d 2026-03-10T12:39:42.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:42 vm00.local systemd[1]: Started Ceph osd.0 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:39:42.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:42 vm00.local ceph-osd[109350]: -- 192.168.123.100:0/2227252936 <== mon.0 v2:192.168.123.100:3300/0 4 ==== auth_reply(proto 2 0 (0) Success) ==== 194+0+0 (secure 0 0 0) 0x55cf9abbc960 con 0x55cf9b9a6000 2026-03-10T12:39:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:41 vm07.local ceph-mon[93622]: pgmap v25: 65 pgs: 9 stale+active+clean, 56 active+clean; 287 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 552 KiB/s wr, 229 op/s 2026-03-10T12:39:43.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:42 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[109346]: 2026-03-10T12:39:42.958+0000 7f4056d6a740 -1 Falling back to public interface 2026-03-10T12:39:43.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:43 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:43.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:43 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:43.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:43 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:43.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:43 vm00.local ceph-mon[103263]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 283 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1000 KiB/s wr, 451 op/s; 3715/25227 objects degraded (14.726%) 2026-03-10T12:39:43.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:43 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:43.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:43 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:43.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:43 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:43.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:43 vm07.local ceph-mon[93622]: pgmap v26: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 283 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1000 KiB/s wr, 451 op/s; 3715/25227 objects degraded (14.726%) 2026-03-10T12:39:44.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:44 vm00.local ceph-mon[103263]: Health check failed: Degraded data redundancy: 3715/25227 objects degraded (14.726%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T12:39:44.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:44 vm07.local ceph-mon[93622]: Health check failed: Degraded data redundancy: 3715/25227 objects degraded (14.726%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T12:39:45.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:45 vm00.local ceph-mon[103263]: pgmap v27: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 283 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1000 KiB/s wr, 451 op/s; 3715/25227 objects degraded (14.726%) 2026-03-10T12:39:45.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:45.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:45.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:45 vm07.local ceph-mon[93622]: pgmap v27: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 283 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1000 KiB/s wr, 451 op/s; 3715/25227 objects degraded (14.726%) 2026-03-10T12:39:45.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:45.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:47.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:46 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:47.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:46 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:47.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:46 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:47.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:46 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:47.083 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:46 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:47.083 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:46 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:47.083 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:46 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:47.083 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:46 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: pgmap v28: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 283 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 5.5 KiB/s rd, 447 KiB/s wr, 222 op/s; 3715/25227 objects degraded (14.726%) 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: pgmap v28: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 283 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 5.5 KiB/s rd, 447 KiB/s wr, 222 op/s; 3715/25227 objects degraded (14.726%) 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:39:48.734 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:48 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[109346]: 2026-03-10T12:39:48.373+0000 7f4056d6a740 -1 osd.0 0 read_superblock omap replica is missing. 2026-03-10T12:39:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:48 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:39:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:48 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T12:39:49.234 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:49 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[109346]: 2026-03-10T12:39:49.093+0000 7f4056d6a740 -1 osd.0 44 log_to_monitors true 2026-03-10T12:39:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:48 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:39:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:48 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T12:39:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:49 vm00.local ceph-mon[103263]: pgmap v29: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 285 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 823 KiB/s wr, 304 op/s; 3355/22770 objects degraded (14.734%) 2026-03-10T12:39:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:49 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 3355/22770 objects degraded (14.734%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T12:39:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:49 vm00.local ceph-mon[103263]: from='osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T12:39:49.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:39:49 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[109346]: 2026-03-10T12:39:49.915+0000 7f404eb04640 -1 osd.0 44 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:39:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:49 vm07.local ceph-mon[93622]: pgmap v29: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 285 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 823 KiB/s wr, 304 op/s; 3355/22770 objects degraded (14.734%) 2026-03-10T12:39:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:49 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 3355/22770 objects degraded (14.734%), 33 pgs degraded (PG_DEGRADED) 2026-03-10T12:39:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:49 vm07.local ceph-mon[93622]: from='osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.417+0000 7fece0884700 1 -- 192.168.123.100:0/3328097963 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 msgr2=0x7fecdc1028e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.417+0000 7fece0884700 1 --2- 192.168.123.100:0/3328097963 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 0x7fecdc1028e0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7feccc009b00 tx=0x7feccc009e10 comp rx=0 tx=0).stop 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.417+0000 7fece0884700 1 -- 192.168.123.100:0/3328097963 shutdown_connections 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.417+0000 7fece0884700 1 --2- 192.168.123.100:0/3328097963 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 0x7fecdc103b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.417+0000 7fece0884700 1 --2- 192.168.123.100:0/3328097963 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 0x7fecdc1028e0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.417+0000 7fece0884700 1 -- 192.168.123.100:0/3328097963 >> 192.168.123.100:0/3328097963 conn(0x7fecdc0fda80 msgr2=0x7fecdc0ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 -- 192.168.123.100:0/3328097963 shutdown_connections 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 -- 192.168.123.100:0/3328097963 wait complete. 2026-03-10T12:39:50.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 Processor -- start 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 -- start start 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 0x7fecdc078b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 0x7fecdc079040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fecdc0755f0 con 0x7fecdc1036d0 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.418+0000 7fece0884700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fecdc075760 con 0x7fecdc1024d0 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecd3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 0x7fecdc079040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecd3fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 0x7fecdc079040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42404/0 (socket says 192.168.123.100:42404) 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecd3fff700 1 -- 192.168.123.100:0/2829503474 learned_addr learned my addr 192.168.123.100:0/2829503474 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecdad9d700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 0x7fecdc078b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecd3fff700 1 -- 192.168.123.100:0/2829503474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 msgr2=0x7fecdc078b00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecd3fff700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 0x7fecdc078b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecd3fff700 1 -- 192.168.123.100:0/2829503474 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feccc0097e0 con 0x7fecdc1036d0 2026-03-10T12:39:50.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.419+0000 7fecd3fff700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 0x7fecdc079040 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fecc400ed70 tx=0x7fecc400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:50.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.420+0000 7fecd8d99700 1 -- 192.168.123.100:0/2829503474 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fecc4009980 con 0x7fecdc1036d0 2026-03-10T12:39:50.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.420+0000 7fecd8d99700 1 -- 192.168.123.100:0/2829503474 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fecc400cd70 con 0x7fecdc1036d0 2026-03-10T12:39:50.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.420+0000 7fece0884700 1 -- 192.168.123.100:0/2829503474 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fecdc075a40 con 0x7fecdc1036d0 2026-03-10T12:39:50.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.421+0000 7fece0884700 1 -- 192.168.123.100:0/2829503474 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fecdc075f90 con 0x7fecdc1036d0 2026-03-10T12:39:50.422 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.422+0000 7fecd8d99700 1 -- 192.168.123.100:0/2829503474 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fecc40189c0 con 0x7fecdc1036d0 2026-03-10T12:39:50.424 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.422+0000 7fece0884700 1 -- 192.168.123.100:0/2829503474 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fecdc04ea50 con 0x7fecdc1036d0 2026-03-10T12:39:50.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.424+0000 7fecd8d99700 1 -- 192.168.123.100:0/2829503474 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fecc4018be0 con 0x7fecdc1036d0 2026-03-10T12:39:50.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.424+0000 7fecd8d99700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fecc8077850 0x7fecc8079d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.424+0000 7fecdad9d700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fecc8077850 0x7fecc8079d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.425+0000 7fecdad9d700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fecc8077850 0x7fecc8079d00 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7feccc006010 tx=0x7feccc005c40 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:50.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.425+0000 7fecd8d99700 1 -- 192.168.123.100:0/2829503474 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(47..47 src has 1..47) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fecc4014070 con 0x7fecdc1036d0 2026-03-10T12:39:50.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.425+0000 7fecd8d99700 1 -- 192.168.123.100:0/2829503474 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fecc40a0050 con 0x7fecdc1036d0 2026-03-10T12:39:50.611 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.610+0000 7fece0884700 1 -- 192.168.123.100:0/2829503474 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fecdc108020 con 0x7fecc8077850 2026-03-10T12:39:50.612 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.612+0000 7fecd8d99700 1 -- 192.168.123.100:0/2829503474 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fecdc108020 con 0x7fecc8077850 2026-03-10T12:39:50.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 -- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fecc8077850 msgr2=0x7fecc8079d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fecc8077850 0x7fecc8079d00 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7feccc006010 tx=0x7feccc005c40 comp rx=0 tx=0).stop 2026-03-10T12:39:50.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 -- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 msgr2=0x7fecdc079040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 0x7fecdc079040 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fecc400ed70 tx=0x7fecc400c5b0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 -- 192.168.123.100:0/2829503474 shutdown_connections 2026-03-10T12:39:50.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fecc8077850 0x7fecc8079d00 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fecdc1024d0 0x7fecdc078b00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 --2- 192.168.123.100:0/2829503474 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fecdc1036d0 0x7fecdc079040 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.615+0000 7fecd1ffb700 1 -- 192.168.123.100:0/2829503474 >> 192.168.123.100:0/2829503474 conn(0x7fecdc0fda80 msgr2=0x7fecdc106900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:50.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.617+0000 7fecd1ffb700 1 -- 192.168.123.100:0/2829503474 shutdown_connections 2026-03-10T12:39:50.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.617+0000 7fecd1ffb700 1 -- 192.168.123.100:0/2829503474 wait complete. 2026-03-10T12:39:50.628 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 -- 192.168.123.100:0/2953878940 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308072360 msgr2=0x7ff3080770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 --2- 192.168.123.100:0/2953878940 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308072360 0x7ff3080770e0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7ff30000d3f0 tx=0x7ff30000d700 comp rx=0 tx=0).stop 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 -- 192.168.123.100:0/2953878940 shutdown_connections 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 --2- 192.168.123.100:0/2953878940 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308072360 0x7ff3080770e0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 --2- 192.168.123.100:0/2953878940 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff308071980 0x7ff308071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 -- 192.168.123.100:0/2953878940 >> 192.168.123.100:0/2953878940 conn(0x7ff30806d1a0 msgr2=0x7ff30806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 -- 192.168.123.100:0/2953878940 shutdown_connections 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.722+0000 7ff30eb66700 1 -- 192.168.123.100:0/2953878940 wait complete. 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30eb66700 1 Processor -- start 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30eb66700 1 -- start start 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30eb66700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308071980 0x7ff308082560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30eb66700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff308082aa0 0x7ff308082f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30eb66700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3081b2a90 con 0x7ff308071980 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30eb66700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3081b2bd0 con 0x7ff308082aa0 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30c902700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308071980 0x7ff308082560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30c902700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308071980 0x7ff308082560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42432/0 (socket says 192.168.123.100:42432) 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff30c902700 1 -- 192.168.123.100:0/1121583897 learned_addr learned my addr 192.168.123.100:0/1121583897 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:50.724 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.723+0000 7ff307fff700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff308082aa0 0x7ff308082f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.727 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.727+0000 7ff307fff700 1 -- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308071980 msgr2=0x7ff308082560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.728 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.727+0000 7ff307fff700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308071980 0x7ff308082560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.728 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.727+0000 7ff307fff700 1 -- 192.168.123.100:0/1121583897 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff300007ed0 con 0x7ff308082aa0 2026-03-10T12:39:50.728 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.727+0000 7ff307fff700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff308082aa0 0x7ff308082f10 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7ff308072ff0 tx=0x7ff30001bb10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:50.728 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.727+0000 7ff305ffb700 1 -- 192.168.123.100:0/1121583897 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff300020070 con 0x7ff308082aa0 2026-03-10T12:39:50.729 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.727+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3081b2d10 con 0x7ff308082aa0 2026-03-10T12:39:50.729 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.728+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3081b3200 con 0x7ff308082aa0 2026-03-10T12:39:50.729 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.728+0000 7ff305ffb700 1 -- 192.168.123.100:0/1121583897 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff30000deb0 con 0x7ff308082aa0 2026-03-10T12:39:50.729 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.728+0000 7ff305ffb700 1 -- 192.168.123.100:0/1121583897 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff30000fb70 con 0x7ff308082aa0 2026-03-10T12:39:50.730 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.730+0000 7ff305ffb700 1 -- 192.168.123.100:0/1121583897 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ff30000fcd0 con 0x7ff308082aa0 2026-03-10T12:39:50.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.730+0000 7ff305ffb700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff2f0077a50 0x7ff2f0079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.730+0000 7ff30c902700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff2f0077a50 0x7ff2f0079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.731+0000 7ff305ffb700 1 -- 192.168.123.100:0/1121583897 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(47..47 src has 1..47) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff300013070 con 0x7ff308082aa0 2026-03-10T12:39:50.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.731+0000 7ff30c902700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff2f0077a50 0x7ff2f0079f00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7ff2f800d440 tx=0x7ff2f800d490 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:50.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.731+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff2f4005320 con 0x7ff308082aa0 2026-03-10T12:39:50.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.736+0000 7ff305ffb700 1 -- 192.168.123.100:0/1121583897 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff300069110 con 0x7ff308082aa0 2026-03-10T12:39:50.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.874+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff2f4000bf0 con 0x7ff2f0077a50 2026-03-10T12:39:50.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.875+0000 7ff305ffb700 1 -- 192.168.123.100:0/1121583897 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff2f4000bf0 con 0x7ff2f0077a50 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff2f0077a50 msgr2=0x7ff2f0079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff2f0077a50 0x7ff2f0079f00 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7ff2f800d440 tx=0x7ff2f800d490 comp rx=0 tx=0).stop 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff308082aa0 msgr2=0x7ff308082f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff308082aa0 0x7ff308082f10 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7ff308072ff0 tx=0x7ff30001bb10 comp rx=0 tx=0).stop 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 shutdown_connections 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff2f0077a50 0x7ff2f0079f00 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff308071980 0x7ff308082560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.877+0000 7ff30eb66700 1 --2- 192.168.123.100:0/1121583897 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff308082aa0 0x7ff308082f10 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.878+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 >> 192.168.123.100:0/1121583897 conn(0x7ff30806d1a0 msgr2=0x7ff3080764c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.878+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 shutdown_connections 2026-03-10T12:39:50.878 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.878+0000 7ff30eb66700 1 -- 192.168.123.100:0/1121583897 wait complete. 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 -- 192.168.123.100:0/2684499324 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 msgr2=0x7fdb98071ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 --2- 192.168.123.100:0/2684499324 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 0x7fdb98071ea0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fdb88009b00 tx=0x7fdb88009e10 comp rx=0 tx=0).stop 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 -- 192.168.123.100:0/2684499324 shutdown_connections 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 --2- 192.168.123.100:0/2684499324 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 0x7fdb9810beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 --2- 192.168.123.100:0/2684499324 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 0x7fdb98071ea0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 -- 192.168.123.100:0/2684499324 >> 192.168.123.100:0/2684499324 conn(0x7fdb9806d1a0 msgr2=0x7fdb9806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 -- 192.168.123.100:0/2684499324 shutdown_connections 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.986+0000 7fdb9d930700 1 -- 192.168.123.100:0/2684499324 wait complete. 2026-03-10T12:39:50.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb9d930700 1 Processor -- start 2026-03-10T12:39:50.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb9d930700 1 -- start start 2026-03-10T12:39:50.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb9d930700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 0x7fdb98116980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb9d930700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 0x7fdb98116ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb9d930700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb981174c0 con 0x7fdb98071a90 2026-03-10T12:39:50.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb9d930700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb98117630 con 0x7fdb98072470 2026-03-10T12:39:50.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb967fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 0x7fdb98116ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb967fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 0x7fdb98116ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:54704/0 (socket says 192.168.123.100:54704) 2026-03-10T12:39:50.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.987+0000 7fdb967fc700 1 -- 192.168.123.100:0/2907975554 learned_addr learned my addr 192.168.123.100:0/2907975554 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:50.991 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.990+0000 7fdb96ffd700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 0x7fdb98116980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.991+0000 7fdb967fc700 1 -- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 msgr2=0x7fdb98116980 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:50.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.991+0000 7fdb967fc700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 0x7fdb98116980 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:50.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.991+0000 7fdb967fc700 1 -- 192.168.123.100:0/2907975554 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb880097e0 con 0x7fdb98072470 2026-03-10T12:39:50.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.992+0000 7fdb967fc700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 0x7fdb98116ec0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fdb8c009fd0 tx=0x7fdb8c00eea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:50.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.993+0000 7fdb9c92e700 1 -- 192.168.123.100:0/2907975554 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb8c009980 con 0x7fdb98072470 2026-03-10T12:39:50.995 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.993+0000 7fdb9d930700 1 -- 192.168.123.100:0/2907975554 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb981b2860 con 0x7fdb98072470 2026-03-10T12:39:50.995 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.993+0000 7fdb9d930700 1 -- 192.168.123.100:0/2907975554 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb981b2db0 con 0x7fdb98072470 2026-03-10T12:39:50.995 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.994+0000 7fdb9c92e700 1 -- 192.168.123.100:0/2907975554 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdb8c004d10 con 0x7fdb98072470 2026-03-10T12:39:50.995 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.994+0000 7fdb9c92e700 1 -- 192.168.123.100:0/2907975554 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb8c010470 con 0x7fdb98072470 2026-03-10T12:39:50.996 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.994+0000 7fdb9d930700 1 -- 192.168.123.100:0/2907975554 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdb98110c20 con 0x7fdb98072470 2026-03-10T12:39:50.997 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.996+0000 7fdb9c92e700 1 -- 192.168.123.100:0/2907975554 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fdb8c00cca0 con 0x7fdb98072470 2026-03-10T12:39:50.997 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.997+0000 7fdb9c92e700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb80077a00 0x7fdb80079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:50.997 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.997+0000 7fdb9c92e700 1 -- 192.168.123.100:0/2907975554 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(47..47 src has 1..47) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fdb8c014070 con 0x7fdb98072470 2026-03-10T12:39:50.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.997+0000 7fdb96ffd700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb80077a00 0x7fdb80079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:50.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:50.998+0000 7fdb96ffd700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb80077a00 0x7fdb80079eb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fdb88000c00 tx=0x7fdb8801a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:51.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.000+0000 7fdb9c92e700 1 -- 192.168.123.100:0/2907975554 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdb8c062750 con 0x7fdb98072470 2026-03-10T12:39:51.153 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.152+0000 7fdb9d930700 1 -- 192.168.123.100:0/2907975554 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fdb98061190 con 0x7fdb80077a00 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (6m) 6s ago 6m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (6m) 6s ago 6m 8774k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (6m) 20s ago 6m 11.2M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (25s) 6s ago 6m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (23s) 20s ago 6m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (6m) 6s ago 6m 90.9M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (4m) 6s ago 4m 152M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (4m) 6s ago 4m 18.0M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (4m) 20s ago 4m 17.4M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (4m) 20s ago 4m 169M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (83s) 6s ago 7m 615M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (64s) 20s ago 6m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (57s) 6s ago 7m 54.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (42s) 20s ago 6m 50.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (6m) 6s ago 6m 15.2M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (6m) 20s ago 6m 15.5M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:39:51.162 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (9s) 6s ago 5m 29.4M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:39:51.163 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (5m) 6s ago 5m 383M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:39:51.163 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (5m) 6s ago 5m 322M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:39:51.163 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (5m) 20s ago 5m 452M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:39:51.163 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (5m) 20s ago 5m 402M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:39:51.163 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (5m) 20s ago 5m 370M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:39:51.163 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (67s) 6s ago 6m 55.3M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:39:51.163 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.161+0000 7fdb9c92e700 1 -- 192.168.123.100:0/2907975554 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fdb98061190 con 0x7fdb80077a00 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 -- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb80077a00 msgr2=0x7fdb80079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb80077a00 0x7fdb80079eb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fdb88000c00 tx=0x7fdb8801a040 comp rx=0 tx=0).stop 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 -- 192.168.123.100:0/2907975554 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 msgr2=0x7fdb98116ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 0x7fdb98116ec0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fdb8c009fd0 tx=0x7fdb8c00eea0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 -- 192.168.123.100:0/2907975554 shutdown_connections 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb80077a00 0x7fdb80079eb0 secure :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fdb88000c00 tx=0x7fdb8801a040 comp rx=0 tx=0).stop 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb98071a90 0x7fdb98116980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.165+0000 7fdb7e7fc700 1 --2- 192.168.123.100:0/2907975554 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb98072470 0x7fdb98116ec0 secure :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fdb8c009fd0 tx=0x7fdb8c00eea0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.166+0000 7fdb7e7fc700 1 -- 192.168.123.100:0/2907975554 >> 192.168.123.100:0/2907975554 conn(0x7fdb9806d1a0 msgr2=0x7fdb9810a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.166+0000 7fdb7e7fc700 1 -- 192.168.123.100:0/2907975554 shutdown_connections 2026-03-10T12:39:51.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.166+0000 7fdb7e7fc700 1 -- 192.168.123.100:0/2907975554 wait complete. 2026-03-10T12:39:51.484 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:51 vm07.local ceph-mon[93622]: from='osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T12:39:51.484 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:51 vm07.local ceph-mon[93622]: osdmap e47: 6 total, 5 up, 6 in 2026-03-10T12:39:51.484 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:51 vm07.local ceph-mon[93622]: from='osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 -- 192.168.123.100:0/4098379465 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314072440 msgr2=0x7fc31410be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 --2- 192.168.123.100:0/4098379465 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314072440 0x7fc31410be90 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fc304009b00 tx=0x7fc304009e10 comp rx=0 tx=0).stop 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 -- 192.168.123.100:0/4098379465 shutdown_connections 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 --2- 192.168.123.100:0/4098379465 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314072440 0x7fc31410be90 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 --2- 192.168.123.100:0/4098379465 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc314071a60 0x7fc314071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 -- 192.168.123.100:0/4098379465 >> 192.168.123.100:0/4098379465 conn(0x7fc31406d1a0 msgr2=0x7fc31406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 -- 192.168.123.100:0/4098379465 shutdown_connections 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.498+0000 7fc319baa700 1 -- 192.168.123.100:0/4098379465 wait complete. 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc319baa700 1 Processor -- start 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc319baa700 1 -- start start 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc319baa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314071a60 0x7fc3141169f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc319baa700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc314072440 0x7fc314116f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc319baa700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc314117570 con 0x7fc314071a60 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc319baa700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc3141176e0 con 0x7fc314072440 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc318ba8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314071a60 0x7fc3141169f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc318ba8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314071a60 0x7fc3141169f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42466/0 (socket says 192.168.123.100:42466) 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc318ba8700 1 -- 192.168.123.100:0/1232661304 learned_addr learned my addr 192.168.123.100:0/1232661304 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:51.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.499+0000 7fc313fff700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc314072440 0x7fc314116f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:51.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.500+0000 7fc318ba8700 1 -- 192.168.123.100:0/1232661304 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc314072440 msgr2=0x7fc314116f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.500+0000 7fc318ba8700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc314072440 0x7fc314116f30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.501 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.500+0000 7fc318ba8700 1 -- 192.168.123.100:0/1232661304 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc3040097e0 con 0x7fc314071a60 2026-03-10T12:39:51.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.500+0000 7fc318ba8700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314071a60 0x7fc3141169f0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fc30800b700 tx=0x7fc30800ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:51.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.500+0000 7fc311ffb700 1 -- 192.168.123.100:0/1232661304 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc308011840 con 0x7fc314071a60 2026-03-10T12:39:51.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.500+0000 7fc311ffb700 1 -- 192.168.123.100:0/1232661304 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc308011e80 con 0x7fc314071a60 2026-03-10T12:39:51.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.500+0000 7fc311ffb700 1 -- 192.168.123.100:0/1232661304 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc30800f550 con 0x7fc314071a60 2026-03-10T12:39:51.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.501+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc3141a1640 con 0x7fc314071a60 2026-03-10T12:39:51.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.501+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc3141a1b10 con 0x7fc314071a60 2026-03-10T12:39:51.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.501+0000 7fc2fb7fe700 1 -- 192.168.123.100:0/1232661304 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc31404ea50 con 0x7fc314071a60 2026-03-10T12:39:51.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.502+0000 7fc311ffb700 1 -- 192.168.123.100:0/1232661304 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc3080119a0 con 0x7fc314071a60 2026-03-10T12:39:51.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.503+0000 7fc311ffb700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc2fc0777d0 0x7fc2fc079c80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:51.504 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.503+0000 7fc311ffb700 1 -- 192.168.123.100:0/1232661304 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(48..48 src has 1..48) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc308066ac0 con 0x7fc314071a60 2026-03-10T12:39:51.504 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.504+0000 7fc313fff700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc2fc0777d0 0x7fc2fc079c80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:51.506 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.506+0000 7fc313fff700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc2fc0777d0 0x7fc2fc079c80 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fc304006010 tx=0x7fc30400b560 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:51.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.506+0000 7fc311ffb700 1 -- 192.168.123.100:0/1232661304 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc3080624e0 con 0x7fc314071a60 2026-03-10T12:39:51.663 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:51 vm00.local ceph-mon[103263]: from='osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T12:39:51.663 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:51 vm00.local ceph-mon[103263]: osdmap e47: 6 total, 5 up, 6 in 2026-03-10T12:39:51.663 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:51 vm00.local ceph-mon[103263]: from='osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.691+0000 7fc2fb7fe700 1 -- 192.168.123.100:0/1232661304 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc314062130 con 0x7fc314071a60 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.691+0000 7fc311ffb700 1 -- 192.168.123.100:0/1232661304 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fc3080212f0 con 0x7fc314071a60 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T12:39:51.692 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:39:51.693 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:39:51.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.694+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc2fc0777d0 msgr2=0x7fc2fc079c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.694+0000 7fc319baa700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc2fc0777d0 0x7fc2fc079c80 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fc304006010 tx=0x7fc30400b560 comp rx=0 tx=0).stop 2026-03-10T12:39:51.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.694+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314071a60 msgr2=0x7fc3141169f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314071a60 0x7fc3141169f0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fc30800b700 tx=0x7fc30800ba10 comp rx=0 tx=0).stop 2026-03-10T12:39:51.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 shutdown_connections 2026-03-10T12:39:51.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc2fc0777d0 0x7fc2fc079c80 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc314071a60 0x7fc3141169f0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 --2- 192.168.123.100:0/1232661304 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc314072440 0x7fc314116f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 >> 192.168.123.100:0/1232661304 conn(0x7fc31406d1a0 msgr2=0x7fc31410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:51.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 shutdown_connections 2026-03-10T12:39:51.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.695+0000 7fc319baa700 1 -- 192.168.123.100:0/1232661304 wait complete. 2026-03-10T12:39:51.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.773+0000 7f771515d700 1 -- 192.168.123.100:0/2959500270 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 msgr2=0x7f771010beb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.773+0000 7f771515d700 1 --2- 192.168.123.100:0/2959500270 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 0x7f771010beb0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f770800b3a0 tx=0x7f770800b6b0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.773+0000 7f771515d700 1 -- 192.168.123.100:0/2959500270 shutdown_connections 2026-03-10T12:39:51.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.773+0000 7f771515d700 1 --2- 192.168.123.100:0/2959500270 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 0x7f771010beb0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.773+0000 7f771515d700 1 --2- 192.168.123.100:0/2959500270 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 0x7f7710071ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.773+0000 7f771515d700 1 -- 192.168.123.100:0/2959500270 >> 192.168.123.100:0/2959500270 conn(0x7f771006d1a0 msgr2=0x7f771006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:51.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.773+0000 7f771515d700 1 -- 192.168.123.100:0/2959500270 shutdown_connections 2026-03-10T12:39:51.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.774+0000 7f771515d700 1 -- 192.168.123.100:0/2959500270 wait complete. 2026-03-10T12:39:51.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.774+0000 7f771515d700 1 Processor -- start 2026-03-10T12:39:51.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.774+0000 7f771515d700 1 -- start start 2026-03-10T12:39:51.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f771515d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 0x7f771019c100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:51.777 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f771515d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 0x7f771019c640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:51.777 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f771515d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f771019cc60 con 0x7f7710071a90 2026-03-10T12:39:51.777 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f771515d700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f771019cda0 con 0x7f7710072470 2026-03-10T12:39:51.777 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f770ed9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 0x7f771019c100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:51.777 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f770ed9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 0x7f771019c100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42484/0 (socket says 192.168.123.100:42484) 2026-03-10T12:39:51.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f770ed9d700 1 -- 192.168.123.100:0/2379474813 learned_addr learned my addr 192.168.123.100:0/2379474813 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f770e59c700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 0x7f771019c640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f770ed9d700 1 -- 192.168.123.100:0/2379474813 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 msgr2=0x7f771019c640 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f770ed9d700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 0x7f771019c640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.775+0000 7f770ed9d700 1 -- 192.168.123.100:0/2379474813 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f770001a720 con 0x7f7710071a90 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.776+0000 7f770ed9d700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 0x7f771019c100 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f770001d380 tx=0x7f770001d690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.776+0000 7f76f7fff700 1 -- 192.168.123.100:0/2379474813 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7700004020 con 0x7f7710071a90 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.776+0000 7f76f7fff700 1 -- 192.168.123.100:0/2379474813 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7700004830 con 0x7f7710071a90 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.776+0000 7f76f7fff700 1 -- 192.168.123.100:0/2379474813 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7700005530 con 0x7f7710071a90 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.776+0000 7f771515d700 1 -- 192.168.123.100:0/2379474813 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f770800b050 con 0x7f7710071a90 2026-03-10T12:39:51.783 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.776+0000 7f771515d700 1 -- 192.168.123.100:0/2379474813 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77101a1c10 con 0x7f7710071a90 2026-03-10T12:39:51.784 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.779+0000 7f771515d700 1 -- 192.168.123.100:0/2379474813 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7710196390 con 0x7f7710071a90 2026-03-10T12:39:51.784 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.779+0000 7f76f7fff700 1 -- 192.168.123.100:0/2379474813 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f770002f030 con 0x7f7710071a90 2026-03-10T12:39:51.784 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.779+0000 7f76f7fff700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f76f8077a40 0x7f76f8079ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:51.784 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.780+0000 7f76f7fff700 1 -- 192.168.123.100:0/2379474813 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(48..48 src has 1..48) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f7700025070 con 0x7f7710071a90 2026-03-10T12:39:51.788 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.783+0000 7f76f7fff700 1 -- 192.168.123.100:0/2379474813 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7700073910 con 0x7f7710071a90 2026-03-10T12:39:51.788 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.787+0000 7f770e59c700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f76f8077a40 0x7f76f8079ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:51.788 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.787+0000 7f770e59c700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f76f8077a40 0x7f76f8079ef0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f77080060b0 tx=0x7f7708006040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:51.930 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.929+0000 7f771515d700 1 -- 192.168.123.100:0/2379474813 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f771004ea50 con 0x7f7710071a90 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.934+0000 7f76f7fff700 1 -- 192.168.123.100:0/2379474813 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1961 (secure 0 0 0) 0x7f7700073060 con 0x7f7710071a90 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:39:51.935 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 0 members: 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:51.936 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:39:51.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 -- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f76f8077a40 msgr2=0x7f76f8079ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f76f8077a40 0x7f76f8079ef0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f77080060b0 tx=0x7f7708006040 comp rx=0 tx=0).stop 2026-03-10T12:39:51.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 -- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 msgr2=0x7f771019c100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:51.938 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 0x7f771019c100 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f770001d380 tx=0x7f770001d690 comp rx=0 tx=0).stop 2026-03-10T12:39:51.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 -- 192.168.123.100:0/2379474813 shutdown_connections 2026-03-10T12:39:51.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f76f8077a40 0x7f76f8079ef0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7710071a90 0x7f771019c100 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 --2- 192.168.123.100:0/2379474813 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7710072470 0x7f771019c640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:51.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 -- 192.168.123.100:0/2379474813 >> 192.168.123.100:0/2379474813 conn(0x7f771006d1a0 msgr2=0x7f771010b3b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:51.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 -- 192.168.123.100:0/2379474813 shutdown_connections 2026-03-10T12:39:51.939 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:51.938+0000 7f76f5ffb700 1 -- 192.168.123.100:0/2379474813 wait complete. 2026-03-10T12:39:51.941 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:39:52.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.014+0000 7f0e470e0700 1 -- 192.168.123.100:0/3954333562 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 msgr2=0x7f0e40071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.014+0000 7f0e470e0700 1 --2- 192.168.123.100:0/3954333562 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 0x7f0e40071e70 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f0e3c009b00 tx=0x7f0e3c009e10 comp rx=0 tx=0).stop 2026-03-10T12:39:52.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 -- 192.168.123.100:0/3954333562 shutdown_connections 2026-03-10T12:39:52.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 --2- 192.168.123.100:0/3954333562 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 0x7f0e4010be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 --2- 192.168.123.100:0/3954333562 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 0x7f0e40071e70 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 -- 192.168.123.100:0/3954333562 >> 192.168.123.100:0/3954333562 conn(0x7f0e4006d1a0 msgr2=0x7f0e4006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:52.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 -- 192.168.123.100:0/3954333562 shutdown_connections 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 -- 192.168.123.100:0/3954333562 wait complete. 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 Processor -- start 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.015+0000 7f0e470e0700 1 -- start start 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e470e0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 0x7f0e401a49b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e470e0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 0x7f0e401a4ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e470e0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e401a5510 con 0x7f0e40071a60 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e470e0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e401a5650 con 0x7f0e40072440 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e458dd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 0x7f0e401a4ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e458dd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 0x7f0e401a4ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:54768/0 (socket says 192.168.123.100:54768) 2026-03-10T12:39:52.016 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e458dd700 1 -- 192.168.123.100:0/197478829 learned_addr learned my addr 192.168.123.100:0/197478829 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:52.017 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e460de700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 0x7f0e401a49b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:52.017 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e458dd700 1 -- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 msgr2=0x7f0e401a49b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.017 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e458dd700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 0x7f0e401a49b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.017 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.016+0000 7f0e458dd700 1 -- 192.168.123.100:0/197478829 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e3c0097e0 con 0x7f0e40072440 2026-03-10T12:39:52.017 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.017+0000 7f0e458dd700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 0x7f0e401a4ef0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f0e3800c370 tx=0x7f0e3800c730 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:52.018 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.017+0000 7f0e377fe700 1 -- 192.168.123.100:0/197478829 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e3800e050 con 0x7f0e40072440 2026-03-10T12:39:52.018 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.017+0000 7f0e377fe700 1 -- 192.168.123.100:0/197478829 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0e3800f040 con 0x7f0e40072440 2026-03-10T12:39:52.018 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.018+0000 7f0e377fe700 1 -- 192.168.123.100:0/197478829 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e38013610 con 0x7f0e40072440 2026-03-10T12:39:52.019 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.018+0000 7f0e470e0700 1 -- 192.168.123.100:0/197478829 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e4010f5c0 con 0x7f0e40072440 2026-03-10T12:39:52.019 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.018+0000 7f0e470e0700 1 -- 192.168.123.100:0/197478829 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e4010fa90 con 0x7f0e40072440 2026-03-10T12:39:52.019 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.019+0000 7f0e470e0700 1 -- 192.168.123.100:0/197478829 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e4019eba0 con 0x7f0e40072440 2026-03-10T12:39:52.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.020+0000 7f0e377fe700 1 -- 192.168.123.100:0/197478829 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f0e380090d0 con 0x7f0e40072440 2026-03-10T12:39:52.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.020+0000 7f0e377fe700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0e2c077820 0x7f0e2c079cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:52.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.021+0000 7f0e377fe700 1 -- 192.168.123.100:0/197478829 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(48..48 src has 1..48) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f0e3809a310 con 0x7f0e40072440 2026-03-10T12:39:52.021 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.021+0000 7f0e460de700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0e2c077820 0x7f0e2c079cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:52.024 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.024+0000 7f0e377fe700 1 -- 192.168.123.100:0/197478829 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0e380629a0 con 0x7f0e40072440 2026-03-10T12:39:52.025 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.025+0000 7f0e460de700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0e2c077820 0x7f0e2c079cd0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f0e3c00b5c0 tx=0x7f0e3c01a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:52.162 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.161+0000 7f0e470e0700 1 -- 192.168.123.100:0/197478829 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0e40061190 con 0x7f0e2c077820 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.163+0000 7f0e377fe700 1 -- 192.168.123.100:0/197478829 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f0e40061190 con 0x7f0e2c077820 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T12:39:52.163 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T12:39:52.164 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:39:52.164 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:39:52.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 -- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0e2c077820 msgr2=0x7f0e2c079cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0e2c077820 0x7f0e2c079cd0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f0e3c00b5c0 tx=0x7f0e3c01a040 comp rx=0 tx=0).stop 2026-03-10T12:39:52.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 -- 192.168.123.100:0/197478829 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 msgr2=0x7f0e401a4ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 0x7f0e401a4ef0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f0e3800c370 tx=0x7f0e3800c730 comp rx=0 tx=0).stop 2026-03-10T12:39:52.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 -- 192.168.123.100:0/197478829 shutdown_connections 2026-03-10T12:39:52.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0e2c077820 0x7f0e2c079cd0 secure :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f0e3c00b5c0 tx=0x7f0e3c01a040 comp rx=0 tx=0).stop 2026-03-10T12:39:52.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0e40071a60 0x7f0e401a49b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 --2- 192.168.123.100:0/197478829 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0e40072440 0x7f0e401a4ef0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.166+0000 7f0e357ba700 1 -- 192.168.123.100:0/197478829 >> 192.168.123.100:0/197478829 conn(0x7f0e4006d1a0 msgr2=0x7f0e4010a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:52.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.167+0000 7f0e357ba700 1 -- 192.168.123.100:0/197478829 shutdown_connections 2026-03-10T12:39:52.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.167+0000 7f0e357ba700 1 -- 192.168.123.100:0/197478829 wait complete. 2026-03-10T12:39:52.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.233+0000 7f8ed22d5700 1 -- 192.168.123.100:0/1792734504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 msgr2=0x7f8ecc103d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.233+0000 7f8ed22d5700 1 --2- 192.168.123.100:0/1792734504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 0x7f8ecc103d90 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f8ec0009a60 tx=0x7f8ec0009d70 comp rx=0 tx=0).stop 2026-03-10T12:39:52.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.238+0000 7f8ed22d5700 1 -- 192.168.123.100:0/1792734504 shutdown_connections 2026-03-10T12:39:52.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.238+0000 7f8ed22d5700 1 --2- 192.168.123.100:0/1792734504 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 0x7f8ecc103d90 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.238+0000 7f8ed22d5700 1 --2- 192.168.123.100:0/1792734504 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8ecc102740 0x7f8ecc102b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.238+0000 7f8ed22d5700 1 -- 192.168.123.100:0/1792734504 >> 192.168.123.100:0/1792734504 conn(0x7f8ecc0fdcf0 msgr2=0x7f8ecc100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:52.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.238+0000 7f8ed22d5700 1 -- 192.168.123.100:0/1792734504 shutdown_connections 2026-03-10T12:39:52.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.238+0000 7f8ed22d5700 1 -- 192.168.123.100:0/1792734504 wait complete. 2026-03-10T12:39:52.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.239+0000 7f8ed22d5700 1 Processor -- start 2026-03-10T12:39:52.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.239+0000 7f8ed22d5700 1 -- start start 2026-03-10T12:39:52.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.239+0000 7f8ed22d5700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8ecc102740 0x7f8ecc197f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.239+0000 7f8ed22d5700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 0x7f8ecc1984d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.239+0000 7f8ed22d5700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ecc198af0 con 0x7f8ecc102740 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.239+0000 7f8ed22d5700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ecc198c30 con 0x7f8ecc103940 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.239+0000 7f8ecb7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 0x7f8ecc1984d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.240+0000 7f8ecb7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 0x7f8ecc1984d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:54794/0 (socket says 192.168.123.100:54794) 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.240+0000 7f8ecb7fe700 1 -- 192.168.123.100:0/4185488292 learned_addr learned my addr 192.168.123.100:0/4185488292 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.240+0000 7f8ecbfff700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8ecc102740 0x7f8ecc197f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.240+0000 7f8ecbfff700 1 -- 192.168.123.100:0/4185488292 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 msgr2=0x7f8ecc1984d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.240+0000 7f8ecbfff700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 0x7f8ecc1984d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.241 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.240+0000 7f8ecbfff700 1 -- 192.168.123.100:0/4185488292 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ebc0097e0 con 0x7f8ecc102740 2026-03-10T12:39:52.241 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.240+0000 7f8ecbfff700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8ecc102740 0x7f8ecc197f90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f8ebc005fd0 tx=0x7f8ebc00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:52.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.368+0000 7f8ec97fa700 1 -- 192.168.123.100:0/4185488292 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8ebc00ce30 con 0x7f8ecc102740 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.368+0000 7f8ec97fa700 1 -- 192.168.123.100:0/4185488292 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8ebc010910 con 0x7f8ecc102740 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.368+0000 7f8ec97fa700 1 -- 192.168.123.100:0/4185488292 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8ebc018a50 con 0x7f8ecc102740 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.368+0000 7f8ed22d5700 1 -- 192.168.123.100:0/4185488292 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ec0009710 con 0x7f8ecc102740 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.368+0000 7f8ed22d5700 1 -- 192.168.123.100:0/4185488292 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ecc19daa0 con 0x7f8ecc102740 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.372+0000 7f8ec97fa700 1 -- 192.168.123.100:0/4185488292 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f8ebc018c50 con 0x7f8ecc102740 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.372+0000 7f8ec97fa700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8eb4077b10 0x7f8eb4079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.373+0000 7f8ec97fa700 1 -- 192.168.123.100:0/4185488292 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(48..48 src has 1..48) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8ebc014070 con 0x7f8ecc102740 2026-03-10T12:39:52.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.373+0000 7f8ed22d5700 1 -- 192.168.123.100:0/4185488292 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8eb8005320 con 0x7f8ecc102740 2026-03-10T12:39:52.376 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.376+0000 7f8ec97fa700 1 -- 192.168.123.100:0/4185488292 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8ebc062990 con 0x7f8ecc102740 2026-03-10T12:39:52.376 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.376+0000 7f8ecb7fe700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8eb4077b10 0x7f8eb4079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:39:52.377 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.376+0000 7f8ecb7fe700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8eb4077b10 0x7f8eb4079fc0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f8ec0003820 tx=0x7f8ec000b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:39:52.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.577+0000 7f8ed22d5700 1 -- 192.168.123.100:0/4185488292 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f8eb8005190 con 0x7f8ecc102740 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.580+0000 7f8ec97fa700 1 -- 192.168.123.100:0/4185488292 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+2025 (secure 0 0 0) 0x7f8ebc0620e0 con 0x7f8ecc102740 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: from='client.34158 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: pgmap v31: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 804 KiB/s wr, 297 op/s; 3355/22770 objects degraded (14.734%) 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120] boot 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1232661304' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:52.581 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:52 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2379474813' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_WARN Degraded data redundancy: 3355/22770 objects degraded (14.734%), 33 pgs degraded 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 3355/22770 objects degraded (14.734%), 33 pgs degraded 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 1.0 is active+undersized+degraded, acting [3,1] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.0 is active+undersized+degraded, acting [3,1] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1 is active+undersized+degraded, acting [2,1] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.2 is active+undersized+degraded, acting [5,1] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.4 is active+undersized+degraded, acting [1,4] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.5 is active+undersized+degraded, acting [3,4] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.8 is active+undersized+degraded, acting [3,5] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.9 is active+undersized+degraded, acting [1,4] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.e is active+undersized+degraded, acting [2,3] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.f is active+undersized+degraded, acting [4,5] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.10 is active+undersized+degraded, acting [2,1] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.12 is active+undersized+degraded, acting [3,1] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.13 is active+undersized+degraded, acting [4,2] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.15 is active+undersized+degraded, acting [1,3] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.19 is active+undersized+degraded, acting [4,2] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1b is active+undersized+degraded, acting [1,5] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1d is active+undersized+degraded, acting [3,5] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1e is active+undersized+degraded, acting [2,5] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1f is active+undersized+degraded, acting [3,4] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.1 is active+undersized+degraded, acting [4,2] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.3 is active+undersized+degraded, acting [4,3] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.6 is active+undersized+degraded, acting [1,4] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.b is active+undersized+degraded, acting [1,4] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.c is active+undersized+degraded, acting [5,3] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.f is active+undersized+degraded, acting [5,3] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.10 is active+undersized+degraded, acting [5,1] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.11 is active+undersized+degraded, acting [3,4] 2026-03-10T12:39:52.581 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.12 is active+undersized+degraded, acting [3,1] 2026-03-10T12:39:52.582 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.15 is active+undersized+degraded, acting [3,4] 2026-03-10T12:39:52.582 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.17 is active+undersized+degraded, acting [5,2] 2026-03-10T12:39:52.582 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.18 is active+undersized+degraded, acting [2,1] 2026-03-10T12:39:52.582 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.1b is active+undersized+degraded, acting [4,3] 2026-03-10T12:39:52.582 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.1f is active+undersized+degraded, acting [3,2] 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 -- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8eb4077b10 msgr2=0x7f8eb4079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8eb4077b10 0x7f8eb4079fc0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f8ec0003820 tx=0x7f8ec000b540 comp rx=0 tx=0).stop 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 -- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8ecc102740 msgr2=0x7f8ecc197f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8ecc102740 0x7f8ecc197f90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f8ebc005fd0 tx=0x7f8ebc00c5b0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 -- 192.168.123.100:0/4185488292 shutdown_connections 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8eb4077b10 0x7f8eb4079fc0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8ecc102740 0x7f8ecc197f90 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 --2- 192.168.123.100:0/4185488292 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8ecc103940 0x7f8ecc1984d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.584+0000 7f8eb2ffd700 1 -- 192.168.123.100:0/4185488292 >> 192.168.123.100:0/4185488292 conn(0x7f8ecc0fdcf0 msgr2=0x7f8ecc106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.585+0000 7f8eb2ffd700 1 -- 192.168.123.100:0/4185488292 shutdown_connections 2026-03-10T12:39:52.585 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:39:52.585+0000 7f8eb2ffd700 1 -- 192.168.123.100:0/4185488292 wait complete. 2026-03-10T12:39:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: from='client.34158 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: pgmap v31: 65 pgs: 1 active+undersized, 33 active+undersized+degraded, 31 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 804 KiB/s wr, 297 op/s; 3355/22770 objects degraded (14.734%) 2026-03-10T12:39:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: from='client.44129 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:39:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: osd.0 [v2:192.168.123.100:6802/105866120,v1:192.168.123.100:6803/105866120] boot 2026-03-10T12:39:52.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T12:39:52.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T12:39:52.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1232661304' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:39:52.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:52 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2379474813' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:39:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:53 vm00.local ceph-mon[103263]: from='client.44143 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:53 vm00.local ceph-mon[103263]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T12:39:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:53 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/4185488292' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:39:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:53 vm00.local ceph-mon[103263]: pgmap v34: 65 pgs: 25 peering, 9 active+undersized+degraded, 31 active+clean; 287 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.6 MiB/s wr, 383 op/s; 1410/20505 objects degraded (6.876%) 2026-03-10T12:39:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:53 vm00.local ceph-mon[103263]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T12:39:53.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:53 vm07.local ceph-mon[93622]: from='client.44143 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:39:53.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:53 vm07.local ceph-mon[93622]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T12:39:53.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:53 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/4185488292' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:39:53.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:53 vm07.local ceph-mon[93622]: pgmap v34: 65 pgs: 25 peering, 9 active+undersized+degraded, 31 active+clean; 287 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.6 MiB/s wr, 383 op/s; 1410/20505 objects degraded (6.876%) 2026-03-10T12:39:53.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:53 vm07.local ceph-mon[93622]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T12:39:55.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:54 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 1410/20505 objects degraded (6.876%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T12:39:55.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:54 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 1410/20505 objects degraded (6.876%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T12:39:56.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:55 vm00.local ceph-mon[103263]: pgmap v36: 65 pgs: 25 peering, 9 active+undersized+degraded, 31 active+clean; 287 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1.1 MiB/s wr, 230 op/s; 1410/20505 objects degraded (6.876%) 2026-03-10T12:39:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:55 vm07.local ceph-mon[93622]: pgmap v36: 65 pgs: 25 peering, 9 active+undersized+degraded, 31 active+clean; 287 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1.1 MiB/s wr, 230 op/s; 1410/20505 objects degraded (6.876%) 2026-03-10T12:39:57.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:57 vm07.local ceph-mon[93622]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T12:39:57.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:57 vm07.local ceph-mon[93622]: pgmap v38: 65 pgs: 5 active+recovery_wait+degraded, 25 peering, 1 active+recovering, 34 active+clean; 286 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.0 MiB/s wr, 259 op/s; 358/19701 objects degraded (1.817%); 19 KiB/s, 2 objects/s recovering 2026-03-10T12:39:57.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:57 vm00.local ceph-mon[103263]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T12:39:57.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:57 vm00.local ceph-mon[103263]: pgmap v38: 65 pgs: 5 active+recovery_wait+degraded, 25 peering, 1 active+recovering, 34 active+clean; 286 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.0 MiB/s wr, 259 op/s; 358/19701 objects degraded (1.817%); 19 KiB/s, 2 objects/s recovering 2026-03-10T12:39:58.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:58 vm00.local ceph-mon[103263]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T12:39:58.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:58 vm07.local ceph-mon[93622]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T12:39:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:59 vm00.local ceph-mon[103263]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T12:39:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:59 vm00.local ceph-mon[103263]: pgmap v41: 65 pgs: 1 active+undersized+remapped, 12 active+recovery_wait+degraded, 2 peering, 2 active+recovering, 48 active+clean; 287 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 564 KiB/s wr, 214 op/s; 874/17808 objects degraded (4.908%); 2.1 MiB/s, 7 objects/s recovering 2026-03-10T12:39:59.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:39:59 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 358/19701 objects degraded (1.817%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T12:39:59.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:59 vm07.local ceph-mon[93622]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T12:39:59.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:59 vm07.local ceph-mon[93622]: pgmap v41: 65 pgs: 1 active+undersized+remapped, 12 active+recovery_wait+degraded, 2 peering, 2 active+recovering, 48 active+clean; 287 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 564 KiB/s wr, 214 op/s; 874/17808 objects degraded (4.908%); 2.1 MiB/s, 7 objects/s recovering 2026-03-10T12:39:59.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:39:59 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 358/19701 objects degraded (1.817%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: Health detail: HEALTH_WARN Degraded data redundancy: 874/17808 objects degraded (4.908%), 12 pgs degraded 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: [WRN] PG_DEGRADED: Degraded data redundancy: 874/17808 objects degraded (4.908%), 12 pgs degraded 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.3 is active+recovery_wait+degraded, acting [4,0,3] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.6 is active+recovery_wait+degraded, acting [0,1,4] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.b is active+recovery_wait+degraded, acting [1,0,4] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.c is active+recovery_wait+degraded, acting [5,0,3] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.10 is active+recovery_wait+degraded, acting [5,0,1] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.11 is active+recovery_wait+degraded, acting [3,4,0] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.12 is active+recovery_wait+degraded, acting [0,3,1] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.17 is active+recovery_wait+degraded, acting [0,5,2] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.18 is active+recovery_wait+degraded, acting [2,0,1] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.1b is active+recovery_wait+degraded, acting [0,4,3] 2026-03-10T12:40:00.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:00 vm00.local ceph-mon[103263]: pg 3.1f is active+recovery_wait+degraded, acting [0,3,2] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: Health detail: HEALTH_WARN Degraded data redundancy: 874/17808 objects degraded (4.908%), 12 pgs degraded 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: [WRN] PG_DEGRADED: Degraded data redundancy: 874/17808 objects degraded (4.908%), 12 pgs degraded 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.3 is active+recovery_wait+degraded, acting [4,0,3] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.6 is active+recovery_wait+degraded, acting [0,1,4] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.b is active+recovery_wait+degraded, acting [1,0,4] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.c is active+recovery_wait+degraded, acting [5,0,3] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.10 is active+recovery_wait+degraded, acting [5,0,1] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.11 is active+recovery_wait+degraded, acting [3,4,0] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.12 is active+recovery_wait+degraded, acting [0,3,1] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.17 is active+recovery_wait+degraded, acting [0,5,2] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.18 is active+recovery_wait+degraded, acting [2,0,1] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.1b is active+recovery_wait+degraded, acting [0,4,3] 2026-03-10T12:40:00.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:00 vm07.local ceph-mon[93622]: pg 3.1f is active+recovery_wait+degraded, acting [0,3,2] 2026-03-10T12:40:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:01 vm07.local ceph-mon[93622]: pgmap v42: 65 pgs: 1 active+undersized+remapped, 12 active+recovery_wait+degraded, 2 peering, 2 active+recovering, 48 active+clean; 287 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 496 KiB/s wr, 212 op/s; 874/17367 objects degraded (5.033%); 1.8 MiB/s, 10 objects/s recovering 2026-03-10T12:40:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:40:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:01 vm00.local ceph-mon[103263]: pgmap v42: 65 pgs: 1 active+undersized+remapped, 12 active+recovery_wait+degraded, 2 peering, 2 active+recovering, 48 active+clean; 287 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 496 KiB/s wr, 212 op/s; 874/17367 objects degraded (5.033%); 1.8 MiB/s, 10 objects/s recovering 2026-03-10T12:40:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:40:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:03.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:03 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:03.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:03 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:04.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:04 vm00.local ceph-mon[103263]: pgmap v43: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.1 MiB/s wr, 338 op/s; 874/14526 objects degraded (6.017%); 1.6 MiB/s, 12 objects/s recovering 2026-03-10T12:40:04.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:04 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:04.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:04 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T12:40:04.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:04 vm07.local ceph-mon[93622]: pgmap v43: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.1 MiB/s wr, 338 op/s; 874/14526 objects degraded (6.017%); 1.6 MiB/s, 12 objects/s recovering 2026-03-10T12:40:04.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:04 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:04.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:04 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T12:40:05.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:05 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 874/14526 objects degraded (6.017%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:05.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:05 vm00.local ceph-mon[103263]: pgmap v44: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 975 KiB/s wr, 285 op/s; 874/14526 objects degraded (6.017%); 1.4 MiB/s, 10 objects/s recovering 2026-03-10T12:40:05.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:05 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 874/14526 objects degraded (6.017%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:05.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:05 vm07.local ceph-mon[93622]: pgmap v44: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 285 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 975 KiB/s wr, 285 op/s; 874/14526 objects degraded (6.017%); 1.4 MiB/s, 10 objects/s recovering 2026-03-10T12:40:08.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:08 vm07.local ceph-mon[93622]: pgmap v45: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 519 KiB/s wr, 176 op/s; 874/13842 objects degraded (6.314%); 0 B/s, 10 objects/s recovering 2026-03-10T12:40:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:08 vm00.local ceph-mon[103263]: pgmap v45: 65 pgs: 12 active+recovery_wait+degraded, 2 active+recovering, 51 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 519 KiB/s wr, 176 op/s; 874/13842 objects degraded (6.314%); 0 B/s, 10 objects/s recovering 2026-03-10T12:40:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:09 vm07.local ceph-mon[93622]: pgmap v46: 65 pgs: 11 active+recovery_wait+degraded, 2 active+recovering, 52 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 307 op/s; 813/10950 objects degraded (7.425%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:09.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:09 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 874/13842 objects degraded (6.314%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:09.711 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:09 vm00.local ceph-mon[103263]: pgmap v46: 65 pgs: 11 active+recovery_wait+degraded, 2 active+recovering, 52 active+clean; 284 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 307 op/s; 813/10950 objects degraded (7.425%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:09.711 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:09 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 874/13842 objects degraded (6.314%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:11 vm00.local ceph-mon[103263]: pgmap v47: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 283 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 951 KiB/s wr, 288 op/s; 813/10296 objects degraded (7.896%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:11 vm07.local ceph-mon[93622]: pgmap v47: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 283 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 30 KiB/s rd, 951 KiB/s wr, 288 op/s; 813/10296 objects degraded (7.896%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:13 vm07.local ceph-mon[93622]: pgmap v48: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.4 MiB/s wr, 386 op/s; 813/7953 objects degraded (10.223%); 0 B/s, 14 objects/s recovering 2026-03-10T12:40:13.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:13 vm00.local ceph-mon[103263]: pgmap v48: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.4 MiB/s wr, 386 op/s; 813/7953 objects degraded (10.223%); 0 B/s, 14 objects/s recovering 2026-03-10T12:40:14.513 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-10T12:40:14.513 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-10T12:40:14.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:14 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 813/7953 objects degraded (10.223%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:14.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:14 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 813/7953 objects degraded (10.223%), 11 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:14.936 DEBUG:teuthology.parallel:result is None 2026-03-10T12:40:15.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:15 vm00.local ceph-mon[103263]: pgmap v49: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 999 KiB/s wr, 278 op/s; 813/7953 objects degraded (10.223%); 0 B/s, 11 objects/s recovering 2026-03-10T12:40:15.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:15 vm07.local ceph-mon[93622]: pgmap v49: 65 pgs: 11 active+recovery_wait+degraded, 1 active+recovering, 53 active+clean; 280 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 999 KiB/s wr, 278 op/s; 813/7953 objects degraded (10.223%); 0 B/s, 11 objects/s recovering 2026-03-10T12:40:16.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:16.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:17.747 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:17 vm00.local ceph-mon[103263]: pgmap v50: 65 pgs: 10 active+recovery_wait+degraded, 2 active+recovering, 53 active+clean; 279 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1005 KiB/s wr, 296 op/s; 727/7362 objects degraded (9.875%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:17 vm07.local ceph-mon[93622]: pgmap v50: 65 pgs: 10 active+recovery_wait+degraded, 2 active+recovering, 53 active+clean; 279 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1005 KiB/s wr, 296 op/s; 727/7362 objects degraded (9.875%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:18.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:18 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:18.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:18 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:18.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:18 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T12:40:18.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:18 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:18.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:18 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:18.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:18 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T12:40:19.553 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-10T12:40:19.553 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-10T12:40:19.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:19 vm07.local ceph-mon[93622]: pgmap v51: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 277 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.4 MiB/s wr, 410 op/s; 645/3870 objects degraded (16.667%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:19.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:19 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 727/7362 objects degraded (9.875%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:19 vm00.local ceph-mon[103263]: pgmap v51: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 277 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.4 MiB/s wr, 410 op/s; 645/3870 objects degraded (16.667%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:19 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 727/7362 objects degraded (9.875%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:19.992 DEBUG:teuthology.parallel:result is None 2026-03-10T12:40:19.992 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T12:40:20.046 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T12:40:20.046 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T12:40:20.080 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T12:40:20.080 DEBUG:teuthology.parallel:result is None 2026-03-10T12:40:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:21 vm07.local ceph-mon[93622]: pgmap v52: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 274 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 948 KiB/s wr, 306 op/s; 645/3144 objects degraded (20.515%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:22.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:21 vm00.local ceph-mon[103263]: pgmap v52: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 274 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 35 KiB/s rd, 948 KiB/s wr, 306 op/s; 645/3144 objects degraded (20.515%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.662+0000 7f64d3c8a700 1 -- 192.168.123.100:0/3010925534 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 msgr2=0x7f64cc103e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.662+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/3010925534 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc103e70 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f64bc009b00 tx=0x7f64bc009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.663+0000 7f64d3c8a700 1 -- 192.168.123.100:0/3010925534 shutdown_connections 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.663+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/3010925534 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc103e70 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.663+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/3010925534 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64cc102760 0x7f64cc102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.663+0000 7f64d3c8a700 1 -- 192.168.123.100:0/3010925534 >> 192.168.123.100:0/3010925534 conn(0x7f64cc0fddb0 msgr2=0x7f64cc1001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.664+0000 7f64d3c8a700 1 -- 192.168.123.100:0/3010925534 shutdown_connections 2026-03-10T12:40:22.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.664+0000 7f64d3c8a700 1 -- 192.168.123.100:0/3010925534 wait complete. 2026-03-10T12:40:22.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.664+0000 7f64d3c8a700 1 Processor -- start 2026-03-10T12:40:22.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d3c8a700 1 -- start start 2026-03-10T12:40:22.665 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d3c8a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64cc102760 0x7f64cc197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:22.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d3c8a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:22.666 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d3c8a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64cc198ac0 con 0x7f64cc103a00 2026-03-10T12:40:22.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d3c8a700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f64cc198c00 con 0x7f64cc102760 2026-03-10T12:40:22.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d1225700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:22.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d1225700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40560/0 (socket says 192.168.123.100:40560) 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d1225700 1 -- 192.168.123.100:0/2005932734 learned_addr learned my addr 192.168.123.100:0/2005932734 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d1225700 1 -- 192.168.123.100:0/2005932734 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64cc102760 msgr2=0x7f64cc197ff0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d1225700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64cc102760 0x7f64cc197ff0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d1225700 1 -- 192.168.123.100:0/2005932734 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64bc0097e0 con 0x7f64cc103a00 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.665+0000 7f64d1225700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc198530 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f64bc0049c0 tx=0x7f64bc004aa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.666+0000 7f64c2ffd700 1 -- 192.168.123.100:0/2005932734 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f64bc01d070 con 0x7f64cc103a00 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.666+0000 7f64c2ffd700 1 -- 192.168.123.100:0/2005932734 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f64bc00bd10 con 0x7f64cc103a00 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.666+0000 7f64c2ffd700 1 -- 192.168.123.100:0/2005932734 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f64bc00f940 con 0x7f64cc103a00 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.666+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f64cc06a770 con 0x7f64cc103a00 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.666+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f64cc06ac30 con 0x7f64cc103a00 2026-03-10T12:40:22.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.666+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f64cc066e40 con 0x7f64cc103a00 2026-03-10T12:40:22.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.673+0000 7f64c2ffd700 1 -- 192.168.123.100:0/2005932734 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f64bc022cc0 con 0x7f64cc103a00 2026-03-10T12:40:22.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.673+0000 7f64c2ffd700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64b8077ac0 0x7f64b8079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:22.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.673+0000 7f64d1a26700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64b8077ac0 0x7f64b8079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:22.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.673+0000 7f64c2ffd700 1 -- 192.168.123.100:0/2005932734 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f64bc09c060 con 0x7f64cc103a00 2026-03-10T12:40:22.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.674+0000 7f64c2ffd700 1 -- 192.168.123.100:0/2005932734 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f64bc09c4c0 con 0x7f64cc103a00 2026-03-10T12:40:22.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.674+0000 7f64d1a26700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64b8077ac0 0x7f64b8079f70 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f64c800a9b0 tx=0x7f64c8005c90 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:22.817 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.816+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f64cc108350 con 0x7f64b8077ac0 2026-03-10T12:40:22.818 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.817+0000 7f64c2ffd700 1 -- 192.168.123.100:0/2005932734 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f64cc108350 con 0x7f64b8077ac0 2026-03-10T12:40:22.820 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64b8077ac0 msgr2=0x7f64b8079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:22.820 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64b8077ac0 0x7f64b8079f70 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f64c800a9b0 tx=0x7f64c8005c90 comp rx=0 tx=0).stop 2026-03-10T12:40:22.820 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 msgr2=0x7f64cc198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc198530 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f64bc0049c0 tx=0x7f64bc004aa0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 shutdown_connections 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64b8077ac0 0x7f64b8079f70 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f64cc102760 0x7f64cc197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 --2- 192.168.123.100:0/2005932734 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f64cc103a00 0x7f64cc198530 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 >> 192.168.123.100:0/2005932734 conn(0x7f64cc0fddb0 msgr2=0x7f64cc106c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 shutdown_connections 2026-03-10T12:40:22.821 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.820+0000 7f64d3c8a700 1 -- 192.168.123.100:0/2005932734 wait complete. 2026-03-10T12:40:22.830 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.896+0000 7f2a7bfff700 1 -- 192.168.123.100:0/3418018333 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1024d0 msgr2=0x7f2a7c1028e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.896+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/3418018333 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1024d0 0x7f2a7c1028e0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f2a70009b00 tx=0x7f2a70009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.896+0000 7f2a7bfff700 1 -- 192.168.123.100:0/3418018333 shutdown_connections 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.896+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/3418018333 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1036d0 0x7f2a7c103b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.896+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/3418018333 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1024d0 0x7f2a7c1028e0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.896+0000 7f2a7bfff700 1 -- 192.168.123.100:0/3418018333 >> 192.168.123.100:0/3418018333 conn(0x7f2a7c0fda80 msgr2=0x7f2a7c0ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.896+0000 7f2a7bfff700 1 -- 192.168.123.100:0/3418018333 shutdown_connections 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.897+0000 7f2a7bfff700 1 -- 192.168.123.100:0/3418018333 wait complete. 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.897+0000 7f2a7bfff700 1 Processor -- start 2026-03-10T12:40:22.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.897+0000 7f2a7bfff700 1 -- start start 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.897+0000 7f2a7bfff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1024d0 0x7f2a7c197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.897+0000 7f2a7bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1036d0 0x7f2a7c1982e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.897+0000 7f2a7bfff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a7c198900 con 0x7f2a7c1036d0 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7bfff700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2a7c19d310 con 0x7f2a7c1024d0 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7a7fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1036d0 0x7f2a7c1982e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7a7fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1036d0 0x7f2a7c1982e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40582/0 (socket says 192.168.123.100:40582) 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7a7fc700 1 -- 192.168.123.100:0/946858920 learned_addr learned my addr 192.168.123.100:0/946858920 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7a7fc700 1 -- 192.168.123.100:0/946858920 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1024d0 msgr2=0x7f2a7c197da0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:40:22.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7a7fc700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1024d0 0x7f2a7c197da0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:22.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7a7fc700 1 -- 192.168.123.100:0/946858920 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2a700097e0 con 0x7f2a7c1036d0 2026-03-10T12:40:22.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.898+0000 7f2a7a7fc700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1036d0 0x7f2a7c1982e0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f2a6c00eb10 tx=0x7f2a6c00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:22.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.899+0000 7f2a8097a700 1 -- 192.168.123.100:0/946858920 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a6c00cca0 con 0x7f2a7c1036d0 2026-03-10T12:40:22.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.899+0000 7f2a8097a700 1 -- 192.168.123.100:0/946858920 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2a6c00ce00 con 0x7f2a7c1036d0 2026-03-10T12:40:22.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.899+0000 7f2a8097a700 1 -- 192.168.123.100:0/946858920 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2a6c0189c0 con 0x7f2a7c1036d0 2026-03-10T12:40:22.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.899+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2a7c19d510 con 0x7f2a7c1036d0 2026-03-10T12:40:22.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.899+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2a7c19da30 con 0x7f2a7c1036d0 2026-03-10T12:40:22.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.900+0000 7f2a8097a700 1 -- 192.168.123.100:0/946858920 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f2a6c018b20 con 0x7f2a7c1036d0 2026-03-10T12:40:22.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.900+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2a7c04ea50 con 0x7f2a7c1036d0 2026-03-10T12:40:22.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.903+0000 7f2a8097a700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2a64077b00 0x7f2a64079fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:22.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.903+0000 7f2a8097a700 1 -- 192.168.123.100:0/946858920 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f2a6c014070 con 0x7f2a7c1036d0 2026-03-10T12:40:22.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.903+0000 7f2a7affd700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2a64077b00 0x7f2a64079fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:22.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.904+0000 7f2a8097a700 1 -- 192.168.123.100:0/946858920 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2a6c063dc0 con 0x7f2a7c1036d0 2026-03-10T12:40:22.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:22.904+0000 7f2a7affd700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2a64077b00 0x7f2a64079fb0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f2a70009ad0 tx=0x7f2a70000bc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.041+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f2a7c108020 con 0x7f2a64077b00 2026-03-10T12:40:23.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.042+0000 7f2a8097a700 1 -- 192.168.123.100:0/946858920 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f2a7c108020 con 0x7f2a64077b00 2026-03-10T12:40:23.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2a64077b00 msgr2=0x7f2a64079fb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2a64077b00 0x7f2a64079fb0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f2a70009ad0 tx=0x7f2a70000bc0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1036d0 msgr2=0x7f2a7c1982e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1036d0 0x7f2a7c1982e0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f2a6c00eb10 tx=0x7f2a6c00eed0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 shutdown_connections 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2a64077b00 0x7f2a64079fb0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2a7c1024d0 0x7f2a7c197da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 --2- 192.168.123.100:0/946858920 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2a7c1036d0 0x7f2a7c1982e0 secure :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f2a6c00eb10 tx=0x7f2a6c00eed0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 >> 192.168.123.100:0/946858920 conn(0x7f2a7c0fda80 msgr2=0x7f2a7c106900 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 shutdown_connections 2026-03-10T12:40:23.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.045+0000 7f2a7bfff700 1 -- 192.168.123.100:0/946858920 wait complete. 2026-03-10T12:40:23.117 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 -- 192.168.123.100:0/31172914 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774103960 msgr2=0x7fd774103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 --2- 192.168.123.100:0/31172914 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774103960 0x7fd774103db0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fd764009b00 tx=0x7fd764009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 -- 192.168.123.100:0/31172914 shutdown_connections 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 --2- 192.168.123.100:0/31172914 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774103960 0x7fd774103db0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 --2- 192.168.123.100:0/31172914 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd774102760 0x7fd774102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 -- 192.168.123.100:0/31172914 >> 192.168.123.100:0/31172914 conn(0x7fd7740fdcf0 msgr2=0x7fd774100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 -- 192.168.123.100:0/31172914 shutdown_connections 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.117+0000 7fd77955c700 1 -- 192.168.123.100:0/31172914 wait complete. 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd77955c700 1 Processor -- start 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd77955c700 1 -- start start 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd77955c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774102760 0x7fd774198050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd77955c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd774103960 0x7fd774198590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd77955c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd774198bb0 con 0x7fd774102760 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd77955c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd774198cf0 con 0x7fd774103960 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd772ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774102760 0x7fd774198050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd7727fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd774103960 0x7fd774198590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd772ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774102760 0x7fd774198050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40614/0 (socket says 192.168.123.100:40614) 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.118+0000 7fd772ffd700 1 -- 192.168.123.100:0/4136655163 learned_addr learned my addr 192.168.123.100:0/4136655163 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.119+0000 7fd772ffd700 1 -- 192.168.123.100:0/4136655163 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd774103960 msgr2=0x7fd774198590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.119+0000 7fd772ffd700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd774103960 0x7fd774198590 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.119+0000 7fd772ffd700 1 -- 192.168.123.100:0/4136655163 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7640097e0 con 0x7fd774102760 2026-03-10T12:40:23.119 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.119+0000 7fd772ffd700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774102760 0x7fd774198050 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fd75c00da40 tx=0x7fd75c00de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.120 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.119+0000 7fd76bfff700 1 -- 192.168.123.100:0/4136655163 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd75c0041d0 con 0x7fd774102760 2026-03-10T12:40:23.120 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.119+0000 7fd76bfff700 1 -- 192.168.123.100:0/4136655163 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd75c009c70 con 0x7fd774102760 2026-03-10T12:40:23.120 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.120+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd77419d7a0 con 0x7fd774102760 2026-03-10T12:40:23.120 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.120+0000 7fd76bfff700 1 -- 192.168.123.100:0/4136655163 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd75c003e80 con 0x7fd774102760 2026-03-10T12:40:23.121 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.121+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd77419dcc0 con 0x7fd774102760 2026-03-10T12:40:23.122 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.121+0000 7fd76bfff700 1 -- 192.168.123.100:0/4136655163 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fd75c004330 con 0x7fd774102760 2026-03-10T12:40:23.122 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.121+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd774066e40 con 0x7fd774102760 2026-03-10T12:40:23.122 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.122+0000 7fd76bfff700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd760077780 0x7fd760079c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.122 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.122+0000 7fd7727fc700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd760077780 0x7fd760079c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.123 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.122+0000 7fd76bfff700 1 -- 192.168.123.100:0/4136655163 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd75c098f40 con 0x7fd774102760 2026-03-10T12:40:23.123 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.123+0000 7fd7727fc700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd760077780 0x7fd760079c30 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fd76400b5c0 tx=0x7fd764005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.125+0000 7fd76bfff700 1 -- 192.168.123.100:0/4136655163 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd75c0615d0 con 0x7fd774102760 2026-03-10T12:40:23.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.248+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd77419dfa0 con 0x7fd760077780 2026-03-10T12:40:23.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.253+0000 7fd76bfff700 1 -- 192.168.123.100:0/4136655163 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fd77419dfa0 con 0x7fd760077780 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (6m) 38s ago 7m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (7m) 38s ago 7m 8774k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (6m) 52s ago 6m 11.2M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (57s) 38s ago 7m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (55s) 52s ago 6m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (6m) 38s ago 7m 90.9M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (5m) 38s ago 5m 152M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (5m) 38s ago 5m 18.0M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (5m) 52s ago 5m 17.4M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (5m) 52s ago 5m 169M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (115s) 38s ago 8m 615M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (96s) 52s ago 6m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (89s) 38s ago 8m 54.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (74s) 52s ago 6m 50.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (7m) 38s ago 7m 15.2M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (6m) 52s ago 6m 15.5M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (41s) 38s ago 6m 29.4M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (6m) 38s ago 6m 383M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (6m) 38s ago 6m 322M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (5m) 52s ago 5m 452M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (5m) 52s ago 5m 402M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (5m) 52s ago 5m 370M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:40:23.254 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (99s) 38s ago 7m 55.3M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:40:23.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd760077780 msgr2=0x7fd760079c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd760077780 0x7fd760079c30 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fd76400b5c0 tx=0x7fd764005fb0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774102760 msgr2=0x7fd774198050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774102760 0x7fd774198050 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fd75c00da40 tx=0x7fd75c00de00 comp rx=0 tx=0).stop 2026-03-10T12:40:23.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 shutdown_connections 2026-03-10T12:40:23.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd760077780 0x7fd760079c30 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd774102760 0x7fd774198050 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 --2- 192.168.123.100:0/4136655163 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd774103960 0x7fd774198590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 >> 192.168.123.100:0/4136655163 conn(0x7fd7740fdcf0 msgr2=0x7fd774106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 shutdown_connections 2026-03-10T12:40:23.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.256+0000 7fd77955c700 1 -- 192.168.123.100:0/4136655163 wait complete. 2026-03-10T12:40:23.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.326+0000 7fc8173ad700 1 -- 192.168.123.100:0/727849382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808098810 msgr2=0x7fc808098c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.326+0000 7fc8173ad700 1 --2- 192.168.123.100:0/727849382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808098810 0x7fc808098c60 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fc800009a60 tx=0x7fc800009d70 comp rx=0 tx=0).stop 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.330+0000 7fc8173ad700 1 -- 192.168.123.100:0/727849382 shutdown_connections 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.330+0000 7fc8173ad700 1 --2- 192.168.123.100:0/727849382 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808098810 0x7fc808098c60 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.330+0000 7fc8173ad700 1 --2- 192.168.123.100:0/727849382 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808097610 0x7fc808097a20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.330+0000 7fc8173ad700 1 -- 192.168.123.100:0/727849382 >> 192.168.123.100:0/727849382 conn(0x7fc808092ba0 msgr2=0x7fc808094ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.330+0000 7fc8173ad700 1 -- 192.168.123.100:0/727849382 shutdown_connections 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.330+0000 7fc8173ad700 1 -- 192.168.123.100:0/727849382 wait complete. 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc8173ad700 1 Processor -- start 2026-03-10T12:40:23.331 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc8173ad700 1 -- start start 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc8173ad700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808097610 0x7fc80812ced0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc8173ad700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808098810 0x7fc80812d410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc8173ad700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc80812da30 con 0x7fc808098810 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc8173ad700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc80812db70 con 0x7fc808097610 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc815baa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808098810 0x7fc80812d410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc815baa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808098810 0x7fc80812d410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40626/0 (socket says 192.168.123.100:40626) 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.331+0000 7fc815baa700 1 -- 192.168.123.100:0/562498735 learned_addr learned my addr 192.168.123.100:0/562498735 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.332+0000 7fc8163ab700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808097610 0x7fc80812ced0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.332+0000 7fc815baa700 1 -- 192.168.123.100:0/562498735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808097610 msgr2=0x7fc80812ced0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.332+0000 7fc815baa700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808097610 0x7fc80812ced0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.332+0000 7fc815baa700 1 -- 192.168.123.100:0/562498735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc800009710 con 0x7fc808098810 2026-03-10T12:40:23.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.332+0000 7fc815baa700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808098810 0x7fc80812d410 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fc800009fd0 tx=0x7fc800005330 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.332+0000 7fc8163ab700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808097610 0x7fc80812ced0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:40:23.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.333+0000 7fc8077fe700 1 -- 192.168.123.100:0/562498735 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc80001d070 con 0x7fc808098810 2026-03-10T12:40:23.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.333+0000 7fc8077fe700 1 -- 192.168.123.100:0/562498735 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc80000faf0 con 0x7fc808098810 2026-03-10T12:40:23.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.333+0000 7fc8173ad700 1 -- 192.168.123.100:0/562498735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc8081325c0 con 0x7fc808098810 2026-03-10T12:40:23.334 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.333+0000 7fc8173ad700 1 -- 192.168.123.100:0/562498735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc808132a50 con 0x7fc808098810 2026-03-10T12:40:23.334 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.334+0000 7fc8077fe700 1 -- 192.168.123.100:0/562498735 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc800021b70 con 0x7fc808098810 2026-03-10T12:40:23.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.337+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc8080a02f0 con 0x7fc808098810 2026-03-10T12:40:23.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.339+0000 7fc8077fe700 1 -- 192.168.123.100:0/562498735 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc80000f610 con 0x7fc808098810 2026-03-10T12:40:23.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.339+0000 7fc8077fe700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc7fc077ac0 0x7fc7fc079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.339+0000 7fc8077fe700 1 -- 192.168.123.100:0/562498735 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc80009b780 con 0x7fc808098810 2026-03-10T12:40:23.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.343+0000 7fc8163ab700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc7fc077ac0 0x7fc7fc079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.344+0000 7fc8163ab700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc7fc077ac0 0x7fc7fc079f70 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc80c005950 tx=0x7fc80c0058e0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.344+0000 7fc8077fe700 1 -- 192.168.123.100:0/562498735 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc800063e10 con 0x7fc808098810 2026-03-10T12:40:23.522 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.521+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc808003aa0 con 0x7fc808098810 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.522+0000 7fc8077fe700 1 -- 192.168.123.100:0/562498735 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fc800063560 con 0x7fc808098810 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:40:23.523 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.525+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc7fc077ac0 msgr2=0x7fc7fc079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.525+0000 7fc8057fa700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc7fc077ac0 0x7fc7fc079f70 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fc80c005950 tx=0x7fc80c0058e0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808098810 msgr2=0x7fc80812d410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808098810 0x7fc80812d410 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fc800009fd0 tx=0x7fc800005330 comp rx=0 tx=0).stop 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 shutdown_connections 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc7fc077ac0 0x7fc7fc079f70 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc808097610 0x7fc80812ced0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 --2- 192.168.123.100:0/562498735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc808098810 0x7fc80812d410 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 >> 192.168.123.100:0/562498735 conn(0x7fc808092ba0 msgr2=0x7fc80809ba40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 shutdown_connections 2026-03-10T12:40:23.527 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.526+0000 7fc8057fa700 1 -- 192.168.123.100:0/562498735 wait complete. 2026-03-10T12:40:23.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.597+0000 7fc788900700 1 -- 192.168.123.100:0/1639857028 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780100560 msgr2=0x7fc780100970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.597+0000 7fc788900700 1 --2- 192.168.123.100:0/1639857028 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780100560 0x7fc780100970 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fc778009b00 tx=0x7fc778009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:23.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.601+0000 7fc788900700 1 -- 192.168.123.100:0/1639857028 shutdown_connections 2026-03-10T12:40:23.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.601+0000 7fc788900700 1 --2- 192.168.123.100:0/1639857028 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780101760 0x7fc780101bb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.601+0000 7fc788900700 1 --2- 192.168.123.100:0/1639857028 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780100560 0x7fc780100970 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.601+0000 7fc788900700 1 -- 192.168.123.100:0/1639857028 >> 192.168.123.100:0/1639857028 conn(0x7fc7800fbb10 msgr2=0x7fc7800fdf40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.602+0000 7fc788900700 1 -- 192.168.123.100:0/1639857028 shutdown_connections 2026-03-10T12:40:23.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.602+0000 7fc788900700 1 -- 192.168.123.100:0/1639857028 wait complete. 2026-03-10T12:40:23.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.602+0000 7fc788900700 1 Processor -- start 2026-03-10T12:40:23.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.603+0000 7fc788900700 1 -- start start 2026-03-10T12:40:23.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.603+0000 7fc788900700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780100560 0x7fc780193ca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.603+0000 7fc788900700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780101760 0x7fc7801941e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.603+0000 7fc78669c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780100560 0x7fc780193ca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.603+0000 7fc78669c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780100560 0x7fc780193ca0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:47490/0 (socket says 192.168.123.100:47490) 2026-03-10T12:40:23.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.603+0000 7fc78669c700 1 -- 192.168.123.100:0/777146008 learned_addr learned my addr 192.168.123.100:0/777146008 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:23.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.604+0000 7fc785e9b700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780101760 0x7fc7801941e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.604+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc780194800 con 0x7fc780101760 2026-03-10T12:40:23.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.604+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc780194940 con 0x7fc780100560 2026-03-10T12:40:23.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.604+0000 7fc78669c700 1 -- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780101760 msgr2=0x7fc7801941e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.604+0000 7fc78669c700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780101760 0x7fc7801941e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.604+0000 7fc78669c700 1 -- 192.168.123.100:0/777146008 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc7780097e0 con 0x7fc780100560 2026-03-10T12:40:23.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.605+0000 7fc785e9b700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780101760 0x7fc7801941e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:40:23.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.605+0000 7fc78669c700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780100560 0x7fc780193ca0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fc778009fd0 tx=0x7fc778004a00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.605+0000 7fc7737fe700 1 -- 192.168.123.100:0/777146008 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc77801d070 con 0x7fc780100560 2026-03-10T12:40:23.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.605+0000 7fc7737fe700 1 -- 192.168.123.100:0/777146008 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc77800bc50 con 0x7fc780100560 2026-03-10T12:40:23.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.605+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc780199390 con 0x7fc780100560 2026-03-10T12:40:23.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.605+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc780199880 con 0x7fc780100560 2026-03-10T12:40:23.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.606+0000 7fc7737fe700 1 -- 192.168.123.100:0/777146008 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc778022620 con 0x7fc780100560 2026-03-10T12:40:23.607 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.606+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc780105840 con 0x7fc780100560 2026-03-10T12:40:23.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.607+0000 7fc7737fe700 1 -- 192.168.123.100:0/777146008 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc77800f600 con 0x7fc780100560 2026-03-10T12:40:23.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.607+0000 7fc7737fe700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc76c0779f0 0x7fc76c079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.607+0000 7fc7737fe700 1 -- 192.168.123.100:0/777146008 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc77809c090 con 0x7fc780100560 2026-03-10T12:40:23.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.607+0000 7fc785e9b700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc76c0779f0 0x7fc76c079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.608+0000 7fc785e9b700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc76c0779f0 0x7fc76c079ea0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fc774009fd0 tx=0x7fc774009380 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.610+0000 7fc7737fe700 1 -- 192.168.123.100:0/777146008 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc7780646a0 con 0x7fc780100560 2026-03-10T12:40:23.751 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.751+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fc780066e40 con 0x7fc780100560 2026-03-10T12:40:23.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.751+0000 7fc7737fe700 1 -- 192.168.123.100:0/777146008 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1961 (secure 0 0 0) 0x7fc7780277c0 con 0x7fc780100560 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:40:23.753 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 0 members: 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:23.754 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:23.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.755+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc76c0779f0 msgr2=0x7fc76c079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.755+0000 7fc788900700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc76c0779f0 0x7fc76c079ea0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fc774009fd0 tx=0x7fc774009380 comp rx=0 tx=0).stop 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.755+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780100560 msgr2=0x7fc780193ca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.755+0000 7fc788900700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780100560 0x7fc780193ca0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fc778009fd0 tx=0x7fc778004a00 comp rx=0 tx=0).stop 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.756+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 shutdown_connections 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.756+0000 7fc788900700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc76c0779f0 0x7fc76c079ea0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.756+0000 7fc788900700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc780100560 0x7fc780193ca0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.756+0000 7fc788900700 1 --2- 192.168.123.100:0/777146008 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc780101760 0x7fc7801941e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.756+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 >> 192.168.123.100:0/777146008 conn(0x7fc7800fbb10 msgr2=0x7fc780102980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.756+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 shutdown_connections 2026-03-10T12:40:23.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.756+0000 7fc788900700 1 -- 192.168.123.100:0/777146008 wait complete. 2026-03-10T12:40:23.757 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:40:23.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.825+0000 7f8754cbb700 1 -- 192.168.123.100:0/4194305272 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500ff770 msgr2=0x7f87500ffbe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.825+0000 7f8754cbb700 1 --2- 192.168.123.100:0/4194305272 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500ff770 0x7f87500ffbe0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f8740009b00 tx=0x7f8740009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:23.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.825+0000 7f8754cbb700 1 -- 192.168.123.100:0/4194305272 shutdown_connections 2026-03-10T12:40:23.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.825+0000 7f8754cbb700 1 --2- 192.168.123.100:0/4194305272 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500ff770 0x7f87500ffbe0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.825+0000 7f8754cbb700 1 --2- 192.168.123.100:0/4194305272 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f87500fee20 0x7f87500ff230 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.825+0000 7f8754cbb700 1 -- 192.168.123.100:0/4194305272 >> 192.168.123.100:0/4194305272 conn(0x7f87500fa9b0 msgr2=0x7f87500fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.826+0000 7f8754cbb700 1 -- 192.168.123.100:0/4194305272 shutdown_connections 2026-03-10T12:40:23.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.826+0000 7f8754cbb700 1 -- 192.168.123.100:0/4194305272 wait complete. 2026-03-10T12:40:23.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.826+0000 7f8754cbb700 1 Processor -- start 2026-03-10T12:40:23.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.826+0000 7f8754cbb700 1 -- start start 2026-03-10T12:40:23.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f8754cbb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500fee20 0x7f8750198040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f8754cbb700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f87500ff770 0x7f8750198580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f8754cbb700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8750198ba0 con 0x7f87500fee20 2026-03-10T12:40:23.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f8754cbb700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8750198ce0 con 0x7f87500ff770 2026-03-10T12:40:23.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500fee20 0x7f8750198040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500fee20 0x7f8750198040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40660/0 (socket says 192.168.123.100:40660) 2026-03-10T12:40:23.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874e59c700 1 -- 192.168.123.100:0/1364142685 learned_addr learned my addr 192.168.123.100:0/1364142685 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:23.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874dd9b700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f87500ff770 0x7f8750198580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874e59c700 1 -- 192.168.123.100:0/1364142685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f87500ff770 msgr2=0x7f8750198580 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874e59c700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f87500ff770 0x7f8750198580 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874e59c700 1 -- 192.168.123.100:0/1364142685 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f87400097e0 con 0x7f87500fee20 2026-03-10T12:40:23.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.827+0000 7f874e59c700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500fee20 0x7f8750198040 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f874400b700 tx=0x7f874400ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.828+0000 7f873f7fe700 1 -- 192.168.123.100:0/1364142685 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87440107c0 con 0x7f87500fee20 2026-03-10T12:40:23.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.828+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8750101090 con 0x7f87500fee20 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.828+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f87501015e0 con 0x7f87500fee20 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.828+0000 7f873f7fe700 1 -- 192.168.123.100:0/1364142685 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8744010e00 con 0x7f87500fee20 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.828+0000 7f873f7fe700 1 -- 192.168.123.100:0/1364142685 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f874400f360 con 0x7f87500fee20 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.829+0000 7f873f7fe700 1 -- 192.168.123.100:0/1364142685 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f874400f4c0 con 0x7f87500fee20 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.829+0000 7f873f7fe700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8738088b20 0x7f873808afd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.829+0000 7f874dd9b700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8738088b20 0x7f873808afd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.829+0000 7f873f7fe700 1 -- 192.168.123.100:0/1364142685 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f874409a400 con 0x7f87500fee20 2026-03-10T12:40:23.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.831+0000 7f874dd9b700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8738088b20 0x7f873808afd0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f874000b5c0 tx=0x7f8740005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:23.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.831+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8730005320 con 0x7f87500fee20 2026-03-10T12:40:23.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.834+0000 7f873f7fe700 1 -- 192.168.123.100:0/1364142685 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8744062a90 con 0x7f87500fee20 2026-03-10T12:40:23.963 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.962+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8730000bf0 con 0x7f8738088b20 2026-03-10T12:40:23.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.968+0000 7f873f7fe700 1 -- 192.168.123.100:0/1364142685 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f8730000bf0 con 0x7f8738088b20 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8738088b20 msgr2=0x7f873808afd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8738088b20 0x7f873808afd0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f874000b5c0 tx=0x7f8740005fb0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500fee20 msgr2=0x7f8750198040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500fee20 0x7f8750198040 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f874400b700 tx=0x7f874400ba10 comp rx=0 tx=0).stop 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 shutdown_connections 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8738088b20 0x7f873808afd0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f87500fee20 0x7f8750198040 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 --2- 192.168.123.100:0/1364142685 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f87500ff770 0x7f8750198580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 >> 192.168.123.100:0/1364142685 conn(0x7f87500fa9b0 msgr2=0x7f8750107450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:23.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 shutdown_connections 2026-03-10T12:40:23.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:23.972+0000 7f8754cbb700 1 -- 192.168.123.100:0/1364142685 wait complete. 2026-03-10T12:40:24.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 -- 192.168.123.100:0/4259282376 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 msgr2=0x7fab10102b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:24.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 --2- 192.168.123.100:0/4259282376 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10102b50 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fab00009b00 tx=0x7fab00009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:24.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 -- 192.168.123.100:0/4259282376 shutdown_connections 2026-03-10T12:40:24.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 --2- 192.168.123.100:0/4259282376 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab10103940 0x7fab10103d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:24.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 --2- 192.168.123.100:0/4259282376 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10102b50 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:24.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 -- 192.168.123.100:0/4259282376 >> 192.168.123.100:0/4259282376 conn(0x7fab100fdcf0 msgr2=0x7fab10100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:24.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 -- 192.168.123.100:0/4259282376 shutdown_connections 2026-03-10T12:40:24.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.235+0000 7fab16ecb700 1 -- 192.168.123.100:0/4259282376 wait complete. 2026-03-10T12:40:24.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.236+0000 7fab16ecb700 1 Processor -- start 2026-03-10T12:40:24.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.236+0000 7fab16ecb700 1 -- start start 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.236+0000 7fab16ecb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10198000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab14c67700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10198000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab14c67700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10198000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40672/0 (socket says 192.168.123.100:40672) 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab16ecb700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab10103940 0x7fab10198540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab16ecb700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab10198ad0 con 0x7fab10102740 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab16ecb700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fab10198c10 con 0x7fab10103940 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab14c67700 1 -- 192.168.123.100:0/1778871632 learned_addr learned my addr 192.168.123.100:0/1778871632 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab0ffff700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab10103940 0x7fab10198540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab14c67700 1 -- 192.168.123.100:0/1778871632 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab10103940 msgr2=0x7fab10198540 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab14c67700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab10103940 0x7fab10198540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:24.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab14c67700 1 -- 192.168.123.100:0/1778871632 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab000097e0 con 0x7fab10102740 2026-03-10T12:40:24.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.237+0000 7fab14c67700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10198000 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fab00004c80 tx=0x7fab0000b940 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:24.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.238+0000 7fab0dffb700 1 -- 192.168.123.100:0/1778871632 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab0001d070 con 0x7fab10102740 2026-03-10T12:40:24.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.238+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fab1019d670 con 0x7fab10102740 2026-03-10T12:40:24.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.238+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fab1019db30 con 0x7fab10102740 2026-03-10T12:40:24.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.240+0000 7fab0dffb700 1 -- 192.168.123.100:0/1778871632 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fab0000bd40 con 0x7fab10102740 2026-03-10T12:40:24.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.240+0000 7fab0dffb700 1 -- 192.168.123.100:0/1778871632 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fab00021960 con 0x7fab10102740 2026-03-10T12:40:24.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.240+0000 7fab0dffb700 1 -- 192.168.123.100:0/1778871632 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 36) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fab00021b80 con 0x7fab10102740 2026-03-10T12:40:24.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.240+0000 7fab0dffb700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf8077b90 0x7faaf807a040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:24.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.241+0000 7fab0dffb700 1 -- 192.168.123.100:0/1778871632 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fab0009c330 con 0x7fab10102740 2026-03-10T12:40:24.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.241+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fab10066e40 con 0x7fab10102740 2026-03-10T12:40:24.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.242+0000 7fab0ffff700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf8077b90 0x7faaf807a040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:24.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.243+0000 7fab0ffff700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf8077b90 0x7faaf807a040 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fab0400ba60 tx=0x7fab04005d50 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:24.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.245+0000 7fab0dffb700 1 -- 192.168.123.100:0/1778871632 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fab00064a70 con 0x7fab10102740 2026-03-10T12:40:24.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.413+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fab1019de10 con 0x7fab10102740 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.414+0000 7fab0dffb700 1 -- 192.168.123.100:0/1778871632 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+733 (secure 0 0 0) 0x7fab00026070 con 0x7fab10102740 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_WARN Degraded data redundancy: 645/630 objects degraded (102.381%), 9 pgs degraded 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 645/630 objects degraded (102.381%), 9 pgs degraded 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.6 is active+recovery_wait+degraded, acting [0,1,4] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.b is active+recovery_wait+degraded, acting [1,0,4] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.c is active+recovery_wait+degraded, acting [5,0,3] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.10 is active+recovery_wait+degraded, acting [5,0,1] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.11 is active+recovery_wait+degraded, acting [3,4,0] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.12 is active+recovery_wait+degraded, acting [0,3,1] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.17 is active+recovery_wait+degraded, acting [0,5,2] 2026-03-10T12:40:24.414 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.1f is active+recovery_wait+degraded, acting [0,3,2] 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.416+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf8077b90 msgr2=0x7faaf807a040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.416+0000 7fab16ecb700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf8077b90 0x7faaf807a040 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fab0400ba60 tx=0x7fab04005d50 comp rx=0 tx=0).stop 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.416+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 msgr2=0x7fab10198000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.416+0000 7fab16ecb700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10198000 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fab00004c80 tx=0x7fab0000b940 comp rx=0 tx=0).stop 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.417+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 shutdown_connections 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.417+0000 7fab16ecb700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faaf8077b90 0x7faaf807a040 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.417+0000 7fab16ecb700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fab10102740 0x7fab10198000 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.417+0000 7fab16ecb700 1 --2- 192.168.123.100:0/1778871632 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fab10103940 0x7fab10198540 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.417+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 >> 192.168.123.100:0/1778871632 conn(0x7fab100fdcf0 msgr2=0x7fab10106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.417+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 shutdown_connections 2026-03-10T12:40:24.417 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:24.417+0000 7fab16ecb700 1 -- 192.168.123.100:0/1778871632 wait complete. 2026-03-10T12:40:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:24 vm07.local ceph-mon[93622]: pgmap v53: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.2 MiB/s wr, 372 op/s; 645/630 objects degraded (102.381%); 0 B/s, 15 objects/s recovering 2026-03-10T12:40:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:24 vm07.local ceph-mon[93622]: from='client.34186 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:24 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/562498735' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:24.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:24 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/777146008' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:40:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:24 vm00.local ceph-mon[103263]: pgmap v53: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.2 MiB/s wr, 372 op/s; 645/630 objects degraded (102.381%); 0 B/s, 15 objects/s recovering 2026-03-10T12:40:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:24 vm00.local ceph-mon[103263]: from='client.34186 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:24 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/562498735' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:24.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:24 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/777146008' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:40:25.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:25 vm07.local ceph-mon[93622]: from='client.34190 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:25.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:25 vm07.local ceph-mon[93622]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:25.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:25 vm07.local ceph-mon[93622]: from='client.34206 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:25.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:25 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 645/630 objects degraded (102.381%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:25.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:25 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1778871632' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:40:25.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:25 vm07.local ceph-mon[93622]: pgmap v54: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 741 KiB/s wr, 261 op/s; 645/630 objects degraded (102.381%); 0 B/s, 11 objects/s recovering 2026-03-10T12:40:25.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:25 vm00.local ceph-mon[103263]: from='client.34190 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:25.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:25 vm00.local ceph-mon[103263]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:25.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:25 vm00.local ceph-mon[103263]: from='client.34206 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:25.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:25 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 645/630 objects degraded (102.381%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:25.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:25 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1778871632' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:40:25.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:25 vm00.local ceph-mon[103263]: pgmap v54: 65 pgs: 9 active+recovery_wait+degraded, 2 active+recovering, 54 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 741 KiB/s wr, 261 op/s; 645/630 objects degraded (102.381%); 0 B/s, 11 objects/s recovering 2026-03-10T12:40:28.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:27 vm00.local ceph-mon[103263]: pgmap v55: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 746 KiB/s wr, 270 op/s; 645/333 objects degraded (193.694%); 0 B/s, 14 objects/s recovering 2026-03-10T12:40:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:27 vm07.local ceph-mon[93622]: pgmap v55: 65 pgs: 9 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 746 KiB/s wr, 270 op/s; 645/333 objects degraded (193.694%); 0 B/s, 14 objects/s recovering 2026-03-10T12:40:29.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:29 vm00.local ceph-mon[103263]: pgmap v56: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 757 KiB/s wr, 253 op/s; 577/333 objects degraded (173.273%); 0 B/s, 17 objects/s recovering 2026-03-10T12:40:29.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:29 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 577/333 objects degraded (173.273%), 8 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:29 vm07.local ceph-mon[93622]: pgmap v56: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 757 KiB/s wr, 253 op/s; 577/333 objects degraded (173.273%); 0 B/s, 17 objects/s recovering 2026-03-10T12:40:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:29 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 577/333 objects degraded (173.273%), 8 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:31.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:31 vm00.local ceph-mon[103263]: pgmap v57: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 290 KiB/s wr, 119 op/s; 577/333 objects degraded (173.273%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:32.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:31 vm07.local ceph-mon[93622]: pgmap v57: 65 pgs: 8 active+recovery_wait+degraded, 1 active+recovering, 56 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 290 KiB/s wr, 119 op/s; 577/333 objects degraded (173.273%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:32 vm00.local ceph-mon[103263]: pgmap v58: 65 pgs: 8 active+recovery_wait+degraded, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.3 KiB/s rd, 282 KiB/s wr, 96 op/s; 577/333 objects degraded (173.273%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:32 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:32 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:32 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T12:40:33.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:32 vm07.local ceph-mon[93622]: pgmap v58: 65 pgs: 8 active+recovery_wait+degraded, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.3 KiB/s rd, 282 KiB/s wr, 96 op/s; 577/333 objects degraded (173.273%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:33.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:32 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:33.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:32 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:33.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:32 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-10T12:40:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:33 vm00.local ceph-mon[103263]: mgrmap e37: vm00.nescmq(active, since 92s), standbys: vm07.kfawlb 2026-03-10T12:40:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:33 vm07.local ceph-mon[93622]: mgrmap e37: vm00.nescmq(active, since 92s), standbys: vm07.kfawlb 2026-03-10T12:40:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:34 vm07.local ceph-mon[93622]: pgmap v59: 65 pgs: 8 active+recovery_wait+degraded, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.2 KiB/s rd, 22 KiB/s wr, 10 op/s; 577/333 objects degraded (173.273%); 0 B/s, 8 objects/s recovering 2026-03-10T12:40:35.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:34 vm00.local ceph-mon[103263]: pgmap v59: 65 pgs: 8 active+recovery_wait+degraded, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.2 KiB/s rd, 22 KiB/s wr, 10 op/s; 577/333 objects degraded (173.273%); 0 B/s, 8 objects/s recovering 2026-03-10T12:40:37.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:36 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 507/333 objects degraded (152.252%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:37.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:36 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 507/333 objects degraded (152.252%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:37 vm07.local ceph-mon[93622]: pgmap v60: 65 pgs: 7 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.2 KiB/s rd, 22 KiB/s wr, 10 op/s; 507/333 objects degraded (152.252%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:38.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:37 vm00.local ceph-mon[103263]: pgmap v60: 65 pgs: 7 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 5.2 KiB/s rd, 22 KiB/s wr, 10 op/s; 507/333 objects degraded (152.252%); 0 B/s, 12 objects/s recovering 2026-03-10T12:40:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:39 vm00.local ceph-mon[103263]: pgmap v61: 65 pgs: 7 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 170 B/s rd, 17 KiB/s wr, 1 op/s; 507/333 objects degraded (152.252%); 0 B/s, 9 objects/s recovering 2026-03-10T12:40:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:39 vm07.local ceph-mon[93622]: pgmap v61: 65 pgs: 7 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 170 B/s rd, 17 KiB/s wr, 1 op/s; 507/333 objects degraded (152.252%); 0 B/s, 9 objects/s recovering 2026-03-10T12:40:42.049 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:41 vm07.local ceph-mon[93622]: pgmap v62: 65 pgs: 6 active+recovery_wait+degraded, 1 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 429/333 objects degraded (128.829%); 0 B/s, 10 objects/s recovering 2026-03-10T12:40:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:41 vm00.local ceph-mon[103263]: pgmap v62: 65 pgs: 6 active+recovery_wait+degraded, 1 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 429/333 objects degraded (128.829%); 0 B/s, 10 objects/s recovering 2026-03-10T12:40:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:42 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 351/333 objects degraded (105.405%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:42 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 351/333 objects degraded (105.405%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:44.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:43 vm00.local ceph-mon[103263]: pgmap v63: 65 pgs: 5 active+recovery_wait+degraded, 2 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 351/333 objects degraded (105.405%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:43 vm07.local ceph-mon[93622]: pgmap v63: 65 pgs: 5 active+recovery_wait+degraded, 2 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 351/333 objects degraded (105.405%); 0 B/s, 13 objects/s recovering 2026-03-10T12:40:46.162 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:45 vm00.local ceph-mon[103263]: pgmap v64: 65 pgs: 5 active+recovery_wait+degraded, 2 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 351/333 objects degraded (105.405%); 0 B/s, 10 objects/s recovering 2026-03-10T12:40:46.162 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:45 vm07.local ceph-mon[93622]: pgmap v64: 65 pgs: 5 active+recovery_wait+degraded, 2 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 351/333 objects degraded (105.405%); 0 B/s, 10 objects/s recovering 2026-03-10T12:40:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:40:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:47 vm00.local ceph-mon[103263]: pgmap v65: 65 pgs: 5 active+recovery_wait+degraded, 2 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 351/333 objects degraded (105.405%); 0 B/s, 14 objects/s recovering 2026-03-10T12:40:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:40:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:47 vm07.local ceph-mon[93622]: pgmap v65: 65 pgs: 5 active+recovery_wait+degraded, 2 active+recovering, 58 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 351/333 objects degraded (105.405%); 0 B/s, 14 objects/s recovering 2026-03-10T12:40:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:49 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:49 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T12:40:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:49 vm00.local ceph-mon[103263]: pgmap v66: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 268/333 objects degraded (80.480%); 0 B/s, 15 objects/s recovering 2026-03-10T12:40:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:49 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 268/333 objects degraded (80.480%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:49 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:40:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:49 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T12:40:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:49 vm07.local ceph-mon[93622]: pgmap v66: 65 pgs: 4 active+recovery_wait+degraded, 2 active+recovering, 59 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 268/333 objects degraded (80.480%); 0 B/s, 15 objects/s recovering 2026-03-10T12:40:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:49 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 268/333 objects degraded (80.480%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:52.124 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:51 vm07.local ceph-mon[93622]: pgmap v67: 65 pgs: 3 active+recovery_wait+degraded, 2 active+recovering, 60 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 204/333 objects degraded (61.261%); 0 B/s, 19 objects/s recovering 2026-03-10T12:40:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:51 vm00.local ceph-mon[103263]: pgmap v67: 65 pgs: 3 active+recovery_wait+degraded, 2 active+recovering, 60 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 204/333 objects degraded (61.261%); 0 B/s, 19 objects/s recovering 2026-03-10T12:40:54.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:53 vm00.local ceph-mon[103263]: pgmap v68: 65 pgs: 3 active+recovery_wait+degraded, 2 active+recovering, 60 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 204/333 objects degraded (61.261%); 0 B/s, 19 objects/s recovering 2026-03-10T12:40:54.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:53 vm07.local ceph-mon[93622]: pgmap v68: 65 pgs: 3 active+recovery_wait+degraded, 2 active+recovering, 60 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 204/333 objects degraded (61.261%); 0 B/s, 19 objects/s recovering 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.486+0000 7f9d3f985700 1 -- 192.168.123.100:0/612196203 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 msgr2=0x7f9d38105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.486+0000 7f9d3f985700 1 --2- 192.168.123.100:0/612196203 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d38105ac0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f9d34009b00 tx=0x7f9d34009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 -- 192.168.123.100:0/612196203 shutdown_connections 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 --2- 192.168.123.100:0/612196203 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d38105ac0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 --2- 192.168.123.100:0/612196203 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d38069160 0x7f9d38103160 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 -- 192.168.123.100:0/612196203 >> 192.168.123.100:0/612196203 conn(0x7f9d380faa70 msgr2=0x7f9d380fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 -- 192.168.123.100:0/612196203 shutdown_connections 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 -- 192.168.123.100:0/612196203 wait complete. 2026-03-10T12:40:54.487 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 Processor -- start 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 -- start start 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d38069160 0x7f9d38193b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d381940d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d381946f0 con 0x7f9d381036a0 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.487+0000 7f9d3f985700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d38194830 con 0x7f9d38069160 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3cf20700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d381940d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3cf20700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d381940d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:35118/0 (socket says 192.168.123.100:35118) 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3cf20700 1 -- 192.168.123.100:0/1413769557 learned_addr learned my addr 192.168.123.100:0/1413769557 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3cf20700 1 -- 192.168.123.100:0/1413769557 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d38069160 msgr2=0x7f9d38193b90 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3d721700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d38069160 0x7f9d38193b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3cf20700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d38069160 0x7f9d38193b90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3cf20700 1 -- 192.168.123.100:0/1413769557 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d340097e0 con 0x7f9d381036a0 2026-03-10T12:40:54.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3d721700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d38069160 0x7f9d38193b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:40:54.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3cf20700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d381940d0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f9d34009ad0 tx=0x7f9d34005070 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:54.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d2e7fc700 1 -- 192.168.123.100:0/1413769557 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d3401d070 con 0x7f9d381036a0 2026-03-10T12:40:54.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d2e7fc700 1 -- 192.168.123.100:0/1413769557 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9d3400bc50 con 0x7f9d381036a0 2026-03-10T12:40:54.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d2e7fc700 1 -- 192.168.123.100:0/1413769557 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9d3400f7f0 con 0x7f9d381036a0 2026-03-10T12:40:54.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d38199280 con 0x7f9d381036a0 2026-03-10T12:40:54.489 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.488+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d381a1f50 con 0x7f9d381036a0 2026-03-10T12:40:54.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.489+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d3818dd90 con 0x7f9d381036a0 2026-03-10T12:40:54.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.492+0000 7f9d2e7fc700 1 -- 192.168.123.100:0/1413769557 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9d3400f950 con 0x7f9d381036a0 2026-03-10T12:40:54.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.493+0000 7f9d2e7fc700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9d240779e0 0x7f9d24079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.493+0000 7f9d2e7fc700 1 -- 192.168.123.100:0/1413769557 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9d3409be10 con 0x7f9d381036a0 2026-03-10T12:40:54.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.493+0000 7f9d3d721700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9d240779e0 0x7f9d24079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.494+0000 7f9d3d721700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9d240779e0 0x7f9d24079e90 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f9d28005fd0 tx=0x7f9d28005dc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:54.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.494+0000 7f9d2e7fc700 1 -- 192.168.123.100:0/1413769557 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9d34060cf0 con 0x7f9d381036a0 2026-03-10T12:40:54.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.624+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9d38061190 con 0x7f9d240779e0 2026-03-10T12:40:54.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.625+0000 7f9d2e7fc700 1 -- 192.168.123.100:0/1413769557 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9d38061190 con 0x7f9d240779e0 2026-03-10T12:40:54.627 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.627+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9d240779e0 msgr2=0x7f9d24079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.627 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.627+0000 7f9d3f985700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9d240779e0 0x7f9d24079e90 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f9d28005fd0 tx=0x7f9d28005dc0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.627+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 msgr2=0x7f9d381940d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.627+0000 7f9d3f985700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d381940d0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f9d34009ad0 tx=0x7f9d34005070 comp rx=0 tx=0).stop 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.628+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 shutdown_connections 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.628+0000 7f9d3f985700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9d240779e0 0x7f9d24079e90 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.628+0000 7f9d3f985700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9d38069160 0x7f9d38193b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.628+0000 7f9d3f985700 1 --2- 192.168.123.100:0/1413769557 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9d381036a0 0x7f9d381940d0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.628+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 >> 192.168.123.100:0/1413769557 conn(0x7f9d380faa70 msgr2=0x7f9d380fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.628+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 shutdown_connections 2026-03-10T12:40:54.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.628+0000 7f9d3f985700 1 -- 192.168.123.100:0/1413769557 wait complete. 2026-03-10T12:40:54.637 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.694+0000 7f898b5e0700 1 -- 192.168.123.100:0/3321239037 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 msgr2=0x7f8984105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.694+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3321239037 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 0x7f8984105ac0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f8980009b50 tx=0x7f8980009e60 comp rx=0 tx=0).stop 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.694+0000 7f898b5e0700 1 -- 192.168.123.100:0/3321239037 shutdown_connections 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.694+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3321239037 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 0x7f8984105ac0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.694+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3321239037 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 0x7f8984103160 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.694+0000 7f898b5e0700 1 -- 192.168.123.100:0/3321239037 >> 192.168.123.100:0/3321239037 conn(0x7f89840faa70 msgr2=0x7f89840fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 -- 192.168.123.100:0/3321239037 shutdown_connections 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 -- 192.168.123.100:0/3321239037 wait complete. 2026-03-10T12:40:54.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 Processor -- start 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 -- start start 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 0x7f8984197fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 0x7f89841984e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8984198b00 con 0x7f89841036a0 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.695+0000 7f898b5e0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8984198c40 con 0x7f8984069160 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f898937c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 0x7f8984197fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f898937c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 0x7f8984197fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60056/0 (socket says 192.168.123.100:60056) 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f898937c700 1 -- 192.168.123.100:0/3471805876 learned_addr learned my addr 192.168.123.100:0/3471805876 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f8988b7b700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 0x7f89841984e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f898937c700 1 -- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 msgr2=0x7f89841984e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f898937c700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 0x7f89841984e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f898937c700 1 -- 192.168.123.100:0/3471805876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89800097e0 con 0x7f8984069160 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f8988b7b700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 0x7f89841984e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.696+0000 7f898937c700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 0x7f8984197fa0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f8974009fd0 tx=0x7f897400eea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.697+0000 7f897a7fc700 1 -- 192.168.123.100:0/3471805876 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8974009980 con 0x7f8984069160 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.697+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f898419d6f0 con 0x7f8984069160 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.697+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f898419dc40 con 0x7f8984069160 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.697+0000 7f897a7fc700 1 -- 192.168.123.100:0/3471805876 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8974004500 con 0x7f8984069160 2026-03-10T12:40:54.697 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.697+0000 7f897a7fc700 1 -- 192.168.123.100:0/3471805876 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8974010450 con 0x7f8984069160 2026-03-10T12:40:54.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.698+0000 7f897a7fc700 1 -- 192.168.123.100:0/3471805876 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f897400cca0 con 0x7f8984069160 2026-03-10T12:40:54.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.698+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8968005320 con 0x7f8984069160 2026-03-10T12:40:54.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.699+0000 7f897a7fc700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f89700778e0 0x7f8970079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.699+0000 7f897a7fc700 1 -- 192.168.123.100:0/3471805876 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8974014070 con 0x7f8984069160 2026-03-10T12:40:54.699 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.699+0000 7f8988b7b700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f89700778e0 0x7f8970079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.699+0000 7f8988b7b700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f89700778e0 0x7f8970079d90 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8980005950 tx=0x7f89800058e0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:54.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.702+0000 7f897a7fc700 1 -- 192.168.123.100:0/3471805876 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8974062760 con 0x7f8984069160 2026-03-10T12:40:54.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.824+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8968000bf0 con 0x7f89700778e0 2026-03-10T12:40:54.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.826+0000 7f897a7fc700 1 -- 192.168.123.100:0/3471805876 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f8968000bf0 con 0x7f89700778e0 2026-03-10T12:40:54.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f89700778e0 msgr2=0x7f8970079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f89700778e0 0x7f8970079d90 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8980005950 tx=0x7f89800058e0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 msgr2=0x7f8984197fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 0x7f8984197fa0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f8974009fd0 tx=0x7f897400eea0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 shutdown_connections 2026-03-10T12:40:54.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f89700778e0 0x7f8970079d90 secure :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8980005950 tx=0x7f89800058e0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8984069160 0x7f8984197fa0 secure :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f8974009fd0 tx=0x7f897400eea0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 --2- 192.168.123.100:0/3471805876 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89841036a0 0x7f89841984e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 >> 192.168.123.100:0/3471805876 conn(0x7f89840faa70 msgr2=0x7f8984104300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:54.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 shutdown_connections 2026-03-10T12:40:54.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.829+0000 7f898b5e0700 1 -- 192.168.123.100:0/3471805876 wait complete. 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.899+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/105080540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68075700 msgr2=0x7fcd68075b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.899+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/105080540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68075700 0x7fcd68075b10 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fcd58009a60 tx=0x7fcd58009d70 comp rx=0 tx=0).stop 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.899+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/105080540 shutdown_connections 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.899+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/105080540 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcd68076950 0x7fcd68076dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.899+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/105080540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68075700 0x7fcd68075b10 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.899+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/105080540 >> 192.168.123.100:0/105080540 conn(0x7fcd680fda60 msgr2=0x7fcd680ffeb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.900+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/105080540 shutdown_connections 2026-03-10T12:40:54.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.900+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/105080540 wait complete. 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.900+0000 7fcd6ebf8700 1 Processor -- start 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.900+0000 7fcd6ebf8700 1 -- start start 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6ebf8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcd68075700 0x7fcd6819c160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6ebf8700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68076950 0x7fcd6819c6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6ebf8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd6819ccc0 con 0x7fcd68075700 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6d3f5700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68076950 0x7fcd6819c6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6d3f5700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68076950 0x7fcd6819c6a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60064/0 (socket says 192.168.123.100:60064) 2026-03-10T12:40:54.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6d3f5700 1 -- 192.168.123.100:0/4281847945 learned_addr learned my addr 192.168.123.100:0/4281847945 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcd6819ce00 con 0x7fcd68076950 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6dbf6700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcd68075700 0x7fcd6819c160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6dbf6700 1 -- 192.168.123.100:0/4281847945 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68076950 msgr2=0x7fcd6819c6a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.901+0000 7fcd6dbf6700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68076950 0x7fcd6819c6a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd6dbf6700 1 -- 192.168.123.100:0/4281847945 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcd640097e0 con 0x7fcd68075700 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd6d3f5700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68076950 0x7fcd6819c6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd6dbf6700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcd68075700 0x7fcd6819c160 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fcd58009a60 tx=0x7fcd5800f740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:54.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd5effd700 1 -- 192.168.123.100:0/4281847945 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd5801d070 con 0x7fcd68075700 2026-03-10T12:40:54.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd5effd700 1 -- 192.168.123.100:0/4281847945 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcd5800fca0 con 0x7fcd68075700 2026-03-10T12:40:54.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcd58009710 con 0x7fcd68075700 2026-03-10T12:40:54.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd5effd700 1 -- 192.168.123.100:0/4281847945 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcd58017720 con 0x7fcd68075700 2026-03-10T12:40:54.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.902+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcd681a1bb0 con 0x7fcd68075700 2026-03-10T12:40:54.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.904+0000 7fcd5effd700 1 -- 192.168.123.100:0/4281847945 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcd580178a0 con 0x7fcd68075700 2026-03-10T12:40:54.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.904+0000 7fcd5effd700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcd540778c0 0x7fcd54079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:54.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.904+0000 7fcd5effd700 1 -- 192.168.123.100:0/4281847945 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fcd5809bce0 con 0x7fcd68075700 2026-03-10T12:40:54.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.904+0000 7fcd6d3f5700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcd540778c0 0x7fcd54079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:54.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.904+0000 7fcd6d3f5700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcd540778c0 0x7fcd54079d70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fcd640103c0 tx=0x7fcd64009500 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:54.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.905+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcd6804ea50 con 0x7fcd68075700 2026-03-10T12:40:54.908 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:54.908+0000 7fcd5effd700 1 -- 192.168.123.100:0/4281847945 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcd580645c0 con 0x7fcd68075700 2026-03-10T12:40:55.023 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.023+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fcd6810de50 con 0x7fcd540778c0 2026-03-10T12:40:55.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.028+0000 7fcd5effd700 1 -- 192.168.123.100:0/4281847945 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fcd6810de50 con 0x7fcd540778c0 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (7m) 69s ago 7m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (8m) 69s ago 8m 8774k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (7m) 84s ago 7m 11.2M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (89s) 69s ago 8m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (87s) 84s ago 7m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (7m) 69s ago 7m 90.9M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (5m) 69s ago 5m 152M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (5m) 69s ago 5m 18.0M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (5m) 84s ago 5m 17.4M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (5m) 84s ago 5m 169M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (2m) 69s ago 8m 615M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (2m) 84s ago 7m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (2m) 69s ago 8m 54.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (106s) 84s ago 7m 50.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (7m) 69s ago 7m 15.2M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (7m) 84s ago 7m 15.5M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (73s) 69s ago 7m 29.4M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (6m) 69s ago 6m 383M 4096M 18.2.0 dc2bc1663786 5bc971fe4d49 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (6m) 69s ago 6m 322M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (6m) 84s ago 6m 452M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (6m) 84s ago 6m 402M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (6m) 84s ago 6m 370M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:40:55.029 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (2m) 69s ago 7m 55.3M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.030+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcd540778c0 msgr2=0x7fcd54079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.030+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcd540778c0 0x7fcd54079d70 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fcd640103c0 tx=0x7fcd64009500 comp rx=0 tx=0).stop 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.030+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcd68075700 msgr2=0x7fcd6819c160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.030+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcd68075700 0x7fcd6819c160 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fcd58009a60 tx=0x7fcd5800f740 comp rx=0 tx=0).stop 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.031+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 shutdown_connections 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.031+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcd540778c0 0x7fcd54079d70 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.031+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcd68075700 0x7fcd6819c160 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.031+0000 7fcd6ebf8700 1 --2- 192.168.123.100:0/4281847945 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcd68076950 0x7fcd6819c6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.031+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 >> 192.168.123.100:0/4281847945 conn(0x7fcd680fda60 msgr2=0x7fcd6810c730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.031 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.031+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 shutdown_connections 2026-03-10T12:40:55.032 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.031+0000 7fcd6ebf8700 1 -- 192.168.123.100:0/4281847945 wait complete. 2026-03-10T12:40:55.101 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:54 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 204/333 objects degraded (61.261%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:55.101 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.100+0000 7f0c88c6f700 1 -- 192.168.123.100:0/2259601455 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c841038d0 msgr2=0x7f0c84105cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.101 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.100+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/2259601455 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c841038d0 0x7f0c84105cb0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f0c74009b00 tx=0x7f0c74009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:55.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.104+0000 7f0c88c6f700 1 -- 192.168.123.100:0/2259601455 shutdown_connections 2026-03-10T12:40:55.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.104+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/2259601455 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c841038d0 0x7f0c84105cb0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.104+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/2259601455 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c84100fb0 0x7f0c84103390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.104+0000 7f0c88c6f700 1 -- 192.168.123.100:0/2259601455 >> 192.168.123.100:0/2259601455 conn(0x7f0c840fa990 msgr2=0x7f0c840fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.104 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.104+0000 7f0c88c6f700 1 -- 192.168.123.100:0/2259601455 shutdown_connections 2026-03-10T12:40:55.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.104+0000 7f0c88c6f700 1 -- 192.168.123.100:0/2259601455 wait complete. 2026-03-10T12:40:55.105 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c88c6f700 1 Processor -- start 2026-03-10T12:40:55.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c88c6f700 1 -- start start 2026-03-10T12:40:55.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c88c6f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c84100fb0 0x7f0c84193be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c88c6f700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c841038d0 0x7f0c84194120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c88c6f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c841946b0 con 0x7f0c84100fb0 2026-03-10T12:40:55.106 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c88c6f700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c841947f0 con 0x7f0c841038d0 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c8259c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c84100fb0 0x7f0c84193be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c8259c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c84100fb0 0x7f0c84193be0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:35172/0 (socket says 192.168.123.100:35172) 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c8259c700 1 -- 192.168.123.100:0/4040678729 learned_addr learned my addr 192.168.123.100:0/4040678729 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c8259c700 1 -- 192.168.123.100:0/4040678729 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c841038d0 msgr2=0x7f0c84194120 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c8259c700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c841038d0 0x7f0c84194120 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.105+0000 7f0c8259c700 1 -- 192.168.123.100:0/4040678729 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0c740097e0 con 0x7f0c84100fb0 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.106+0000 7f0c8259c700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c84100fb0 0x7f0c84193be0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f0c6c00c930 tx=0x7f0c6c00ccf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.106+0000 7f0c7b7fe700 1 -- 192.168.123.100:0/4040678729 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c6c007ab0 con 0x7f0c84100fb0 2026-03-10T12:40:55.107 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.106+0000 7f0c7b7fe700 1 -- 192.168.123.100:0/4040678729 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0c6c007c10 con 0x7f0c84100fb0 2026-03-10T12:40:55.108 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.106+0000 7f0c7b7fe700 1 -- 192.168.123.100:0/4040678729 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c6c0186e0 con 0x7f0c84100fb0 2026-03-10T12:40:55.108 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.107+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0c841992b0 con 0x7f0c84100fb0 2026-03-10T12:40:55.109 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.108+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0c841997d0 con 0x7f0c84100fb0 2026-03-10T12:40:55.109 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.109+0000 7f0c7b7fe700 1 -- 192.168.123.100:0/4040678729 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0c6c018840 con 0x7f0c84100fb0 2026-03-10T12:40:55.109 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.109+0000 7f0c7b7fe700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0c700778c0 0x7f0c70079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.109+0000 7f0c7b7fe700 1 -- 192.168.123.100:0/4040678729 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f0c6c099b40 con 0x7f0c84100fb0 2026-03-10T12:40:55.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.109+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0c840fc590 con 0x7f0c84100fb0 2026-03-10T12:40:55.110 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.110+0000 7f0c81d9b700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0c700778c0 0x7f0c70079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.111 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.110+0000 7f0c81d9b700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0c700778c0 0x7f0c70079d70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f0c74005f50 tx=0x7f0c74005dc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.112+0000 7f0c7b7fe700 1 -- 192.168.123.100:0/4040678729 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0c6c0623f0 con 0x7f0c84100fb0 2026-03-10T12:40:55.279 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.278+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0c8402cf90 con 0x7f0c84100fb0 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.279+0000 7f0c7b7fe700 1 -- 192.168.123.100:0/4040678729 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f0c6c061b40 con 0x7f0c84100fb0 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:40:55.280 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:40:55.281 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:40:55.281 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:40:55.281 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 9, 2026-03-10T12:40:55.281 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T12:40:55.281 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:40:55.281 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:40:55.282 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0c700778c0 msgr2=0x7f0c70079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0c700778c0 0x7f0c70079d70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f0c74005f50 tx=0x7f0c74005dc0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c84100fb0 msgr2=0x7f0c84193be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c84100fb0 0x7f0c84193be0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f0c6c00c930 tx=0x7f0c6c00ccf0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 shutdown_connections 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0c700778c0 0x7f0c70079d70 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0c84100fb0 0x7f0c84193be0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 --2- 192.168.123.100:0/4040678729 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0c841038d0 0x7f0c84194120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.282+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 >> 192.168.123.100:0/4040678729 conn(0x7f0c840fa990 msgr2=0x7f0c840ff630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.283+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 shutdown_connections 2026-03-10T12:40:55.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.283+0000 7f0c88c6f700 1 -- 192.168.123.100:0/4040678729 wait complete. 2026-03-10T12:40:55.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:54 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 204/333 objects degraded (61.261%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T12:40:55.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.349+0000 7fb3bca40700 1 -- 192.168.123.100:0/3339358033 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb3b8100700 msgr2=0x7fb3b8100b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.349+0000 7fb3bca40700 1 --2- 192.168.123.100:0/3339358033 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb3b8100700 0x7fb3b8100b70 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fb3a8009b00 tx=0x7fb3a8009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:55.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.350+0000 7fb3bca40700 1 -- 192.168.123.100:0/3339358033 shutdown_connections 2026-03-10T12:40:55.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.350+0000 7fb3bca40700 1 --2- 192.168.123.100:0/3339358033 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb3b8100700 0x7fb3b8100b70 secure :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7fb3a8009b00 tx=0x7fb3a8009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:55.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.350+0000 7fb3bca40700 1 --2- 192.168.123.100:0/3339358033 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 0x7fb3b80ff870 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.350+0000 7fb3bca40700 1 -- 192.168.123.100:0/3339358033 >> 192.168.123.100:0/3339358033 conn(0x7fb3b80faa70 msgr2=0x7fb3b80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.353+0000 7fb3bca40700 1 -- 192.168.123.100:0/3339358033 shutdown_connections 2026-03-10T12:40:55.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.353+0000 7fb3bca40700 1 -- 192.168.123.100:0/3339358033 wait complete. 2026-03-10T12:40:55.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.353+0000 7fb3bca40700 1 Processor -- start 2026-03-10T12:40:55.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3bca40700 1 -- start start 2026-03-10T12:40:55.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3bca40700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 0x7fb3b8071f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3bca40700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb3b8072480 0x7fb3b819fb60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3bca40700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3b80728f0 con 0x7fb3b8072480 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3bca40700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3b8072a60 con 0x7fb3b80ff460 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3b659c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 0x7fb3b8071f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3b659c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 0x7fb3b8071f40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60094/0 (socket says 192.168.123.100:60094) 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3b659c700 1 -- 192.168.123.100:0/1918742668 learned_addr learned my addr 192.168.123.100:0/1918742668 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3b659c700 1 -- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb3b8072480 msgr2=0x7fb3b819fb60 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3b659c700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb3b8072480 0x7fb3b819fb60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3b659c700 1 -- 192.168.123.100:0/1918742668 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3a80097e0 con 0x7fb3b80ff460 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3b659c700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 0x7fb3b8071f40 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fb3a000c930 tx=0x7fb3a000ccf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.354+0000 7fb3af7fe700 1 -- 192.168.123.100:0/1918742668 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3a0007ab0 con 0x7fb3b80ff460 2026-03-10T12:40:55.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.355+0000 7fb3af7fe700 1 -- 192.168.123.100:0/1918742668 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3a0007c10 con 0x7fb3b80ff460 2026-03-10T12:40:55.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.355+0000 7fb3af7fe700 1 -- 192.168.123.100:0/1918742668 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3a00186e0 con 0x7fb3b80ff460 2026-03-10T12:40:55.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.355+0000 7fb3bca40700 1 -- 192.168.123.100:0/1918742668 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3b81a00a0 con 0x7fb3b80ff460 2026-03-10T12:40:55.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.355+0000 7fb3bca40700 1 -- 192.168.123.100:0/1918742668 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3b81a05c0 con 0x7fb3b80ff460 2026-03-10T12:40:55.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.356+0000 7fb3af7fe700 1 -- 192.168.123.100:0/1918742668 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb3a0018840 con 0x7fb3b80ff460 2026-03-10T12:40:55.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.357+0000 7fb3af7fe700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb3a40778c0 0x7fb3a4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.357+0000 7fb3af7fe700 1 -- 192.168.123.100:0/1918742668 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb3a0099b40 con 0x7fb3b80ff460 2026-03-10T12:40:55.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.357+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3b8072ba0 con 0x7fb3b80ff460 2026-03-10T12:40:55.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.357+0000 7fb3b5d9b700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb3a40778c0 0x7fb3a4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.358 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.358+0000 7fb3b5d9b700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb3a40778c0 0x7fb3a4079d70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fb3a8009fd0 tx=0x7fb3a8005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.360 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.360+0000 7fb3af7fe700 1 -- 192.168.123.100:0/1918742668 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb3a00623f0 con 0x7fb3b80ff460 2026-03-10T12:40:55.498 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.497+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb3b8061e60 con 0x7fb3b80ff460 2026-03-10T12:40:55.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.498+0000 7fb3af7fe700 1 -- 192.168.123.100:0/1918742668 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1961 (secure 0 0 0) 0x7fb3a0061b40 con 0x7fb3b80ff460 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:40:55.500 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 0 members: 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:55.501 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:40:55.502 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb3a40778c0 msgr2=0x7fb3a4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb3a40778c0 0x7fb3a4079d70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fb3a8009fd0 tx=0x7fb3a8005fb0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 msgr2=0x7fb3b8071f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 0x7fb3b8071f40 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fb3a000c930 tx=0x7fb3a000ccf0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 shutdown_connections 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb3a40778c0 0x7fb3a4079d70 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb3b80ff460 0x7fb3b8071f40 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 --2- 192.168.123.100:0/1918742668 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb3b8072480 0x7fb3b819fb60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 >> 192.168.123.100:0/1918742668 conn(0x7fb3b80faa70 msgr2=0x7fb3b8103930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.503 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 shutdown_connections 2026-03-10T12:40:55.504 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.502+0000 7fb3ad7fa700 1 -- 192.168.123.100:0/1918742668 wait complete. 2026-03-10T12:40:55.504 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:40:55.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.566+0000 7fdb34e0e700 1 -- 192.168.123.100:0/661187236 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30103950 msgr2=0x7fdb30105d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.566+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/661187236 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30103950 0x7fdb30105d30 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fdb20009b00 tx=0x7fdb20009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:55.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.566+0000 7fdb34e0e700 1 -- 192.168.123.100:0/661187236 shutdown_connections 2026-03-10T12:40:55.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.566+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/661187236 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30103950 0x7fdb30105d30 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.566+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/661187236 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb30101030 0x7fdb30103410 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.566+0000 7fdb34e0e700 1 -- 192.168.123.100:0/661187236 >> 192.168.123.100:0/661187236 conn(0x7fdb300fa9b0 msgr2=0x7fdb300fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 -- 192.168.123.100:0/661187236 shutdown_connections 2026-03-10T12:40:55.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 -- 192.168.123.100:0/661187236 wait complete. 2026-03-10T12:40:55.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 Processor -- start 2026-03-10T12:40:55.568 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 -- start start 2026-03-10T12:40:55.568 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30101030 0x7fdb3019c3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.568 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb30103950 0x7fdb3019c930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.568 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb3019cf50 con 0x7fdb30101030 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb34e0e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb3019d090 con 0x7fdb30103950 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb2e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30101030 0x7fdb3019c3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb2e59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30101030 0x7fdb3019c3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:35192/0 (socket says 192.168.123.100:35192) 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.567+0000 7fdb2e59c700 1 -- 192.168.123.100:0/3674337790 learned_addr learned my addr 192.168.123.100:0/3674337790 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.568+0000 7fdb2e59c700 1 -- 192.168.123.100:0/3674337790 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb30103950 msgr2=0x7fdb3019c930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.568+0000 7fdb2e59c700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb30103950 0x7fdb3019c930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.568+0000 7fdb2e59c700 1 -- 192.168.123.100:0/3674337790 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdb200097e0 con 0x7fdb30101030 2026-03-10T12:40:55.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.568+0000 7fdb2e59c700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30101030 0x7fdb3019c3f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fdb1800eb10 tx=0x7fdb1800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.569+0000 7fdb277fe700 1 -- 192.168.123.100:0/3674337790 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb1800cca0 con 0x7fdb30101030 2026-03-10T12:40:55.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.569+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb301a1b40 con 0x7fdb30101030 2026-03-10T12:40:55.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.569+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb301a2090 con 0x7fdb30101030 2026-03-10T12:40:55.571 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.569+0000 7fdb277fe700 1 -- 192.168.123.100:0/3674337790 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdb1800ce00 con 0x7fdb30101030 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.569+0000 7fdb277fe700 1 -- 192.168.123.100:0/3674337790 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdb180189c0 con 0x7fdb30101030 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.570+0000 7fdb277fe700 1 -- 192.168.123.100:0/3674337790 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdb18018b20 con 0x7fdb30101030 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.570+0000 7fdb277fe700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb1c0779e0 0x7fdb1c079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.570+0000 7fdb277fe700 1 -- 192.168.123.100:0/3674337790 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fdb18014070 con 0x7fdb30101030 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.570+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdb10005320 con 0x7fdb30101030 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.573+0000 7fdb2dd9b700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb1c0779e0 0x7fdb1c079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.573+0000 7fdb2dd9b700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb1c0779e0 0x7fdb1c079e90 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fdb20009ad0 tx=0x7fdb20005c00 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.573+0000 7fdb277fe700 1 -- 192.168.123.100:0/3674337790 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdb18063d60 con 0x7fdb30101030 2026-03-10T12:40:55.696 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.694+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdb10000bf0 con 0x7fdb1c0779e0 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.699+0000 7fdb277fe700 1 -- 192.168.123.100:0/3674337790 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fdb10000bf0 con 0x7fdb1c0779e0 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:40:55.700 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:40:55.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb1c0779e0 msgr2=0x7fdb1c079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.702 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb1c0779e0 0x7fdb1c079e90 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fdb20009ad0 tx=0x7fdb20005c00 comp rx=0 tx=0).stop 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30101030 msgr2=0x7fdb3019c3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30101030 0x7fdb3019c3f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fdb1800eb10 tx=0x7fdb1800eed0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 shutdown_connections 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fdb1c0779e0 0x7fdb1c079e90 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fdb30101030 0x7fdb3019c3f0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 --2- 192.168.123.100:0/3674337790 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fdb30103950 0x7fdb3019c930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 >> 192.168.123.100:0/3674337790 conn(0x7fdb300fa9b0 msgr2=0x7fdb300fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 shutdown_connections 2026-03-10T12:40:55.703 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.702+0000 7fdb34e0e700 1 -- 192.168.123.100:0/3674337790 wait complete. 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.768+0000 7f4e47ada700 1 -- 192.168.123.100:0/3285666647 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400fee20 msgr2=0x7f4e400ff230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.768+0000 7f4e47ada700 1 --2- 192.168.123.100:0/3285666647 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400fee20 0x7f4e400ff230 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4e34009b00 tx=0x7f4e34009e10 comp rx=0 tx=0).stop 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.769+0000 7f4e47ada700 1 -- 192.168.123.100:0/3285666647 shutdown_connections 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.769+0000 7f4e47ada700 1 --2- 192.168.123.100:0/3285666647 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400ff770 0x7f4e400ffbe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.769+0000 7f4e47ada700 1 --2- 192.168.123.100:0/3285666647 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400fee20 0x7f4e400ff230 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.769+0000 7f4e47ada700 1 -- 192.168.123.100:0/3285666647 >> 192.168.123.100:0/3285666647 conn(0x7f4e400fa990 msgr2=0x7f4e400fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.769+0000 7f4e47ada700 1 -- 192.168.123.100:0/3285666647 shutdown_connections 2026-03-10T12:40:55.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.769+0000 7f4e47ada700 1 -- 192.168.123.100:0/3285666647 wait complete. 2026-03-10T12:40:55.770 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e47ada700 1 Processor -- start 2026-03-10T12:40:55.770 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e47ada700 1 -- start start 2026-03-10T12:40:55.770 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e47ada700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400fee20 0x7f4e40198010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.770 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e47ada700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400ff770 0x7f4e40198550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.770 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e47ada700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e40198b70 con 0x7f4e400ff770 2026-03-10T12:40:55.770 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e47ada700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4e40198cb0 con 0x7f4e400fee20 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e45876700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400fee20 0x7f4e40198010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e45876700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400fee20 0x7f4e40198010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60132/0 (socket says 192.168.123.100:60132) 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e45876700 1 -- 192.168.123.100:0/2502019325 learned_addr learned my addr 192.168.123.100:0/2502019325 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.770+0000 7f4e45075700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400ff770 0x7f4e40198550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e45075700 1 -- 192.168.123.100:0/2502019325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400fee20 msgr2=0x7f4e40198010 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e45075700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400fee20 0x7f4e40198010 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e45075700 1 -- 192.168.123.100:0/2502019325 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4e340097e0 con 0x7f4e400ff770 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e45876700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400fee20 0x7f4e40198010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:40:55.771 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e45075700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400ff770 0x7f4e40198550 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f4e3c00eab0 tx=0x7f4e3c00ee70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e32ffd700 1 -- 192.168.123.100:0/2502019325 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4e3c00cbc0 con 0x7f4e400ff770 2026-03-10T12:40:55.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e32ffd700 1 -- 192.168.123.100:0/2502019325 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4e3c004510 con 0x7f4e400ff770 2026-03-10T12:40:55.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e32ffd700 1 -- 192.168.123.100:0/2502019325 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4e3c005230 con 0x7f4e400ff770 2026-03-10T12:40:55.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4e4019d760 con 0x7f4e400ff770 2026-03-10T12:40:55.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.771+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4e4019dcb0 con 0x7f4e400ff770 2026-03-10T12:40:55.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.772+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4e40066e40 con 0x7f4e400ff770 2026-03-10T12:40:55.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.775+0000 7f4e32ffd700 1 -- 192.168.123.100:0/2502019325 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4e3c01f030 con 0x7f4e400ff770 2026-03-10T12:40:55.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.776+0000 7f4e32ffd700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4e2c0779e0 0x7f4e2c079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:40:55.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.776+0000 7f4e32ffd700 1 -- 192.168.123.100:0/2502019325 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4e3c014070 con 0x7f4e400ff770 2026-03-10T12:40:55.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.776+0000 7f4e45876700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4e2c0779e0 0x7f4e2c079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:40:55.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.776+0000 7f4e45876700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4e2c0779e0 0x7f4e2c079e90 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f4e3400b5c0 tx=0x7f4e3401a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:40:55.777 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.776+0000 7f4e32ffd700 1 -- 192.168.123.100:0/2502019325 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4e3c09a300 con 0x7f4e400ff770 2026-03-10T12:40:55.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.943+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f4e4019df90 con 0x7f4e400ff770 2026-03-10T12:40:55.944 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.943+0000 7f4e32ffd700 1 -- 192.168.123.100:0/2502019325 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+367 (secure 0 0 0) 0x7f4e3c0628f0 con 0x7f4e400ff770 2026-03-10T12:40:55.944 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_WARN Degraded data redundancy: 204/333 objects degraded (61.261%), 3 pgs degraded 2026-03-10T12:40:55.944 INFO:teuthology.orchestra.run.vm00.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 204/333 objects degraded (61.261%), 3 pgs degraded 2026-03-10T12:40:55.944 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.c is active+recovery_wait+degraded, acting [5,0,3] 2026-03-10T12:40:55.944 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.12 is active+recovery_wait+degraded, acting [0,3,1] 2026-03-10T12:40:55.944 INFO:teuthology.orchestra.run.vm00.stdout: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-10T12:40:55.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4e2c0779e0 msgr2=0x7f4e2c079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4e2c0779e0 0x7f4e2c079e90 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f4e3400b5c0 tx=0x7f4e3401a040 comp rx=0 tx=0).stop 2026-03-10T12:40:55.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400ff770 msgr2=0x7f4e40198550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:40:55.947 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400ff770 0x7f4e40198550 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f4e3c00eab0 tx=0x7f4e3c00ee70 comp rx=0 tx=0).stop 2026-03-10T12:40:55.947 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:55 vm00.local ceph-mon[103263]: from='client.34214 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:55.947 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:55 vm00.local ceph-mon[103263]: pgmap v69: 65 pgs: 3 active+recovery_wait+degraded, 2 active+recovering, 60 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 204/333 objects degraded (61.261%); 0 B/s, 16 objects/s recovering 2026-03-10T12:40:55.947 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:55 vm00.local ceph-mon[103263]: from='client.44165 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:55.947 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:55 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/4040678729' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:55.947 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:55 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1918742668' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:40:55.949 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 shutdown_connections 2026-03-10T12:40:55.949 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4e2c0779e0 0x7f4e2c079e90 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.949 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4e400fee20 0x7f4e40198010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.949 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 --2- 192.168.123.100:0/2502019325 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4e400ff770 0x7f4e40198550 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:40:55.949 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.946+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 >> 192.168.123.100:0/2502019325 conn(0x7f4e400fa990 msgr2=0x7f4e40107450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:40:55.950 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.950+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 shutdown_connections 2026-03-10T12:40:55.950 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:40:55.950+0000 7f4e47ada700 1 -- 192.168.123.100:0/2502019325 wait complete. 2026-03-10T12:40:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:55 vm07.local ceph-mon[93622]: from='client.34214 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:55 vm07.local ceph-mon[93622]: pgmap v69: 65 pgs: 3 active+recovery_wait+degraded, 2 active+recovering, 60 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 204/333 objects degraded (61.261%); 0 B/s, 16 objects/s recovering 2026-03-10T12:40:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:55 vm07.local ceph-mon[93622]: from='client.44165 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:55 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/4040678729' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:40:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:55 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1918742668' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:40:57.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:56 vm00.local ceph-mon[103263]: from='client.34222 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:57.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:56 vm00.local ceph-mon[103263]: from='client.34232 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:57.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:56 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2502019325' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:40:57.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:56 vm07.local ceph-mon[93622]: from='client.34222 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:57.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:56 vm07.local ceph-mon[93622]: from='client.34232 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:40:57.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:56 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2502019325' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:40:58.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:57 vm00.local ceph-mon[103263]: pgmap v70: 65 pgs: 2 active+recovery_wait+degraded, 2 active+recovering, 61 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 144/333 objects degraded (43.243%); 0 B/s, 18 objects/s recovering 2026-03-10T12:40:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:57 vm07.local ceph-mon[93622]: pgmap v70: 65 pgs: 2 active+recovery_wait+degraded, 2 active+recovering, 61 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 144/333 objects degraded (43.243%); 0 B/s, 18 objects/s recovering 2026-03-10T12:40:59.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:59 vm00.local ceph-mon[103263]: pgmap v71: 65 pgs: 1 active+recovery_wait+degraded, 2 active+recovering, 62 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 18 objects/s recovering 2026-03-10T12:40:59.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:40:59 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 74/333 objects degraded (22.222%), 1 pg degraded (PG_DEGRADED) 2026-03-10T12:41:00.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:59 vm07.local ceph-mon[93622]: pgmap v71: 65 pgs: 1 active+recovery_wait+degraded, 2 active+recovering, 62 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 18 objects/s recovering 2026-03-10T12:41:00.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:40:59 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 74/333 objects degraded (22.222%), 1 pg degraded (PG_DEGRADED) 2026-03-10T12:41:01.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:00 vm00.local ceph-mon[103263]: pgmap v72: 65 pgs: 1 active+recovery_wait+degraded, 2 active+recovering, 62 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 18 objects/s recovering 2026-03-10T12:41:01.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:01.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:00 vm07.local ceph-mon[93622]: pgmap v72: 65 pgs: 1 active+recovery_wait+degraded, 2 active+recovering, 62 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 18 objects/s recovering 2026-03-10T12:41:01.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:03.875 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:03 vm00.local ceph-mon[103263]: pgmap v73: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 18 objects/s recovering 2026-03-10T12:41:03.875 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:03 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:41:03.875 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:03 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:03.875 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:03 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T12:41:03.875 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:03 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:03 vm07.local ceph-mon[93622]: pgmap v73: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 18 objects/s recovering 2026-03-10T12:41:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:03 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:41:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:03 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:03 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T12:41:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:03 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:04.484 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local systemd[1]: Stopping Ceph osd.1 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:41:04.484 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[73253]: 2026-03-10T12:41:04.311+0000 7f9d1fadf700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:41:04.484 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[73253]: 2026-03-10T12:41:04.311+0000 7f9d1fadf700 -1 osd.1 53 *** Got signal Terminated *** 2026-03-10T12:41:04.484 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[73253]: 2026-03-10T12:41:04.311+0000 7f9d1fadf700 -1 osd.1 53 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:41:05.019 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:41:05.019 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-mon[103263]: Upgrade: osd.1 is safe to restart 2026-03-10T12:41:05.019 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-mon[103263]: Upgrade: Updating osd.1 2026-03-10T12:41:05.019 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-mon[103263]: Deploying daemon osd.1 on vm00 2026-03-10T12:41:05.019 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:04 vm00.local ceph-mon[103263]: osd.1 marked itself down and dead 2026-03-10T12:41:05.019 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local podman[113447]: 2026-03-10 12:41:04.825621272 +0000 UTC m=+0.526825220 container died 5bc971fe4d495f963e16c5383d98033ddf7e5d09b4fdc06020fd1ab9dca66517 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1, ceph=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD) 2026-03-10T12:41:05.019 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local podman[113447]: 2026-03-10 12:41:04.851883155 +0000 UTC m=+0.553087103 container remove 5bc971fe4d495f963e16c5383d98033ddf7e5d09b4fdc06020fd1ab9dca66517 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, org.label-schema.license=GPLv2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.build-date=20231212, org.label-schema.schema-version=1.0) 2026-03-10T12:41:05.019 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local bash[113447]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1 2026-03-10T12:41:05.019 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:04 vm00.local podman[113513]: 2026-03-10 12:41:04.996723186 +0000 UTC m=+0.016655996 container create 1c03492f8f7bc676b59add85dda06f82ce0a5eba2476660416ddcf491559bb66 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default) 2026-03-10T12:41:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:04 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T12:41:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:04 vm07.local ceph-mon[93622]: Upgrade: osd.1 is safe to restart 2026-03-10T12:41:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:04 vm07.local ceph-mon[93622]: Upgrade: Updating osd.1 2026-03-10T12:41:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:04 vm07.local ceph-mon[93622]: Deploying daemon osd.1 on vm00 2026-03-10T12:41:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:04 vm07.local ceph-mon[93622]: osd.1 marked itself down and dead 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113513]: 2026-03-10 12:41:05.043956501 +0000 UTC m=+0.063889311 container init 1c03492f8f7bc676b59add85dda06f82ce0a5eba2476660416ddcf491559bb66 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20260223, CEPH_REF=squid) 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113513]: 2026-03-10 12:41:05.046973548 +0000 UTC m=+0.066906358 container start 1c03492f8f7bc676b59add85dda06f82ce0a5eba2476660416ddcf491559bb66 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113513]: 2026-03-10 12:41:05.052220619 +0000 UTC m=+0.072153429 container attach 1c03492f8f7bc676b59add85dda06f82ce0a5eba2476660416ddcf491559bb66 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113513]: 2026-03-10 12:41:04.990072489 +0000 UTC m=+0.010005308 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113513]: 2026-03-10 12:41:05.18976234 +0000 UTC m=+0.209695150 container died 1c03492f8f7bc676b59add85dda06f82ce0a5eba2476660416ddcf491559bb66 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2) 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113513]: 2026-03-10 12:41:05.223330889 +0000 UTC m=+0.243263699 container remove 1c03492f8f7bc676b59add85dda06f82ce0a5eba2476660416ddcf491559bb66 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS) 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.1.service: Deactivated successfully. 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local systemd[1]: Stopped Ceph osd.1 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:41:05.323 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.1.service: Consumed 47.273s CPU time. 2026-03-10T12:41:05.674 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local systemd[1]: Starting Ceph osd.1 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:41:05.674 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113624]: 2026-03-10 12:41:05.54740176 +0000 UTC m=+0.020378222 container create d5f61f4fb97e3176edd0694b590dc52bfeff1792b81dde72145661ae93090cea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.vendor=CentOS) 2026-03-10T12:41:05.674 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113624]: 2026-03-10 12:41:05.588695257 +0000 UTC m=+0.061671730 container init d5f61f4fb97e3176edd0694b590dc52bfeff1792b81dde72145661ae93090cea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-10T12:41:05.674 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113624]: 2026-03-10 12:41:05.591586859 +0000 UTC m=+0.064563332 container start d5f61f4fb97e3176edd0694b590dc52bfeff1792b81dde72145661ae93090cea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:41:05.674 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113624]: 2026-03-10 12:41:05.597938176 +0000 UTC m=+0.070914649 container attach d5f61f4fb97e3176edd0694b590dc52bfeff1792b81dde72145661ae93090cea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-10T12:41:05.674 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local podman[113624]: 2026-03-10 12:41:05.53771521 +0000 UTC m=+0.010691692 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:05.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:05 vm00.local ceph-mon[103263]: pgmap v74: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 14 objects/s recovering 2026-03-10T12:41:05.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:05 vm00.local ceph-mon[103263]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:41:05.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:05 vm00.local ceph-mon[103263]: osdmap e54: 6 total, 5 up, 6 in 2026-03-10T12:41:05.984 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:05.985 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local bash[113624]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:05.985 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:05.985 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:05 vm00.local bash[113624]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:05 vm07.local ceph-mon[93622]: pgmap v74: 65 pgs: 1 active+recovery_wait+degraded, 1 active+recovering, 63 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 14 objects/s recovering 2026-03-10T12:41:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:05 vm07.local ceph-mon[93622]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:41:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:05 vm07.local ceph-mon[93622]: osdmap e54: 6 total, 5 up, 6 in 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-d399ffd5-d4f1-4c3b-b5c3-9f7108a9e89c/osd-block-bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T12:41:06.485 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-d399ffd5-d4f1-4c3b-b5c3-9f7108a9e89c/osd-block-bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T12:41:06.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-mon[103263]: osdmap e55: 6 total, 5 up, 6 in 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/ln -snf /dev/ceph-d399ffd5-d4f1-4c3b-b5c3-9f7108a9e89c/osd-block-bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b /var/lib/ceph/osd/ceph-1/block 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/ln -snf /dev/ceph-d399ffd5-d4f1-4c3b-b5c3-9f7108a9e89c/osd-block-bbe7c7dc-9287-4bf7-b4aa-5e00c56a2a7b /var/lib/ceph/osd/ceph-1/block 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate[113636]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113624]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local conmon[113636]: conmon d5f61f4fb97e3176edd0 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d5f61f4fb97e3176edd0694b590dc52bfeff1792b81dde72145661ae93090cea.scope/container/memory.events 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local podman[113624]: 2026-03-10 12:41:06.663023485 +0000 UTC m=+1.135999958 container died d5f61f4fb97e3176edd0694b590dc52bfeff1792b81dde72145661ae93090cea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS) 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local podman[113624]: 2026-03-10 12:41:06.684510802 +0000 UTC m=+1.157487275 container remove d5f61f4fb97e3176edd0694b590dc52bfeff1792b81dde72145661ae93090cea (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-activate, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local podman[113884]: 2026-03-10 12:41:06.785344318 +0000 UTC m=+0.022958652 container create 252ea98c56650e3214a7e4635ecdcce97d5f8c7ae0e18f5b3c56bb10fdebca62 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local podman[113884]: 2026-03-10 12:41:06.823429291 +0000 UTC m=+0.061043635 container init 252ea98c56650e3214a7e4635ecdcce97d5f8c7ae0e18f5b3c56bb10fdebca62 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local podman[113884]: 2026-03-10 12:41:06.826829024 +0000 UTC m=+0.064443358 container start 252ea98c56650e3214a7e4635ecdcce97d5f8c7ae0e18f5b3c56bb10fdebca62 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223) 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local bash[113884]: 252ea98c56650e3214a7e4635ecdcce97d5f8c7ae0e18f5b3c56bb10fdebca62 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local podman[113884]: 2026-03-10 12:41:06.778566312 +0000 UTC m=+0.016180646 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local systemd[1]: Started Ceph osd.1 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:41:06.986 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:06 vm00.local ceph-osd[113898]: -- 192.168.123.100:0/3862118695 <== mon.1 v2:192.168.123.107:3300/0 4 ==== auth_reply(proto 2 0 (0) Success) ==== 194+0+0 (secure 0 0 0) 0x56036be70960 con 0x56036cc62000 2026-03-10T12:41:07.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:06 vm07.local ceph-mon[93622]: osdmap e55: 6 total, 5 up, 6 in 2026-03-10T12:41:07.419 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:07 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[113894]: 2026-03-10T12:41:07.417+0000 7f3f56d71740 -1 Falling back to public interface 2026-03-10T12:41:07.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:07 vm00.local ceph-mon[103263]: pgmap v77: 65 pgs: 5 peering, 9 stale+active+clean, 1 active+recovery_wait+degraded, 50 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 4 objects/s recovering 2026-03-10T12:41:07.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:07 vm00.local ceph-mon[103263]: Health check failed: Reduced data availability: 2 pgs inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-10T12:41:07.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:07.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:07.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:07 vm07.local ceph-mon[93622]: pgmap v77: 65 pgs: 5 peering, 9 stale+active+clean, 1 active+recovery_wait+degraded, 50 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 74/333 objects degraded (22.222%); 0 B/s, 4 objects/s recovering 2026-03-10T12:41:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:07 vm07.local ceph-mon[93622]: Health check failed: Reduced data availability: 2 pgs inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-10T12:41:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:09.712 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:09 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:09.713 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:09 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:09.713 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:09 vm00.local ceph-mon[103263]: pgmap v78: 65 pgs: 5 peering, 15 active+undersized, 1 active+recovering, 14 active+undersized+degraded, 30 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 46/333 objects degraded (13.814%); 0 B/s, 4 objects/s recovering 2026-03-10T12:41:09.713 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:09 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:09.713 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:09 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:09.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:09 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:09.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:09 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:09.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:09 vm07.local ceph-mon[93622]: pgmap v78: 65 pgs: 5 peering, 15 active+undersized, 1 active+recovering, 14 active+undersized+degraded, 30 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 46/333 objects degraded (13.814%); 0 B/s, 4 objects/s recovering 2026-03-10T12:41:09.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:09 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:09.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:09 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 46/333 objects degraded (13.814%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:10.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:10.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:10.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:10.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 46/333 objects degraded (13.814%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:10.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:11.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:11 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:11.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:11 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T12:41:11.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:11 vm00.local ceph-mon[103263]: pgmap v79: 65 pgs: 18 active+undersized, 1 active+recovering, 16 active+undersized+degraded, 30 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%) 2026-03-10T12:41:11.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:11 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:11.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:11 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T12:41:11.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:11 vm07.local ceph-mon[93622]: pgmap v79: 65 pgs: 18 active+undersized, 1 active+recovering, 16 active+undersized+degraded, 30 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%) 2026-03-10T12:41:12.377 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:12 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[113894]: 2026-03-10T12:41:12.085+0000 7f3f56d71740 -1 osd.1 0 read_superblock omap replica is missing. 2026-03-10T12:41:12.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:12 vm07.local ceph-mon[93622]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs inactive, 2 pgs peering) 2026-03-10T12:41:12.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:12 vm00.local ceph-mon[103263]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs inactive, 2 pgs peering) 2026-03-10T12:41:12.734 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:12 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[113894]: 2026-03-10T12:41:12.545+0000 7f3f56d71740 -1 osd.1 53 log_to_monitors true 2026-03-10T12:41:13.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:13 vm00.local ceph-mon[103263]: from='osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T12:41:13.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:13 vm00.local ceph-mon[103263]: pgmap v80: 65 pgs: 18 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%); 0 B/s, 1 objects/s recovering 2026-03-10T12:41:13.734 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:41:13 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[113894]: 2026-03-10T12:41:13.398+0000 7f3f4eb0b640 -1 osd.1 53 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:41:13.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:13 vm07.local ceph-mon[93622]: from='osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T12:41:13.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:13 vm07.local ceph-mon[93622]: pgmap v80: 65 pgs: 18 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%); 0 B/s, 1 objects/s recovering 2026-03-10T12:41:14.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:14 vm00.local ceph-mon[103263]: from='osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T12:41:14.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:14 vm00.local ceph-mon[103263]: osdmap e56: 6 total, 5 up, 6 in 2026-03-10T12:41:14.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:14 vm00.local ceph-mon[103263]: from='osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:41:14.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:14 vm07.local ceph-mon[93622]: from='osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T12:41:14.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:14 vm07.local ceph-mon[93622]: osdmap e56: 6 total, 5 up, 6 in 2026-03-10T12:41:14.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:14 vm07.local ceph-mon[93622]: from='osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:41:15.678 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:15 vm00.local ceph-mon[103263]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:41:15.678 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:15 vm00.local ceph-mon[103263]: osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354] boot 2026-03-10T12:41:15.678 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:15 vm00.local ceph-mon[103263]: osdmap e57: 6 total, 6 up, 6 in 2026-03-10T12:41:15.678 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:41:15.678 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:15 vm00.local ceph-mon[103263]: pgmap v83: 65 pgs: 18 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%); 0 B/s, 9 objects/s recovering 2026-03-10T12:41:15.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:15 vm07.local ceph-mon[93622]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:41:15.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:15 vm07.local ceph-mon[93622]: osd.1 [v2:192.168.123.100:6810/4012605354,v1:192.168.123.100:6811/4012605354] boot 2026-03-10T12:41:15.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:15 vm07.local ceph-mon[93622]: osdmap e57: 6 total, 6 up, 6 in 2026-03-10T12:41:15.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T12:41:15.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:15 vm07.local ceph-mon[93622]: pgmap v83: 65 pgs: 18 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%); 0 B/s, 9 objects/s recovering 2026-03-10T12:41:16.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:16 vm00.local ceph-mon[103263]: osdmap e58: 6 total, 6 up, 6 in 2026-03-10T12:41:16.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:16.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:16 vm07.local ceph-mon[93622]: osdmap e58: 6 total, 6 up, 6 in 2026-03-10T12:41:16.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:17.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:17 vm07.local ceph-mon[93622]: pgmap v85: 65 pgs: 2 peering, 16 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%); 0 B/s, 4 objects/s recovering 2026-03-10T12:41:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:17 vm00.local ceph-mon[103263]: pgmap v85: 65 pgs: 2 peering, 16 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 52/333 objects degraded (15.616%); 0 B/s, 4 objects/s recovering 2026-03-10T12:41:19.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:18 vm00.local ceph-mon[103263]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 52/333 objects degraded (15.616%), 16 pgs degraded) 2026-03-10T12:41:19.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:18 vm00.local ceph-mon[103263]: Cluster is now healthy 2026-03-10T12:41:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:18 vm07.local ceph-mon[93622]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 52/333 objects degraded (15.616%), 16 pgs degraded) 2026-03-10T12:41:19.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:18 vm07.local ceph-mon[93622]: Cluster is now healthy 2026-03-10T12:41:20.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:20 vm07.local ceph-mon[93622]: pgmap v86: 65 pgs: 2 peering, 63 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:20.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:20 vm00.local ceph-mon[103263]: pgmap v86: 65 pgs: 2 peering, 63 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:21.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:21 vm07.local ceph-mon[93622]: pgmap v87: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:21.377 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:21 vm00.local ceph-mon[103263]: pgmap v87: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:23.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:23 vm00.local ceph-mon[103263]: pgmap v88: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:24.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:23 vm07.local ceph-mon[93622]: pgmap v88: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:25.853 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:25 vm00.local ceph-mon[103263]: pgmap v89: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:25.853 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:25 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:25.853 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:25 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:25.853 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:25 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T12:41:25.853 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:25 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.038+0000 7f4d796a0700 1 -- 192.168.123.100:0/2993089433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c093d00 msgr2=0x7f4d6c094110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.038+0000 7f4d796a0700 1 --2- 192.168.123.100:0/2993089433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c093d00 0x7f4d6c094110 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f4d64009b00 tx=0x7f4d64009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.039+0000 7f4d796a0700 1 -- 192.168.123.100:0/2993089433 shutdown_connections 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.039+0000 7f4d796a0700 1 --2- 192.168.123.100:0/2993089433 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d6c094650 0x7f4d6c094ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.039+0000 7f4d796a0700 1 --2- 192.168.123.100:0/2993089433 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c093d00 0x7f4d6c094110 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.039+0000 7f4d796a0700 1 -- 192.168.123.100:0/2993089433 >> 192.168.123.100:0/2993089433 conn(0x7f4d6c08f8b0 msgr2=0x7f4d6c091d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.039+0000 7f4d796a0700 1 -- 192.168.123.100:0/2993089433 shutdown_connections 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.039+0000 7f4d796a0700 1 -- 192.168.123.100:0/2993089433 wait complete. 2026-03-10T12:41:26.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.040+0000 7f4d796a0700 1 Processor -- start 2026-03-10T12:41:26.041 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.040+0000 7f4d796a0700 1 -- start start 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d796a0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d6c093d00 0x7f4d6c12ce30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d796a0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c094650 0x7f4d6c12d370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d796a0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d6c12d990 con 0x7f4d6c093d00 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d796a0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d6c12dad0 con 0x7f4d6c094650 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d72ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d6c093d00 0x7f4d6c12ce30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d72ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d6c093d00 0x7f4d6c12ce30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:55482/0 (socket says 192.168.123.100:55482) 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d72ffd700 1 -- 192.168.123.100:0/244165363 learned_addr learned my addr 192.168.123.100:0/244165363 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.041+0000 7f4d727fc700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c094650 0x7f4d6c12d370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.042+0000 7f4d727fc700 1 -- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d6c093d00 msgr2=0x7f4d6c12ce30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.042+0000 7f4d727fc700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d6c093d00 0x7f4d6c12ce30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.042+0000 7f4d727fc700 1 -- 192.168.123.100:0/244165363 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d640097e0 con 0x7f4d6c094650 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.042+0000 7f4d727fc700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c094650 0x7f4d6c12d370 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4d6800eab0 tx=0x7f4d6800edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.042+0000 7f4d5bfff700 1 -- 192.168.123.100:0/244165363 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d6800cb20 con 0x7f4d6c094650 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.042+0000 7f4d5bfff700 1 -- 192.168.123.100:0/244165363 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4d6800cc80 con 0x7f4d6c094650 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.043+0000 7f4d5bfff700 1 -- 192.168.123.100:0/244165363 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d68018860 con 0x7f4d6c094650 2026-03-10T12:41:26.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.043+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d6c132580 con 0x7f4d6c094650 2026-03-10T12:41:26.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.044+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d6c132a50 con 0x7f4d6c094650 2026-03-10T12:41:26.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.046+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4d6c006120 con 0x7f4d6c094650 2026-03-10T12:41:26.047 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.046+0000 7f4d5bfff700 1 -- 192.168.123.100:0/244165363 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4d680189c0 con 0x7f4d6c094650 2026-03-10T12:41:26.047 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.046+0000 7f4d5bfff700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4d5c0776b0 0x7f4d5c079b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.047 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.047+0000 7f4d5bfff700 1 -- 192.168.123.100:0/244165363 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(58..58 src has 1..58) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4d68014070 con 0x7f4d6c094650 2026-03-10T12:41:26.047 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.047+0000 7f4d72ffd700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4d5c0776b0 0x7f4d5c079b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.048 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.047+0000 7f4d72ffd700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4d5c0776b0 0x7f4d5c079b60 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f4d640051d0 tx=0x7f4d6401a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.049+0000 7f4d5bfff700 1 -- 192.168.123.100:0/244165363 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4d68061f80 con 0x7f4d6c094650 2026-03-10T12:41:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:25 vm07.local ceph-mon[93622]: pgmap v89: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:25 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:25 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:25 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T12:41:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:25 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:26.200 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.199+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4d6c09da50 con 0x7f4d5c0776b0 2026-03-10T12:41:26.201 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.201+0000 7f4d5bfff700 1 -- 192.168.123.100:0/244165363 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f4d6c09da50 con 0x7f4d5c0776b0 2026-03-10T12:41:26.205 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.205+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4d5c0776b0 msgr2=0x7f4d5c079b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.205 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.205+0000 7f4d796a0700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4d5c0776b0 0x7f4d5c079b60 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f4d640051d0 tx=0x7f4d6401a040 comp rx=0 tx=0).stop 2026-03-10T12:41:26.205 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.205+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c094650 msgr2=0x7f4d6c12d370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.205 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.205+0000 7f4d796a0700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c094650 0x7f4d6c12d370 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4d6800eab0 tx=0x7f4d6800edc0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.206+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 shutdown_connections 2026-03-10T12:41:26.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.206+0000 7f4d796a0700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4d5c0776b0 0x7f4d5c079b60 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.206+0000 7f4d796a0700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4d6c093d00 0x7f4d6c12ce30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.206+0000 7f4d796a0700 1 --2- 192.168.123.100:0/244165363 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4d6c094650 0x7f4d6c12d370 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.206+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 >> 192.168.123.100:0/244165363 conn(0x7f4d6c08f8b0 msgr2=0x7f4d6c09c330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.206+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 shutdown_connections 2026-03-10T12:41:26.206 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.206+0000 7f4d796a0700 1 -- 192.168.123.100:0/244165363 wait complete. 2026-03-10T12:41:26.218 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.299+0000 7f60fadfa700 1 -- 192.168.123.100:0/2819492675 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4072360 msgr2=0x7f60f40770e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.299+0000 7f60fadfa700 1 --2- 192.168.123.100:0/2819492675 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4072360 0x7f60f40770e0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f60ec00d3e0 tx=0x7f60ec00d6f0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 -- 192.168.123.100:0/2819492675 shutdown_connections 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 --2- 192.168.123.100:0/2819492675 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4072360 0x7f60f40770e0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 --2- 192.168.123.100:0/2819492675 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60f4071980 0x7f60f4071d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 -- 192.168.123.100:0/2819492675 >> 192.168.123.100:0/2819492675 conn(0x7f60f406d1a0 msgr2=0x7f60f406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 -- 192.168.123.100:0/2819492675 shutdown_connections 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 -- 192.168.123.100:0/2819492675 wait complete. 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 Processor -- start 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 -- start start 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4071980 0x7f60f4082440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60f4082980 0x7f60f4082df0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60f4083df0 con 0x7f60f4082980 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.300+0000 7f60fadfa700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60f412dd80 con 0x7f60f4071980 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.301+0000 7f60f8b96700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4071980 0x7f60f4082440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.301+0000 7f60f8b96700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4071980 0x7f60f4082440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:55456/0 (socket says 192.168.123.100:55456) 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.301+0000 7f60f8b96700 1 -- 192.168.123.100:0/718068797 learned_addr learned my addr 192.168.123.100:0/718068797 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.301+0000 7f60f8b96700 1 -- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60f4082980 msgr2=0x7f60f4082df0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.301+0000 7f60f8b96700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60f4082980 0x7f60f4082df0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.302 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.301+0000 7f60f8b96700 1 -- 192.168.123.100:0/718068797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f60ec00d090 con 0x7f60f4071980 2026-03-10T12:41:26.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.302+0000 7f60f8b96700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4071980 0x7f60f4082440 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f60e4008ca0 tx=0x7f60e400e410 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.302+0000 7f60f1ffb700 1 -- 192.168.123.100:0/718068797 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f60e4019070 con 0x7f60f4071980 2026-03-10T12:41:26.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.302+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60f412e030 con 0x7f60f4071980 2026-03-10T12:41:26.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.302+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60f412e580 con 0x7f60f4071980 2026-03-10T12:41:26.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.302+0000 7f60f1ffb700 1 -- 192.168.123.100:0/718068797 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f60e400ede0 con 0x7f60f4071980 2026-03-10T12:41:26.303 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.302+0000 7f60f1ffb700 1 -- 192.168.123.100:0/718068797 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f60e400f040 con 0x7f60f4071980 2026-03-10T12:41:26.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.303+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f60f404ea50 con 0x7f60f4071980 2026-03-10T12:41:26.305 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.304+0000 7f60f1ffb700 1 -- 192.168.123.100:0/718068797 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f60e4004750 con 0x7f60f4071980 2026-03-10T12:41:26.305 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.304+0000 7f60f1ffb700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f60dc0778c0 0x7f60dc079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.305 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.305+0000 7f60f3fff700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f60dc0778c0 0x7f60dc079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.305+0000 7f60f1ffb700 1 -- 192.168.123.100:0/718068797 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(58..58 src has 1..58) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f60e4099c60 con 0x7f60f4071980 2026-03-10T12:41:26.307 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.306+0000 7f60f3fff700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f60dc0778c0 0x7f60dc079d70 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f60f4072ff0 tx=0x7f60ec00da40 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.311+0000 7f60f1ffb700 1 -- 192.168.123.100:0/718068797 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f60e4062490 con 0x7f60f4071980 2026-03-10T12:41:26.452 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:26 vm00.local systemd[1]: Stopping Ceph osd.2 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:41:26.453 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[79470]: 2026-03-10T12:41:26.334+0000 7fc190d8d700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:41:26.453 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[79470]: 2026-03-10T12:41:26.334+0000 7fc190d8d700 -1 osd.2 58 *** Got signal Terminated *** 2026-03-10T12:41:26.453 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[79470]: 2026-03-10T12:41:26.334+0000 7fc190d8d700 -1 osd.2 58 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:41:26.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.451+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f60f407c700 con 0x7f60dc0778c0 2026-03-10T12:41:26.454 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.453+0000 7f60f1ffb700 1 -- 192.168.123.100:0/718068797 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f60f407c700 con 0x7f60dc0778c0 2026-03-10T12:41:26.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f60dc0778c0 msgr2=0x7f60dc079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f60dc0778c0 0x7f60dc079d70 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f60f4072ff0 tx=0x7f60ec00da40 comp rx=0 tx=0).stop 2026-03-10T12:41:26.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4071980 msgr2=0x7f60f4082440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4071980 0x7f60f4082440 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f60e4008ca0 tx=0x7f60e400e410 comp rx=0 tx=0).stop 2026-03-10T12:41:26.457 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 shutdown_connections 2026-03-10T12:41:26.457 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f60dc0778c0 0x7f60dc079d70 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.457 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60f4071980 0x7f60f4082440 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.457 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 --2- 192.168.123.100:0/718068797 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60f4082980 0x7f60f4082df0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.457 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 >> 192.168.123.100:0/718068797 conn(0x7f60f406d1a0 msgr2=0x7f60f4076470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.457 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 shutdown_connections 2026-03-10T12:41:26.457 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.456+0000 7f60fadfa700 1 -- 192.168.123.100:0/718068797 wait complete. 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 -- 192.168.123.100:0/2494206974 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724101030 msgr2=0x7f3724103410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 --2- 192.168.123.100:0/2494206974 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724101030 0x7f3724103410 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f3714009b50 tx=0x7f3714009e60 comp rx=0 tx=0).stop 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 -- 192.168.123.100:0/2494206974 shutdown_connections 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 --2- 192.168.123.100:0/2494206974 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3724103950 0x7f3724105d30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 --2- 192.168.123.100:0/2494206974 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724101030 0x7f3724103410 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 -- 192.168.123.100:0/2494206974 >> 192.168.123.100:0/2494206974 conn(0x7f37240fa9b0 msgr2=0x7f37240fce00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 -- 192.168.123.100:0/2494206974 shutdown_connections 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.529+0000 7f372ae7d700 1 -- 192.168.123.100:0/2494206974 wait complete. 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f372ae7d700 1 Processor -- start 2026-03-10T12:41:26.530 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f372ae7d700 1 -- start start 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f372ae7d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724103950 0x7f37241982a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f372ae7d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37241987e0 0x7f372419d850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f372ae7d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3724198ce0 con 0x7f3724103950 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f372ae7d700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3724198e50 con 0x7f37241987e0 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f3723fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37241987e0 0x7f372419d850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f3723fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37241987e0 0x7f372419d850 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:55462/0 (socket says 192.168.123.100:55462) 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.530+0000 7f3723fff700 1 -- 192.168.123.100:0/1929965720 learned_addr learned my addr 192.168.123.100:0/1929965720 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f3728c19700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724103950 0x7f37241982a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f3723fff700 1 -- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724103950 msgr2=0x7f37241982a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f3723fff700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724103950 0x7f37241982a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f3723fff700 1 -- 192.168.123.100:0/1929965720 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f37140097e0 con 0x7f37241987e0 2026-03-10T12:41:26.531 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f3728c19700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724103950 0x7f37241982a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:41:26.532 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f3723fff700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37241987e0 0x7f372419d850 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f371800d8d0 tx=0x7f371800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.532 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f3721ffb700 1 -- 192.168.123.100:0/1929965720 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3718009940 con 0x7f37241987e0 2026-03-10T12:41:26.532 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.531+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f372419ddf0 con 0x7f37241987e0 2026-03-10T12:41:26.533 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.532+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f372419e310 con 0x7f37241987e0 2026-03-10T12:41:26.533 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.532+0000 7f3721ffb700 1 -- 192.168.123.100:0/1929965720 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3718010460 con 0x7f37241987e0 2026-03-10T12:41:26.533 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.532+0000 7f3721ffb700 1 -- 192.168.123.100:0/1929965720 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3718010ab0 con 0x7f37241987e0 2026-03-10T12:41:26.533 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.532+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3710005320 con 0x7f37241987e0 2026-03-10T12:41:26.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.534+0000 7f3721ffb700 1 -- 192.168.123.100:0/1929965720 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3718009aa0 con 0x7f37241987e0 2026-03-10T12:41:26.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.534+0000 7f3721ffb700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f370c0778c0 0x7f370c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.534+0000 7f3721ffb700 1 -- 192.168.123.100:0/1929965720 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(58..58 src has 1..58) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3718099e50 con 0x7f37241987e0 2026-03-10T12:41:26.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.534+0000 7f3728c19700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f370c0778c0 0x7f370c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.535+0000 7f3728c19700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f370c0778c0 0x7f370c079d70 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f3714009b20 tx=0x7f371400b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.536+0000 7f3721ffb700 1 -- 192.168.123.100:0/1929965720 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3718062680 con 0x7f37241987e0 2026-03-10T12:41:26.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.672+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3710000bf0 con 0x7f370c0778c0 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.677+0000 7f3721ffb700 1 -- 192.168.123.100:0/1929965720 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f3710000bf0 con 0x7f370c0778c0 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (7m) 18s ago 8m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (8m) 18s ago 8m 9261k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (7m) 115s ago 7m 11.2M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (2m) 18s ago 8m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (118s) 115s ago 7m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (7m) 18s ago 8m 91.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (6m) 18s ago 6m 136M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (6m) 18s ago 6m 18.6M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (6m) 115s ago 6m 17.4M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (6m) 115s ago 6m 169M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (2m) 18s ago 9m 621M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (2m) 115s ago 7m 488M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (2m) 18s ago 9m 59.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (2m) 115s ago 7m 50.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (8m) 18s ago 8m 14.8M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (7m) 115s ago 7m 15.5M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (104s) 18s ago 7m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (19s) 18s ago 7m 12.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (7m) 18s ago 7m 366M 4096M 18.2.0 dc2bc1663786 a5a89ccf847e 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (7m) 115s ago 7m 452M 4096M 18.2.0 dc2bc1663786 0c6249fe3951 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (6m) 115s ago 6m 402M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (6m) 115s ago 6m 370M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:41:26.678 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (2m) 18s ago 8m 65.5M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.679+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f370c0778c0 msgr2=0x7f370c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.679+0000 7f372ae7d700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f370c0778c0 0x7f370c079d70 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f3714009b20 tx=0x7f371400b540 comp rx=0 tx=0).stop 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.679+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37241987e0 msgr2=0x7f372419d850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.679+0000 7f372ae7d700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37241987e0 0x7f372419d850 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f371800d8d0 tx=0x7f371800dc90 comp rx=0 tx=0).stop 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.680+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 shutdown_connections 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.680+0000 7f372ae7d700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f370c0778c0 0x7f370c079d70 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.680+0000 7f372ae7d700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3724103950 0x7f37241982a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.680+0000 7f372ae7d700 1 --2- 192.168.123.100:0/1929965720 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f37241987e0 0x7f372419d850 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.680+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 >> 192.168.123.100:0/1929965720 conn(0x7f37240fa9b0 msgr2=0x7f37241045b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.680+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 shutdown_connections 2026-03-10T12:41:26.680 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.680+0000 7f372ae7d700 1 -- 192.168.123.100:0/1929965720 wait complete. 2026-03-10T12:41:26.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 -- 192.168.123.100:0/3385053503 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c105780 msgr2=0x7f1d9c107b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 --2- 192.168.123.100:0/3385053503 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c105780 0x7f1d9c107b60 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f1d8c009b50 tx=0x7f1d8c009e60 comp rx=0 tx=0).stop 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 -- 192.168.123.100:0/3385053503 shutdown_connections 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 --2- 192.168.123.100:0/3385053503 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c105780 0x7f1d9c107b60 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 --2- 192.168.123.100:0/3385053503 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c0691c0 0x7f1d9c105240 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 -- 192.168.123.100:0/3385053503 >> 192.168.123.100:0/3385053503 conn(0x7f1d9c0fa7b0 msgr2=0x7f1d9c0fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 -- 192.168.123.100:0/3385053503 shutdown_connections 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 -- 192.168.123.100:0/3385053503 wait complete. 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 Processor -- start 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.752+0000 7f1da116d700 1 -- start start 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1da116d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c198070 0x7f1d9c198480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1da116d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c1989c0 0x7f1d9c19d660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1da116d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d9c19dba0 con 0x7f1d9c1989c0 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1da116d700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d9c19dce0 con 0x7f1d9c198070 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1d9bfff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c198070 0x7f1d9c198480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1d9bfff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c198070 0x7f1d9c198480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:55480/0 (socket says 192.168.123.100:55480) 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1d9bfff700 1 -- 192.168.123.100:0/3205127119 learned_addr learned my addr 192.168.123.100:0/3205127119 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1d9b7fe700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c1989c0 0x7f1d9c19d660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1d9bfff700 1 -- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c1989c0 msgr2=0x7f1d9c19d660 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1d9bfff700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c1989c0 0x7f1d9c19d660 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.753+0000 7f1d9bfff700 1 -- 192.168.123.100:0/3205127119 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1d8c0097e0 con 0x7f1d9c198070 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.754+0000 7f1d9b7fe700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c1989c0 0x7f1d9c19d660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.754+0000 7f1d9bfff700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c198070 0x7f1d9c198480 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f1d8400b700 tx=0x7f1d8400bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.754+0000 7f1d997fa700 1 -- 192.168.123.100:0/3205127119 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d84010820 con 0x7f1d9c198070 2026-03-10T12:41:26.754 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.754+0000 7f1da116d700 1 -- 192.168.123.100:0/3205127119 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1d9c19df90 con 0x7f1d9c198070 2026-03-10T12:41:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.754+0000 7f1da116d700 1 -- 192.168.123.100:0/3205127119 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1d9c19e450 con 0x7f1d9c198070 2026-03-10T12:41:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.754+0000 7f1d997fa700 1 -- 192.168.123.100:0/3205127119 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1d84010e60 con 0x7f1d9c198070 2026-03-10T12:41:26.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.754+0000 7f1d997fa700 1 -- 192.168.123.100:0/3205127119 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d84017570 con 0x7f1d9c198070 2026-03-10T12:41:26.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.755+0000 7f1d997fa700 1 -- 192.168.123.100:0/3205127119 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1d84010980 con 0x7f1d9c198070 2026-03-10T12:41:26.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.756+0000 7f1d997fa700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1d88077920 0x7f1d88079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:26.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.756+0000 7f1d997fa700 1 -- 192.168.123.100:0/3205127119 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(58..58 src has 1..58) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f1d84099430 con 0x7f1d9c198070 2026-03-10T12:41:26.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.756+0000 7f1d9b7fe700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1d88077920 0x7f1d88079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:26.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.756+0000 7f1da116d700 1 -- 192.168.123.100:0/3205127119 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1d9c191f80 con 0x7f1d9c198070 2026-03-10T12:41:26.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.756+0000 7f1d9b7fe700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1d88077920 0x7f1d88079dd0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f1d8c000c00 tx=0x7f1d8c004e80 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:26.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.759+0000 7f1d997fa700 1 -- 192.168.123.100:0/3205127119 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1d84061be0 con 0x7f1d9c198070 2026-03-10T12:41:26.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.969+0000 7f1da116d700 1 -- 192.168.123.100:0/3205127119 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f1d9c02cc60 con 0x7f1d9c198070 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.970+0000 7f1d997fa700 1 -- 192.168.123.100:0/3205127119 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f1d84061330 con 0x7f1d9c198070 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 3, 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 7, 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:41:26.971 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.973+0000 7f1d92ffd700 1 -- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1d88077920 msgr2=0x7f1d88079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.973+0000 7f1d92ffd700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1d88077920 0x7f1d88079dd0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f1d8c000c00 tx=0x7f1d8c004e80 comp rx=0 tx=0).stop 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.973+0000 7f1d92ffd700 1 -- 192.168.123.100:0/3205127119 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c198070 msgr2=0x7f1d9c198480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.973+0000 7f1d92ffd700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c198070 0x7f1d9c198480 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f1d8400b700 tx=0x7f1d8400bac0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.973+0000 7f1d92ffd700 1 -- 192.168.123.100:0/3205127119 shutdown_connections 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.973+0000 7f1d92ffd700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1d88077920 0x7f1d88079dd0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.974+0000 7f1d92ffd700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1d9c198070 0x7f1d9c198480 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.974+0000 7f1d92ffd700 1 --2- 192.168.123.100:0/3205127119 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1d9c1989c0 0x7f1d9c19d660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:26.974 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.974+0000 7f1d92ffd700 1 -- 192.168.123.100:0/3205127119 >> 192.168.123.100:0/3205127119 conn(0x7f1d9c0fa7b0 msgr2=0x7f1d9c0fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:26.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.976+0000 7f1d92ffd700 1 -- 192.168.123.100:0/3205127119 shutdown_connections 2026-03-10T12:41:26.976 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:26.976+0000 7f1d92ffd700 1 -- 192.168.123.100:0/3205127119 wait complete. 2026-03-10T12:41:27.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.063+0000 7f8bedff1700 1 -- 192.168.123.100:0/1187165105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8072440 msgr2=0x7f8be810be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.063+0000 7f8bedff1700 1 --2- 192.168.123.100:0/1187165105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8072440 0x7f8be810be90 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f8bdc009b00 tx=0x7f8bdc009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:27.064 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:27.064 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-mon[103263]: Upgrade: osd.2 is safe to restart 2026-03-10T12:41:27.064 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-mon[103263]: Upgrade: Updating osd.2 2026-03-10T12:41:27.064 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-mon[103263]: Deploying daemon osd.2 on vm00 2026-03-10T12:41:27.064 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:27.064 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:26 vm00.local ceph-mon[103263]: osd.2 marked itself down and dead 2026-03-10T12:41:27.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.063+0000 7f8bedff1700 1 -- 192.168.123.100:0/1187165105 shutdown_connections 2026-03-10T12:41:27.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.063+0000 7f8bedff1700 1 --2- 192.168.123.100:0/1187165105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8072440 0x7f8be810be90 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.063+0000 7f8bedff1700 1 --2- 192.168.123.100:0/1187165105 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8be8071a60 0x7f8be8071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.063+0000 7f8bedff1700 1 -- 192.168.123.100:0/1187165105 >> 192.168.123.100:0/1187165105 conn(0x7f8be806d1a0 msgr2=0x7f8be806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:27.064 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:26 vm00.local podman[117476]: 2026-03-10 12:41:26.91909475 +0000 UTC m=+0.602107282 container died a5a89ccf847e88ea2ffa55465afd100201f0111dadd9a6fe1c92f31dc2fc6fd1 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2, CEPH_POINT_RELEASE=-18.2.0, ceph=True, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.license=GPLv2, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, RELEASE=HEAD, org.label-schema.vendor=CentOS) 2026-03-10T12:41:27.064 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:26 vm00.local podman[117476]: 2026-03-10 12:41:26.959859086 +0000 UTC m=+0.642871609 container remove a5a89ccf847e88ea2ffa55465afd100201f0111dadd9a6fe1c92f31dc2fc6fd1 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, org.label-schema.vendor=CentOS, GIT_CLEAN=True, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=-18.2.0, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T12:41:27.064 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:26 vm00.local bash[117476]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2 2026-03-10T12:41:27.064 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.064+0000 7f8bedff1700 1 -- 192.168.123.100:0/1187165105 shutdown_connections 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.064+0000 7f8bedff1700 1 -- 192.168.123.100:0/1187165105 wait complete. 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.067+0000 7f8bedff1700 1 Processor -- start 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.067+0000 7f8bedff1700 1 -- start start 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.067+0000 7f8bedff1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8071a60 0x7f8be81169f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.067+0000 7f8bedff1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8be8116f30 0x7f8be81a1480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.067+0000 7f8bedff1700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8be8117430 con 0x7f8be8071a60 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.067+0000 7f8bedff1700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8be81175a0 con 0x7f8be8116f30 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.067+0000 7f8becfef700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8071a60 0x7f8be81169f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8becfef700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8071a60 0x7f8be81169f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:55542/0 (socket says 192.168.123.100:55542) 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8becfef700 1 -- 192.168.123.100:0/179853777 learned_addr learned my addr 192.168.123.100:0/179853777 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8be7fff700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8be8116f30 0x7f8be81a1480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8becfef700 1 -- 192.168.123.100:0/179853777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8be8116f30 msgr2=0x7f8be81a1480 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8becfef700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8be8116f30 0x7f8be81a1480 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8becfef700 1 -- 192.168.123.100:0/179853777 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8bdc0097e0 con 0x7f8be8071a60 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8becfef700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8071a60 0x7f8be81169f0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f8bd800d900 tx=0x7f8bd800dc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8be5ffb700 1 -- 192.168.123.100:0/179853777 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bd80041d0 con 0x7f8be8071a60 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8bedff1700 1 -- 192.168.123.100:0/179853777 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8be81a1a20 con 0x7f8be8071a60 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8bedff1700 1 -- 192.168.123.100:0/179853777 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8be81a1f70 con 0x7f8be8071a60 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8be5ffb700 1 -- 192.168.123.100:0/179853777 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8bd8004330 con 0x7f8be8071a60 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.068+0000 7f8be5ffb700 1 -- 192.168.123.100:0/179853777 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bd8003d70 con 0x7f8be8071a60 2026-03-10T12:41:27.071 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.070+0000 7f8bedff1700 1 -- 192.168.123.100:0/179853777 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8bd4005320 con 0x7f8be8071a60 2026-03-10T12:41:27.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.071+0000 7f8be5ffb700 1 -- 192.168.123.100:0/179853777 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8bd803ca90 con 0x7f8be8071a60 2026-03-10T12:41:27.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.071+0000 7f8be5ffb700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8bd00779e0 0x7f8bd0079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.073 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.071+0000 7f8be5ffb700 1 -- 192.168.123.100:0/179853777 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(59..59 src has 1..59) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8bd8021030 con 0x7f8be8071a60 2026-03-10T12:41:27.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.074+0000 7f8be7fff700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8bd00779e0 0x7f8bd0079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.075 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.074+0000 7f8be7fff700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8bd00779e0 0x7f8bd0079e90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f8bdc005fd0 tx=0x7f8bdc019040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:27.078 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.075+0000 7f8be5ffb700 1 -- 192.168.123.100:0/179853777 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8bd8061fe0 con 0x7f8be8071a60 2026-03-10T12:41:27.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.239+0000 7f8bedff1700 1 -- 192.168.123.100:0/179853777 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8bd4005cc0 con 0x7f8be8071a60 2026-03-10T12:41:27.241 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.240+0000 7f8be5ffb700 1 -- 192.168.123.100:0/179853777 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1961 (secure 0 0 0) 0x7f8bd8061730 con 0x7f8be8071a60 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 0 members: 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:27.242 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:27.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.244+0000 7f8bcf7fe700 1 -- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8bd00779e0 msgr2=0x7f8bd0079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.244+0000 7f8bcf7fe700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8bd00779e0 0x7f8bd0079e90 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f8bdc005fd0 tx=0x7f8bdc019040 comp rx=0 tx=0).stop 2026-03-10T12:41:27.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.244+0000 7f8bcf7fe700 1 -- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8071a60 msgr2=0x7f8be81169f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.244 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.244+0000 7f8bcf7fe700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8071a60 0x7f8be81169f0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f8bd800d900 tx=0x7f8bd800dc10 comp rx=0 tx=0).stop 2026-03-10T12:41:27.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.244+0000 7f8bcf7fe700 1 -- 192.168.123.100:0/179853777 shutdown_connections 2026-03-10T12:41:27.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.244+0000 7f8bcf7fe700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8bd00779e0 0x7f8bd0079e90 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.245+0000 7f8bcf7fe700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8be8071a60 0x7f8be81169f0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.245+0000 7f8bcf7fe700 1 --2- 192.168.123.100:0/179853777 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8be8116f30 0x7f8be81a1480 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.245+0000 7f8bcf7fe700 1 -- 192.168.123.100:0/179853777 >> 192.168.123.100:0/179853777 conn(0x7f8be806d1a0 msgr2=0x7f8be80705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:27.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.245+0000 7f8bcf7fe700 1 -- 192.168.123.100:0/179853777 shutdown_connections 2026-03-10T12:41:27.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.245+0000 7f8bcf7fe700 1 -- 192.168.123.100:0/179853777 wait complete. 2026-03-10T12:41:27.248 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:41:27.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:26 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T12:41:27.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:26 vm07.local ceph-mon[93622]: Upgrade: osd.2 is safe to restart 2026-03-10T12:41:27.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:26 vm07.local ceph-mon[93622]: Upgrade: Updating osd.2 2026-03-10T12:41:27.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:26 vm07.local ceph-mon[93622]: Deploying daemon osd.2 on vm00 2026-03-10T12:41:27.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:27.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:26 vm07.local ceph-mon[93622]: osd.2 marked itself down and dead 2026-03-10T12:41:27.322 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117614]: 2026-03-10 12:41:27.11942511 +0000 UTC m=+0.019602140 container create a9724b701a6e1a79d5feb5a10ad2f6a88a7ac83f273b972e8f16fd7771eb2ebc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:41:27.322 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117614]: 2026-03-10 12:41:27.167039448 +0000 UTC m=+0.067216488 container init a9724b701a6e1a79d5feb5a10ad2f6a88a7ac83f273b972e8f16fd7771eb2ebc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-10T12:41:27.322 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117614]: 2026-03-10 12:41:27.172115589 +0000 UTC m=+0.072292619 container start a9724b701a6e1a79d5feb5a10ad2f6a88a7ac83f273b972e8f16fd7771eb2ebc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T12:41:27.323 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117614]: 2026-03-10 12:41:27.175368397 +0000 UTC m=+0.075545427 container attach a9724b701a6e1a79d5feb5a10ad2f6a88a7ac83f273b972e8f16fd7771eb2ebc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T12:41:27.323 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117614]: 2026-03-10 12:41:27.109500865 +0000 UTC m=+0.009677905 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:27.323 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local conmon[117626]: conmon a9724b701a6e1a79d5fe : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a9724b701a6e1a79d5feb5a10ad2f6a88a7ac83f273b972e8f16fd7771eb2ebc.scope/container/memory.events 2026-03-10T12:41:27.323 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117614]: 2026-03-10 12:41:27.305518864 +0000 UTC m=+0.205695894 container died a9724b701a6e1a79d5feb5a10ad2f6a88a7ac83f273b972e8f16fd7771eb2ebc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, org.label-schema.build-date=20260223, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:41:27.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1809938898 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c071950 msgr2=0x7f9e5c071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1809938898 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c071950 0x7f9e5c071d60 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f9e4c007780 tx=0x7f9e4c00c050 comp rx=0 tx=0).stop 2026-03-10T12:41:27.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1809938898 shutdown_connections 2026-03-10T12:41:27.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1809938898 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9e5c072330 0x7f9e5c0770b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1809938898 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c071950 0x7f9e5c071d60 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.327 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1809938898 >> 192.168.123.100:0/1809938898 conn(0x7f9e5c06d1a0 msgr2=0x7f9e5c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1809938898 shutdown_connections 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.326+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1809938898 wait complete. 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.327+0000 7f9e60a5c700 1 Processor -- start 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.327+0000 7f9e60a5c700 1 -- start start 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.327+0000 7f9e60a5c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9e5c072330 0x7f9e5c131350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.327+0000 7f9e60a5c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c131890 0x7f9e5c07f510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.327+0000 7f9e60a5c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e5c131d90 con 0x7f9e5c072330 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.327+0000 7f9e60a5c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e5c131ed0 con 0x7f9e5c131890 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5ad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9e5c072330 0x7f9e5c131350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5ad9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9e5c072330 0x7f9e5c131350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:55556/0 (socket says 192.168.123.100:55556) 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5ad9d700 1 -- 192.168.123.100:0/1905625727 learned_addr learned my addr 192.168.123.100:0/1905625727 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:27.328 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5a59c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c131890 0x7f9e5c07f510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5a59c700 1 -- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9e5c072330 msgr2=0x7f9e5c131350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5a59c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9e5c072330 0x7f9e5c131350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5a59c700 1 -- 192.168.123.100:0/1905625727 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9e4c007430 con 0x7f9e5c131890 2026-03-10T12:41:27.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e5a59c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c131890 0x7f9e5c07f510 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f9e5400beb0 tx=0x7f9e5400bee0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:27.329 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.328+0000 7f9e43fff700 1 -- 192.168.123.100:0/1905625727 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9e5400cd70 con 0x7f9e5c131890 2026-03-10T12:41:27.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.329+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9e5c07fab0 con 0x7f9e5c131890 2026-03-10T12:41:27.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.329+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9e5c07ffb0 con 0x7f9e5c131890 2026-03-10T12:41:27.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.329+0000 7f9e43fff700 1 -- 192.168.123.100:0/1905625727 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9e54014920 con 0x7f9e5c131890 2026-03-10T12:41:27.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.329+0000 7f9e43fff700 1 -- 192.168.123.100:0/1905625727 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9e540129f0 con 0x7f9e5c131890 2026-03-10T12:41:27.330 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.330+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9e48005320 con 0x7f9e5c131890 2026-03-10T12:41:27.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.331+0000 7f9e43fff700 1 -- 192.168.123.100:0/1905625727 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9e54012b50 con 0x7f9e5c131890 2026-03-10T12:41:27.332 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.332+0000 7f9e43fff700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9e44077910 0x7f9e44079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.333 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.332+0000 7f9e43fff700 1 -- 192.168.123.100:0/1905625727 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(59..59 src has 1..59) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9e54099470 con 0x7f9e5c131890 2026-03-10T12:41:27.335 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.335+0000 7f9e5ad9d700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9e44077910 0x7f9e44079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.336 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.335+0000 7f9e5ad9d700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9e44077910 0x7f9e44079dc0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f9e4c00c4d0 tx=0x7f9e4c015040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:27.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.336+0000 7f9e43fff700 1 -- 192.168.123.100:0/1905625727 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9e54061dd0 con 0x7f9e5c131890 2026-03-10T12:41:27.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.507+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9e48000bf0 con 0x7f9e44077910 2026-03-10T12:41:27.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.510+0000 7f9e43fff700 1 -- 192.168.123.100:0/1905625727 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9e48000bf0 con 0x7f9e44077910 2026-03-10T12:41:27.510 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "8/23 daemons upgraded", 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:41:27.511 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:41:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.512+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9e44077910 msgr2=0x7f9e44079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9e44077910 0x7f9e44079dc0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f9e4c00c4d0 tx=0x7f9e4c015040 comp rx=0 tx=0).stop 2026-03-10T12:41:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c131890 msgr2=0x7f9e5c07f510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c131890 0x7f9e5c07f510 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f9e5400beb0 tx=0x7f9e5400bee0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 shutdown_connections 2026-03-10T12:41:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9e44077910 0x7f9e44079dc0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9e5c072330 0x7f9e5c131350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 --2- 192.168.123.100:0/1905625727 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9e5c131890 0x7f9e5c07f510 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 >> 192.168.123.100:0/1905625727 conn(0x7f9e5c06d1a0 msgr2=0x7f9e5c0764f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 shutdown_connections 2026-03-10T12:41:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.513+0000 7f9e60a5c700 1 -- 192.168.123.100:0/1905625727 wait complete. 2026-03-10T12:41:27.591 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117614]: 2026-03-10 12:41:27.33207616 +0000 UTC m=+0.232253190 container remove a9724b701a6e1a79d5feb5a10ad2f6a88a7ac83f273b972e8f16fd7771eb2ebc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T12:41:27.591 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.2.service: Deactivated successfully. 2026-03-10T12:41:27.591 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local systemd[1]: Stopped Ceph osd.2 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:41:27.591 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.2.service: Consumed 38.805s CPU time. 2026-03-10T12:41:27.591 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local systemd[1]: Starting Ceph osd.2 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.595+0000 7fc51e445700 1 -- 192.168.123.100:0/1556498445 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518107d50 msgr2=0x7fc5181081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.595+0000 7fc51e445700 1 --2- 192.168.123.100:0/1556498445 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518107d50 0x7fc5181081c0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fc510009b00 tx=0x7fc510009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 -- 192.168.123.100:0/1556498445 shutdown_connections 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 --2- 192.168.123.100:0/1556498445 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518107d50 0x7fc5181081c0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 --2- 192.168.123.100:0/1556498445 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc518071db0 0x7fc5180721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 -- 192.168.123.100:0/1556498445 >> 192.168.123.100:0/1556498445 conn(0x7fc51806d3e0 msgr2=0x7fc51806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 -- 192.168.123.100:0/1556498445 shutdown_connections 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 -- 192.168.123.100:0/1556498445 wait complete. 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 Processor -- start 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 -- start start 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc518071db0 0x7fc518116a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518116fd0 0x7fc5181b29e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc518117500 con 0x7fc518116fd0 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc51e445700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc518117670 con 0x7fc518071db0 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc5177fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518116fd0 0x7fc5181b29e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc5177fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518116fd0 0x7fc5181b29e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:55572/0 (socket says 192.168.123.100:55572) 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc5177fe700 1 -- 192.168.123.100:0/3148165210 learned_addr learned my addr 192.168.123.100:0/3148165210 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.596+0000 7fc517fff700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc518071db0 0x7fc518116a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.597+0000 7fc5177fe700 1 -- 192.168.123.100:0/3148165210 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc518071db0 msgr2=0x7fc518116a90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.597+0000 7fc5177fe700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc518071db0 0x7fc518116a90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.597+0000 7fc5177fe700 1 -- 192.168.123.100:0/3148165210 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5100097e0 con 0x7fc518116fd0 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.598+0000 7fc5177fe700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518116fd0 0x7fc5181b29e0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fc510009ad0 tx=0x7fc51000f710 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:27.598 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.598+0000 7fc5157fa700 1 -- 192.168.123.100:0/3148165210 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc51001c070 con 0x7fc518116fd0 2026-03-10T12:41:27.599 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.598+0000 7fc51e445700 1 -- 192.168.123.100:0/3148165210 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5181b2f20 con 0x7fc518116fd0 2026-03-10T12:41:27.599 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.598+0000 7fc51e445700 1 -- 192.168.123.100:0/3148165210 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5181b33e0 con 0x7fc518116fd0 2026-03-10T12:41:27.599 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.598+0000 7fc5157fa700 1 -- 192.168.123.100:0/3148165210 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc51000fe90 con 0x7fc518116fd0 2026-03-10T12:41:27.599 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.598+0000 7fc5157fa700 1 -- 192.168.123.100:0/3148165210 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc510017630 con 0x7fc518116fd0 2026-03-10T12:41:27.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.600+0000 7fc5157fa700 1 -- 192.168.123.100:0/3148165210 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc510017790 con 0x7fc518116fd0 2026-03-10T12:41:27.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.600+0000 7fc5157fa700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5000779e0 0x7fc500079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:27.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.600+0000 7fc5157fa700 1 -- 192.168.123.100:0/3148165210 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(59..59 src has 1..59) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc51009bc70 con 0x7fc518116fd0 2026-03-10T12:41:27.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.600+0000 7fc517fff700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5000779e0 0x7fc500079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:27.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.601+0000 7fc51e445700 1 -- 192.168.123.100:0/3148165210 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc518066e40 con 0x7fc518116fd0 2026-03-10T12:41:27.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.601+0000 7fc517fff700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5000779e0 0x7fc500079e90 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fc50c005950 tx=0x7fc50c00b410 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:27.614 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.604+0000 7fc5157fa700 1 -- 192.168.123.100:0/3148165210 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc510064520 con 0x7fc518116fd0 2026-03-10T12:41:27.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.792+0000 7fc51e445700 1 -- 192.168.123.100:0/3148165210 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fc5181b3780 con 0x7fc518116fd0 2026-03-10T12:41:27.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.807+0000 7fc5157fa700 1 -- 192.168.123.100:0/3148165210 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+95 (secure 0 0 0) 0x7fc510063c70 con 0x7fc518116fd0 2026-03-10T12:41:27.809 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_WARN 1 osds down 2026-03-10T12:41:27.809 INFO:teuthology.orchestra.run.vm00.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-10T12:41:27.809 INFO:teuthology.orchestra.run.vm00.stdout: osd.2 (root=default,host=vm00) is down 2026-03-10T12:41:27.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 -- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5000779e0 msgr2=0x7fc500079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5000779e0 0x7fc500079e90 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fc50c005950 tx=0x7fc50c00b410 comp rx=0 tx=0).stop 2026-03-10T12:41:27.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 -- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518116fd0 msgr2=0x7fc5181b29e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:27.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518116fd0 0x7fc5181b29e0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fc510009ad0 tx=0x7fc51000f710 comp rx=0 tx=0).stop 2026-03-10T12:41:27.812 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 -- 192.168.123.100:0/3148165210 shutdown_connections 2026-03-10T12:41:27.812 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5000779e0 0x7fc500079e90 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.812 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc518071db0 0x7fc518116a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.812 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 --2- 192.168.123.100:0/3148165210 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc518116fd0 0x7fc5181b29e0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:27.812 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.811+0000 7fc4feffd700 1 -- 192.168.123.100:0/3148165210 >> 192.168.123.100:0/3148165210 conn(0x7fc51806d3e0 msgr2=0x7fc518070650 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:27.813 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.812+0000 7fc4feffd700 1 -- 192.168.123.100:0/3148165210 shutdown_connections 2026-03-10T12:41:27.813 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:27.812+0000 7fc4feffd700 1 -- 192.168.123.100:0/3148165210 wait complete. 2026-03-10T12:41:27.857 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117762]: 2026-03-10 12:41:27.757655801 +0000 UTC m=+0.079866693 container create 5330c3f6ff62460a9774e2ad44a0eeff22fe59db6e4391af7a2f5ede48bfb473 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T12:41:27.857 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117762]: 2026-03-10 12:41:27.689470753 +0000 UTC m=+0.011681645 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:27.857 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117762]: 2026-03-10 12:41:27.81138322 +0000 UTC m=+0.133594113 container init 5330c3f6ff62460a9774e2ad44a0eeff22fe59db6e4391af7a2f5ede48bfb473 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, io.buildah.version=1.41.3) 2026-03-10T12:41:27.857 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117762]: 2026-03-10 12:41:27.815496991 +0000 UTC m=+0.137707883 container start 5330c3f6ff62460a9774e2ad44a0eeff22fe59db6e4391af7a2f5ede48bfb473 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223) 2026-03-10T12:41:27.858 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local podman[117762]: 2026-03-10 12:41:27.819501466 +0000 UTC m=+0.141712358 container attach 5330c3f6ff62460a9774e2ad44a0eeff22fe59db6e4391af7a2f5ede48bfb473 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True) 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: from='client.44183 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: from='client.44187 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: from='client.44191 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: pgmap v90: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: osdmap e59: 6 total, 5 up, 6 in 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3205127119' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/179853777' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:41:28.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3148165210' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:41:28.235 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.235 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local bash[117762]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.235 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.235 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:27 vm00.local bash[117762]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: from='client.44183 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: from='client.44187 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: from='client.44191 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: pgmap v90: 65 pgs: 65 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: osdmap e59: 6 total, 5 up, 6 in 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3205127119' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/179853777' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:41:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:27 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3148165210' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:41:28.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:41:28.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:41:28.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:28.735 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T12:41:28.735 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T12:41:28.735 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3aaef666-3219-49bd-9f0e-e04905fb26fc/osd-block-3de6b811-dbac-419f-abf8-afd0bec7a47f --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T12:41:28.735 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3aaef666-3219-49bd-9f0e-e04905fb26fc/osd-block-3de6b811-dbac-419f-abf8-afd0bec7a47f --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T12:41:29.057 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-mon[103263]: from='client.44201 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:29.057 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-mon[103263]: osdmap e60: 6 total, 5 up, 6 in 2026-03-10T12:41:29.057 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/ln -snf /dev/ceph-3aaef666-3219-49bd-9f0e-e04905fb26fc/osd-block-3de6b811-dbac-419f-abf8-afd0bec7a47f /var/lib/ceph/osd/ceph-2/block 2026-03-10T12:41:29.057 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/ln -snf /dev/ceph-3aaef666-3219-49bd-9f0e-e04905fb26fc/osd-block-3de6b811-dbac-419f-abf8-afd0bec7a47f /var/lib/ceph/osd/ceph-2/block 2026-03-10T12:41:29.057 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate[117774]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local bash[117762]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local podman[117984]: 2026-03-10 12:41:28.838186319 +0000 UTC m=+0.011274603 container died 5330c3f6ff62460a9774e2ad44a0eeff22fe59db6e4391af7a2f5ede48bfb473 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-10T12:41:29.058 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:28 vm00.local podman[117984]: 2026-03-10 12:41:28.947972451 +0000 UTC m=+0.121060735 container remove 5330c3f6ff62460a9774e2ad44a0eeff22fe59db6e4391af7a2f5ede48bfb473 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-activate, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:41:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:28 vm07.local ceph-mon[93622]: from='client.44201 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:28 vm07.local ceph-mon[93622]: osdmap e60: 6 total, 5 up, 6 in 2026-03-10T12:41:29.485 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:29 vm00.local podman[118023]: 2026-03-10 12:41:29.057564901 +0000 UTC m=+0.023175769 container create 249137e44eb73320cc3b7d5fb2611352f188ae04d8ea34073c8950f66c9054fc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T12:41:29.485 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:29 vm00.local podman[118023]: 2026-03-10 12:41:29.04826207 +0000 UTC m=+0.013872946 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:29.485 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:29 vm00.local podman[118023]: 2026-03-10 12:41:29.159227649 +0000 UTC m=+0.124838526 container init 249137e44eb73320cc3b7d5fb2611352f188ae04d8ea34073c8950f66c9054fc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default) 2026-03-10T12:41:29.485 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:29 vm00.local podman[118023]: 2026-03-10 12:41:29.165454133 +0000 UTC m=+0.131065000 container start 249137e44eb73320cc3b7d5fb2611352f188ae04d8ea34073c8950f66c9054fc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.build-date=20260223) 2026-03-10T12:41:29.485 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:29 vm00.local bash[118023]: 249137e44eb73320cc3b7d5fb2611352f188ae04d8ea34073c8950f66c9054fc 2026-03-10T12:41:29.485 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:29 vm00.local systemd[1]: Started Ceph osd.2 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:41:29.768 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:29 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[118033]: 2026-03-10T12:41:29.501+0000 7fa8c64bc740 -1 Falling back to public interface 2026-03-10T12:41:30.022 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:29 vm00.local ceph-mon[103263]: pgmap v93: 65 pgs: 4 active+undersized, 12 peering, 3 stale+active+clean, 3 active+undersized+degraded, 43 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 13/333 objects degraded (3.904%) 2026-03-10T12:41:30.022 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:29 vm00.local ceph-mon[103263]: Health check failed: Reduced data availability: 1 pg inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-10T12:41:30.022 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:29 vm00.local ceph-mon[103263]: Health check failed: Degraded data redundancy: 13/333 objects degraded (3.904%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:30.022 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:30.022 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:30.022 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:29 vm07.local ceph-mon[93622]: pgmap v93: 65 pgs: 4 active+undersized, 12 peering, 3 stale+active+clean, 3 active+undersized+degraded, 43 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 13/333 objects degraded (3.904%) 2026-03-10T12:41:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:29 vm07.local ceph-mon[93622]: Health check failed: Reduced data availability: 1 pg inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-10T12:41:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:29 vm07.local ceph-mon[93622]: Health check failed: Degraded data redundancy: 13/333 objects degraded (3.904%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:31.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:31 vm00.local ceph-mon[103263]: pgmap v94: 65 pgs: 8 active+undersized, 12 peering, 7 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 26/333 objects degraded (7.808%) 2026-03-10T12:41:31.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:31.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:31.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:31.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:31.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:31.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:31 vm07.local ceph-mon[93622]: pgmap v94: 65 pgs: 8 active+undersized, 12 peering, 7 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 26/333 objects degraded (7.808%) 2026-03-10T12:41:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:32.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:32 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[118033]: 2026-03-10T12:41:32.598+0000 7fa8c64bc740 -1 osd.2 0 read_superblock omap replica is missing. 2026-03-10T12:41:33.234 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:32 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[118033]: 2026-03-10T12:41:32.921+0000 7fa8c64bc740 -1 osd.2 58 log_to_monitors true 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: pgmap v95: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 41/333 objects degraded (12.312%) 2026-03-10T12:41:33.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:33 vm00.local ceph-mon[103263]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: pgmap v95: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 41/333 objects degraded (12.312%) 2026-03-10T12:41:33.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:33 vm07.local ceph-mon[93622]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T12:41:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:34 vm00.local ceph-mon[103263]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 2 pgs peering) 2026-03-10T12:41:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:34 vm00.local ceph-mon[103263]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T12:41:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:34 vm00.local ceph-mon[103263]: osdmap e61: 6 total, 5 up, 6 in 2026-03-10T12:41:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:34 vm00.local ceph-mon[103263]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:41:34.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:34 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 41/333 objects degraded (12.312%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:34.734 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:41:34 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[118033]: 2026-03-10T12:41:34.590+0000 7fa8bda55640 -1 osd.2 58 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:41:34.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:34 vm07.local ceph-mon[93622]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 2 pgs peering) 2026-03-10T12:41:34.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:34 vm07.local ceph-mon[93622]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T12:41:34.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:34 vm07.local ceph-mon[93622]: osdmap e61: 6 total, 5 up, 6 in 2026-03-10T12:41:34.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:34 vm07.local ceph-mon[93622]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm00", "root=default"]}]: dispatch 2026-03-10T12:41:34.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:34 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 41/333 objects degraded (12.312%), 12 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:35.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:35 vm00.local ceph-mon[103263]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' 2026-03-10T12:41:35.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:35 vm00.local ceph-mon[103263]: pgmap v97: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 41/333 objects degraded (12.312%) 2026-03-10T12:41:35.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:35 vm00.local ceph-mon[103263]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:41:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:35 vm00.local ceph-mon[103263]: osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805] boot 2026-03-10T12:41:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:35 vm00.local ceph-mon[103263]: osdmap e62: 6 total, 6 up, 6 in 2026-03-10T12:41:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:35 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:41:36.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:35 vm07.local ceph-mon[93622]: from='osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805]' entity='osd.2' 2026-03-10T12:41:36.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:35 vm07.local ceph-mon[93622]: pgmap v97: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 41/333 objects degraded (12.312%) 2026-03-10T12:41:36.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:35 vm07.local ceph-mon[93622]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:41:36.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:35 vm07.local ceph-mon[93622]: osd.2 [v2:192.168.123.100:6818/429283805,v1:192.168.123.100:6819/429283805] boot 2026-03-10T12:41:36.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:35 vm07.local ceph-mon[93622]: osdmap e62: 6 total, 6 up, 6 in 2026-03-10T12:41:36.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:35 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T12:41:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:37 vm07.local ceph-mon[93622]: osdmap e63: 6 total, 6 up, 6 in 2026-03-10T12:41:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:37 vm07.local ceph-mon[93622]: pgmap v100: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 41/333 objects degraded (12.312%) 2026-03-10T12:41:37.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:37 vm00.local ceph-mon[103263]: osdmap e63: 6 total, 6 up, 6 in 2026-03-10T12:41:37.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:37 vm00.local ceph-mon[103263]: pgmap v100: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 41/333 objects degraded (12.312%) 2026-03-10T12:41:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:39 vm00.local ceph-mon[103263]: pgmap v101: 65 pgs: 3 active+undersized, 2 active+undersized+degraded, 60 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 8/333 objects degraded (2.402%) 2026-03-10T12:41:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:39 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 8/333 objects degraded (2.402%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:39 vm07.local ceph-mon[93622]: pgmap v101: 65 pgs: 3 active+undersized, 2 active+undersized+degraded, 60 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 8/333 objects degraded (2.402%) 2026-03-10T12:41:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:39 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 8/333 objects degraded (2.402%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:41.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:40 vm00.local ceph-mon[103263]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 8/333 objects degraded (2.402%), 2 pgs degraded) 2026-03-10T12:41:41.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:40 vm00.local ceph-mon[103263]: Cluster is now healthy 2026-03-10T12:41:41.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:40 vm07.local ceph-mon[93622]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 8/333 objects degraded (2.402%), 2 pgs degraded) 2026-03-10T12:41:41.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:40 vm07.local ceph-mon[93622]: Cluster is now healthy 2026-03-10T12:41:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:41 vm00.local ceph-mon[103263]: pgmap v102: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:41 vm07.local ceph-mon[93622]: pgmap v102: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:44.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:43 vm00.local ceph-mon[103263]: pgmap v103: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:43 vm07.local ceph-mon[93622]: pgmap v103: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:45 vm00.local ceph-mon[103263]: pgmap v104: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:45 vm07.local ceph-mon[93622]: pgmap v104: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:41:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:48.107 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:48 vm07.local ceph-mon[93622]: pgmap v105: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:48.107 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:48 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:48 vm00.local ceph-mon[103263]: pgmap v105: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:48.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:48 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: Upgrade: osd.3 is safe to restart 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: Upgrade: Updating osd.3 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: Deploying daemon osd.3 on vm07 2026-03-10T12:41:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-mon[93622]: pgmap v106: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:49.316 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:49 vm07.local systemd[1]: Stopping Ceph osd.3 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:41:49.317 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[64470]: 2026-03-10T12:41:49.127+0000 7f54add53700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:41:49.317 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[64470]: 2026-03-10T12:41:49.127+0000 7f54add53700 -1 osd.3 63 *** Got signal Terminated *** 2026-03-10T12:41:49.317 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:49 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[64470]: 2026-03-10T12:41:49.127+0000 7f54add53700 -1 osd.3 63 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: Upgrade: osd.3 is safe to restart 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: Upgrade: Updating osd.3 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: Deploying daemon osd.3 on vm07 2026-03-10T12:41:49.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:49 vm00.local ceph-mon[103263]: pgmap v106: 65 pgs: 65 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:50.088 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:50 vm07.local ceph-mon[93622]: osd.3 marked itself down and dead 2026-03-10T12:41:50.344 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99428]: 2026-03-10 12:41:50.089123215 +0000 UTC m=+0.980254067 container died 0c6249fe39510c26c4ac29e2a8fcb3815b7f917cc974d0ade0ebd5fe7e3d5f45 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3, GIT_BRANCH=HEAD, GIT_CLEAN=True, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, org.label-schema.build-date=20231212, CEPH_POINT_RELEASE=-18.2.0, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.vendor=CentOS) 2026-03-10T12:41:50.344 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99428]: 2026-03-10 12:41:50.10881183 +0000 UTC m=+0.999942683 container remove 0c6249fe39510c26c4ac29e2a8fcb3815b7f917cc974d0ade0ebd5fe7e3d5f45 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3, GIT_CLEAN=True, org.label-schema.vendor=CentOS, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=HEAD, ceph=True, org.label-schema.build-date=20231212, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.0) 2026-03-10T12:41:50.344 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local bash[99428]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3 2026-03-10T12:41:50.344 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99496]: 2026-03-10 12:41:50.306373439 +0000 UTC m=+0.030427138 container create 299c2ebd8e4e25868683c88a359b309ac6bcf88d031b42ddf161ac88657ce109 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:41:50.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:50 vm00.local ceph-mon[103263]: osd.3 marked itself down and dead 2026-03-10T12:41:50.599 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99496]: 2026-03-10 12:41:50.291210776 +0000 UTC m=+0.015264485 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:50.599 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99496]: 2026-03-10 12:41:50.58512588 +0000 UTC m=+0.309179600 container init 299c2ebd8e4e25868683c88a359b309ac6bcf88d031b42ddf161ac88657ce109 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99496]: 2026-03-10 12:41:50.597402432 +0000 UTC m=+0.321456141 container start 299c2ebd8e4e25868683c88a359b309ac6bcf88d031b42ddf161ac88657ce109 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99496]: 2026-03-10 12:41:50.613415827 +0000 UTC m=+0.337469536 container attach 299c2ebd8e4e25868683c88a359b309ac6bcf88d031b42ddf161ac88657ce109 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local conmon[99507]: conmon 299c2ebd8e4e25868683 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-299c2ebd8e4e25868683c88a359b309ac6bcf88d031b42ddf161ac88657ce109.scope/container/memory.events 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99496]: 2026-03-10 12:41:50.752962687 +0000 UTC m=+0.477016396 container died 299c2ebd8e4e25868683c88a359b309ac6bcf88d031b42ddf161ac88657ce109 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local podman[99496]: 2026-03-10 12:41:50.78144265 +0000 UTC m=+0.505496359 container remove 299c2ebd8e4e25868683c88a359b309ac6bcf88d031b42ddf161ac88657ce109 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS) 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.3.service: Deactivated successfully. 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.3.service: Unit process 99507 (conmon) remains running after unit stopped. 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.3.service: Unit process 99526 (podman) remains running after unit stopped. 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local systemd[1]: Stopped Ceph osd.3 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:41:50.874 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:50 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.3.service: Consumed 50.643s CPU time, 831.1M memory peak. 2026-03-10T12:41:51.138 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local systemd[1]: Starting Ceph osd.3 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:41:51.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-mon[93622]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:41:51.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-mon[93622]: osdmap e64: 6 total, 5 up, 6 in 2026-03-10T12:41:51.138 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-mon[93622]: pgmap v108: 65 pgs: 16 stale+active+clean, 49 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:51.386 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:51 vm00.local ceph-mon[103263]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:41:51.386 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:51 vm00.local ceph-mon[103263]: osdmap e64: 6 total, 5 up, 6 in 2026-03-10T12:41:51.386 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:51 vm00.local ceph-mon[103263]: pgmap v108: 65 pgs: 16 stale+active+clean, 49 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local podman[99608]: 2026-03-10 12:41:51.138250164 +0000 UTC m=+0.019910122 container create a9eed787d4e1ab65732d653836cf89034652d305d9fa917fa7c6492fc3056f00 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local podman[99608]: 2026-03-10 12:41:51.180372794 +0000 UTC m=+0.062032772 container init a9eed787d4e1ab65732d653836cf89034652d305d9fa917fa7c6492fc3056f00 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223) 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local podman[99608]: 2026-03-10 12:41:51.192023774 +0000 UTC m=+0.073683742 container start a9eed787d4e1ab65732d653836cf89034652d305d9fa917fa7c6492fc3056f00 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0) 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local podman[99608]: 2026-03-10 12:41:51.193471603 +0000 UTC m=+0.075131581 container attach a9eed787d4e1ab65732d653836cf89034652d305d9fa917fa7c6492fc3056f00 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local podman[99608]: 2026-03-10 12:41:51.131013136 +0000 UTC m=+0.012673104 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local bash[99608]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:51.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local bash[99608]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local bash[99608]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local bash[99608]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local bash[99608]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local bash[99608]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T12:41:52.031 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6afb6cdb-52b5-409c-8e74-f049a2cbfadb/osd-block-cd850d3a-e99e-4292-9600-f18ed81a7d18 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T12:41:52.032 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:51 vm07.local bash[99608]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6afb6cdb-52b5-409c-8e74-f049a2cbfadb/osd-block-cd850d3a-e99e-4292-9600-f18ed81a7d18 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T12:41:52.282 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:52 vm07.local ceph-mon[93622]: osdmap e65: 6 total, 5 up, 6 in 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/ln -snf /dev/ceph-6afb6cdb-52b5-409c-8e74-f049a2cbfadb/osd-block-cd850d3a-e99e-4292-9600-f18ed81a7d18 /var/lib/ceph/osd/ceph-3/block 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local bash[99608]: Running command: /usr/bin/ln -snf /dev/ceph-6afb6cdb-52b5-409c-8e74-f049a2cbfadb/osd-block-cd850d3a-e99e-4292-9600-f18ed81a7d18 /var/lib/ceph/osd/ceph-3/block 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local bash[99608]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local bash[99608]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local bash[99608]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate[99621]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local bash[99608]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local conmon[99621]: conmon a9eed787d4e1ab65732d : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a9eed787d4e1ab65732d653836cf89034652d305d9fa917fa7c6492fc3056f00.scope/container/memory.events 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local podman[99608]: 2026-03-10 12:41:52.148324029 +0000 UTC m=+1.029983997 container died a9eed787d4e1ab65732d653836cf89034652d305d9fa917fa7c6492fc3056f00 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local podman[99608]: 2026-03-10 12:41:52.178463597 +0000 UTC m=+1.060123565 container remove a9eed787d4e1ab65732d653836cf89034652d305d9fa917fa7c6492fc3056f00 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-activate, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:41:52.282 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local podman[99883]: 2026-03-10 12:41:52.282034013 +0000 UTC m=+0.021537527 container create 72a045e3b78b360940476b4ac5c0a1e208ea1de379c5afe8114a7bc3afa315f3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-10T12:41:52.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:52 vm00.local ceph-mon[103263]: osdmap e65: 6 total, 5 up, 6 in 2026-03-10T12:41:52.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local podman[99883]: 2026-03-10 12:41:52.326880061 +0000 UTC m=+0.066383575 container init 72a045e3b78b360940476b4ac5c0a1e208ea1de379c5afe8114a7bc3afa315f3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:41:52.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local podman[99883]: 2026-03-10 12:41:52.329992868 +0000 UTC m=+0.069496382 container start 72a045e3b78b360940476b4ac5c0a1e208ea1de379c5afe8114a7bc3afa315f3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid) 2026-03-10T12:41:52.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local bash[99883]: 72a045e3b78b360940476b4ac5c0a1e208ea1de379c5afe8114a7bc3afa315f3 2026-03-10T12:41:52.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local podman[99883]: 2026-03-10 12:41:52.275215549 +0000 UTC m=+0.014719074 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:41:52.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:52 vm07.local systemd[1]: Started Ceph osd.3 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:41:53.427 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:53 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[99893]: 2026-03-10T12:41:53.423+0000 7f8073c74740 -1 Falling back to public interface 2026-03-10T12:41:53.427 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:53 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:53.427 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:53 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:53.427 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:53 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:53.427 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:53 vm07.local ceph-mon[93622]: pgmap v110: 65 pgs: 9 active+undersized, 6 stale+active+clean, 9 active+undersized+degraded, 41 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 39/333 objects degraded (11.712%) 2026-03-10T12:41:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:53 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:53 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:53 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:53.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:53 vm00.local ceph-mon[103263]: pgmap v110: 65 pgs: 9 active+undersized, 6 stale+active+clean, 9 active+undersized+degraded, 41 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 39/333 objects degraded (11.712%) 2026-03-10T12:41:54.696 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:54 vm07.local ceph-mon[93622]: Health check failed: Degraded data redundancy: 39/333 objects degraded (11.712%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:54.696 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:54 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:54.696 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:54 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:54.696 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:54 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:54.696 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:54 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:54 vm00.local ceph-mon[103263]: Health check failed: Degraded data redundancy: 39/333 objects degraded (11.712%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:54 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:54 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:54 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:54.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:54 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: pgmap v111: 65 pgs: 15 active+undersized, 3 stale+active+clean, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 62/333 objects degraded (18.619%) 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:55.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: pgmap v111: 65 pgs: 15 active+undersized, 3 stale+active+clean, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 62/333 objects degraded (18.619%) 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:41:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:56 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:41:56.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:56 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (14 PGs are or would become offline) 2026-03-10T12:41:56.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:56 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:41:56.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:56 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (14 PGs are or would become offline) 2026-03-10T12:41:57.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:57 vm00.local ceph-mon[103263]: pgmap v112: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 69/333 objects degraded (20.721%) 2026-03-10T12:41:57.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:57 vm07.local ceph-mon[93622]: pgmap v112: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 269 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 69/333 objects degraded (20.721%) 2026-03-10T12:41:57.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.893+0000 7f7f452d1700 1 -- 192.168.123.100:0/2752405596 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40102760 msgr2=0x7f7f40102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:57.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.893+0000 7f7f452d1700 1 --2- 192.168.123.100:0/2752405596 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40102760 0x7f7f40102b70 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7f30009b00 tx=0x7f7f30009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.894+0000 7f7f452d1700 1 -- 192.168.123.100:0/2752405596 shutdown_connections 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.894+0000 7f7f452d1700 1 --2- 192.168.123.100:0/2752405596 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40103960 0x7f7f40103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.894+0000 7f7f452d1700 1 --2- 192.168.123.100:0/2752405596 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40102760 0x7f7f40102b70 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.894+0000 7f7f452d1700 1 -- 192.168.123.100:0/2752405596 >> 192.168.123.100:0/2752405596 conn(0x7f7f400fdcf0 msgr2=0x7f7f40100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.894+0000 7f7f452d1700 1 -- 192.168.123.100:0/2752405596 shutdown_connections 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.894+0000 7f7f452d1700 1 -- 192.168.123.100:0/2752405596 wait complete. 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.895+0000 7f7f452d1700 1 Processor -- start 2026-03-10T12:41:57.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.895+0000 7f7f452d1700 1 -- start start 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.895+0000 7f7f452d1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40102760 0x7f7f40198110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.895+0000 7f7f452d1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40103960 0x7f7f40198650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.895+0000 7f7f452d1700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f40198c70 con 0x7f7f40103960 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.895+0000 7f7f452d1700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f40198db0 con 0x7f7f40102760 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f37fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40103960 0x7f7f40198650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f3effd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40102760 0x7f7f40198110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f3effd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40102760 0x7f7f40198110 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:53078/0 (socket says 192.168.123.100:53078) 2026-03-10T12:41:57.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f3effd700 1 -- 192.168.123.100:0/1429755563 learned_addr learned my addr 192.168.123.100:0/1429755563 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:57.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f3effd700 1 -- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40103960 msgr2=0x7f7f40198650 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:57.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f3effd700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40103960 0x7f7f40198650 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:57.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f3effd700 1 -- 192.168.123.100:0/1429755563 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f300097e0 con 0x7f7f40102760 2026-03-10T12:41:57.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.896+0000 7f7f37fff700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40103960 0x7f7f40198650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:41:57.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.897+0000 7f7f3effd700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40102760 0x7f7f40198110 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f7f3000b5c0 tx=0x7f7f300049d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:57.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.897+0000 7f7f3cff9700 1 -- 192.168.123.100:0/1429755563 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f3001d070 con 0x7f7f40102760 2026-03-10T12:41:57.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.897+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f4019d800 con 0x7f7f40102760 2026-03-10T12:41:57.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.897+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f4019dc90 con 0x7f7f40102760 2026-03-10T12:41:57.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.898+0000 7f7f3cff9700 1 -- 192.168.123.100:0/1429755563 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f30004500 con 0x7f7f40102760 2026-03-10T12:41:57.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.898+0000 7f7f3cff9700 1 -- 192.168.123.100:0/1429755563 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f3000f460 con 0x7f7f40102760 2026-03-10T12:41:57.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.899+0000 7f7f3cff9700 1 -- 192.168.123.100:0/1429755563 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7f3000f6d0 con 0x7f7f40102760 2026-03-10T12:41:57.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.899+0000 7f7f3cff9700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f7f2c0778c0 0x7f7f2c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:57.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.899+0000 7f7f3cff9700 1 -- 192.168.123.100:0/1429755563 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f7f3009c130 con 0x7f7f40102760 2026-03-10T12:41:57.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.900+0000 7f7f37fff700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f7f2c0778c0 0x7f7f2c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:57.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.900+0000 7f7f37fff700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f7f2c0778c0 0x7f7f2c079d70 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f7f28006fd0 tx=0x7f7f28009380 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:57.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.900+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f20005320 con 0x7f7f40102760 2026-03-10T12:41:57.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:57.903+0000 7f7f3cff9700 1 -- 192.168.123.100:0/1429755563 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7f30064960 con 0x7f7f40102760 2026-03-10T12:41:58.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.035+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7f20000bf0 con 0x7f7f2c0778c0 2026-03-10T12:41:58.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.036+0000 7f7f3cff9700 1 -- 192.168.123.100:0/1429755563 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f7f20000bf0 con 0x7f7f2c0778c0 2026-03-10T12:41:58.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f7f2c0778c0 msgr2=0x7f7f2c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f7f2c0778c0 0x7f7f2c079d70 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f7f28006fd0 tx=0x7f7f28009380 comp rx=0 tx=0).stop 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40102760 msgr2=0x7f7f40198110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40102760 0x7f7f40198110 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f7f3000b5c0 tx=0x7f7f300049d0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 shutdown_connections 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f7f2c0778c0 0x7f7f2c079d70 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f7f40102760 0x7f7f40198110 secure :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f7f3000b5c0 tx=0x7f7f300049d0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 --2- 192.168.123.100:0/1429755563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f7f40103960 0x7f7f40198650 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.039+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 >> 192.168.123.100:0/1429755563 conn(0x7f7f400fdcf0 msgr2=0x7f7f40106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.040+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 shutdown_connections 2026-03-10T12:41:58.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.040+0000 7f7f452d1700 1 -- 192.168.123.100:0/1429755563 wait complete. 2026-03-10T12:41:58.052 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:41:58.129 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.129+0000 7f31361b7700 1 -- 192.168.123.100:0/223530388 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130103960 msgr2=0x7f3130103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.129+0000 7f31361b7700 1 --2- 192.168.123.100:0/223530388 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130103960 0x7f3130103db0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f3118009b00 tx=0x7f3118009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.129+0000 7f31361b7700 1 -- 192.168.123.100:0/223530388 shutdown_connections 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.129+0000 7f31361b7700 1 --2- 192.168.123.100:0/223530388 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130103960 0x7f3130103db0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.129+0000 7f31361b7700 1 --2- 192.168.123.100:0/223530388 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3130102760 0x7f3130102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.129+0000 7f31361b7700 1 -- 192.168.123.100:0/223530388 >> 192.168.123.100:0/223530388 conn(0x7f31300fdcf0 msgr2=0x7f3130100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.130+0000 7f31361b7700 1 -- 192.168.123.100:0/223530388 shutdown_connections 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.130+0000 7f31361b7700 1 -- 192.168.123.100:0/223530388 wait complete. 2026-03-10T12:41:58.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.130+0000 7f31361b7700 1 Processor -- start 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.130+0000 7f31361b7700 1 -- start start 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.130+0000 7f31361b7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130102760 0x7f3130198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f31361b7700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3130103960 0x7f3130198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f31361b7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3130198b80 con 0x7f3130102760 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f31361b7700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3130198cc0 con 0x7f3130103960 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f312f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130102760 0x7f3130198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f312f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130102760 0x7f3130198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40908/0 (socket says 192.168.123.100:40908) 2026-03-10T12:41:58.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f312f7fe700 1 -- 192.168.123.100:0/620767883 learned_addr learned my addr 192.168.123.100:0/620767883 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:58.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f312f7fe700 1 -- 192.168.123.100:0/620767883 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3130103960 msgr2=0x7f3130198560 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:41:58.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f312f7fe700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3130103960 0x7f3130198560 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.131+0000 7f312f7fe700 1 -- 192.168.123.100:0/620767883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31180097e0 con 0x7f3130102760 2026-03-10T12:41:58.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.132+0000 7f312f7fe700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130102760 0x7f3130198020 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f312000d900 tx=0x7f312000dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:58.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.132+0000 7f312d7fa700 1 -- 192.168.123.100:0/620767883 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f31200098e0 con 0x7f3130102760 2026-03-10T12:41:58.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.132+0000 7f312d7fa700 1 -- 192.168.123.100:0/620767883 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3120010460 con 0x7f3130102760 2026-03-10T12:41:58.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.132+0000 7f312d7fa700 1 -- 192.168.123.100:0/620767883 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f312000f5d0 con 0x7f3130102760 2026-03-10T12:41:58.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.132+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f313019d770 con 0x7f3130102760 2026-03-10T12:41:58.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.132+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f313019dcc0 con 0x7f3130102760 2026-03-10T12:41:58.134 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.133+0000 7f312d7fa700 1 -- 192.168.123.100:0/620767883 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f31200105d0 con 0x7f3130102760 2026-03-10T12:41:58.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.134+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f313010b460 con 0x7f3130102760 2026-03-10T12:41:58.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.134+0000 7f312d7fa700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f311c077870 0x7f311c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.134+0000 7f312d7fa700 1 -- 192.168.123.100:0/620767883 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3120099520 con 0x7f3130102760 2026-03-10T12:41:58.138 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.137+0000 7f3127fff700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f311c077870 0x7f311c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.138 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.138+0000 7f312d7fa700 1 -- 192.168.123.100:0/620767883 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3120061d50 con 0x7f3130102760 2026-03-10T12:41:58.138 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.138+0000 7f3127fff700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f311c077870 0x7f311c079d20 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f3118005f50 tx=0x7f3118005dc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:58.284 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.283+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f31301082b0 con 0x7f311c077870 2026-03-10T12:41:58.285 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.285+0000 7f312d7fa700 1 -- 192.168.123.100:0/620767883 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f31301082b0 con 0x7f311c077870 2026-03-10T12:41:58.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f311c077870 msgr2=0x7f311c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f311c077870 0x7f311c079d20 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f3118005f50 tx=0x7f3118005dc0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130102760 msgr2=0x7f3130198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130102760 0x7f3130198020 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f312000d900 tx=0x7f312000dcc0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 shutdown_connections 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f311c077870 0x7f311c079d20 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3130102760 0x7f3130198020 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 --2- 192.168.123.100:0/620767883 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3130103960 0x7f3130198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 >> 192.168.123.100:0/620767883 conn(0x7f31300fdcf0 msgr2=0x7f3130106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 shutdown_connections 2026-03-10T12:41:58.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.288+0000 7f31361b7700 1 -- 192.168.123.100:0/620767883 wait complete. 2026-03-10T12:41:58.369 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.369+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/1357478479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b81017e0 msgr2=0x7ff8b8103c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.369 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.369+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/1357478479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b81017e0 0x7ff8b8103c60 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7ff8a8009b00 tx=0x7ff8a8009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:58.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.369+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/1357478479 shutdown_connections 2026-03-10T12:41:58.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.369+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/1357478479 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b81017e0 0x7ff8b8103c60 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.369+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/1357478479 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8b80fee80 0x7ff8b81012a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.369+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/1357478479 >> 192.168.123.100:0/1357478479 conn(0x7ff8b80faa70 msgr2=0x7ff8b80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.370+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/1357478479 shutdown_connections 2026-03-10T12:41:58.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.370+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/1357478479 wait complete. 2026-03-10T12:41:58.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.370+0000 7ff8bdfe1700 1 Processor -- start 2026-03-10T12:41:58.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.370+0000 7ff8bdfe1700 1 -- start start 2026-03-10T12:41:58.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8bdfe1700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b80fee80 0x7ff8b819c3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8bdfe1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8b81017e0 0x7ff8b819c910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.371 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8bdfe1700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8b819cf30 con 0x7ff8b80fee80 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8bdfe1700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8b819d070 con 0x7ff8b81017e0 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8b77fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b80fee80 0x7ff8b819c3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8b77fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b80fee80 0x7ff8b819c3d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40928/0 (socket says 192.168.123.100:40928) 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8b77fe700 1 -- 192.168.123.100:0/4169480010 learned_addr learned my addr 192.168.123.100:0/4169480010 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8b6ffd700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8b81017e0 0x7ff8b819c910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8b77fe700 1 -- 192.168.123.100:0/4169480010 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8b81017e0 msgr2=0x7ff8b819c910 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8b77fe700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8b81017e0 0x7ff8b819c910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.371+0000 7ff8b77fe700 1 -- 192.168.123.100:0/4169480010 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff8a80097e0 con 0x7ff8b80fee80 2026-03-10T12:41:58.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.372+0000 7ff8b77fe700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b80fee80 0x7ff8b819c3d0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7ff8a000d900 tx=0x7ff8a000dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:58.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.372+0000 7ff8b4ff9700 1 -- 192.168.123.100:0/4169480010 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8a00041d0 con 0x7ff8b80fee80 2026-03-10T12:41:58.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.372+0000 7ff8b4ff9700 1 -- 192.168.123.100:0/4169480010 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff8a0004330 con 0x7ff8b80fee80 2026-03-10T12:41:58.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.372+0000 7ff8b4ff9700 1 -- 192.168.123.100:0/4169480010 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8a0003de0 con 0x7ff8b80fee80 2026-03-10T12:41:58.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.372+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff8b81a1b20 con 0x7ff8b80fee80 2026-03-10T12:41:58.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.372+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff8b81a2070 con 0x7ff8b80fee80 2026-03-10T12:41:58.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.374+0000 7ff8b4ff9700 1 -- 192.168.123.100:0/4169480010 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff8a0010460 con 0x7ff8b80fee80 2026-03-10T12:41:58.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.374+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff8b8196590 con 0x7ff8b80fee80 2026-03-10T12:41:58.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.375+0000 7ff8b4ff9700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff8a40778c0 0x7ff8a4079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.375+0000 7ff8b4ff9700 1 -- 192.168.123.100:0/4169480010 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff8a0021030 con 0x7ff8b80fee80 2026-03-10T12:41:58.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.375+0000 7ff8b6ffd700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff8a40778c0 0x7ff8a4079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.378+0000 7ff8b4ff9700 1 -- 192.168.123.100:0/4169480010 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff8a0062780 con 0x7ff8b80fee80 2026-03-10T12:41:58.379 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.378+0000 7ff8b6ffd700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff8a40778c0 0x7ff8a4079d70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7ff8a8000c00 tx=0x7ff8a8005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:58.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.509+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7ff8b8061190 con 0x7ff8a40778c0 2026-03-10T12:41:58.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.514+0000 7ff8b4ff9700 1 -- 192.168.123.100:0/4169480010 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7ff8b8061190 con 0x7ff8a40778c0 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (8m) 27s ago 8m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (9m) 27s ago 9m 9361k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (8m) 4s ago 8m 11.4M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (2m) 27s ago 9m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (2m) 4s ago 8m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (8m) 27s ago 8m 91.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (6m) 27s ago 6m 136M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (6m) 27s ago 6m 18.7M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (6m) 4s ago 6m 18.4M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (6m) 4s ago 6m 143M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (3m) 27s ago 9m 622M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (3m) 4s ago 8m 490M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (3m) 27s ago 9m 61.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (2m) 4s ago 8m 50.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (9m) 27s ago 9m 15.0M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (8m) 4s ago 8m 16.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (2m) 27s ago 8m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (51s) 27s ago 7m 104M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (29s) 27s ago 7m 12.4M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (6s) 4s ago 7m 31.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (7m) 4s ago 7m 459M 4096M 18.2.0 dc2bc1663786 5c66bfb63a83 2026-03-10T12:41:58.515 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (7m) 4s ago 7m 409M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:41:58.516 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (3m) 27s ago 8m 67.8M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:41:58.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.517+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff8a40778c0 msgr2=0x7ff8a4079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.517+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff8a40778c0 0x7ff8a4079d70 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7ff8a8000c00 tx=0x7ff8a8005fb0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.517+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b80fee80 msgr2=0x7ff8b819c3d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.517+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b80fee80 0x7ff8b819c3d0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7ff8a000d900 tx=0x7ff8a000dcc0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.518+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 shutdown_connections 2026-03-10T12:41:58.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.518+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff8a40778c0 0x7ff8a4079d70 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.518+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8b80fee80 0x7ff8b819c3d0 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.518+0000 7ff8bdfe1700 1 --2- 192.168.123.100:0/4169480010 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8b81017e0 0x7ff8b819c910 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.518+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 >> 192.168.123.100:0/4169480010 conn(0x7ff8b80faa70 msgr2=0x7ff8b80fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.518+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 shutdown_connections 2026-03-10T12:41:58.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.518+0000 7ff8bdfe1700 1 -- 192.168.123.100:0/4169480010 wait complete. 2026-03-10T12:41:58.542 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:58 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[99893]: 2026-03-10T12:41:58.198+0000 7f8073c74740 -1 osd.3 0 read_superblock omap replica is missing. 2026-03-10T12:41:58.542 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:58 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[99893]: 2026-03-10T12:41:58.542+0000 7f8073c74740 -1 osd.3 63 log_to_monitors true 2026-03-10T12:41:58.590 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.588+0000 7f95b50c6700 1 -- 192.168.123.100:0/2224455880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 msgr2=0x7f95b0073160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.590 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.588+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2224455880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0073160 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f9598009b00 tx=0x7f9598009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:58.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.591+0000 7f95b50c6700 1 -- 192.168.123.100:0/2224455880 shutdown_connections 2026-03-10T12:41:58.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.591+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2224455880 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95b0073730 0x7f95b0073ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.591+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2224455880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0073160 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.591+0000 7f95b50c6700 1 -- 192.168.123.100:0/2224455880 >> 192.168.123.100:0/2224455880 conn(0x7f95b00fbad0 msgr2=0x7f95b00fdf20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 -- 192.168.123.100:0/2224455880 shutdown_connections 2026-03-10T12:41:58.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 -- 192.168.123.100:0/2224455880 wait complete. 2026-03-10T12:41:58.592 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 Processor -- start 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 -- start start 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95b0073730 0x7f95b0197fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0198520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95b0198b40 con 0x7f95b0074d00 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.592+0000 7f95b50c6700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95b0198c80 con 0x7f95b0073730 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.593+0000 7f95ae59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0198520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.593+0000 7f95aed9d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95b0073730 0x7f95b0197fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.593 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.593+0000 7f95ae59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0198520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40936/0 (socket says 192.168.123.100:40936) 2026-03-10T12:41:58.594 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.593+0000 7f95ae59c700 1 -- 192.168.123.100:0/2920541149 learned_addr learned my addr 192.168.123.100:0/2920541149 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:58.594 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.593+0000 7f95ae59c700 1 -- 192.168.123.100:0/2920541149 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95b0073730 msgr2=0x7f95b0197fe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.594 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.593+0000 7f95ae59c700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95b0073730 0x7f95b0197fe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.594 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.593+0000 7f95ae59c700 1 -- 192.168.123.100:0/2920541149 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95980097e0 con 0x7f95b0074d00 2026-03-10T12:41:58.594 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.594+0000 7f95ae59c700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0198520 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f95a000ed70 tx=0x7f95a000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:58.595 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.594+0000 7f95a7fff700 1 -- 192.168.123.100:0/2920541149 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95a000cd70 con 0x7f95b0074d00 2026-03-10T12:41:58.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.594+0000 7f95a7fff700 1 -- 192.168.123.100:0/2920541149 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f95a000eec0 con 0x7f95b0074d00 2026-03-10T12:41:58.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.594+0000 7f95a7fff700 1 -- 192.168.123.100:0/2920541149 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95a00188b0 con 0x7f95b0074d00 2026-03-10T12:41:58.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.594+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95b019d730 con 0x7f95b0074d00 2026-03-10T12:41:58.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.594+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95b01002e0 con 0x7f95b0074d00 2026-03-10T12:41:58.596 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.596+0000 7f95a7fff700 1 -- 192.168.123.100:0/2920541149 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f95a0018a10 con 0x7f95b0074d00 2026-03-10T12:41:58.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.596+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f95b0066e40 con 0x7f95b0074d00 2026-03-10T12:41:58.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.597+0000 7f95a7fff700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f959c0778c0 0x7f959c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.597 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.597+0000 7f95a7fff700 1 -- 192.168.123.100:0/2920541149 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(65..65 src has 1..65) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f95a0014070 con 0x7f95b0074d00 2026-03-10T12:41:58.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.599+0000 7f95a7fff700 1 -- 192.168.123.100:0/2920541149 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f95a0062b90 con 0x7f95b0074d00 2026-03-10T12:41:58.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.600+0000 7f95aed9d700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f959c0778c0 0x7f959c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.600+0000 7f95aed9d700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f959c0778c0 0x7f959c079d70 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9598005200 tx=0x7f959801a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:58.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.773+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f95b01005c0 con 0x7f95b0074d00 2026-03-10T12:41:58.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.774+0000 7f95a7fff700 1 -- 192.168.123.100:0/2920541149 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f95a00622e0 con 0x7f95b0074d00 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2, 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-10T12:41:58.775 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 6, 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:41:58.776 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:41:58.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.778+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f959c0778c0 msgr2=0x7f959c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.778+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f959c0778c0 0x7f959c079d70 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f9598005200 tx=0x7f959801a040 comp rx=0 tx=0).stop 2026-03-10T12:41:58.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.778+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 msgr2=0x7f95b0198520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.778+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0198520 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f95a000ed70 tx=0x7f95a000c5b0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.779+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 shutdown_connections 2026-03-10T12:41:58.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.779+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f959c0778c0 0x7f959c079d70 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.779+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f95b0073730 0x7f95b0197fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.779+0000 7f95b50c6700 1 --2- 192.168.123.100:0/2920541149 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f95b0074d00 0x7f95b0198520 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.779+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 >> 192.168.123.100:0/2920541149 conn(0x7f95b00fbad0 msgr2=0x7f95b01063a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.779+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 shutdown_connections 2026-03-10T12:41:58.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.779+0000 7f95b50c6700 1 -- 192.168.123.100:0/2920541149 wait complete. 2026-03-10T12:41:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:58 vm07.local ceph-mon[93622]: from='osd.3 [v2:192.168.123.107:6800/4276377159,v1:192.168.123.107:6801/4276377159]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:41:58.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:58 vm07.local ceph-mon[93622]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:41:58.816 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:41:58 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[99893]: 2026-03-10T12:41:58.630+0000 7f806ba0e640 -1 osd.3 63 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:41:58.851 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:58 vm00.local ceph-mon[103263]: from='osd.3 [v2:192.168.123.107:6800/4276377159,v1:192.168.123.107:6801/4276377159]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:41:58.851 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:58 vm00.local ceph-mon[103263]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T12:41:58.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.850+0000 7f3a77aa2700 1 -- 192.168.123.100:0/693100351 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 msgr2=0x7f3a70102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:58.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.850+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/693100351 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70102b70 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f3a64009b00 tx=0x7f3a64009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:58.853 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.853+0000 7f3a77aa2700 1 -- 192.168.123.100:0/693100351 shutdown_connections 2026-03-10T12:41:58.853 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.853+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/693100351 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a70103960 0x7f3a70103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.853 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.853+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/693100351 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70102b70 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.853+0000 7f3a77aa2700 1 -- 192.168.123.100:0/693100351 >> 192.168.123.100:0/693100351 conn(0x7f3a700fdcf0 msgr2=0x7f3a70100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:58.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.853+0000 7f3a77aa2700 1 -- 192.168.123.100:0/693100351 shutdown_connections 2026-03-10T12:41:58.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.853+0000 7f3a77aa2700 1 -- 192.168.123.100:0/693100351 wait complete. 2026-03-10T12:41:58.854 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.854+0000 7f3a77aa2700 1 Processor -- start 2026-03-10T12:41:58.855 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.854+0000 7f3a77aa2700 1 -- start start 2026-03-10T12:41:58.855 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.854+0000 7f3a77aa2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70198020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.855 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.854+0000 7f3a77aa2700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a70103960 0x7f3a70198560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.855 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.854+0000 7f3a77aa2700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a70198b80 con 0x7f3a70102760 2026-03-10T12:41:58.855 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.854+0000 7f3a77aa2700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a70198cc0 con 0x7f3a70103960 2026-03-10T12:41:58.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.854+0000 7f3a7583e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70198020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.855+0000 7f3a7583e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70198020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40954/0 (socket says 192.168.123.100:40954) 2026-03-10T12:41:58.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.855+0000 7f3a7583e700 1 -- 192.168.123.100:0/3118943218 learned_addr learned my addr 192.168.123.100:0/3118943218 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:58.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.855+0000 7f3a7583e700 1 -- 192.168.123.100:0/3118943218 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a70103960 msgr2=0x7f3a70198560 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:41:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.855+0000 7f3a7583e700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a70103960 0x7f3a70198560 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.855+0000 7f3a7583e700 1 -- 192.168.123.100:0/3118943218 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a640097e0 con 0x7f3a70102760 2026-03-10T12:41:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.855+0000 7f3a7583e700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70198020 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f3a6400bb40 tx=0x7f3a6400bc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.856+0000 7f3a62ffd700 1 -- 192.168.123.100:0/3118943218 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a6401d070 con 0x7f3a70102760 2026-03-10T12:41:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.856+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a700752d0 con 0x7f3a70102760 2026-03-10T12:41:58.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.856+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a700757c0 con 0x7f3a70102760 2026-03-10T12:41:58.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.857+0000 7f3a62ffd700 1 -- 192.168.123.100:0/3118943218 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3a64004cf0 con 0x7f3a70102760 2026-03-10T12:41:58.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.857+0000 7f3a62ffd700 1 -- 192.168.123.100:0/3118943218 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a6400f650 con 0x7f3a70102760 2026-03-10T12:41:58.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.857+0000 7f3a62ffd700 1 -- 192.168.123.100:0/3118943218 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3a6400f870 con 0x7f3a70102760 2026-03-10T12:41:58.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.857+0000 7f3a62ffd700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3a5c077a60 0x7f3a5c079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:58.858 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.857+0000 7f3a62ffd700 1 -- 192.168.123.100:0/3118943218 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(66..66 src has 1..66) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3a6409c150 con 0x7f3a70102760 2026-03-10T12:41:58.859 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.858+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3a54005320 con 0x7f3a70102760 2026-03-10T12:41:58.859 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.858+0000 7f3a7503d700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3a5c077a60 0x7f3a5c079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:58.861 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.860+0000 7f3a62ffd700 1 -- 192.168.123.100:0/3118943218 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3a6406b080 con 0x7f3a70102760 2026-03-10T12:41:58.861 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:58.861+0000 7f3a7503d700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3a5c077a60 0x7f3a5c079f10 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f3a6c009ce0 tx=0x7f3a6c009450 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:59.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.002+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f3a54006200 con 0x7f3a70102760 2026-03-10T12:41:59.004 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.003+0000 7f3a62ffd700 1 -- 192.168.123.100:0/3118943218 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 13 v13) v1 ==== 76+0+1961 (secure 0 0 0) 0x7f3a64027020 con 0x7f3a70102760 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:e13 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:epoch 13 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:37:36.646083+0000 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 0 members: 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 export targets 1 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:41:59.005 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:41:59.006 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:59.006 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:41:59.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3a5c077a60 msgr2=0x7f3a5c079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3a5c077a60 0x7f3a5c079f10 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f3a6c009ce0 tx=0x7f3a6c009450 comp rx=0 tx=0).stop 2026-03-10T12:41:59.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 msgr2=0x7f3a70198020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70198020 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f3a6400bb40 tx=0x7f3a6400bc20 comp rx=0 tx=0).stop 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 shutdown_connections 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3a5c077a60 0x7f3a5c079f10 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3a70102760 0x7f3a70198020 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 --2- 192.168.123.100:0/3118943218 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3a70103960 0x7f3a70198560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 >> 192.168.123.100:0/3118943218 conn(0x7f3a700fdcf0 msgr2=0x7f3a70106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 shutdown_connections 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.007+0000 7f3a77aa2700 1 -- 192.168.123.100:0/3118943218 wait complete. 2026-03-10T12:41:59.008 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.083+0000 7f99f7e78700 1 -- 192.168.123.100:0/168973471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f00fee80 msgr2=0x7f99f01012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.083+0000 7f99f7e78700 1 --2- 192.168.123.100:0/168973471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f00fee80 0x7f99f01012a0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f99e4009b00 tx=0x7f99e4009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.083+0000 7f99f7e78700 1 -- 192.168.123.100:0/168973471 shutdown_connections 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.083+0000 7f99f7e78700 1 --2- 192.168.123.100:0/168973471 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99f01017e0 0x7f99f0103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.083+0000 7f99f7e78700 1 --2- 192.168.123.100:0/168973471 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f00fee80 0x7f99f01012a0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.083+0000 7f99f7e78700 1 -- 192.168.123.100:0/168973471 >> 192.168.123.100:0/168973471 conn(0x7f99f00faa70 msgr2=0x7f99f00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.084+0000 7f99f7e78700 1 -- 192.168.123.100:0/168973471 shutdown_connections 2026-03-10T12:41:59.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.084+0000 7f99f7e78700 1 -- 192.168.123.100:0/168973471 wait complete. 2026-03-10T12:41:59.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.084+0000 7f99f7e78700 1 Processor -- start 2026-03-10T12:41:59.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f7e78700 1 -- start start 2026-03-10T12:41:59.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f7e78700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99f01017e0 0x7f99f019c640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f7e78700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f019cb80 0x7f99f01a1bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f7e78700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99f019d080 con 0x7f99f01017e0 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f7e78700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99f019d1f0 con 0x7f99f019cb80 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f5413700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f019cb80 0x7f99f01a1bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f5413700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f019cb80 0x7f99f01a1bf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:53164/0 (socket says 192.168.123.100:53164) 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f5413700 1 -- 192.168.123.100:0/3191498843 learned_addr learned my addr 192.168.123.100:0/3191498843 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.085+0000 7f99f5413700 1 -- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99f01017e0 msgr2=0x7f99f019c640 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.086+0000 7f99f5c14700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99f01017e0 0x7f99f019c640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.086+0000 7f99f5413700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99f01017e0 0x7f99f019c640 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.086+0000 7f99f5413700 1 -- 192.168.123.100:0/3191498843 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99ec009710 con 0x7f99f019cb80 2026-03-10T12:41:59.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.086+0000 7f99f5413700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f019cb80 0x7f99f01a1bf0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f99ec00ec80 tx=0x7f99ec00c5b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:59.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.086+0000 7f99e2ffd700 1 -- 192.168.123.100:0/3191498843 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f99ec00cd50 con 0x7f99f019cb80 2026-03-10T12:41:59.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.086+0000 7f99e2ffd700 1 -- 192.168.123.100:0/3191498843 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f99ec00ceb0 con 0x7f99f019cb80 2026-03-10T12:41:59.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.086+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f99e40097e0 con 0x7f99f019cb80 2026-03-10T12:41:59.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.087+0000 7f99e2ffd700 1 -- 192.168.123.100:0/3191498843 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f99ec005380 con 0x7f99f019cb80 2026-03-10T12:41:59.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.087+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f99f01a2520 con 0x7f99f019cb80 2026-03-10T12:41:59.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.087+0000 7f99f5c14700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99f01017e0 0x7f99f019c640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:41:59.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.088+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f99f0066e40 con 0x7f99f019cb80 2026-03-10T12:41:59.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.089+0000 7f99e2ffd700 1 -- 192.168.123.100:0/3191498843 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f99ec005630 con 0x7f99f019cb80 2026-03-10T12:41:59.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.089+0000 7f99e2ffd700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f99dc077870 0x7f99dc079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:59.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.089+0000 7f99e2ffd700 1 -- 192.168.123.100:0/3191498843 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(66..66 src has 1..66) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f99ec014070 con 0x7f99f019cb80 2026-03-10T12:41:59.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.090+0000 7f99f5c14700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f99dc077870 0x7f99dc079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:59.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.090+0000 7f99f5c14700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f99dc077870 0x7f99dc079d20 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f99e4005230 tx=0x7f99e4005fd0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:59.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.092+0000 7f99e2ffd700 1 -- 192.168.123.100:0/3191498843 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f99ec09e050 con 0x7f99f019cb80 2026-03-10T12:41:59.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.218+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f99f01024c0 con 0x7f99dc077870 2026-03-10T12:41:59.221 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.220+0000 7f99e2ffd700 1 -- 192.168.123.100:0/3191498843 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f99f01024c0 con 0x7f99dc077870 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "10/23 daemons upgraded", 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:41:59.222 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:41:59.225 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.225+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f99dc077870 msgr2=0x7f99dc079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.225 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.225+0000 7f99f7e78700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f99dc077870 0x7f99dc079d20 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f99e4005230 tx=0x7f99e4005fd0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.225 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.225+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f019cb80 msgr2=0x7f99f01a1bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.225+0000 7f99f7e78700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f019cb80 0x7f99f01a1bf0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f99ec00ec80 tx=0x7f99ec00c5b0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.225+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 shutdown_connections 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.225+0000 7f99f7e78700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f99dc077870 0x7f99dc079d20 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.226+0000 7f99f7e78700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f99f01017e0 0x7f99f019c640 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.226+0000 7f99f7e78700 1 --2- 192.168.123.100:0/3191498843 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f99f019cb80 0x7f99f01a1bf0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.226+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 >> 192.168.123.100:0/3191498843 conn(0x7f99f00faa70 msgr2=0x7f99f00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.226+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 shutdown_connections 2026-03-10T12:41:59.226 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.226+0000 7f99f7e78700 1 -- 192.168.123.100:0/3191498843 wait complete. 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 -- 192.168.123.100:0/3844130547 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3841036f0 msgr2=0x7fe384105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 --2- 192.168.123.100:0/3844130547 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3841036f0 0x7fe384105ad0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fe374009b00 tx=0x7fe374009e10 comp rx=0 tx=0).stop 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 -- 192.168.123.100:0/3844130547 shutdown_connections 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 --2- 192.168.123.100:0/3844130547 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe3841036f0 0x7fe384105ad0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 --2- 192.168.123.100:0/3844130547 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe384100dd0 0x7fe3841031b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 -- 192.168.123.100:0/3844130547 >> 192.168.123.100:0/3844130547 conn(0x7fe3840fa7b0 msgr2=0x7fe3840fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 -- 192.168.123.100:0/3844130547 shutdown_connections 2026-03-10T12:41:59.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.295+0000 7fe388926700 1 -- 192.168.123.100:0/3844130547 wait complete. 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.296+0000 7fe388926700 1 Processor -- start 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.296+0000 7fe388926700 1 -- start start 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe388926700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe384100dd0 0x7fe3841939a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe388926700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3841036f0 0x7fe384193ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe388926700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe384194500 con 0x7fe384100dd0 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe388926700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe384194640 con 0x7fe3841036f0 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe382d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe384100dd0 0x7fe3841939a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:59.297 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe382d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe384100dd0 0x7fe3841939a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:40992/0 (socket says 192.168.123.100:40992) 2026-03-10T12:41:59.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe382d9d700 1 -- 192.168.123.100:0/3083827689 learned_addr learned my addr 192.168.123.100:0/3083827689 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:41:59.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe382d9d700 1 -- 192.168.123.100:0/3083827689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3841036f0 msgr2=0x7fe384193ee0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:41:59.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe382d9d700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3841036f0 0x7fe384193ee0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe382d9d700 1 -- 192.168.123.100:0/3083827689 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe3740097e0 con 0x7fe384100dd0 2026-03-10T12:41:59.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.297+0000 7fe382d9d700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe384100dd0 0x7fe3841939a0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fe37c007f00 tx=0x7fe37c00d3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:59.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.298+0000 7fe36bfff700 1 -- 192.168.123.100:0/3083827689 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe37c00dcf0 con 0x7fe384100dd0 2026-03-10T12:41:59.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.298+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe3841990a0 con 0x7fe384100dd0 2026-03-10T12:41:59.299 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.299+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe3841995f0 con 0x7fe384100dd0 2026-03-10T12:41:59.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.299+0000 7fe36bfff700 1 -- 192.168.123.100:0/3083827689 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe37c00f040 con 0x7fe384100dd0 2026-03-10T12:41:59.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.299+0000 7fe36bfff700 1 -- 192.168.123.100:0/3083827689 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe37c0127c0 con 0x7fe384100dd0 2026-03-10T12:41:59.300 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.299+0000 7fe36bfff700 1 -- 192.168.123.100:0/3083827689 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe37c0129a0 con 0x7fe384100dd0 2026-03-10T12:41:59.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.300+0000 7fe36bfff700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe36c07be70 0x7fe36c07e320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:41:59.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.300+0000 7fe38259c700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe36c07be70 0x7fe36c07e320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:41:59.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.300+0000 7fe36bfff700 1 -- 192.168.123.100:0/3083827689 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(66..66 src has 1..66) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fe37c09e170 con 0x7fe384100dd0 2026-03-10T12:41:59.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.301+0000 7fe38259c700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe36c07be70 0x7fe36c07e320 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe374005f50 tx=0x7fe374005dc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:41:59.301 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.301+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe370005320 con 0x7fe384100dd0 2026-03-10T12:41:59.304 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.303+0000 7fe36bfff700 1 -- 192.168.123.100:0/3083827689 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe37c09f050 con 0x7fe384100dd0 2026-03-10T12:41:59.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.466+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fe370005190 con 0x7fe384100dd0 2026-03-10T12:41:59.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.467+0000 7fe36bfff700 1 -- 192.168.123.100:0/3083827689 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1323 (secure 0 0 0) 0x7fe37c062620 con 0x7fe384100dd0 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_WARN 1 osds down; Degraded data redundancy: 69/333 objects degraded (20.721%), 19 pgs degraded 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: osd.3 (root=default,host=vm07) is down 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 69/333 objects degraded (20.721%), 19 pgs degraded 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: pg 1.0 is active+undersized+degraded, acting [0,1] 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.0 is active+undersized+degraded, acting [1,0] 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.5 is active+undersized+degraded, acting [0,4] 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.6 is active+undersized+degraded, acting [1,4] 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.7 is active+undersized+degraded, acting [4,2] 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.8 is active+undersized+degraded, acting [5,0] 2026-03-10T12:41:59.468 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.a is active+undersized+degraded, acting [1,4] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.b is active+undersized+degraded, acting [4,5] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.d is active+undersized+degraded, acting [1,2] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.e is active+undersized+degraded, acting [2,0] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.11 is active+undersized+degraded, acting [4,1] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.12 is active+undersized+degraded, acting [1,0] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.14 is active+undersized+degraded, acting [4,5] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.15 is active+undersized+degraded, acting [1,0] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.16 is active+undersized+degraded, acting [5,2] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.18 is active+undersized+degraded, acting [5,4] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1a is active+undersized+degraded, acting [4,5] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1d is active+undersized+degraded, acting [5,0] 2026-03-10T12:41:59.469 INFO:teuthology.orchestra.run.vm00.stdout: pg 2.1f is active+undersized+degraded, acting [0,4] 2026-03-10T12:41:59.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.470+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe36c07be70 msgr2=0x7fe36c07e320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.471+0000 7fe388926700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe36c07be70 0x7fe36c07e320 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fe374005f50 tx=0x7fe374005dc0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.471+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe384100dd0 msgr2=0x7fe3841939a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:41:59.471 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.471+0000 7fe388926700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe384100dd0 0x7fe3841939a0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fe37c007f00 tx=0x7fe37c00d3b0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.474+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 shutdown_connections 2026-03-10T12:41:59.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.474+0000 7fe388926700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe36c07be70 0x7fe36c07e320 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.474+0000 7fe388926700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe384100dd0 0x7fe3841939a0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.474+0000 7fe388926700 1 --2- 192.168.123.100:0/3083827689 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe3841036f0 0x7fe384193ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:41:59.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.474+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 >> 192.168.123.100:0/3083827689 conn(0x7fe3840fa7b0 msgr2=0x7fe3840fcc20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:41:59.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.474+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 shutdown_connections 2026-03-10T12:41:59.475 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:41:59.475+0000 7fe388926700 1 -- 192.168.123.100:0/3083827689 wait complete. 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='client.44211 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='client.34278 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='client.34282 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: osdmap e66: 6 total, 5 up, 6 in 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='osd.3 [v2:192.168.123.107:6800/4276377159,v1:192.168.123.107:6801/4276377159]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: pgmap v114: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 69/333 objects degraded (20.721%) 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2920541149' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3118943218' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 69/333 objects degraded (20.721%), 19 pgs degraded (PG_DEGRADED) 2026-03-10T12:41:59.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:41:59 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3083827689' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='client.44211 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='client.34278 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='client.34282 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: osdmap e66: 6 total, 5 up, 6 in 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='osd.3 [v2:192.168.123.107:6800/4276377159,v1:192.168.123.107:6801/4276377159]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: pgmap v114: 65 pgs: 19 active+undersized, 19 active+undersized+degraded, 27 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 69/333 objects degraded (20.721%) 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2920541149' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3118943218' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 69/333 objects degraded (20.721%), 19 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:41:59 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3083827689' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:42:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:00 vm00.local ceph-mon[103263]: from='client.44227 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:00 vm00.local ceph-mon[103263]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:42:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:00 vm00.local ceph-mon[103263]: osd.3 [v2:192.168.123.107:6800/4276377159,v1:192.168.123.107:6801/4276377159] boot 2026-03-10T12:42:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:00 vm00.local ceph-mon[103263]: osdmap e67: 6 total, 6 up, 6 in 2026-03-10T12:42:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:42:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:00 vm07.local ceph-mon[93622]: from='client.44227 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:00 vm07.local ceph-mon[93622]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:42:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:00 vm07.local ceph-mon[93622]: osd.3 [v2:192.168.123.107:6800/4276377159,v1:192.168.123.107:6801/4276377159] boot 2026-03-10T12:42:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:00 vm07.local ceph-mon[93622]: osdmap e67: 6 total, 6 up, 6 in 2026-03-10T12:42:01.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T12:42:01.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:01 vm00.local ceph-mon[103263]: osdmap e68: 6 total, 6 up, 6 in 2026-03-10T12:42:01.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:01 vm00.local ceph-mon[103263]: pgmap v117: 65 pgs: 4 peering, 16 active+undersized, 18 active+undersized+degraded, 27 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 66/333 objects degraded (19.820%) 2026-03-10T12:42:01.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:01.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:01 vm07.local ceph-mon[93622]: osdmap e68: 6 total, 6 up, 6 in 2026-03-10T12:42:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:01 vm07.local ceph-mon[93622]: pgmap v117: 65 pgs: 4 peering, 16 active+undersized, 18 active+undersized+degraded, 27 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 66/333 objects degraded (19.820%) 2026-03-10T12:42:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:03 vm07.local ceph-mon[93622]: pgmap v118: 65 pgs: 4 peering, 11 active+undersized, 15 active+undersized+degraded, 35 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 57/333 objects degraded (17.117%) 2026-03-10T12:42:04.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:03 vm00.local ceph-mon[103263]: pgmap v118: 65 pgs: 4 peering, 11 active+undersized, 15 active+undersized+degraded, 35 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 57/333 objects degraded (17.117%) 2026-03-10T12:42:05.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:04 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 57/333 objects degraded (17.117%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:05.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:04 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 57/333 objects degraded (17.117%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:05 vm07.local ceph-mon[93622]: pgmap v119: 65 pgs: 4 peering, 61 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:05 vm07.local ceph-mon[93622]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 57/333 objects degraded (17.117%), 15 pgs degraded) 2026-03-10T12:42:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:05 vm07.local ceph-mon[93622]: Cluster is now healthy 2026-03-10T12:42:06.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:05 vm00.local ceph-mon[103263]: pgmap v119: 65 pgs: 4 peering, 61 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:06.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:05 vm00.local ceph-mon[103263]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 57/333 objects degraded (17.117%), 15 pgs degraded) 2026-03-10T12:42:06.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:05 vm00.local ceph-mon[103263]: Cluster is now healthy 2026-03-10T12:42:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:07 vm07.local ceph-mon[93622]: pgmap v120: 65 pgs: 65 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:07 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:42:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:07 vm00.local ceph-mon[103263]: pgmap v120: 65 pgs: 65 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:07 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:42:09.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:09 vm00.local ceph-mon[103263]: pgmap v121: 65 pgs: 65 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:10.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:09 vm07.local ceph-mon[93622]: pgmap v121: 65 pgs: 65 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:11.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:42:11.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:11.080 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T12:42:11.081 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:11.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:42:11.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:11.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T12:42:11.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:11.816 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:11 vm07.local systemd[1]: Stopping Ceph osd.4 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:42:11.816 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[69961]: 2026-03-10T12:42:11.529+0000 7f18d443a700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:42:11.816 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[69961]: 2026-03-10T12:42:11.529+0000 7f18d443a700 -1 osd.4 68 *** Got signal Terminated *** 2026-03-10T12:42:11.816 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[69961]: 2026-03-10T12:42:11.529+0000 7f18d443a700 -1 osd.4 68 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:42:12.098 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:11 vm07.local podman[103138]: 2026-03-10 12:42:11.915651817 +0000 UTC m=+0.402705684 container died 5c66bfb63a835e3700ff0fb2d5564419c731345103409fe87acda6b641fbcdcd (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20231212, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0) 2026-03-10T12:42:12.098 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:11 vm07.local podman[103138]: 2026-03-10 12:42:11.935464966 +0000 UTC m=+0.422518833 container remove 5c66bfb63a835e3700ff0fb2d5564419c731345103409fe87acda6b641fbcdcd (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.build-date=20231212, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, RELEASE=HEAD, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_CLEAN=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0) 2026-03-10T12:42:12.098 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:11 vm07.local bash[103138]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4 2026-03-10T12:42:12.098 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103205]: 2026-03-10 12:42:12.066651977 +0000 UTC m=+0.016899906 container create 83a59cdb2cbf0db211a17f5c681845ec70cb80a93ee9d6c915cd500d48abbdad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:42:12.098 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:42:12.098 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-mon[93622]: Upgrade: osd.4 is safe to restart 2026-03-10T12:42:12.098 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-mon[93622]: Upgrade: Updating osd.4 2026-03-10T12:42:12.098 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-mon[93622]: Deploying daemon osd.4 on vm07 2026-03-10T12:42:12.098 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-mon[93622]: pgmap v122: 65 pgs: 65 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:12.098 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:11 vm07.local ceph-mon[93622]: osd.4 marked itself down and dead 2026-03-10T12:42:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:11 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T12:42:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:11 vm00.local ceph-mon[103263]: Upgrade: osd.4 is safe to restart 2026-03-10T12:42:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:11 vm00.local ceph-mon[103263]: Upgrade: Updating osd.4 2026-03-10T12:42:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:11 vm00.local ceph-mon[103263]: Deploying daemon osd.4 on vm07 2026-03-10T12:42:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:11 vm00.local ceph-mon[103263]: pgmap v122: 65 pgs: 65 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:11 vm00.local ceph-mon[103263]: osd.4 marked itself down and dead 2026-03-10T12:42:12.416 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103205]: 2026-03-10 12:42:12.116976982 +0000 UTC m=+0.067224911 container init 83a59cdb2cbf0db211a17f5c681845ec70cb80a93ee9d6c915cd500d48abbdad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:42:12.416 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103205]: 2026-03-10 12:42:12.119659552 +0000 UTC m=+0.069907492 container start 83a59cdb2cbf0db211a17f5c681845ec70cb80a93ee9d6c915cd500d48abbdad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:42:12.416 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103205]: 2026-03-10 12:42:12.120520123 +0000 UTC m=+0.070768052 container attach 83a59cdb2cbf0db211a17f5c681845ec70cb80a93ee9d6c915cd500d48abbdad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True) 2026-03-10T12:42:12.417 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103205]: 2026-03-10 12:42:12.058071906 +0000 UTC m=+0.008319845 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:42:12.417 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103224]: 2026-03-10 12:42:12.274548732 +0000 UTC m=+0.010260379 container died 83a59cdb2cbf0db211a17f5c681845ec70cb80a93ee9d6c915cd500d48abbdad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T12:42:12.417 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103224]: 2026-03-10 12:42:12.295385537 +0000 UTC m=+0.031097173 container remove 83a59cdb2cbf0db211a17f5c681845ec70cb80a93ee9d6c915cd500d48abbdad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-10T12:42:12.417 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.4.service: Deactivated successfully. 2026-03-10T12:42:12.417 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local systemd[1]: Stopped Ceph osd.4 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:42:12.417 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.4.service: Consumed 44.658s CPU time. 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local systemd[1]: Starting Ceph osd.4 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103310]: 2026-03-10 12:42:12.620334574 +0000 UTC m=+0.023527391 container create b7b16864a6c1dd6de6541a63cc055bf4e905dbacd847cd9906dd3b4f60c4ae68 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103310]: 2026-03-10 12:42:12.666241539 +0000 UTC m=+0.069434356 container init b7b16864a6c1dd6de6541a63cc055bf4e905dbacd847cd9906dd3b4f60c4ae68 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103310]: 2026-03-10 12:42:12.669490261 +0000 UTC m=+0.072683067 container start b7b16864a6c1dd6de6541a63cc055bf4e905dbacd847cd9906dd3b4f60c4ae68 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103310]: 2026-03-10 12:42:12.67114081 +0000 UTC m=+0.074333627 container attach b7b16864a6c1dd6de6541a63cc055bf4e905dbacd847cd9906dd3b4f60c4ae68 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate, ceph=True, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local podman[103310]: 2026-03-10 12:42:12.607314562 +0000 UTC m=+0.010507379 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local bash[103310]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:12.817 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:12 vm07.local bash[103310]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:12 vm00.local ceph-mon[103263]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:42:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:12 vm00.local ceph-mon[103263]: osdmap e69: 6 total, 5 up, 6 in 2026-03-10T12:42:13.283 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:12 vm07.local ceph-mon[93622]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:42:13.283 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:12 vm07.local ceph-mon[93622]: osdmap e69: 6 total, 5 up, 6 in 2026-03-10T12:42:13.566 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:42:13.566 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-be6a8b69-0cef-4c47-ba22-2c4feb5bedef/osd-block-ff03ddab-6945-46b4-b19b-30775ca85618 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T12:42:13.567 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-be6a8b69-0cef-4c47-ba22-2c4feb5bedef/osd-block-ff03ddab-6945-46b4-b19b-30775ca85618 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T12:42:14.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-mon[93622]: pgmap v124: 65 pgs: 5 peering, 5 stale+active+clean, 55 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:14.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-mon[93622]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T12:42:14.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-mon[93622]: osdmap e70: 6 total, 5 up, 6 in 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/ln -snf /dev/ceph-be6a8b69-0cef-4c47-ba22-2c4feb5bedef/osd-block-ff03ddab-6945-46b4-b19b-30775ca85618 /var/lib/ceph/osd/ceph-4/block 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/ln -snf /dev/ceph-be6a8b69-0cef-4c47-ba22-2c4feb5bedef/osd-block-ff03ddab-6945-46b4-b19b-30775ca85618 /var/lib/ceph/osd/ceph-4/block 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate[103322]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103310]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T12:42:14.067 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local podman[103525]: 2026-03-10 12:42:13.666370498 +0000 UTC m=+0.010163236 container died b7b16864a6c1dd6de6541a63cc055bf4e905dbacd847cd9906dd3b4f60c4ae68 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:42:14.068 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local podman[103525]: 2026-03-10 12:42:13.6844069 +0000 UTC m=+0.028199638 container remove b7b16864a6c1dd6de6541a63cc055bf4e905dbacd847cd9906dd3b4f60c4ae68 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-10T12:42:14.068 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local podman[103564]: 2026-03-10 12:42:13.780061949 +0000 UTC m=+0.020174407 container create 7ac87e1c2a4169504cccb779482cb077f05d40d198bb42a29fecf6e153b05972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T12:42:14.068 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local podman[103564]: 2026-03-10 12:42:13.820403564 +0000 UTC m=+0.060516023 container init 7ac87e1c2a4169504cccb779482cb077f05d40d198bb42a29fecf6e153b05972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) 2026-03-10T12:42:14.068 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local podman[103564]: 2026-03-10 12:42:13.823590601 +0000 UTC m=+0.063703059 container start 7ac87e1c2a4169504cccb779482cb077f05d40d198bb42a29fecf6e153b05972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:42:14.068 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local bash[103564]: 7ac87e1c2a4169504cccb779482cb077f05d40d198bb42a29fecf6e153b05972 2026-03-10T12:42:14.068 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local podman[103564]: 2026-03-10 12:42:13.772693696 +0000 UTC m=+0.012806165 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:42:14.068 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:13 vm07.local systemd[1]: Started Ceph osd.4 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:42:14.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:13 vm00.local ceph-mon[103263]: pgmap v124: 65 pgs: 5 peering, 5 stale+active+clean, 55 active+clean; 269 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail 2026-03-10T12:42:14.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:13 vm00.local ceph-mon[103263]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T12:42:14.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:13 vm00.local ceph-mon[103263]: osdmap e70: 6 total, 5 up, 6 in 2026-03-10T12:42:14.539 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:14 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:42:14.404+0000 7f4df20c8740 -1 Falling back to public interface 2026-03-10T12:42:15.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:14 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:14 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: pgmap v126: 65 pgs: 11 active+undersized, 5 peering, 2 stale+active+clean, 10 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 36/333 objects degraded (10.811%) 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: Health check failed: Degraded data redundancy: 36/333 objects degraded (10.811%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:15.964 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: pgmap v126: 65 pgs: 11 active+undersized, 5 peering, 2 stale+active+clean, 10 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 36/333 objects degraded (10.811%) 2026-03-10T12:42:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: Health check failed: Degraded data redundancy: 36/333 objects degraded (10.811%), 10 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:16.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:16.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:16.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:16.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:16.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:16.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: Upgrade: unsafe to stop osd(s) at this time (10 PGs are or would become offline) 2026-03-10T12:42:17.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:17 vm07.local ceph-mon[93622]: pgmap v127: 65 pgs: 15 active+undersized, 5 peering, 14 active+undersized+degraded, 31 active+clean; 269 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 49/333 objects degraded (14.715%) 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: Upgrade: unsafe to stop osd(s) at this time (10 PGs are or would become offline) 2026-03-10T12:42:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:17 vm00.local ceph-mon[103263]: pgmap v127: 65 pgs: 15 active+undersized, 5 peering, 14 active+undersized+degraded, 31 active+clean; 269 MiB data, 1.8 GiB used, 118 GiB / 120 GiB avail; 49/333 objects degraded (14.715%) 2026-03-10T12:42:18.566 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:18 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:42:18.257+0000 7f4df20c8740 -1 osd.4 0 read_superblock omap replica is missing. 2026-03-10T12:42:18.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:18 vm00.local ceph-mon[103263]: from='osd.4 [v2:192.168.123.107:6808/2460757629,v1:192.168.123.107:6809/2460757629]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:42:18.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:18 vm00.local ceph-mon[103263]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:42:19.066 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:18 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:42:18.581+0000 7f4df20c8740 -1 osd.4 68 log_to_monitors true 2026-03-10T12:42:19.066 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:42:18 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:42:18.671+0000 7f4de9e62640 -1 osd.4 68 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:42:19.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:18 vm07.local ceph-mon[93622]: from='osd.4 [v2:192.168.123.107:6808/2460757629,v1:192.168.123.107:6809/2460757629]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:42:19.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:18 vm07.local ceph-mon[93622]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T12:42:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:19 vm00.local ceph-mon[103263]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T12:42:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:19 vm00.local ceph-mon[103263]: osdmap e71: 6 total, 5 up, 6 in 2026-03-10T12:42:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:19 vm00.local ceph-mon[103263]: from='osd.4 [v2:192.168.123.107:6808/2460757629,v1:192.168.123.107:6809/2460757629]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:19 vm00.local ceph-mon[103263]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:19 vm00.local ceph-mon[103263]: pgmap v129: 65 pgs: 18 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 54/333 objects degraded (16.216%) 2026-03-10T12:42:20.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:19 vm07.local ceph-mon[93622]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T12:42:20.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:19 vm07.local ceph-mon[93622]: osdmap e71: 6 total, 5 up, 6 in 2026-03-10T12:42:20.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:19 vm07.local ceph-mon[93622]: from='osd.4 [v2:192.168.123.107:6808/2460757629,v1:192.168.123.107:6809/2460757629]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:20.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:19 vm07.local ceph-mon[93622]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:20.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:19 vm07.local ceph-mon[93622]: pgmap v129: 65 pgs: 18 active+undersized, 16 active+undersized+degraded, 31 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 54/333 objects degraded (16.216%) 2026-03-10T12:42:20.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:20 vm00.local ceph-mon[103263]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:42:20.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:20 vm00.local ceph-mon[103263]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-10T12:42:20.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:20 vm00.local ceph-mon[103263]: osd.4 [v2:192.168.123.107:6808/2460757629,v1:192.168.123.107:6809/2460757629] boot 2026-03-10T12:42:20.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:20 vm00.local ceph-mon[103263]: osdmap e72: 6 total, 6 up, 6 in 2026-03-10T12:42:20.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:20 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:42:20.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:20 vm00.local ceph-mon[103263]: osdmap e73: 6 total, 6 up, 6 in 2026-03-10T12:42:21.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:20 vm07.local ceph-mon[93622]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:42:21.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:20 vm07.local ceph-mon[93622]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-10T12:42:21.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:20 vm07.local ceph-mon[93622]: osd.4 [v2:192.168.123.107:6808/2460757629,v1:192.168.123.107:6809/2460757629] boot 2026-03-10T12:42:21.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:20 vm07.local ceph-mon[93622]: osdmap e72: 6 total, 6 up, 6 in 2026-03-10T12:42:21.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:20 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T12:42:21.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:20 vm07.local ceph-mon[93622]: osdmap e73: 6 total, 6 up, 6 in 2026-03-10T12:42:21.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:21 vm00.local ceph-mon[103263]: pgmap v132: 65 pgs: 6 peering, 15 active+undersized, 13 active+undersized+degraded, 31 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 43/333 objects degraded (12.913%) 2026-03-10T12:42:22.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:21 vm07.local ceph-mon[93622]: pgmap v132: 65 pgs: 6 peering, 15 active+undersized, 13 active+undersized+degraded, 31 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 43/333 objects degraded (12.913%) 2026-03-10T12:42:23.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:23 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 43/333 objects degraded (12.913%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:23.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:23 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 43/333 objects degraded (12.913%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:24.263 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:24 vm07.local ceph-mon[93622]: pgmap v133: 65 pgs: 6 peering, 12 active+undersized, 12 active+undersized+degraded, 35 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 40/333 objects degraded (12.012%) 2026-03-10T12:42:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:24 vm00.local ceph-mon[103263]: pgmap v133: 65 pgs: 6 peering, 12 active+undersized, 12 active+undersized+degraded, 35 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 40/333 objects degraded (12.012%) 2026-03-10T12:42:25.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:25 vm07.local ceph-mon[93622]: pgmap v134: 65 pgs: 6 peering, 59 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:25.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:25 vm00.local ceph-mon[103263]: pgmap v134: 65 pgs: 6 peering, 59 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:26 vm07.local ceph-mon[93622]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 40/333 objects degraded (12.012%), 12 pgs degraded) 2026-03-10T12:42:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:26 vm07.local ceph-mon[93622]: Cluster is now healthy 2026-03-10T12:42:26.878 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:26 vm00.local ceph-mon[103263]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 40/333 objects degraded (12.012%), 12 pgs degraded) 2026-03-10T12:42:26.878 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:26 vm00.local ceph-mon[103263]: Cluster is now healthy 2026-03-10T12:42:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:27 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:42:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:27 vm07.local ceph-mon[93622]: pgmap v135: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:27.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:27 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:42:27.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:27 vm00.local ceph-mon[103263]: pgmap v135: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.551+0000 7f9edd761700 1 -- 192.168.123.100:0/3332016378 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 msgr2=0x7f9ed8102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.551+0000 7f9edd761700 1 --2- 192.168.123.100:0/3332016378 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed8102b70 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f9ec8009b00 tx=0x7f9ec8009e10 comp rx=0 tx=0).stop 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.552+0000 7f9edd761700 1 -- 192.168.123.100:0/3332016378 shutdown_connections 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.552+0000 7f9edd761700 1 --2- 192.168.123.100:0/3332016378 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ed8103960 0x7f9ed8103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.552+0000 7f9edd761700 1 --2- 192.168.123.100:0/3332016378 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed8102b70 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.552+0000 7f9edd761700 1 -- 192.168.123.100:0/3332016378 >> 192.168.123.100:0/3332016378 conn(0x7f9ed80fdd10 msgr2=0x7f9ed8100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.552+0000 7f9edd761700 1 -- 192.168.123.100:0/3332016378 shutdown_connections 2026-03-10T12:42:29.553 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.553+0000 7f9edd761700 1 -- 192.168.123.100:0/3332016378 wait complete. 2026-03-10T12:42:29.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.553+0000 7f9edd761700 1 Processor -- start 2026-03-10T12:42:29.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.553+0000 7f9edd761700 1 -- start start 2026-03-10T12:42:29.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9edd761700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed81980b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9edd761700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ed8103960 0x7f9ed81985f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9edd761700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ed8198c10 con 0x7f9ed8102760 2026-03-10T12:42:29.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9edd761700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9ed8198d50 con 0x7f9ed8103960 2026-03-10T12:42:29.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ed6ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed81980b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:29.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ed6ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed81980b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59302/0 (socket says 192.168.123.100:59302) 2026-03-10T12:42:29.555 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ed6ffd700 1 -- 192.168.123.100:0/3320351340 learned_addr learned my addr 192.168.123.100:0/3320351340 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ecffff700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ed8103960 0x7f9ed81985f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ed6ffd700 1 -- 192.168.123.100:0/3320351340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ed8103960 msgr2=0x7f9ed81985f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ed6ffd700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ed8103960 0x7f9ed81985f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ed6ffd700 1 -- 192.168.123.100:0/3320351340 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9ec80097e0 con 0x7f9ed8102760 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.554+0000 7f9ed6ffd700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed81980b0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f9ec800b5c0 tx=0x7f9ec80049d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.555+0000 7f9ed4ff9700 1 -- 192.168.123.100:0/3320351340 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ec801d070 con 0x7f9ed8102760 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.555+0000 7f9ed4ff9700 1 -- 192.168.123.100:0/3320351340 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9ec8004500 con 0x7f9ed8102760 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.555+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9ed819d7a0 con 0x7f9ed8102760 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.555+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9ed819dc30 con 0x7f9ed8102760 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.555+0000 7f9ed4ff9700 1 -- 192.168.123.100:0/3320351340 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9ec800f460 con 0x7f9ed8102760 2026-03-10T12:42:29.557 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.556+0000 7f9ed4ff9700 1 -- 192.168.123.100:0/3320351340 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9ec800f5c0 con 0x7f9ed8102760 2026-03-10T12:42:29.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.557+0000 7f9ed4ff9700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9ec4077990 0x7f9ec4079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.557+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9ed8066e40 con 0x7f9ed8102760 2026-03-10T12:42:29.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.557+0000 7f9ecffff700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9ec4077990 0x7f9ec4079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:29.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.561+0000 7f9ed4ff9700 1 -- 192.168.123.100:0/3320351340 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9ec809c080 con 0x7f9ed8102760 2026-03-10T12:42:29.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.561+0000 7f9ecffff700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9ec4077990 0x7f9ec4079e40 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f9ec0006fd0 tx=0x7f9ec0009380 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:29.561 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.561+0000 7f9ed4ff9700 1 -- 192.168.123.100:0/3320351340 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9ec80600a0 con 0x7f9ed8102760 2026-03-10T12:42:29.691 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.690+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9ed81082b0 con 0x7f9ec4077990 2026-03-10T12:42:29.692 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.691+0000 7f9ed4ff9700 1 -- 192.168.123.100:0/3320351340 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f9ed81082b0 con 0x7f9ec4077990 2026-03-10T12:42:29.694 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.694+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9ec4077990 msgr2=0x7f9ec4079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.694+0000 7f9edd761700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9ec4077990 0x7f9ec4079e40 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f9ec0006fd0 tx=0x7f9ec0009380 comp rx=0 tx=0).stop 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.694+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 msgr2=0x7f9ed81980b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.694+0000 7f9edd761700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed81980b0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f9ec800b5c0 tx=0x7f9ec80049d0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.694+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 shutdown_connections 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.695+0000 7f9edd761700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9ec4077990 0x7f9ec4079e40 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.695+0000 7f9edd761700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9ed8102760 0x7f9ed81980b0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.695+0000 7f9edd761700 1 --2- 192.168.123.100:0/3320351340 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9ed8103960 0x7f9ed81985f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.695+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 >> 192.168.123.100:0/3320351340 conn(0x7f9ed80fdd10 msgr2=0x7f9ed8106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.695+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 shutdown_connections 2026-03-10T12:42:29.695 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.695+0000 7f9edd761700 1 -- 192.168.123.100:0/3320351340 wait complete. 2026-03-10T12:42:29.705 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.772+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1853704508 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 msgr2=0x7f1ae0102b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.772+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1853704508 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0102b50 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f1ad0009b50 tx=0x7f1ad0009e60 comp rx=0 tx=0).stop 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.772+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1853704508 shutdown_connections 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.772+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1853704508 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1ae0103940 0x7f1ae0103d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.772+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1853704508 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0102b50 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.772+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1853704508 >> 192.168.123.100:0/1853704508 conn(0x7f1ae00fdcf0 msgr2=0x7f1ae0100120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.773+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1853704508 shutdown_connections 2026-03-10T12:42:29.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.773+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1853704508 wait complete. 2026-03-10T12:42:29.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.773+0000 7f1ae6ed3700 1 Processor -- start 2026-03-10T12:42:29.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.773+0000 7f1ae6ed3700 1 -- start start 2026-03-10T12:42:29.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae6ed3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0198070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae6ed3700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1ae0103940 0x7f1ae01985b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae6ed3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ae0198bd0 con 0x7f1ae0102740 2026-03-10T12:42:29.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae4c6f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0198070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:29.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae4c6f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0198070 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59322/0 (socket says 192.168.123.100:59322) 2026-03-10T12:42:29.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae4c6f700 1 -- 192.168.123.100:0/1096322423 learned_addr learned my addr 192.168.123.100:0/1096322423 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:42:29.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae6ed3700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ae0198d10 con 0x7f1ae0103940 2026-03-10T12:42:29.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1adffff700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1ae0103940 0x7f1ae01985b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:29.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae4c6f700 1 -- 192.168.123.100:0/1096322423 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1ae0103940 msgr2=0x7f1ae01985b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.774+0000 7f1ae4c6f700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1ae0103940 0x7f1ae01985b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.775+0000 7f1ae4c6f700 1 -- 192.168.123.100:0/1096322423 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ad00097e0 con 0x7f1ae0102740 2026-03-10T12:42:29.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.775+0000 7f1ae4c6f700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0198070 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f1ad0004ce0 tx=0x7f1ad0005790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:29.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.775+0000 7f1addffb700 1 -- 192.168.123.100:0/1096322423 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ad001d070 con 0x7f1ae0102740 2026-03-10T12:42:29.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.775+0000 7f1addffb700 1 -- 192.168.123.100:0/1096322423 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1ad000bb00 con 0x7f1ae0102740 2026-03-10T12:42:29.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.775+0000 7f1addffb700 1 -- 192.168.123.100:0/1096322423 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1ad000f670 con 0x7f1ae0102740 2026-03-10T12:42:29.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.775+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1ae019d760 con 0x7f1ae0102740 2026-03-10T12:42:29.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.775+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1ae019dc50 con 0x7f1ae0102740 2026-03-10T12:42:29.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.777+0000 7f1addffb700 1 -- 192.168.123.100:0/1096322423 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1ad000bc70 con 0x7f1ae0102740 2026-03-10T12:42:29.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.777+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1ae0066e40 con 0x7f1ae0102740 2026-03-10T12:42:29.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.777+0000 7f1addffb700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ac8077870 0x7f1ac8079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.777+0000 7f1addffb700 1 -- 192.168.123.100:0/1096322423 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f1ad009afe0 con 0x7f1ae0102740 2026-03-10T12:42:29.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.777+0000 7f1adffff700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ac8077870 0x7f1ac8079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:29.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.779+0000 7f1adffff700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ac8077870 0x7f1ac8079d20 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f1ad4007c60 tx=0x7f1ad40073d0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:29.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.780+0000 7f1addffb700 1 -- 192.168.123.100:0/1096322423 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1ad00d19f0 con 0x7f1ae0102740 2026-03-10T12:42:29.918 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.917+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1ae0108290 con 0x7f1ac8077870 2026-03-10T12:42:29.918 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:29 vm00.local ceph-mon[103263]: pgmap v136: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:29.919 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.919+0000 7f1addffb700 1 -- 192.168.123.100:0/1096322423 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f1ae0108290 con 0x7f1ac8077870 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.921+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ac8077870 msgr2=0x7f1ac8079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.921+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ac8077870 0x7f1ac8079d20 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f1ad4007c60 tx=0x7f1ad40073d0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.921+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 msgr2=0x7f1ae0198070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.921+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0198070 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f1ad0004ce0 tx=0x7f1ad0005790 comp rx=0 tx=0).stop 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.922+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 shutdown_connections 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.922+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ac8077870 0x7f1ac8079d20 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.922+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1ae0102740 0x7f1ae0198070 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.922+0000 7f1ae6ed3700 1 --2- 192.168.123.100:0/1096322423 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1ae0103940 0x7f1ae01985b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.922+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 >> 192.168.123.100:0/1096322423 conn(0x7f1ae00fdcf0 msgr2=0x7f1ae0106b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.922+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 shutdown_connections 2026-03-10T12:42:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.922+0000 7f1ae6ed3700 1 -- 192.168.123.100:0/1096322423 wait complete. 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/3054881442 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 msgr2=0x7f9b48102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/3054881442 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48102b70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f9b30009b00 tx=0x7f9b30009e10 comp rx=0 tx=0).stop 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/3054881442 shutdown_connections 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/3054881442 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9b48103a00 0x7f9b48103e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/3054881442 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48102b70 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/3054881442 >> 192.168.123.100:0/3054881442 conn(0x7f9b480fddb0 msgr2=0x7f9b481001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/3054881442 shutdown_connections 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.997+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/3054881442 wait complete. 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.998+0000 7f9b4dbcb700 1 Processor -- start 2026-03-10T12:42:29.998 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.998+0000 7f9b4dbcb700 1 -- start start 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.998+0000 7f9b4dbcb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48198060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.998+0000 7f9b477fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48198060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b477fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48198060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59332/0 (socket says 192.168.123.100:59332) 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b4dbcb700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9b48103a00 0x7f9b481985a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b477fe700 1 -- 192.168.123.100:0/1538469698 learned_addr learned my addr 192.168.123.100:0/1538469698 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b4dbcb700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b48198bc0 con 0x7f9b48102760 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b48198d00 con 0x7f9b48103a00 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b477fe700 1 -- 192.168.123.100:0/1538469698 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9b48103a00 msgr2=0x7f9b481985a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:29.999 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b46ffd700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9b48103a00 0x7f9b481985a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b477fe700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9b48103a00 0x7f9b481985a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b477fe700 1 -- 192.168.123.100:0/1538469698 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9b300097e0 con 0x7f9b48102760 2026-03-10T12:42:30.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b46ffd700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9b48103a00 0x7f9b481985a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:42:30.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:29.999+0000 7f9b477fe700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48198060 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f9b300052d0 tx=0x7f9b30004af0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.000+0000 7f9b44ff9700 1 -- 192.168.123.100:0/1538469698 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b3001d070 con 0x7f9b48102760 2026-03-10T12:42:30.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.000+0000 7f9b44ff9700 1 -- 192.168.123.100:0/1538469698 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9b30004500 con 0x7f9b48102760 2026-03-10T12:42:30.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.000+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9b4819d750 con 0x7f9b48102760 2026-03-10T12:42:30.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.000+0000 7f9b44ff9700 1 -- 192.168.123.100:0/1538469698 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b30022680 con 0x7f9b48102760 2026-03-10T12:42:30.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.000+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9b4819dbe0 con 0x7f9b48102760 2026-03-10T12:42:30.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.001+0000 7f9b44ff9700 1 -- 192.168.123.100:0/1538469698 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9b300227e0 con 0x7f9b48102760 2026-03-10T12:42:30.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.002+0000 7f9b44ff9700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9b340778c0 0x7f9b34079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.002+0000 7f9b46ffd700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9b340778c0 0x7f9b34079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.002+0000 7f9b44ff9700 1 -- 192.168.123.100:0/1538469698 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9b3009c760 con 0x7f9b48102760 2026-03-10T12:42:30.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.002+0000 7f9b46ffd700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9b340778c0 0x7f9b34079d70 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9b38007900 tx=0x7f9b38008040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.003+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9b4804ea50 con 0x7f9b48102760 2026-03-10T12:42:30.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.006+0000 7f9b44ff9700 1 -- 192.168.123.100:0/1538469698 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9b30065010 con 0x7f9b48102760 2026-03-10T12:42:30.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:29 vm07.local ceph-mon[93622]: pgmap v136: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:30.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.130+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f9b48108120 con 0x7f9b340778c0 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.136+0000 7f9b44ff9700 1 -- 192.168.123.100:0/1538469698 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f9b48108120 con 0x7f9b340778c0 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (8m) 59s ago 9m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (9m) 59s ago 9m 9361k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (9m) 15s ago 9m 11.6M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (3m) 59s ago 9m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (3m) 15s ago 9m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (8m) 59s ago 9m 91.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (7m) 59s ago 7m 136M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (7m) 59s ago 7m 18.7M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (7m) 15s ago 7m 18.5M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (7m) 15s ago 7m 143M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (4m) 59s ago 10m 622M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (3m) 15s ago 8m 490M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (3m) 59s ago 10m 61.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (3m) 15s ago 8m 52.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (9m) 59s ago 9m 15.0M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (8m) 15s ago 8m 16.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (2m) 59s ago 8m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (83s) 59s ago 8m 104M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (61s) 59s ago 8m 12.4M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (37s) 15s ago 8m 143M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (16s) 15s ago 7m 12.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7ac87e1c2a41 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (7m) 15s ago 7m 410M 4096M 18.2.0 dc2bc1663786 277bdfe4ec55 2026-03-10T12:42:30.137 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (3m) 59s ago 9m 67.8M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:42:30.140 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.140+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9b340778c0 msgr2=0x7f9b34079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.140+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9b340778c0 0x7f9b34079d70 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9b38007900 tx=0x7f9b38008040 comp rx=0 tx=0).stop 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.140+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 msgr2=0x7f9b48198060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48198060 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f9b300052d0 tx=0x7f9b30004af0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 shutdown_connections 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f9b340778c0 0x7f9b34079d70 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9b48102760 0x7f9b48198060 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 --2- 192.168.123.100:0/1538469698 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f9b48103a00 0x7f9b481985a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 >> 192.168.123.100:0/1538469698 conn(0x7f9b480fddb0 msgr2=0x7f9b48100170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 shutdown_connections 2026-03-10T12:42:30.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.141+0000 7f9b4dbcb700 1 -- 192.168.123.100:0/1538469698 wait complete. 2026-03-10T12:42:30.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.213+0000 7eff8f753700 1 -- 192.168.123.100:0/3136267880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 msgr2=0x7eff88103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.213 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.213+0000 7eff8f753700 1 --2- 192.168.123.100:0/3136267880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88103dd0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7eff84009b50 tx=0x7eff84009e60 comp rx=0 tx=0).stop 2026-03-10T12:42:30.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.216+0000 7eff8f753700 1 -- 192.168.123.100:0/3136267880 shutdown_connections 2026-03-10T12:42:30.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.216+0000 7eff8f753700 1 --2- 192.168.123.100:0/3136267880 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88103dd0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.216+0000 7eff8f753700 1 --2- 192.168.123.100:0/3136267880 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff88102780 0x7eff88102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.216+0000 7eff8f753700 1 -- 192.168.123.100:0/3136267880 >> 192.168.123.100:0/3136267880 conn(0x7eff880fdd10 msgr2=0x7eff88100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.217+0000 7eff8f753700 1 -- 192.168.123.100:0/3136267880 shutdown_connections 2026-03-10T12:42:30.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.217+0000 7eff8f753700 1 -- 192.168.123.100:0/3136267880 wait complete. 2026-03-10T12:42:30.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.217+0000 7eff8f753700 1 Processor -- start 2026-03-10T12:42:30.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.217+0000 7eff8f753700 1 -- start start 2026-03-10T12:42:30.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8f753700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff88102780 0x7eff88198100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8f753700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88198640 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8f753700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff88198c60 con 0x7eff88103980 2026-03-10T12:42:30.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8ccee700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88198640 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8ccee700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88198640 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59352/0 (socket says 192.168.123.100:59352) 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8ccee700 1 -- 192.168.123.100:0/1959560005 learned_addr learned my addr 192.168.123.100:0/1959560005 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7eff88198da0 con 0x7eff88102780 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.218+0000 7eff8d4ef700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff88102780 0x7eff88198100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.219+0000 7eff8ccee700 1 -- 192.168.123.100:0/1959560005 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff88102780 msgr2=0x7eff88198100 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.219+0000 7eff8ccee700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff88102780 0x7eff88198100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.219+0000 7eff8ccee700 1 -- 192.168.123.100:0/1959560005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7eff840097e0 con 0x7eff88103980 2026-03-10T12:42:30.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.219+0000 7eff8ccee700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88198640 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7eff84005850 tx=0x7eff8400b870 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.220 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.219+0000 7eff7a7fc700 1 -- 192.168.123.100:0/1959560005 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff8401d070 con 0x7eff88103980 2026-03-10T12:42:30.221 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.219+0000 7eff7a7fc700 1 -- 192.168.123.100:0/1959560005 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7eff8400bc70 con 0x7eff88103980 2026-03-10T12:42:30.221 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.219+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7eff8819d7f0 con 0x7eff88103980 2026-03-10T12:42:30.221 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.220+0000 7eff7a7fc700 1 -- 192.168.123.100:0/1959560005 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7eff840218d0 con 0x7eff88103980 2026-03-10T12:42:30.221 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.221+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7eff8819dce0 con 0x7eff88103980 2026-03-10T12:42:30.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.222+0000 7eff7a7fc700 1 -- 192.168.123.100:0/1959560005 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7eff8400f460 con 0x7eff88103980 2026-03-10T12:42:30.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.222+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7eff88066e40 con 0x7eff88103980 2026-03-10T12:42:30.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.223+0000 7eff7a7fc700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7eff74077870 0x7eff74079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.223+0000 7eff7a7fc700 1 -- 192.168.123.100:0/1959560005 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6136+0+0 (secure 0 0 0) 0x7eff8409af70 con 0x7eff88103980 2026-03-10T12:42:30.227 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.227+0000 7eff8d4ef700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7eff74077870 0x7eff74079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.228 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.227+0000 7eff7a7fc700 1 -- 192.168.123.100:0/1959560005 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7eff840637a0 con 0x7eff88103980 2026-03-10T12:42:30.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.229+0000 7eff8d4ef700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7eff74077870 0x7eff74079d20 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7eff7c006fd0 tx=0x7eff7c009380 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.406+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7eff8819df90 con 0x7eff88103980 2026-03-10T12:42:30.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.407+0000 7eff7a7fc700 1 -- 192.168.123.100:0/1959560005 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7eff8400bde0 con 0x7eff88103980 2026-03-10T12:42:30.407 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 1, 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 5, 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 9 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:42:30.408 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:42:30.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.410+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7eff74077870 msgr2=0x7eff74079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.410+0000 7eff8f753700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7eff74077870 0x7eff74079d20 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7eff7c006fd0 tx=0x7eff7c009380 comp rx=0 tx=0).stop 2026-03-10T12:42:30.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.410+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 msgr2=0x7eff88198640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.410+0000 7eff8f753700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88198640 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7eff84005850 tx=0x7eff8400b870 comp rx=0 tx=0).stop 2026-03-10T12:42:30.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.410+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 shutdown_connections 2026-03-10T12:42:30.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.411+0000 7eff8f753700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7eff74077870 0x7eff74079d20 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.411+0000 7eff8f753700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7eff88102780 0x7eff88198100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.411+0000 7eff8f753700 1 --2- 192.168.123.100:0/1959560005 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7eff88103980 0x7eff88198640 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.411+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 >> 192.168.123.100:0/1959560005 conn(0x7eff880fdd10 msgr2=0x7eff88106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.411+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 shutdown_connections 2026-03-10T12:42:30.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.411+0000 7eff8f753700 1 -- 192.168.123.100:0/1959560005 wait complete. 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.487+0000 7f89b7248700 1 -- 192.168.123.100:0/3935877047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b00fee80 msgr2=0x7f89b01012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.487+0000 7f89b7248700 1 --2- 192.168.123.100:0/3935877047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b00fee80 0x7f89b01012a0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f89a800b3a0 tx=0x7f89a800b6b0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 -- 192.168.123.100:0/3935877047 shutdown_connections 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 --2- 192.168.123.100:0/3935877047 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b01017e0 0x7f89b0103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 --2- 192.168.123.100:0/3935877047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b00fee80 0x7f89b01012a0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 -- 192.168.123.100:0/3935877047 >> 192.168.123.100:0/3935877047 conn(0x7f89b00faa70 msgr2=0x7f89b00fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 -- 192.168.123.100:0/3935877047 shutdown_connections 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 -- 192.168.123.100:0/3935877047 wait complete. 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 Processor -- start 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 -- start start 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b01017e0 0x7f89b010cb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b010d070 0x7f89b0114510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89b010d570 con 0x7f89b010d070 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.488+0000 7f89b7248700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f89b010d6b0 con 0x7f89b01017e0 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.489+0000 7f89affff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b010d070 0x7f89b0114510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.489+0000 7f89affff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b010d070 0x7f89b0114510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59372/0 (socket says 192.168.123.100:59372) 2026-03-10T12:42:30.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.489+0000 7f89affff700 1 -- 192.168.123.100:0/421013425 learned_addr learned my addr 192.168.123.100:0/421013425 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:42:30.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.490+0000 7f89affff700 1 -- 192.168.123.100:0/421013425 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b01017e0 msgr2=0x7f89b010cb30 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:42:30.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.490+0000 7f89affff700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b01017e0 0x7f89b010cb30 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.490+0000 7f89affff700 1 -- 192.168.123.100:0/421013425 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89a800b050 con 0x7f89b010d070 2026-03-10T12:42:30.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.491+0000 7f89affff700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b010d070 0x7f89b0114510 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f89a400da40 tx=0x7f89a400de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.491+0000 7f89adffb700 1 -- 192.168.123.100:0/421013425 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89a40041d0 con 0x7f89b010d070 2026-03-10T12:42:30.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.491+0000 7f89adffb700 1 -- 192.168.123.100:0/421013425 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f89a4004d10 con 0x7f89b010d070 2026-03-10T12:42:30.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.491+0000 7f89adffb700 1 -- 192.168.123.100:0/421013425 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89a4005020 con 0x7f89b010d070 2026-03-10T12:42:30.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.491+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f89b0114ab0 con 0x7f89b010d070 2026-03-10T12:42:30.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.491+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f89b0114fb0 con 0x7f89b010d070 2026-03-10T12:42:30.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.494+0000 7f89adffb700 1 -- 192.168.123.100:0/421013425 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f89a4004750 con 0x7f89b010d070 2026-03-10T12:42:30.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.494+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f89b0106bf0 con 0x7f89b010d070 2026-03-10T12:42:30.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.494+0000 7f89adffb700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8998077910 0x7f8998079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.494+0000 7f89adffb700 1 -- 192.168.123.100:0/421013425 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f89a4099800 con 0x7f89b010d070 2026-03-10T12:42:30.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.497+0000 7f89b4fe4700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8998077910 0x7f8998079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.499 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.497+0000 7f89b4fe4700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8998077910 0x7f8998079dc0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f89a800bb30 tx=0x7f89a800bf90 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.500 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.499+0000 7f89adffb700 1 -- 192.168.123.100:0/421013425 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f89a40620b0 con 0x7f89b010d070 2026-03-10T12:42:30.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.653+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f89b0066e40 con 0x7f89b010d070 2026-03-10T12:42:30.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.654+0000 7f89adffb700 1 -- 192.168.123.100:0/421013425 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 15 v15) v1 ==== 76+0+1942 (secure 0 0 0) 0x7f89a40167b0 con 0x7f89b010d070 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:e15 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:btime 2026-03-10T12:42:26:260745+0000 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:epoch 15 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:42:25.757418+0000 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 24313 members: 24313,24307 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:42:30.655 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:42:30.656 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:42:30.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.658+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8998077910 msgr2=0x7f8998079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.658+0000 7f89b7248700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8998077910 0x7f8998079dc0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f89a800bb30 tx=0x7f89a800bf90 comp rx=0 tx=0).stop 2026-03-10T12:42:30.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.658+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b010d070 msgr2=0x7f89b0114510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.658+0000 7f89b7248700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b010d070 0x7f89b0114510 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f89a400da40 tx=0x7f89a400de00 comp rx=0 tx=0).stop 2026-03-10T12:42:30.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.659+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 shutdown_connections 2026-03-10T12:42:30.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.660+0000 7f89b7248700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f8998077910 0x7f8998079dc0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.660+0000 7f89b7248700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f89b01017e0 0x7f89b010cb30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.660+0000 7f89b7248700 1 --2- 192.168.123.100:0/421013425 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f89b010d070 0x7f89b0114510 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.660+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 >> 192.168.123.100:0/421013425 conn(0x7f89b00faa70 msgr2=0x7f89b01010a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.660+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 shutdown_connections 2026-03-10T12:42:30.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.660+0000 7f89b7248700 1 -- 192.168.123.100:0/421013425 wait complete. 2026-03-10T12:42:30.661 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 15 2026-03-10T12:42:30.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.739+0000 7f010b315700 1 -- 192.168.123.100:0/536391368 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104069180 msgr2=0x7f0104103140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.739+0000 7f010b315700 1 --2- 192.168.123.100:0/536391368 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104069180 0x7f0104103140 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f00f8009b00 tx=0x7f00f8009e10 comp rx=0 tx=0).stop 2026-03-10T12:42:30.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.743+0000 7f010b315700 1 -- 192.168.123.100:0/536391368 shutdown_connections 2026-03-10T12:42:30.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.743+0000 7f010b315700 1 --2- 192.168.123.100:0/536391368 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0104103680 0x7f0104105ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.743+0000 7f010b315700 1 --2- 192.168.123.100:0/536391368 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104069180 0x7f0104103140 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.743+0000 7f010b315700 1 -- 192.168.123.100:0/536391368 >> 192.168.123.100:0/536391368 conn(0x7f01040faa70 msgr2=0x7f01040fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.744+0000 7f010b315700 1 -- 192.168.123.100:0/536391368 shutdown_connections 2026-03-10T12:42:30.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.744+0000 7f010b315700 1 -- 192.168.123.100:0/536391368 wait complete. 2026-03-10T12:42:30.746 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.745+0000 7f010b315700 1 Processor -- start 2026-03-10T12:42:30.746 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f010b315700 1 -- start start 2026-03-10T12:42:30.746 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f010b315700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0104069180 0x7f0104193c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.746 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f010b315700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104103680 0x7f0104194170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.746 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f010b315700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0104194700 con 0x7f0104103680 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f010b315700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0104194840 con 0x7f0104069180 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f01088b0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104103680 0x7f0104194170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f01088b0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104103680 0x7f0104194170 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59404/0 (socket says 192.168.123.100:59404) 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f01088b0700 1 -- 192.168.123.100:0/1808939094 learned_addr learned my addr 192.168.123.100:0/1808939094 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.746+0000 7f01090b1700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0104069180 0x7f0104193c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.747+0000 7f01088b0700 1 -- 192.168.123.100:0/1808939094 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0104069180 msgr2=0x7f0104193c30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.747+0000 7f01088b0700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0104069180 0x7f0104193c30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.747+0000 7f01088b0700 1 -- 192.168.123.100:0/1808939094 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00f80097e0 con 0x7f0104103680 2026-03-10T12:42:30.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.747+0000 7f01088b0700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104103680 0x7f0104194170 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f010000c930 tx=0x7f010000ccf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.748 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.747+0000 7f00f67fc700 1 -- 192.168.123.100:0/1808939094 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0100007ab0 con 0x7f0104103680 2026-03-10T12:42:30.749 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.748+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0104199300 con 0x7f0104103680 2026-03-10T12:42:30.749 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.748+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0104199820 con 0x7f0104103680 2026-03-10T12:42:30.749 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.748+0000 7f00f67fc700 1 -- 192.168.123.100:0/1808939094 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0100007c10 con 0x7f0104103680 2026-03-10T12:42:30.749 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.749+0000 7f00f67fc700 1 -- 192.168.123.100:0/1808939094 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0100018730 con 0x7f0104103680 2026-03-10T12:42:30.751 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.750+0000 7f00f67fc700 1 -- 192.168.123.100:0/1808939094 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0100018890 con 0x7f0104103680 2026-03-10T12:42:30.751 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.751+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00e8005320 con 0x7f0104103680 2026-03-10T12:42:30.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.751+0000 7f00f67fc700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f00f00778c0 0x7f00f0079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.751+0000 7f01090b1700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f00f00778c0 0x7f00f0079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.752+0000 7f00f67fc700 1 -- 192.168.123.100:0/1808939094 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f0100099d80 con 0x7f0104103680 2026-03-10T12:42:30.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.752+0000 7f01090b1700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f00f00778c0 0x7f00f0079d70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f00f80052d0 tx=0x7f00f8005fd0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.758 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.757+0000 7f00f67fc700 1 -- 192.168.123.100:0/1808939094 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f01000625b0 con 0x7f0104103680 2026-03-10T12:42:30.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.896+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f00e8000bf0 con 0x7f00f00778c0 2026-03-10T12:42:30.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.898+0000 7f00f67fc700 1 -- 192.168.123.100:0/1808939094 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f00e8000bf0 con 0x7f00f00778c0 2026-03-10T12:42:30.898 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:42:30.898 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:42:30.898 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:42:30.898 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:42:30.898 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:42:30.898 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:42:30.899 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:42:30.899 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:42:30.899 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:42:30.899 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "11/23 daemons upgraded", 2026-03-10T12:42:30.899 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T12:42:30.899 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:42:30.899 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:42:30.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f00f00778c0 msgr2=0x7f00f0079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f00f00778c0 0x7f00f0079d70 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f00f80052d0 tx=0x7f00f8005fd0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104103680 msgr2=0x7f0104194170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104103680 0x7f0104194170 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f010000c930 tx=0x7f010000ccf0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 shutdown_connections 2026-03-10T12:42:30.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f00f00778c0 0x7f00f0079d70 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.901 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0104069180 0x7f0104193c30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 --2- 192.168.123.100:0/1808939094 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0104103680 0x7f0104194170 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 >> 192.168.123.100:0/1808939094 conn(0x7f01040faa70 msgr2=0x7f01040ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 shutdown_connections 2026-03-10T12:42:30.902 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.901+0000 7f010b315700 1 -- 192.168.123.100:0/1808939094 wait complete. 2026-03-10T12:42:30.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.979+0000 7f6056783700 1 -- 192.168.123.100:0/2233200595 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 msgr2=0x7f60501012a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.980 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.979+0000 7f6056783700 1 --2- 192.168.123.100:0/2233200595 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f60501012a0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f6038009b00 tx=0x7f6038009e10 comp rx=0 tx=0).stop 2026-03-10T12:42:30.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.980+0000 7f6056783700 1 -- 192.168.123.100:0/2233200595 shutdown_connections 2026-03-10T12:42:30.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.980+0000 7f6056783700 1 --2- 192.168.123.100:0/2233200595 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60501017e0 0x7f6050103c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.980+0000 7f6056783700 1 --2- 192.168.123.100:0/2233200595 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f60501012a0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.980+0000 7f6056783700 1 -- 192.168.123.100:0/2233200595 >> 192.168.123.100:0/2233200595 conn(0x7f60500faa70 msgr2=0x7f60500fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:30.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.980+0000 7f6056783700 1 -- 192.168.123.100:0/2233200595 shutdown_connections 2026-03-10T12:42:30.981 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.980+0000 7f6056783700 1 -- 192.168.123.100:0/2233200595 wait complete. 2026-03-10T12:42:30.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.981+0000 7f6056783700 1 Processor -- start 2026-03-10T12:42:30.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.981+0000 7f6056783700 1 -- start start 2026-03-10T12:42:30.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.981+0000 7f6056783700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f605019c490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.982+0000 7f604ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f605019c490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.982+0000 7f604ffff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f605019c490 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:59428/0 (socket says 192.168.123.100:59428) 2026-03-10T12:42:30.982 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.982+0000 7f604ffff700 1 -- 192.168.123.100:0/3422775410 learned_addr learned my addr 192.168.123.100:0/3422775410 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.982+0000 7f6056783700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60501017e0 0x7f605019c9d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.982+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f605019cff0 con 0x7f60500fee80 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.982+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f605019d130 con 0x7f60501017e0 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.982+0000 7f604f7fe700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60501017e0 0x7f605019c9d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.983+0000 7f604ffff700 1 -- 192.168.123.100:0/3422775410 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60501017e0 msgr2=0x7f605019c9d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.983+0000 7f604ffff700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60501017e0 0x7f605019c9d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.983+0000 7f604ffff700 1 -- 192.168.123.100:0/3422775410 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f60380097e0 con 0x7f60500fee80 2026-03-10T12:42:30.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.983+0000 7f604f7fe700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60501017e0 0x7f605019c9d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:42:30.984 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.983+0000 7f604ffff700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f605019c490 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6038004990 tx=0x7f6038004a70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.984 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.984+0000 7f604d7fa700 1 -- 192.168.123.100:0/3422775410 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f603801d070 con 0x7f60500fee80 2026-03-10T12:42:30.984 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.984+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60501a1b80 con 0x7f60500fee80 2026-03-10T12:42:30.984 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.984+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60501a2070 con 0x7f60500fee80 2026-03-10T12:42:30.985 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.984+0000 7f604d7fa700 1 -- 192.168.123.100:0/3422775410 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f603800bc50 con 0x7f60500fee80 2026-03-10T12:42:30.985 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.984+0000 7f604d7fa700 1 -- 192.168.123.100:0/3422775410 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f603800f850 con 0x7f60500fee80 2026-03-10T12:42:30.986 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.986+0000 7f604d7fa700 1 -- 192.168.123.100:0/3422775410 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f603800f9b0 con 0x7f60500fee80 2026-03-10T12:42:30.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.986+0000 7f604d7fa700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f603c077990 0x7f603c079e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:42:30.987 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.987+0000 7f604f7fe700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f603c077990 0x7f603c079e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:42:30.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.987+0000 7f604d7fa700 1 -- 192.168.123.100:0/3422775410 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(73..73 src has 1..73) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f603809c290 con 0x7f60500fee80 2026-03-10T12:42:30.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.987+0000 7f604f7fe700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f603c077990 0x7f603c079e40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f6040007900 tx=0x7f6040008040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:42:30.988 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.988+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6030005320 con 0x7f60500fee80 2026-03-10T12:42:30.992 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:30 vm00.local ceph-mon[103263]: from='client.34302 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:30.992 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:30 vm00.local ceph-mon[103263]: from='client.34306 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:30.992 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:30 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1959560005' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:30.992 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:30 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/421013425' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:42:30.994 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:30.991+0000 7f604d7fa700 1 -- 192.168.123.100:0/3422775410 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6038064bf0 con 0x7f60500fee80 2026-03-10T12:42:31.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:30 vm07.local ceph-mon[93622]: from='client.34302 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:31.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:30 vm07.local ceph-mon[93622]: from='client.34306 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:31.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:30 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1959560005' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:31.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:30 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/421013425' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:42:31.170 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.169+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f6030005190 con 0x7f60500fee80 2026-03-10T12:42:31.171 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.171+0000 7f604d7fa700 1 -- 192.168.123.100:0/3422775410 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f6038027090 con 0x7f60500fee80 2026-03-10T12:42:31.171 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:42:31.174 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.174+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f603c077990 msgr2=0x7f603c079e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:31.174 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.174+0000 7f6056783700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f603c077990 0x7f603c079e40 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f6040007900 tx=0x7f6040008040 comp rx=0 tx=0).stop 2026-03-10T12:42:31.174 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.174+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 msgr2=0x7f605019c490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:42:31.175 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.174+0000 7f6056783700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f605019c490 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6038004990 tx=0x7f6038004a70 comp rx=0 tx=0).stop 2026-03-10T12:42:31.175 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.175+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 shutdown_connections 2026-03-10T12:42:31.175 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.175+0000 7f6056783700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f603c077990 0x7f603c079e40 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:31.175 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.175+0000 7f6056783700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f60500fee80 0x7f605019c490 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:31.175 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.175+0000 7f6056783700 1 --2- 192.168.123.100:0/3422775410 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f60501017e0 0x7f605019c9d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:42:31.175 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.175+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 >> 192.168.123.100:0/3422775410 conn(0x7f60500faa70 msgr2=0x7f60500fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:42:31.176 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.175+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 shutdown_connections 2026-03-10T12:42:31.176 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:42:31.175+0000 7f6056783700 1 -- 192.168.123.100:0/3422775410 wait complete. 2026-03-10T12:42:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:31 vm07.local ceph-mon[93622]: from='client.34310 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:31 vm07.local ceph-mon[93622]: pgmap v137: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:31 vm07.local ceph-mon[93622]: from='client.34322 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:31 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3422775410' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:42:32.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:31 vm00.local ceph-mon[103263]: from='client.34310 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:31 vm00.local ceph-mon[103263]: pgmap v137: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:31 vm00.local ceph-mon[103263]: from='client.34322 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:42:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:31 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3422775410' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:42:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:33.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:32 vm07.local systemd[1]: Stopping Ceph osd.5 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:42:33.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[75277]: 2026-03-10T12:42:32.880+0000 7f525d957700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:42:33.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[75277]: 2026-03-10T12:42:32.880+0000 7f525d957700 -1 osd.5 73 *** Got signal Terminated *** 2026-03-10T12:42:33.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[75277]: 2026-03-10T12:42:32.880+0000 7f525d957700 -1 osd.5 73 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:42:33.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:33.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-mon[93622]: Upgrade: osd.5 is safe to restart 2026-03-10T12:42:33.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:33.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T12:42:33.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:32 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:32 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T12:42:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:32 vm00.local ceph-mon[103263]: Upgrade: osd.5 is safe to restart 2026-03-10T12:42:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:32 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:32 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T12:42:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:32 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:34.050 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:33 vm07.local ceph-mon[93622]: Upgrade: Updating osd.5 2026-03-10T12:42:34.050 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:33 vm07.local ceph-mon[93622]: Deploying daemon osd.5 on vm07 2026-03-10T12:42:34.050 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:33 vm07.local ceph-mon[93622]: pgmap v138: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:34.050 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:33 vm07.local ceph-mon[93622]: osd.5 marked itself down and dead 2026-03-10T12:42:34.050 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:33 vm07.local podman[106904]: 2026-03-10 12:42:33.865550536 +0000 UTC m=+0.996524443 container died 277bdfe4ec551d9296d9cc2c5eb1d7550ac7885dc91f64202afcb80f77927d17 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20231212, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, RELEASE=HEAD) 2026-03-10T12:42:34.050 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:33 vm07.local podman[106904]: 2026-03-10 12:42:33.886930258 +0000 UTC m=+1.017904164 container remove 277bdfe4ec551d9296d9cc2c5eb1d7550ac7885dc91f64202afcb80f77927d17 (image=quay.io/ceph/ceph@sha256:1793ff3af6ae74527c86e1a0b22401e9c42dc08d0ebb8379653be07db17d0007, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5, org.label-schema.license=GPLv2, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, maintainer=Guillaume Abrioux , org.label-schema.build-date=20231212, org.label-schema.vendor=CentOS, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.0, GIT_BRANCH=HEAD, GIT_COMMIT=0396eef90bef641b676c164ec7a3876f45010308) 2026-03-10T12:42:34.050 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:33 vm07.local bash[106904]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5 2026-03-10T12:42:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:33 vm00.local ceph-mon[103263]: Upgrade: Updating osd.5 2026-03-10T12:42:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:33 vm00.local ceph-mon[103263]: Deploying daemon osd.5 on vm07 2026-03-10T12:42:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:33 vm00.local ceph-mon[103263]: pgmap v138: 65 pgs: 65 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:33 vm00.local ceph-mon[103263]: osd.5 marked itself down and dead 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[106972]: 2026-03-10 12:42:34.049866069 +0000 UTC m=+0.017641724 container create 63cf7dd24f0adb38aa910d50095d2a9022e560dde786127c81672a6387415a09 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[106972]: 2026-03-10 12:42:34.099885021 +0000 UTC m=+0.067660687 container init 63cf7dd24f0adb38aa910d50095d2a9022e560dde786127c81672a6387415a09 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[106972]: 2026-03-10 12:42:34.103174189 +0000 UTC m=+0.070949844 container start 63cf7dd24f0adb38aa910d50095d2a9022e560dde786127c81672a6387415a09 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, OSD_FLAVOR=default) 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[106972]: 2026-03-10 12:42:34.108175611 +0000 UTC m=+0.075951266 container attach 63cf7dd24f0adb38aa910d50095d2a9022e560dde786127c81672a6387415a09 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[106972]: 2026-03-10 12:42:34.042855607 +0000 UTC m=+0.010631262 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[106992]: 2026-03-10 12:42:34.268657319 +0000 UTC m=+0.014364821 container died 63cf7dd24f0adb38aa910d50095d2a9022e560dde786127c81672a6387415a09 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[106992]: 2026-03-10 12:42:34.286048093 +0000 UTC m=+0.031755595 container remove 63cf7dd24f0adb38aa910d50095d2a9022e560dde786127c81672a6387415a09 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.build-date=20260223) 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.5.service: Deactivated successfully. 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local systemd[1]: Stopped Ceph osd.5 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:42:34.318 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local systemd[1]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.5.service: Consumed 43.649s CPU time. 2026-03-10T12:42:34.609 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local systemd[1]: Starting Ceph osd.5 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:42:35.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[107078]: 2026-03-10 12:42:34.60872098 +0000 UTC m=+0.017551255 container create fb688bd2b192c4eb9b8ad6a1754096a7fe3ca5386dcfe26f844218199f3e5764 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[107078]: 2026-03-10 12:42:34.656418366 +0000 UTC m=+0.065248651 container init fb688bd2b192c4eb9b8ad6a1754096a7fe3ca5386dcfe26f844218199f3e5764 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[107078]: 2026-03-10 12:42:34.660111249 +0000 UTC m=+0.068941524 container start fb688bd2b192c4eb9b8ad6a1754096a7fe3ca5386dcfe26f844218199f3e5764 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[107078]: 2026-03-10 12:42:34.663133837 +0000 UTC m=+0.071964102 container attach fb688bd2b192c4eb9b8ad6a1754096a7fe3ca5386dcfe26f844218199f3e5764 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local podman[107078]: 2026-03-10 12:42:34.601586245 +0000 UTC m=+0.010416520 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local bash[107078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:34 vm07.local bash[107078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:34 vm07.local ceph-mon[93622]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:42:35.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:34 vm07.local ceph-mon[93622]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T12:42:35.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:34 vm07.local ceph-mon[93622]: osdmap e74: 6 total, 5 up, 6 in 2026-03-10T12:42:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:34 vm00.local ceph-mon[103263]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T12:42:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:34 vm00.local ceph-mon[103263]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T12:42:35.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:34 vm00.local ceph-mon[103263]: osdmap e74: 6 total, 5 up, 6 in 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3fcea972-e6a0-47cb-a68d-57080d86096a/osd-block-d91585d7-d879-48f1-8fdf-f6c88a82428a --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T12:42:35.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3fcea972-e6a0-47cb-a68d-57080d86096a/osd-block-d91585d7-d879-48f1-8fdf-f6c88a82428a --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T12:42:36.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/ln -snf /dev/ceph-3fcea972-e6a0-47cb-a68d-57080d86096a/osd-block-d91585d7-d879-48f1-8fdf-f6c88a82428a /var/lib/ceph/osd/ceph-5/block 2026-03-10T12:42:36.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/ln -snf /dev/ceph-3fcea972-e6a0-47cb-a68d-57080d86096a/osd-block-d91585d7-d879-48f1-8fdf-f6c88a82428a /var/lib/ceph/osd/ceph-5/block 2026-03-10T12:42:36.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate[107088]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107078]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local podman[107078]: 2026-03-10 12:42:35.628801016 +0000 UTC m=+1.037631291 container died fb688bd2b192c4eb9b8ad6a1754096a7fe3ca5386dcfe26f844218199f3e5764 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local podman[107078]: 2026-03-10 12:42:35.646711422 +0000 UTC m=+1.055541697 container remove fb688bd2b192c4eb9b8ad6a1754096a7fe3ca5386dcfe26f844218199f3e5764 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local podman[107346]: 2026-03-10 12:42:35.757810891 +0000 UTC m=+0.017823275 container create bd169bf008349682e5af0c95f71fd72efadde2c22c5f54dcc2cc6420647631c4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , ceph=True) 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local podman[107346]: 2026-03-10 12:42:35.794021064 +0000 UTC m=+0.054033447 container init bd169bf008349682e5af0c95f71fd72efadde2c22c5f54dcc2cc6420647631c4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260223) 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local podman[107346]: 2026-03-10 12:42:35.80313274 +0000 UTC m=+0.063145123 container start bd169bf008349682e5af0c95f71fd72efadde2c22c5f54dcc2cc6420647631c4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223) 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local bash[107346]: bd169bf008349682e5af0c95f71fd72efadde2c22c5f54dcc2cc6420647631c4 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local podman[107346]: 2026-03-10 12:42:35.750659014 +0000 UTC m=+0.010671407 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:42:36.067 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:35 vm07.local systemd[1]: Started Ceph osd.5 for 1a52002a-1c7d-11f1-af82-51cdd81caea8. 2026-03-10T12:42:36.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-mon[93622]: pgmap v140: 65 pgs: 13 stale+active+clean, 52 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:36.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:35 vm07.local ceph-mon[93622]: osdmap e75: 6 total, 5 up, 6 in 2026-03-10T12:42:36.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:35 vm00.local ceph-mon[103263]: pgmap v140: 65 pgs: 13 stale+active+clean, 52 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:36.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:35 vm00.local ceph-mon[103263]: osdmap e75: 6 total, 5 up, 6 in 2026-03-10T12:42:36.816 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:36 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:42:36.646+0000 7f765540b740 -1 Falling back to public interface 2026-03-10T12:42:37.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:37.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:37.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:37.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:36 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:37.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:37.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:37.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:37.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:36 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.035 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:37 vm07.local ceph-mon[93622]: pgmap v142: 65 pgs: 3 active+undersized, 11 stale+active+clean, 51 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:38.036 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.036 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.036 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.036 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:37 vm00.local ceph-mon[103263]: pgmap v142: 65 pgs: 3 active+undersized, 11 stale+active+clean, 51 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail 2026-03-10T12:42:38.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:38.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: pgmap v143: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 51/333 objects degraded (15.315%) 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all osd 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T12:42:39.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:39 vm00.local ceph-mon[103263]: Health check failed: Degraded data redundancy: 51/333 objects degraded (15.315%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:40.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:42:39.868+0000 7f765540b740 -1 osd.5 0 read_superblock omap replica is missing. 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: pgmap v143: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 51/333 objects degraded (15.315%) 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all osd 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T12:42:40.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:39 vm07.local ceph-mon[93622]: Health check failed: Degraded data redundancy: 51/333 objects degraded (15.315%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:40.566 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:42:40.105+0000 7f765540b740 -1 osd.5 73 log_to_monitors true 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: Upgrade: Scaling down filesystem cephfs 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: from='osd.5 [v2:192.168.123.107:6816/1623232774,v1:192.168.123.107:6817/1623232774]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:42:41.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:40 vm07.local ceph-mon[93622]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:42:41.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T12:42:41.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T12:42:41.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T12:42:41.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: Upgrade: Scaling down filesystem cephfs 2026-03-10T12:42:41.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:41.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T12:42:41.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: from='osd.5 [v2:192.168.123.107:6816/1623232774,v1:192.168.123.107:6817/1623232774]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:42:41.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:40 vm00.local ceph-mon[103263]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: pgmap v145: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 51/333 objects degraded (15.315%) 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: stopping daemon mds.cephfs.vm00.lnokoe 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='osd.5 [v2:192.168.123.107:6816/1623232774,v1:192.168.123.107:6817/1623232774]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: osdmap e77: 6 total, 5 up, 6 in 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:stopping} 2 up:standby 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: pgmap v145: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 51/333 objects degraded (15.315%) 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: stopping daemon mds.cephfs.vm00.lnokoe 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='osd.5 [v2:192.168.123.107:6816/1623232774,v1:192.168.123.107:6817/1623232774]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: osdmap e77: 6 total, 5 up, 6 in 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm07", "root=default"]}]: dispatch 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:stopping} 2 up:standby 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:42.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:42:42 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:42:42.651+0000 7f764c9a4640 -1 osd.5 73 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T12:42:43.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:42 vm07.local ceph-mon[93622]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:42:43.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:42 vm07.local ceph-mon[93622]: from='osd.5 ' entity='osd.5' 2026-03-10T12:42:43.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:42 vm07.local ceph-mon[93622]: pgmap v147: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 51/333 objects degraded (15.315%) 2026-03-10T12:42:43.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:42 vm00.local ceph-mon[103263]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:42:43.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:42 vm00.local ceph-mon[103263]: from='osd.5 ' entity='osd.5' 2026-03-10T12:42:43.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:42 vm00.local ceph-mon[103263]: pgmap v147: 65 pgs: 14 active+undersized, 14 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 51/333 objects degraded (15.315%) 2026-03-10T12:42:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:43 vm07.local ceph-mon[93622]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:42:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:43 vm07.local ceph-mon[93622]: osd.5 [v2:192.168.123.107:6816/1623232774,v1:192.168.123.107:6817/1623232774] boot 2026-03-10T12:42:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:43 vm07.local ceph-mon[93622]: osdmap e78: 6 total, 6 up, 6 in 2026-03-10T12:42:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:43 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:42:44.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:43 vm00.local ceph-mon[103263]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T12:42:44.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:43 vm00.local ceph-mon[103263]: osd.5 [v2:192.168.123.107:6816/1623232774,v1:192.168.123.107:6817/1623232774] boot 2026-03-10T12:42:44.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:43 vm00.local ceph-mon[103263]: osdmap e78: 6 total, 6 up, 6 in 2026-03-10T12:42:44.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:43 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T12:42:45.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:45 vm00.local ceph-mon[103263]: osdmap e79: 6 total, 6 up, 6 in 2026-03-10T12:42:45.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:45 vm00.local ceph-mon[103263]: pgmap v150: 65 pgs: 1 peering, 14 active+undersized, 13 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 49/333 objects degraded (14.715%) 2026-03-10T12:42:46.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:45 vm07.local ceph-mon[93622]: osdmap e79: 6 total, 6 up, 6 in 2026-03-10T12:42:46.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:45 vm07.local ceph-mon[93622]: pgmap v150: 65 pgs: 1 peering, 14 active+undersized, 13 active+undersized+degraded, 37 active+clean; 269 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 49/333 objects degraded (14.715%) 2026-03-10T12:42:46.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:46 vm00.local ceph-mon[103263]: Health check update: Degraded data redundancy: 49/333 objects degraded (14.715%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:46.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:46 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:46.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:46 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:47.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:46 vm07.local ceph-mon[93622]: Health check update: Degraded data redundancy: 49/333 objects degraded (14.715%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T12:42:47.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:46 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:47.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:46 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:42:47.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:47 vm00.local ceph-mon[103263]: pgmap v151: 65 pgs: 1 peering, 5 active+undersized, 8 active+undersized+degraded, 51 active+clean; 269 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 170 B/s wr, 0 op/s; 34/333 objects degraded (10.210%) 2026-03-10T12:42:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:47 vm07.local ceph-mon[93622]: pgmap v151: 65 pgs: 1 peering, 5 active+undersized, 8 active+undersized+degraded, 51 active+clean; 269 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 170 B/s wr, 0 op/s; 34/333 objects degraded (10.210%) 2026-03-10T12:42:49.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:48 vm07.local ceph-mon[93622]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 34/333 objects degraded (10.210%), 8 pgs degraded) 2026-03-10T12:42:49.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:48 vm07.local ceph-mon[93622]: Cluster is now healthy 2026-03-10T12:42:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:48 vm00.local ceph-mon[103263]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 34/333 objects degraded (10.210%), 8 pgs degraded) 2026-03-10T12:42:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:48 vm00.local ceph-mon[103263]: Cluster is now healthy 2026-03-10T12:42:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:49 vm00.local ceph-mon[103263]: pgmap v152: 65 pgs: 65 active+clean; 270 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 88 KiB/s wr, 35 op/s 2026-03-10T12:42:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:49 vm07.local ceph-mon[93622]: pgmap v152: 65 pgs: 65 active+clean; 270 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 88 KiB/s wr, 35 op/s 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: pgmap v153: 65 pgs: 65 active+clean; 270 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 86 KiB/s wr, 34 op/s 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:stopping} 2 up:standby 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:52.119 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:51 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: pgmap v153: 65 pgs: 65 active+clean; 270 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 86 KiB/s wr, 34 op/s 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm07.wznhgu=up:active,1=cephfs.vm00.lnokoe=up:stopping} 2 up:standby 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:51 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:42:53.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:52 vm00.local ceph-mon[103263]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:42:53.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:52 vm07.local ceph-mon[93622]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:42:54.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:53 vm00.local ceph-mon[103263]: pgmap v154: 65 pgs: 65 active+clean; 270 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 76 KiB/s wr, 30 op/s 2026-03-10T12:42:54.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:53 vm07.local ceph-mon[93622]: pgmap v154: 65 pgs: 65 active+clean; 270 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 76 KiB/s wr, 30 op/s 2026-03-10T12:42:56.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:55 vm00.local ceph-mon[103263]: pgmap v155: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 3.0 KiB/s rd, 116 KiB/s wr, 54 op/s 2026-03-10T12:42:56.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:55 vm07.local ceph-mon[93622]: pgmap v155: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 3.0 KiB/s rd, 116 KiB/s wr, 54 op/s 2026-03-10T12:42:56.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:42:58.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:57 vm00.local ceph-mon[103263]: pgmap v156: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 97 KiB/s wr, 45 op/s 2026-03-10T12:42:58.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:57 vm07.local ceph-mon[93622]: pgmap v156: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 97 KiB/s wr, 45 op/s 2026-03-10T12:42:59.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:42:59 vm00.local ceph-mon[103263]: pgmap v157: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 98 KiB/s wr, 45 op/s 2026-03-10T12:43:00.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:42:59 vm07.local ceph-mon[93622]: pgmap v157: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 98 KiB/s wr, 45 op/s 2026-03-10T12:43:01.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.248+0000 7f451c4a6700 1 -- 192.168.123.100:0/969954865 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4514102760 msgr2=0x7f4514102b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.248+0000 7f451c4a6700 1 --2- 192.168.123.100:0/969954865 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4514102760 0x7f4514102b70 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f4508009b00 tx=0x7f4508009e10 comp rx=0 tx=0).stop 2026-03-10T12:43:01.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.249+0000 7f451c4a6700 1 -- 192.168.123.100:0/969954865 shutdown_connections 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.249+0000 7f451c4a6700 1 --2- 192.168.123.100:0/969954865 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4514103960 0x7f4514103db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.249+0000 7f451c4a6700 1 --2- 192.168.123.100:0/969954865 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4514102760 0x7f4514102b70 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.249+0000 7f451c4a6700 1 -- 192.168.123.100:0/969954865 >> 192.168.123.100:0/969954865 conn(0x7f45140fdcf0 msgr2=0x7f4514100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.249+0000 7f451c4a6700 1 -- 192.168.123.100:0/969954865 shutdown_connections 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.249+0000 7f451c4a6700 1 -- 192.168.123.100:0/969954865 wait complete. 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.249+0000 7f451c4a6700 1 Processor -- start 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.250+0000 7f451c4a6700 1 -- start start 2026-03-10T12:43:01.250 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.250+0000 7f451c4a6700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4514103960 0x7f4514198290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.250+0000 7f451c4a6700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f45141987d0 0x7f451419d840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f4519a41700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f45141987d0 0x7f451419d840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f451a242700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4514103960 0x7f4514198290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f451a242700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4514103960 0x7f4514198290 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:45590/0 (socket says 192.168.123.100:45590) 2026-03-10T12:43:01.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f451a242700 1 -- 192.168.123.100:0/4281497644 learned_addr learned my addr 192.168.123.100:0/4281497644 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:01.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.250+0000 7f451c4a6700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4514198cd0 con 0x7f45141987d0 2026-03-10T12:43:01.251 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4514198e40 con 0x7f4514103960 2026-03-10T12:43:01.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f4519a41700 1 -- 192.168.123.100:0/4281497644 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4514103960 msgr2=0x7f4514198290 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f4519a41700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4514103960 0x7f4514198290 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.251+0000 7f4519a41700 1 -- 192.168.123.100:0/4281497644 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45080097e0 con 0x7f45141987d0 2026-03-10T12:43:01.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.252+0000 7f4519a41700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f45141987d0 0x7f451419d840 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f451000eab0 tx=0x7f451000ee70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.252+0000 7f45077fe700 1 -- 192.168.123.100:0/4281497644 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f451000cbe0 con 0x7f45141987d0 2026-03-10T12:43:01.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.252+0000 7f45077fe700 1 -- 192.168.123.100:0/4281497644 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f451000cd40 con 0x7f45141987d0 2026-03-10T12:43:01.252 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.252+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f451419dde0 con 0x7f45141987d0 2026-03-10T12:43:01.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.253+0000 7f45077fe700 1 -- 192.168.123.100:0/4281497644 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4510018860 con 0x7f45141987d0 2026-03-10T12:43:01.253 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.253+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f451419e300 con 0x7f45141987d0 2026-03-10T12:43:01.254 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.254+0000 7f45077fe700 1 -- 192.168.123.100:0/4281497644 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4510018af0 con 0x7f45141987d0 2026-03-10T12:43:01.255 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.254+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4514066e40 con 0x7f45141987d0 2026-03-10T12:43:01.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.256+0000 7f45077fe700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4500077660 0x7f4500079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.256+0000 7f451a242700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4500077660 0x7f4500079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.256+0000 7f451a242700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4500077660 0x7f4500079b10 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f45080051d0 tx=0x7f450801a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.256+0000 7f45077fe700 1 -- 192.168.123.100:0/4281497644 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4510014070 con 0x7f45141987d0 2026-03-10T12:43:01.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.258+0000 7f45077fe700 1 -- 192.168.123.100:0/4281497644 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f451009f050 con 0x7f45141987d0 2026-03-10T12:43:01.398 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.397+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f45141082b0 con 0x7f4500077660 2026-03-10T12:43:01.399 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.399+0000 7f45077fe700 1 -- 192.168.123.100:0/4281497644 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f45141082b0 con 0x7f4500077660 2026-03-10T12:43:01.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.401+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4500077660 msgr2=0x7f4500079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.401+0000 7f451c4a6700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4500077660 0x7f4500079b10 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f45080051d0 tx=0x7f450801a040 comp rx=0 tx=0).stop 2026-03-10T12:43:01.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f45141987d0 msgr2=0x7f451419d840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f45141987d0 0x7f451419d840 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f451000eab0 tx=0x7f451000ee70 comp rx=0 tx=0).stop 2026-03-10T12:43:01.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 shutdown_connections 2026-03-10T12:43:01.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4500077660 0x7f4500079b10 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4514103960 0x7f4514198290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 --2- 192.168.123.100:0/4281497644 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f45141987d0 0x7f451419d840 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 >> 192.168.123.100:0/4281497644 conn(0x7f45140fdcf0 msgr2=0x7f4514106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:01.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.402+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 shutdown_connections 2026-03-10T12:43:01.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.403+0000 7f451c4a6700 1 -- 192.168.123.100:0/4281497644 wait complete. 2026-03-10T12:43:01.413 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:43:01.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 -- 192.168.123.100:0/1830124350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 msgr2=0x7f3874107e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 --2- 192.168.123.100:0/1830124350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 0x7f3874107e40 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f3868009a60 tx=0x7f3868009d70 comp rx=0 tx=0).stop 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 -- 192.168.123.100:0/1830124350 shutdown_connections 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 --2- 192.168.123.100:0/1830124350 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 0x7f3874107e40 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 --2- 192.168.123.100:0/1830124350 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 0x7f3874105520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 -- 192.168.123.100:0/1830124350 >> 192.168.123.100:0/1830124350 conn(0x7f38740faa70 msgr2=0x7f38740fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 -- 192.168.123.100:0/1830124350 shutdown_connections 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.478+0000 7f387b208700 1 -- 192.168.123.100:0/1830124350 wait complete. 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.479+0000 7f387b208700 1 Processor -- start 2026-03-10T12:43:01.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.479+0000 7f387b208700 1 -- start start 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.479+0000 7f387b208700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 0x7f38741980a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.479+0000 7f387b208700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 0x7f38741985e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.479+0000 7f387b208700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3874198c00 con 0x7f38740691a0 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.479+0000 7f387b208700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3874198d40 con 0x7f3874105a60 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.480+0000 7f3873fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 0x7f38741985e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.480+0000 7f3878fa4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 0x7f38741980a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.480+0000 7f3878fa4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 0x7f38741980a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:36998/0 (socket says 192.168.123.100:36998) 2026-03-10T12:43:01.480 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.480+0000 7f3878fa4700 1 -- 192.168.123.100:0/3971746318 learned_addr learned my addr 192.168.123.100:0/3971746318 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:01.481 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.480+0000 7f3878fa4700 1 -- 192.168.123.100:0/3971746318 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 msgr2=0x7f38741985e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.481 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.480+0000 7f3878fa4700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 0x7f38741985e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.481 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.480+0000 7f3878fa4700 1 -- 192.168.123.100:0/3971746318 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38640097e0 con 0x7f38740691a0 2026-03-10T12:43:01.481 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.481+0000 7f3878fa4700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 0x7f38741980a0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f386400efd0 tx=0x7f386400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.481 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.481+0000 7f3871ffb700 1 -- 192.168.123.100:0/3971746318 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3864009e70 con 0x7f38740691a0 2026-03-10T12:43:01.481 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.481+0000 7f3871ffb700 1 -- 192.168.123.100:0/3971746318 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3864004500 con 0x7f38740691a0 2026-03-10T12:43:01.482 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.481+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3868009710 con 0x7f38740691a0 2026-03-10T12:43:01.482 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.481+0000 7f3871ffb700 1 -- 192.168.123.100:0/3971746318 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f38640106d0 con 0x7f38740691a0 2026-03-10T12:43:01.482 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.481+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f387419db60 con 0x7f38740691a0 2026-03-10T12:43:01.483 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.483+0000 7f3871ffb700 1 -- 192.168.123.100:0/3971746318 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3864010830 con 0x7f38740691a0 2026-03-10T12:43:01.484 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.483+0000 7f3871ffb700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f385c077870 0x7f385c079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.484 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.484+0000 7f3873fff700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f385c077870 0x7f385c079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.484 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.484+0000 7f3871ffb700 1 -- 192.168.123.100:0/3971746318 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3864014070 con 0x7f38740691a0 2026-03-10T12:43:01.485 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.484+0000 7f3873fff700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f385c077870 0x7f385c079d20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f38680038c0 tx=0x7f386800b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.485 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.484+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38741921e0 con 0x7f38740691a0 2026-03-10T12:43:01.488 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.488+0000 7f3871ffb700 1 -- 192.168.123.100:0/3971746318 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f38640639e0 con 0x7f38740691a0 2026-03-10T12:43:01.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.636+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f387402cf50 con 0x7f385c077870 2026-03-10T12:43:01.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.644+0000 7f3871ffb700 1 -- 192.168.123.100:0/3971746318 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f387402cf50 con 0x7f385c077870 2026-03-10T12:43:01.648 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.647+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f385c077870 msgr2=0x7f385c079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.647+0000 7f387b208700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f385c077870 0x7f385c079d20 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f38680038c0 tx=0x7f386800b540 comp rx=0 tx=0).stop 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.647+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 msgr2=0x7f38741980a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.647+0000 7f387b208700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 0x7f38741980a0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f386400efd0 tx=0x7f386400c5b0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.647+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 shutdown_connections 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.647+0000 7f387b208700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f385c077870 0x7f385c079d20 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.648+0000 7f387b208700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f38740691a0 0x7f38741980a0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.648+0000 7f387b208700 1 --2- 192.168.123.100:0/3971746318 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3874105a60 0x7f38741985e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.648+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 >> 192.168.123.100:0/3971746318 conn(0x7f38740faa70 msgr2=0x7f38740fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.648+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 shutdown_connections 2026-03-10T12:43:01.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.648+0000 7f387b208700 1 -- 192.168.123.100:0/3971746318 wait complete. 2026-03-10T12:43:01.730 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.729+0000 7f20a0e20700 1 -- 192.168.123.100:0/549342148 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c0ff460 msgr2=0x7f209c0ff870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.730 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.729+0000 7f20a0e20700 1 --2- 192.168.123.100:0/549342148 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c0ff460 0x7f209c0ff870 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f2084009b00 tx=0x7f2084009e10 comp rx=0 tx=0).stop 2026-03-10T12:43:01.730 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.730+0000 7f20a0e20700 1 -- 192.168.123.100:0/549342148 shutdown_connections 2026-03-10T12:43:01.730 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.730+0000 7f20a0e20700 1 --2- 192.168.123.100:0/549342148 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c100700 0x7f209c100b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.730 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.730+0000 7f20a0e20700 1 --2- 192.168.123.100:0/549342148 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c0ff460 0x7f209c0ff870 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.730 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.730+0000 7f20a0e20700 1 -- 192.168.123.100:0/549342148 >> 192.168.123.100:0/549342148 conn(0x7f209c0faa70 msgr2=0x7f209c0fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:01.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.730+0000 7f20a0e20700 1 -- 192.168.123.100:0/549342148 shutdown_connections 2026-03-10T12:43:01.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.730+0000 7f20a0e20700 1 -- 192.168.123.100:0/549342148 wait complete. 2026-03-10T12:43:01.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.730+0000 7f20a0e20700 1 Processor -- start 2026-03-10T12:43:01.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f20a0e20700 1 -- start start 2026-03-10T12:43:01.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f20a0e20700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c0ff460 0x7f209c193b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.731 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f20a0e20700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c100700 0x7f209c1940b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f20a0e20700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f209c1946d0 con 0x7f209c100700 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f20a0e20700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f209c194810 con 0x7f209c0ff460 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f209a59c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c0ff460 0x7f209c193b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f209a59c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c0ff460 0x7f209c193b70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:45608/0 (socket says 192.168.123.100:45608) 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.731+0000 7f209a59c700 1 -- 192.168.123.100:0/2579599666 learned_addr learned my addr 192.168.123.100:0/2579599666 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f209a59c700 1 -- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c100700 msgr2=0x7f209c1940b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f2099d9b700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c100700 0x7f209c1940b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f209a59c700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c100700 0x7f209c1940b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.732 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f209a59c700 1 -- 192.168.123.100:0/2579599666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f20840097e0 con 0x7f209c0ff460 2026-03-10T12:43:01.733 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f2099d9b700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c100700 0x7f209c1940b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:43:01.733 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f209a59c700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c0ff460 0x7f209c193b70 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f20840049c0 tx=0x7f2084004aa0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.733 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f20937fe700 1 -- 192.168.123.100:0/2579599666 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f208401d070 con 0x7f209c0ff460 2026-03-10T12:43:01.733 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.732+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f209c199260 con 0x7f209c0ff460 2026-03-10T12:43:01.733 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.733+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f209c199750 con 0x7f209c0ff460 2026-03-10T12:43:01.734 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.733+0000 7f20937fe700 1 -- 192.168.123.100:0/2579599666 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f208400bd10 con 0x7f209c0ff460 2026-03-10T12:43:01.734 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.733+0000 7f20937fe700 1 -- 192.168.123.100:0/2579599666 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f208400f8d0 con 0x7f209c0ff460 2026-03-10T12:43:01.734 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.734+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f207c005320 con 0x7f209c0ff460 2026-03-10T12:43:01.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.734+0000 7f20937fe700 1 -- 192.168.123.100:0/2579599666 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f208400fa30 con 0x7f209c0ff460 2026-03-10T12:43:01.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.735+0000 7f20937fe700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f208807bcd0 0x7f208807e180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.735+0000 7f20937fe700 1 -- 192.168.123.100:0/2579599666 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f208409b1b0 con 0x7f209c0ff460 2026-03-10T12:43:01.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.735+0000 7f2099d9b700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f208807bcd0 0x7f208807e180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.736+0000 7f2099d9b700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f208807bcd0 0x7f208807e180 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f208c005fd0 tx=0x7f208c005dc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.738+0000 7f20937fe700 1 -- 192.168.123.100:0/2579599666 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2084063a90 con 0x7f209c0ff460 2026-03-10T12:43:01.866 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.865+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f207c000bf0 con 0x7f208807bcd0 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.871+0000 7f20937fe700 1 -- 192.168.123.100:0/2579599666 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f207c000bf0 con 0x7f208807bcd0 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (9m) 90s ago 10m 25.5M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (10m) 90s ago 10m 9361k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (9m) 24s ago 9m 11.8M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (3m) 90s ago 10m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (3m) 24s ago 9m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (9m) 90s ago 9m 91.0M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (7m) 90s ago 7m 136M - 18.2.0 dc2bc1663786 13dfd2469732 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (7m) 90s ago 7m 18.7M - 18.2.0 dc2bc1663786 d65368ac6dfa 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (7m) 24s ago 7m 18.7M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (7m) 24s ago 7m 143M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (4m) 90s ago 10m 622M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (4m) 24s ago 9m 490M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (4m) 90s ago 10m 61.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (3m) 24s ago 9m 52.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (10m) 90s ago 10m 15.0M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (9m) 24s ago 9m 16.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (3m) 90s ago 9m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (115s) 90s ago 8m 104M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (92s) 90s ago 8m 12.4M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (69s) 24s ago 8m 144M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (48s) 24s ago 8m 115M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7ac87e1c2a41 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (26s) 24s ago 8m 13.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bd169bf00834 2026-03-10T12:43:01.872 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (4m) 90s ago 9m 67.8M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:43:01.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f208807bcd0 msgr2=0x7f208807e180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f208807bcd0 0x7f208807e180 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f208c005fd0 tx=0x7f208c005dc0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c0ff460 msgr2=0x7f209c193b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c0ff460 0x7f209c193b70 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f20840049c0 tx=0x7f2084004aa0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 shutdown_connections 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f208807bcd0 0x7f208807e180 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f209c0ff460 0x7f209c193b70 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 --2- 192.168.123.100:0/2579599666 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f209c100700 0x7f209c1940b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 >> 192.168.123.100:0/2579599666 conn(0x7f209c0faa70 msgr2=0x7f209c103930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 shutdown_connections 2026-03-10T12:43:01.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.875+0000 7f20a0e20700 1 -- 192.168.123.100:0/2579599666 wait complete. 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: pgmap v158: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 40 KiB/s wr, 22 op/s 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:01.956 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:01.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.954+0000 7faa4083a700 1 -- 192.168.123.100:0/3350479559 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38103980 msgr2=0x7faa38103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.956 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.954+0000 7faa4083a700 1 --2- 192.168.123.100:0/3350479559 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38103980 0x7faa38103dd0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7faa30009b00 tx=0x7faa30009e10 comp rx=0 tx=0).stop 2026-03-10T12:43:01.958 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.956+0000 7faa4083a700 1 -- 192.168.123.100:0/3350479559 shutdown_connections 2026-03-10T12:43:01.958 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.956+0000 7faa4083a700 1 --2- 192.168.123.100:0/3350479559 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38103980 0x7faa38103dd0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.958 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.956+0000 7faa4083a700 1 --2- 192.168.123.100:0/3350479559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38102780 0x7faa38102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.958 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.956+0000 7faa4083a700 1 -- 192.168.123.100:0/3350479559 >> 192.168.123.100:0/3350479559 conn(0x7faa380fdd50 msgr2=0x7faa38100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:01.960 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.959+0000 7faa4083a700 1 -- 192.168.123.100:0/3350479559 shutdown_connections 2026-03-10T12:43:01.960 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.959+0000 7faa4083a700 1 -- 192.168.123.100:0/3350479559 wait complete. 2026-03-10T12:43:01.961 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.960+0000 7faa4083a700 1 Processor -- start 2026-03-10T12:43:01.961 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.961+0000 7faa4083a700 1 -- start start 2026-03-10T12:43:01.961 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.961+0000 7faa4083a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38102780 0x7faa3819c400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.961 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.961+0000 7faa4083a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38103980 0x7faa3819c940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.961 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.961+0000 7faa4083a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faa3819cf60 con 0x7faa38102780 2026-03-10T12:43:01.961 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.961+0000 7faa4083a700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faa3819d0a0 con 0x7faa38103980 2026-03-10T12:43:01.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.961+0000 7faa3ddd5700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38103980 0x7faa3819c940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.962+0000 7faa3ddd5700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38103980 0x7faa3819c940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:45620/0 (socket says 192.168.123.100:45620) 2026-03-10T12:43:01.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.962+0000 7faa3ddd5700 1 -- 192.168.123.100:0/3561222421 learned_addr learned my addr 192.168.123.100:0/3561222421 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:01.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.962+0000 7faa3e5d6700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38102780 0x7faa3819c400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.962+0000 7faa3ddd5700 1 -- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38102780 msgr2=0x7faa3819c400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:01.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.962+0000 7faa3ddd5700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38102780 0x7faa3819c400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:01.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.962+0000 7faa3ddd5700 1 -- 192.168.123.100:0/3561222421 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faa300097e0 con 0x7faa38103980 2026-03-10T12:43:01.963 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.962+0000 7faa3e5d6700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38102780 0x7faa3819c400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:43:01.963 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.963+0000 7faa3ddd5700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38103980 0x7faa3819c940 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7faa3000b5c0 tx=0x7faa30004a40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.963 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.963+0000 7faa2b7fe700 1 -- 192.168.123.100:0/3561222421 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faa3001d070 con 0x7faa38103980 2026-03-10T12:43:01.963 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.963+0000 7faa4083a700 1 -- 192.168.123.100:0/3561222421 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faa381a1af0 con 0x7faa38103980 2026-03-10T12:43:01.963 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.963+0000 7faa4083a700 1 -- 192.168.123.100:0/3561222421 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faa381a1fe0 con 0x7faa38103980 2026-03-10T12:43:01.964 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.964+0000 7faa2b7fe700 1 -- 192.168.123.100:0/3561222421 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faa3000bc50 con 0x7faa38103980 2026-03-10T12:43:01.965 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.965+0000 7faa2b7fe700 1 -- 192.168.123.100:0/3561222421 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faa3000f670 con 0x7faa38103980 2026-03-10T12:43:01.965 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.965+0000 7faa4083a700 1 -- 192.168.123.100:0/3561222421 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faa38066e40 con 0x7faa38103980 2026-03-10T12:43:01.966 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.965+0000 7faa2b7fe700 1 -- 192.168.123.100:0/3561222421 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7faa3000f8f0 con 0x7faa38103980 2026-03-10T12:43:01.966 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.966+0000 7faa2b7fe700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa24077700 0x7faa24079bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:01.967 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.966+0000 7faa3e5d6700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa24077700 0x7faa24079bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:01.967 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.966+0000 7faa2b7fe700 1 -- 192.168.123.100:0/3561222421 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6136+0+0 (secure 0 0 0) 0x7faa3009c220 con 0x7faa38103980 2026-03-10T12:43:01.967 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.967+0000 7faa3e5d6700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa24077700 0x7faa24079bb0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7faa2c005fd0 tx=0x7faa2c005f00 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:01.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:01.968+0000 7faa2b7fe700 1 -- 192.168.123.100:0/3561222421 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7faa30064b80 con 0x7faa38103980 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: pgmap v158: 65 pgs: 65 active+clean; 270 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 40 KiB/s wr, 22 op/s 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:02.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:02.155 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.154+0000 7faa4083a700 1 -- 192.168.123.100:0/3561222421 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7faa381a22c0 con 0x7faa38103980 2026-03-10T12:43:02.156 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.156+0000 7faa2b7fe700 1 -- 192.168.123.100:0/3561222421 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+709 (secure 0 0 0) 0x7faa300642d0 con 0x7faa38103980 2026-03-10T12:43:02.156 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:43:02.156 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:43:02.156 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 4, 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 10 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:43:02.157 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:43:02.159 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.159+0000 7faa297fa700 1 -- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa24077700 msgr2=0x7faa24079bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.159+0000 7faa297fa700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa24077700 0x7faa24079bb0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7faa2c005fd0 tx=0x7faa2c005f00 comp rx=0 tx=0).stop 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.159+0000 7faa297fa700 1 -- 192.168.123.100:0/3561222421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38103980 msgr2=0x7faa3819c940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.159+0000 7faa297fa700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38103980 0x7faa3819c940 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7faa3000b5c0 tx=0x7faa30004a40 comp rx=0 tx=0).stop 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.160+0000 7faa297fa700 1 -- 192.168.123.100:0/3561222421 shutdown_connections 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.160+0000 7faa297fa700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa24077700 0x7faa24079bb0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.160+0000 7faa297fa700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa38102780 0x7faa3819c400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.160+0000 7faa297fa700 1 --2- 192.168.123.100:0/3561222421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa38103980 0x7faa3819c940 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.160+0000 7faa297fa700 1 -- 192.168.123.100:0/3561222421 >> 192.168.123.100:0/3561222421 conn(0x7faa380fdd50 msgr2=0x7faa38106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.160+0000 7faa297fa700 1 -- 192.168.123.100:0/3561222421 shutdown_connections 2026-03-10T12:43:02.160 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.160+0000 7faa297fa700 1 -- 192.168.123.100:0/3561222421 wait complete. 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.236+0000 7fcc9024e700 1 -- 192.168.123.100:0/2481449788 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88103960 msgr2=0x7fcc88103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.236+0000 7fcc8cfe8700 1 -- 192.168.123.100:0/2481449788 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc8400ba20 con 0x7fcc88103960 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.236+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2481449788 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88103960 0x7fcc88103db0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fcc84009b50 tx=0x7fcc84009e60 comp rx=0 tx=0).stop 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 -- 192.168.123.100:0/2481449788 shutdown_connections 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2481449788 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88103960 0x7fcc88103db0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2481449788 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88102760 0x7fcc88102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 -- 192.168.123.100:0/2481449788 >> 192.168.123.100:0/2481449788 conn(0x7fcc880fdcf0 msgr2=0x7fcc88100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 -- 192.168.123.100:0/2481449788 shutdown_connections 2026-03-10T12:43:02.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 -- 192.168.123.100:0/2481449788 wait complete. 2026-03-10T12:43:02.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 Processor -- start 2026-03-10T12:43:02.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.237+0000 7fcc9024e700 1 -- start start 2026-03-10T12:43:02.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc9024e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88102760 0x7fcc88193c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc9024e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88103960 0x7fcc88194180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc9024e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc881947a0 con 0x7fcc88102760 2026-03-10T12:43:02.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc9024e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc881991b0 con 0x7fcc88103960 2026-03-10T12:43:02.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc8dfea700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88102760 0x7fcc88193c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc8d7e9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88103960 0x7fcc88194180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc8d7e9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88103960 0x7fcc88194180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:45632/0 (socket says 192.168.123.100:45632) 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.238+0000 7fcc8d7e9700 1 -- 192.168.123.100:0/2182882075 learned_addr learned my addr 192.168.123.100:0/2182882075 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc8d7e9700 1 -- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88102760 msgr2=0x7fcc88193c40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc8d7e9700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88102760 0x7fcc88193c40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc8d7e9700 1 -- 192.168.123.100:0/2182882075 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcc840097e0 con 0x7fcc88103960 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc8dfea700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88102760 0x7fcc88193c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:43:02.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc8d7e9700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88103960 0x7fcc88194180 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fcc84005310 tx=0x7fcc84005710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:02.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc7effd700 1 -- 192.168.123.100:0/2182882075 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc8401d070 con 0x7fcc88103960 2026-03-10T12:43:02.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc7effd700 1 -- 192.168.123.100:0/2182882075 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcc8400bd70 con 0x7fcc88103960 2026-03-10T12:43:02.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcc88199350 con 0x7fcc88103960 2026-03-10T12:43:02.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.239+0000 7fcc7effd700 1 -- 192.168.123.100:0/2182882075 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcc8400f8b0 con 0x7fcc88103960 2026-03-10T12:43:02.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.240+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcc88199890 con 0x7fcc88103960 2026-03-10T12:43:02.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.241+0000 7fcc7effd700 1 -- 192.168.123.100:0/2182882075 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcc84022b70 con 0x7fcc88103960 2026-03-10T12:43:02.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.241+0000 7fcc7effd700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcc740800d0 0x7fcc74082580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.241+0000 7fcc7effd700 1 -- 192.168.123.100:0/2182882075 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fcc8409b250 con 0x7fcc88103960 2026-03-10T12:43:02.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.241+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcc88066e40 con 0x7fcc88103960 2026-03-10T12:43:02.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.242+0000 7fcc8dfea700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcc740800d0 0x7fcc74082580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.242 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.242+0000 7fcc8dfea700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcc740800d0 0x7fcc74082580 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fcc88195120 tx=0x7fcc78009380 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:02.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.244+0000 7fcc7effd700 1 -- 192.168.123.100:0/2182882075 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcc84063a80 con 0x7fcc88103960 2026-03-10T12:43:02.387 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.386+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fcc88199b30 con 0x7fcc88103960 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.387+0000 7fcc7effd700 1 -- 192.168.123.100:0/2182882075 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 18 v18) v1 ==== 76+0+1955 (secure 0 0 0) 0x7fcc84027070 con 0x7fcc88103960 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:e18 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:btime 2026-03-10T12:42:50:841904+0000 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:epoch 18 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:42:50.758107+0000 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 1 2026-03-10T12:43:02.388 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313,1=24307} 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 24313 members: 24313 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:24307} state up:stopping seq 3 export targets 0 join_fscid=1 addr [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:14490} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.100:6828/2948081127,v1:192.168.123.100:6829/2948081127] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:43:02.389 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.391+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcc740800d0 msgr2=0x7fcc74082580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.391+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcc740800d0 0x7fcc74082580 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fcc88195120 tx=0x7fcc78009380 comp rx=0 tx=0).stop 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88103960 msgr2=0x7fcc88194180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88103960 0x7fcc88194180 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7fcc84005310 tx=0x7fcc84005710 comp rx=0 tx=0).stop 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 shutdown_connections 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcc740800d0 0x7fcc74082580 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcc88102760 0x7fcc88193c40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 --2- 192.168.123.100:0/2182882075 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcc88103960 0x7fcc88194180 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 >> 192.168.123.100:0/2182882075 conn(0x7fcc880fdcf0 msgr2=0x7fcc880fe780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 shutdown_connections 2026-03-10T12:43:02.392 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.392+0000 7fcc9024e700 1 -- 192.168.123.100:0/2182882075 wait complete. 2026-03-10T12:43:02.393 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 18 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.475+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1435588369 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b54103680 msgr2=0x7f3b54105ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.475+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1435588369 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b54103680 0x7f3b54105ac0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f3b48009a60 tx=0x7f3b48009d70 comp rx=0 tx=0).stop 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.475+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1435588369 shutdown_connections 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.475+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1435588369 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b54103680 0x7f3b54105ac0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.475+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1435588369 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54069180 0x7f3b54103140 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.475+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1435588369 >> 192.168.123.100:0/1435588369 conn(0x7f3b540faa70 msgr2=0x7f3b540fcee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.476+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1435588369 shutdown_connections 2026-03-10T12:43:02.476 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.476+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1435588369 wait complete. 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.476+0000 7f3b5abc3700 1 Processor -- start 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.476+0000 7f3b5abc3700 1 -- start start 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b5abc3700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b54069180 0x7f3b54197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b5abc3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54103680 0x7f3b54198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b5abc3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b54198b50 con 0x7f3b54103680 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b5abc3700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b54198c90 con 0x7f3b54069180 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b53fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54103680 0x7f3b54198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.477 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b53fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54103680 0x7f3b54198530 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:37072/0 (socket says 192.168.123.100:37072) 2026-03-10T12:43:02.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b53fff700 1 -- 192.168.123.100:0/1639834457 learned_addr learned my addr 192.168.123.100:0/1639834457 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:02.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b53fff700 1 -- 192.168.123.100:0/1639834457 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b54069180 msgr2=0x7f3b54197ff0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:43:02.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b53fff700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b54069180 0x7f3b54197ff0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b53fff700 1 -- 192.168.123.100:0/1639834457 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b440097e0 con 0x7f3b54103680 2026-03-10T12:43:02.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.477+0000 7f3b53fff700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54103680 0x7f3b54198530 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f3b48009420 tx=0x7f3b4800f7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:02.478 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.478+0000 7f3b51ffb700 1 -- 192.168.123.100:0/1639834457 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b4801d070 con 0x7f3b54103680 2026-03-10T12:43:02.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.478+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b48009710 con 0x7f3b54103680 2026-03-10T12:43:02.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.478+0000 7f3b51ffb700 1 -- 192.168.123.100:0/1639834457 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3b4800fd40 con 0x7f3b54103680 2026-03-10T12:43:02.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.478+0000 7f3b51ffb700 1 -- 192.168.123.100:0/1639834457 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b48017940 con 0x7f3b54103680 2026-03-10T12:43:02.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.478+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b5419da40 con 0x7f3b54103680 2026-03-10T12:43:02.484 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.479+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b540fc670 con 0x7f3b54103680 2026-03-10T12:43:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.485+0000 7f3b51ffb700 1 -- 192.168.123.100:0/1639834457 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3b48021c80 con 0x7f3b54103680 2026-03-10T12:43:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.485+0000 7f3b51ffb700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3b3c0778c0 0x7f3b3c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.485+0000 7f3b51ffb700 1 -- 192.168.123.100:0/1639834457 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3b4809b620 con 0x7f3b54103680 2026-03-10T12:43:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.485+0000 7f3b51ffb700 1 -- 192.168.123.100:0/1639834457 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3b4809baa0 con 0x7f3b54103680 2026-03-10T12:43:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.485+0000 7f3b5895f700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3b3c0778c0 0x7f3b3c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.486 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.486+0000 7f3b5895f700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3b3c0778c0 0x7f3b3c079d70 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f3b54068eb0 tx=0x7f3b44009500 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:02.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.619+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3b5419d620 con 0x7f3b3c0778c0 2026-03-10T12:43:02.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.626+0000 7f3b51ffb700 1 -- 192.168.123.100:0/1639834457 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f3b5419d620 con 0x7f3b3c0778c0 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "osd", 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "12/23 daemons upgraded", 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:43:02.627 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:43:02.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3b3c0778c0 msgr2=0x7f3b3c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3b3c0778c0 0x7f3b3c079d70 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f3b54068eb0 tx=0x7f3b44009500 comp rx=0 tx=0).stop 2026-03-10T12:43:02.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54103680 msgr2=0x7f3b54198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.629 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54103680 0x7f3b54198530 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f3b48009420 tx=0x7f3b4800f7b0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 shutdown_connections 2026-03-10T12:43:02.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3b3c0778c0 0x7f3b3c079d70 secure :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f3b54068eb0 tx=0x7f3b44009500 comp rx=0 tx=0).stop 2026-03-10T12:43:02.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3b54069180 0x7f3b54197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 --2- 192.168.123.100:0/1639834457 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3b54103680 0x7f3b54198530 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 >> 192.168.123.100:0/1639834457 conn(0x7f3b540faa70 msgr2=0x7f3b540ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:02.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.629+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 shutdown_connections 2026-03-10T12:43:02.630 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.630+0000 7f3b5abc3700 1 -- 192.168.123.100:0/1639834457 wait complete. 2026-03-10T12:43:02.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.712+0000 7ff4b02ef700 1 -- 192.168.123.100:0/3425788228 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a8107d50 msgr2=0x7ff4a81081c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.712+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/3425788228 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a8107d50 0x7ff4a81081c0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7ff4a000b600 tx=0x7ff4a000b910 comp rx=0 tx=0).stop 2026-03-10T12:43:02.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 -- 192.168.123.100:0/3425788228 shutdown_connections 2026-03-10T12:43:02.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/3425788228 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a8107d50 0x7ff4a81081c0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/3425788228 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4a8071db0 0x7ff4a80721c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.713 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 -- 192.168.123.100:0/3425788228 >> 192.168.123.100:0/3425788228 conn(0x7ff4a806d3e0 msgr2=0x7ff4a806f830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:02.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 -- 192.168.123.100:0/3425788228 shutdown_connections 2026-03-10T12:43:02.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 -- 192.168.123.100:0/3425788228 wait complete. 2026-03-10T12:43:02.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 Processor -- start 2026-03-10T12:43:02.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 -- start start 2026-03-10T12:43:02.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4a8071db0 0x7ff4a819c460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a819c9a0 0x7ff4a81c3a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.715 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4a819cea0 con 0x7ff4a819c9a0 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.713+0000 7ff4b02ef700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff4a819d010 con 0x7ff4a8071db0 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ad88a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a819c9a0 0x7ff4a81c3a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ae08b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4a8071db0 0x7ff4a819c460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ad88a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a819c9a0 0x7ff4a81c3a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:37086/0 (socket says 192.168.123.100:37086) 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ad88a700 1 -- 192.168.123.100:0/1551832003 learned_addr learned my addr 192.168.123.100:0/1551832003 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ad88a700 1 -- 192.168.123.100:0/1551832003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4a8071db0 msgr2=0x7ff4a819c460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ad88a700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4a8071db0 0x7ff4a819c460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ad88a700 1 -- 192.168.123.100:0/1551832003 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff4a000b050 con 0x7ff4a819c9a0 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.714+0000 7ff4ad88a700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a819c9a0 0x7ff4a81c3a20 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7ff4a00077f0 tx=0x7ff4a0007820 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:02.716 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.715+0000 7ff49f7fe700 1 -- 192.168.123.100:0/1551832003 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff4a000e030 con 0x7ff4a819c9a0 2026-03-10T12:43:02.721 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.716+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff4a81c3f60 con 0x7ff4a819c9a0 2026-03-10T12:43:02.721 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.716+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff4a81c4480 con 0x7ff4a819c9a0 2026-03-10T12:43:02.721 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.716+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff4a8066e40 con 0x7ff4a819c9a0 2026-03-10T12:43:02.721 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.721+0000 7ff49f7fe700 1 -- 192.168.123.100:0/1551832003 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff4a0007a70 con 0x7ff4a819c9a0 2026-03-10T12:43:02.722 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.721+0000 7ff49f7fe700 1 -- 192.168.123.100:0/1551832003 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff4a001cac0 con 0x7ff4a819c9a0 2026-03-10T12:43:02.722 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.721+0000 7ff49f7fe700 1 -- 192.168.123.100:0/1551832003 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff4a001cc60 con 0x7ff4a819c9a0 2026-03-10T12:43:02.722 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.722+0000 7ff49f7fe700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff494077ab0 0x7ff494079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:02.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.722+0000 7ff4ae08b700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff494077ab0 0x7ff494079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:02.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.722+0000 7ff4ae08b700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff494077ab0 0x7ff494079f60 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff4a401aeb0 tx=0x7ff4a401a460 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:02.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.723+0000 7ff49f7fe700 1 -- 192.168.123.100:0/1551832003 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff4a009c630 con 0x7ff4a819c9a0 2026-03-10T12:43:02.723 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.723+0000 7ff49f7fe700 1 -- 192.168.123.100:0/1551832003 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff4a009ca60 con 0x7ff4a819c9a0 2026-03-10T12:43:02.913 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:02 vm00.local ceph-mon[103263]: from='client.34334 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:02.913 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:02 vm00.local ceph-mon[103263]: from='client.34338 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:02.913 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:02 vm00.local ceph-mon[103263]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:43:02.913 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:02 vm00.local ceph-mon[103263]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:02.913 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:02 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3561222421' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:02.913 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:02 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2182882075' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:43:02.914 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.912+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff4a81c4730 con 0x7ff4a819c9a0 2026-03-10T12:43:02.916 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.916+0000 7ff49f7fe700 1 -- 192.168.123.100:0/1551832003 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7ff4a0064fa0 con 0x7ff4a819c9a0 2026-03-10T12:43:02.916 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff494077ab0 msgr2=0x7ff494079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff494077ab0 0x7ff494079f60 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff4a401aeb0 tx=0x7ff4a401a460 comp rx=0 tx=0).stop 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a819c9a0 msgr2=0x7ff4a81c3a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a819c9a0 0x7ff4a81c3a20 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7ff4a00077f0 tx=0x7ff4a0007820 comp rx=0 tx=0).stop 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 shutdown_connections 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff494077ab0 0x7ff494079f60 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff4a8071db0 0x7ff4a819c460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 --2- 192.168.123.100:0/1551832003 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff4a819c9a0 0x7ff4a81c3a20 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:02.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 >> 192.168.123.100:0/1551832003 conn(0x7ff4a806d3e0 msgr2=0x7ff4a80706c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:02.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 shutdown_connections 2026-03-10T12:43:02.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:02.919+0000 7ff4b02ef700 1 -- 192.168.123.100:0/1551832003 wait complete. 2026-03-10T12:43:02.940 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:02 vm07.local ceph-mon[93622]: from='client.34334 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:02.940 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:02 vm07.local ceph-mon[93622]: from='client.34338 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:02.940 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:02 vm07.local ceph-mon[93622]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:43:02.940 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:02 vm07.local ceph-mon[93622]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:02.940 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:02 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3561222421' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:02.940 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:02 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2182882075' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:43:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:03 vm07.local ceph-mon[93622]: from='client.34350 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:03 vm07.local ceph-mon[93622]: pgmap v159: 65 pgs: 65 active+clean; 258 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 41 KiB/s wr, 24 op/s 2026-03-10T12:43:04.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:03 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1551832003' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:43:04.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:03 vm00.local ceph-mon[103263]: from='client.34350 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:04.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:03 vm00.local ceph-mon[103263]: pgmap v159: 65 pgs: 65 active+clean; 258 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 41 KiB/s wr, 24 op/s 2026-03-10T12:43:04.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:03 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1551832003' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:43:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:05 vm07.local ceph-mon[93622]: pgmap v160: 65 pgs: 65 active+clean; 218 MiB data, 985 MiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 41 KiB/s wr, 28 op/s 2026-03-10T12:43:06.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:05 vm00.local ceph-mon[103263]: pgmap v160: 65 pgs: 65 active+clean; 218 MiB data, 985 MiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 41 KiB/s wr, 28 op/s 2026-03-10T12:43:08.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:07 vm07.local ceph-mon[93622]: pgmap v161: 65 pgs: 65 active+clean; 210 MiB data, 945 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 6 op/s 2026-03-10T12:43:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:07 vm00.local ceph-mon[103263]: pgmap v161: 65 pgs: 65 active+clean; 210 MiB data, 945 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 6 op/s 2026-03-10T12:43:09.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:09 vm00.local ceph-mon[103263]: pgmap v162: 65 pgs: 65 active+clean; 154 MiB data, 801 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.3 KiB/s wr, 7 op/s 2026-03-10T12:43:10.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:09 vm07.local ceph-mon[93622]: pgmap v162: 65 pgs: 65 active+clean; 154 MiB data, 801 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.3 KiB/s wr, 7 op/s 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: pgmap v163: 65 pgs: 65 active+clean; 150 MiB data, 753 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 938 B/s wr, 7 op/s 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:12.218 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:11 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: pgmap v163: 65 pgs: 65 active+clean; 150 MiB data, 753 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 938 B/s wr, 7 op/s 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:11 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:13.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:12 vm00.local ceph-mon[103263]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:43:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:12 vm07.local ceph-mon[93622]: Upgrade: Waiting for fs cephfs to scale down to reach 1 MDS 2026-03-10T12:43:14.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:13 vm00.local ceph-mon[103263]: pgmap v164: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 7 op/s 2026-03-10T12:43:14.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:13 vm07.local ceph-mon[93622]: pgmap v164: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 1.2 KiB/s wr, 7 op/s 2026-03-10T12:43:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:15 vm00.local ceph-mon[103263]: pgmap v165: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 597 B/s wr, 6 op/s 2026-03-10T12:43:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:15 vm00.local ceph-mon[103263]: daemon mds.cephfs.vm00.lnokoe finished stopping rank 1 in filesystem cephfs (now has 1 ranks) 2026-03-10T12:43:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:15 vm07.local ceph-mon[93622]: pgmap v165: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 597 B/s wr, 6 op/s 2026-03-10T12:43:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:15 vm07.local ceph-mon[93622]: daemon mds.cephfs.vm00.lnokoe finished stopping rank 1 in filesystem cephfs (now has 1 ranks) 2026-03-10T12:43:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:17.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:16 vm00.local ceph-mon[103263]: mds.1 [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] down:stopped 2026-03-10T12:43:17.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:16 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:43:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:16 vm07.local ceph-mon[93622]: mds.1 [v2:192.168.123.100:6826/2640363946,v1:192.168.123.100:6827/2640363946] down:stopped 2026-03-10T12:43:17.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:16 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:43:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:17 vm00.local ceph-mon[103263]: pgmap v166: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 426 B/s wr, 2 op/s 2026-03-10T12:43:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:17 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.100:6826/1069803323,v1:192.168.123.100:6827/1069803323] up:boot 2026-03-10T12:43:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:17 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 3 up:standby 2026-03-10T12:43:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:43:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:17 vm07.local ceph-mon[93622]: pgmap v166: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 426 B/s wr, 2 op/s 2026-03-10T12:43:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:17 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.100:6826/1069803323,v1:192.168.123.100:6827/1069803323] up:boot 2026-03-10T12:43:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:17 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 3 up:standby 2026-03-10T12:43:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:43:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:19 vm00.local ceph-mon[103263]: pgmap v167: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 341 B/s wr, 1 op/s 2026-03-10T12:43:20.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:19 vm07.local ceph-mon[93622]: pgmap v167: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 341 B/s wr, 1 op/s 2026-03-10T12:43:22.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:21 vm00.local ceph-mon[103263]: pgmap v168: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s 2026-03-10T12:43:22.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:22.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:22.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:22.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:21 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:22.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:21 vm07.local ceph-mon[93622]: pgmap v168: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s 2026-03-10T12:43:22.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:22.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:22.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:22.281 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:21 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm00.lnokoe"]}]: dispatch 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: Upgrade: It appears safe to stop mds.cephfs.vm00.lnokoe 2026-03-10T12:43:23.105 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:23.106 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:23.106 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:22 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm00.lnokoe"]}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: Upgrade: It appears safe to stop mds.cephfs.vm00.lnokoe 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.lnokoe", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:23.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:22 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:24.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:23 vm00.local ceph-mon[103263]: Upgrade: Updating mds.cephfs.vm00.lnokoe 2026-03-10T12:43:24.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:23 vm00.local ceph-mon[103263]: Deploying daemon mds.cephfs.vm00.lnokoe on vm00 2026-03-10T12:43:24.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:23 vm00.local ceph-mon[103263]: pgmap v169: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s 2026-03-10T12:43:24.155 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:23 vm00.local ceph-mon[103263]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T12:43:24.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:23 vm07.local ceph-mon[93622]: Upgrade: Updating mds.cephfs.vm00.lnokoe 2026-03-10T12:43:24.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:23 vm07.local ceph-mon[93622]: Deploying daemon mds.cephfs.vm00.lnokoe on vm00 2026-03-10T12:43:24.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:23 vm07.local ceph-mon[93622]: pgmap v169: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 255 B/s wr, 0 op/s 2026-03-10T12:43:24.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:23 vm07.local ceph-mon[93622]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T12:43:25.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:24 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:43:25.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:25.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:25.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:25.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:25.122 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:24 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:43:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:26.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:26 vm00.local ceph-mon[103263]: pgmap v171: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:26.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:26 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] up:boot 2026-03-10T12:43:26.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:26 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 3 up:standby 2026-03-10T12:43:26.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:43:26.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:26 vm07.local ceph-mon[93622]: pgmap v171: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:26.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:26 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] up:boot 2026-03-10T12:43:26.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:26 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 3 up:standby 2026-03-10T12:43:26.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:43:27.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:27 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:27 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:27 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:27 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:27 vm00.local ceph-mon[103263]: pgmap v172: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:27.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:27 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:27 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:27 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:27 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:27.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:27 vm07.local ceph-mon[93622]: pgmap v172: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: Detected new or changed devices on vm00 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm00.wdwvcu"]}]: dispatch 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: Upgrade: It appears safe to stop mds.cephfs.vm00.wdwvcu 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:29.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:28 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: Detected new or changed devices on vm00 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm00.wdwvcu"]}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: Upgrade: It appears safe to stop mds.cephfs.vm00.wdwvcu 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm00.wdwvcu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:28 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:29.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:29 vm00.local ceph-mon[103263]: Upgrade: Updating mds.cephfs.vm00.wdwvcu 2026-03-10T12:43:29.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:29 vm00.local ceph-mon[103263]: Deploying daemon mds.cephfs.vm00.wdwvcu on vm00 2026-03-10T12:43:29.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:29 vm00.local ceph-mon[103263]: pgmap v173: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:29.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:29 vm00.local ceph-mon[103263]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T12:43:29.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:29 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:43:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:29 vm07.local ceph-mon[93622]: Upgrade: Updating mds.cephfs.vm00.wdwvcu 2026-03-10T12:43:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:29 vm07.local ceph-mon[93622]: Deploying daemon mds.cephfs.vm00.wdwvcu on vm00 2026-03-10T12:43:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:29 vm07.local ceph-mon[93622]: pgmap v173: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:29 vm07.local ceph-mon[93622]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T12:43:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:29 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 2 up:standby 2026-03-10T12:43:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:31.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:32.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:31 vm00.local ceph-mon[103263]: pgmap v175: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:32.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:31 vm07.local ceph-mon[93622]: pgmap v175: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:32.973 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:32 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:43:33.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.005+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/2176674814 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd7a8071a60 msgr2=0x7fd7a8071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.005+0000 7fd7ae8a4700 1 --2- 192.168.123.100:0/2176674814 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd7a8071a60 0x7fd7a8071e70 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fd7a4009b00 tx=0x7fd7a4009e10 comp rx=0 tx=0).stop 2026-03-10T12:43:33.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.007+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/2176674814 shutdown_connections 2026-03-10T12:43:33.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.007+0000 7fd7ae8a4700 1 --2- 192.168.123.100:0/2176674814 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8072440 0x7fd7a810be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.007+0000 7fd7ae8a4700 1 --2- 192.168.123.100:0/2176674814 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd7a8071a60 0x7fd7a8071e70 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.007+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/2176674814 >> 192.168.123.100:0/2176674814 conn(0x7fd7a806d1a0 msgr2=0x7fd7a806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:33.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.007+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/2176674814 shutdown_connections 2026-03-10T12:43:33.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.007+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/2176674814 wait complete. 2026-03-10T12:43:33.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.007+0000 7fd7ae8a4700 1 Processor -- start 2026-03-10T12:43:33.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ae8a4700 1 -- start start 2026-03-10T12:43:33.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ae8a4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd7a8072440 0x7fd7a8116a70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ae8a4700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8116fb0 0x7fd7a81b27a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ae8a4700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7a81174b0 con 0x7fd7a8072440 2026-03-10T12:43:33.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ae8a4700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd7a81175f0 con 0x7fd7a8116fb0 2026-03-10T12:43:33.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ad0a1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8116fb0 0x7fd7a81b27a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ad0a1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8116fb0 0x7fd7a81b27a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:32830/0 (socket says 192.168.123.100:32830) 2026-03-10T12:43:33.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.008+0000 7fd7ad0a1700 1 -- 192.168.123.100:0/1438935639 learned_addr learned my addr 192.168.123.100:0/1438935639 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:33.009 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.009+0000 7fd7ad0a1700 1 -- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd7a8072440 msgr2=0x7fd7a8116a70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.009+0000 7fd7ad0a1700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd7a8072440 0x7fd7a8116a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.009+0000 7fd7ad0a1700 1 -- 192.168.123.100:0/1438935639 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7a40097e0 con 0x7fd7a8116fb0 2026-03-10T12:43:33.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.009+0000 7fd7ad0a1700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8116fb0 0x7fd7a81b27a0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fd7a0007f00 tx=0x7fd7a000d3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.010+0000 7fd79effd700 1 -- 192.168.123.100:0/1438935639 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7a000dcf0 con 0x7fd7a8116fb0 2026-03-10T12:43:33.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.010+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/1438935639 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd7a81b2d40 con 0x7fd7a8116fb0 2026-03-10T12:43:33.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.010+0000 7fd79effd700 1 -- 192.168.123.100:0/1438935639 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd7a000f040 con 0x7fd7a8116fb0 2026-03-10T12:43:33.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.010+0000 7fd79effd700 1 -- 192.168.123.100:0/1438935639 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd7a00127c0 con 0x7fd7a8116fb0 2026-03-10T12:43:33.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.010+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/1438935639 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd7a81b3240 con 0x7fd7a8116fb0 2026-03-10T12:43:33.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.011+0000 7fd79effd700 1 -- 192.168.123.100:0/1438935639 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd7a0004ad0 con 0x7fd7a8116fb0 2026-03-10T12:43:33.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.012+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/1438935639 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd7a804ea50 con 0x7fd7a8116fb0 2026-03-10T12:43:33.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.012+0000 7fd79effd700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd7940776c0 0x7fd794079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.012+0000 7fd79effd700 1 -- 192.168.123.100:0/1438935639 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd7a0098c70 con 0x7fd7a8116fb0 2026-03-10T12:43:33.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.012+0000 7fd7ad8a2700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd7940776c0 0x7fd794079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.013+0000 7fd7ad8a2700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd7940776c0 0x7fd794079b70 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd7a4009ad0 tx=0x7fd7a4009f90 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.015+0000 7fd79effd700 1 -- 192.168.123.100:0/1438935639 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd7a009f080 con 0x7fd7a8116fb0 2026-03-10T12:43:33.175 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:32 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:43:33.179 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.178+0000 7fd7ae8a4700 1 -- 192.168.123.100:0/1438935639 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd7a810b6e0 con 0x7fd7940776c0 2026-03-10T12:43:33.183 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.181+0000 7fd79effd700 1 -- 192.168.123.100:0/1438935639 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fd7a810b6e0 con 0x7fd7940776c0 2026-03-10T12:43:33.185 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.184+0000 7fd79cf79700 1 -- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd7940776c0 msgr2=0x7fd794079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.185 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.184+0000 7fd79cf79700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd7940776c0 0x7fd794079b70 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd7a4009ad0 tx=0x7fd7a4009f90 comp rx=0 tx=0).stop 2026-03-10T12:43:33.185 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.184+0000 7fd79cf79700 1 -- 192.168.123.100:0/1438935639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8116fb0 msgr2=0x7fd7a81b27a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.185 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.184+0000 7fd79cf79700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8116fb0 0x7fd7a81b27a0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fd7a0007f00 tx=0x7fd7a000d3b0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.185+0000 7fd79cf79700 1 -- 192.168.123.100:0/1438935639 shutdown_connections 2026-03-10T12:43:33.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.185+0000 7fd79cf79700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd7940776c0 0x7fd794079b70 secure :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fd7a4009ad0 tx=0x7fd7a4009f90 comp rx=0 tx=0).stop 2026-03-10T12:43:33.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.185+0000 7fd79cf79700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd7a8072440 0x7fd7a8116a70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.185+0000 7fd79cf79700 1 --2- 192.168.123.100:0/1438935639 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd7a8116fb0 0x7fd7a81b27a0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.185+0000 7fd79cf79700 1 -- 192.168.123.100:0/1438935639 >> 192.168.123.100:0/1438935639 conn(0x7fd7a806d1a0 msgr2=0x7fd7a80705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:33.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.185+0000 7fd79cf79700 1 -- 192.168.123.100:0/1438935639 shutdown_connections 2026-03-10T12:43:33.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.185+0000 7fd79cf79700 1 -- 192.168.123.100:0/1438935639 wait complete. 2026-03-10T12:43:33.198 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:43:33.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.275+0000 7fc0c1664700 1 -- 192.168.123.100:0/4081564151 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc072440 msgr2=0x7fc0bc10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.275+0000 7fc0c1664700 1 --2- 192.168.123.100:0/4081564151 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc072440 0x7fc0bc10be90 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fc0b400b600 tx=0x7fc0b400b910 comp rx=0 tx=0).stop 2026-03-10T12:43:33.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.276+0000 7fc0c1664700 1 -- 192.168.123.100:0/4081564151 shutdown_connections 2026-03-10T12:43:33.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.276+0000 7fc0c1664700 1 --2- 192.168.123.100:0/4081564151 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc072440 0x7fc0bc10be90 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.276+0000 7fc0c1664700 1 --2- 192.168.123.100:0/4081564151 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0bc071a60 0x7fc0bc071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.277 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.276+0000 7fc0c1664700 1 -- 192.168.123.100:0/4081564151 >> 192.168.123.100:0/4081564151 conn(0x7fc0bc06d1a0 msgr2=0x7fc0bc06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.277+0000 7fc0c1664700 1 -- 192.168.123.100:0/4081564151 shutdown_connections 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.277+0000 7fc0c1664700 1 -- 192.168.123.100:0/4081564151 wait complete. 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.277+0000 7fc0c1664700 1 Processor -- start 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0c1664700 1 -- start start 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0c1664700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc071a60 0x7fc0bc1a4b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0c1664700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0bc1a5090 0x7fc0bc076fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0c1664700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0bc1a5590 con 0x7fc0bc071a60 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0c1664700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0bc1a5700 con 0x7fc0bc1a5090 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0bbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc071a60 0x7fc0bc1a4b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0bbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc071a60 0x7fc0bc1a4b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:44850/0 (socket says 192.168.123.100:44850) 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0bbfff700 1 -- 192.168.123.100:0/90135082 learned_addr learned my addr 192.168.123.100:0/90135082 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0bb7fe700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0bc1a5090 0x7fc0bc076fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0bbfff700 1 -- 192.168.123.100:0/90135082 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0bc1a5090 msgr2=0x7fc0bc076fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.280 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0bbfff700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0bc1a5090 0x7fc0bc076fa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.278+0000 7fc0bbfff700 1 -- 192.168.123.100:0/90135082 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc0b400b050 con 0x7fc0bc071a60 2026-03-10T12:43:33.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.279+0000 7fc0bbfff700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc071a60 0x7fc0bc1a4b50 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fc0ac00ba70 tx=0x7fc0ac00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.281 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.280+0000 7fc0b97fa700 1 -- 192.168.123.100:0/90135082 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc0ac00c760 con 0x7fc0bc071a60 2026-03-10T12:43:33.282 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.280+0000 7fc0c1664700 1 -- 192.168.123.100:0/90135082 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0bc077540 con 0x7fc0bc071a60 2026-03-10T12:43:33.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.282+0000 7fc0c1664700 1 -- 192.168.123.100:0/90135082 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0bc077a60 con 0x7fc0bc071a60 2026-03-10T12:43:33.283 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.283+0000 7fc0c1664700 1 -- 192.168.123.100:0/90135082 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc0bc19ec20 con 0x7fc0bc071a60 2026-03-10T12:43:33.286 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.286+0000 7fc0b97fa700 1 -- 192.168.123.100:0/90135082 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc0ac00cda0 con 0x7fc0bc071a60 2026-03-10T12:43:33.287 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.286+0000 7fc0b97fa700 1 -- 192.168.123.100:0/90135082 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc0ac012570 con 0x7fc0bc071a60 2026-03-10T12:43:33.287 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.286+0000 7fc0b97fa700 1 -- 192.168.123.100:0/90135082 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc0ac012790 con 0x7fc0bc071a60 2026-03-10T12:43:33.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.287+0000 7fc0b97fa700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0a4077ab0 0x7fc0a4079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.288+0000 7fc0bb7fe700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0a4077ab0 0x7fc0a4079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.288+0000 7fc0bb7fe700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0a4077ab0 0x7fc0a4079f60 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fc0b400bd90 tx=0x7fc0b4007c00 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.289+0000 7fc0b97fa700 1 -- 192.168.123.100:0/90135082 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc0ac0998a0 con 0x7fc0bc071a60 2026-03-10T12:43:33.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.289+0000 7fc0b97fa700 1 -- 192.168.123.100:0/90135082 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc0ac09d070 con 0x7fc0bc071a60 2026-03-10T12:43:33.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.450+0000 7fc0c1664700 1 -- 192.168.123.100:0/90135082 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc0bc061190 con 0x7fc0a4077ab0 2026-03-10T12:43:33.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.451+0000 7fc0b97fa700 1 -- 192.168.123.100:0/90135082 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fc0bc061190 con 0x7fc0a4077ab0 2026-03-10T12:43:33.455 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.454+0000 7fc0a2ffd700 1 -- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0a4077ab0 msgr2=0x7fc0a4079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.455 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.454+0000 7fc0a2ffd700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0a4077ab0 0x7fc0a4079f60 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fc0b400bd90 tx=0x7fc0b4007c00 comp rx=0 tx=0).stop 2026-03-10T12:43:33.455 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.454+0000 7fc0a2ffd700 1 -- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc071a60 msgr2=0x7fc0bc1a4b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.454+0000 7fc0a2ffd700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc071a60 0x7fc0bc1a4b50 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fc0ac00ba70 tx=0x7fc0ac00be30 comp rx=0 tx=0).stop 2026-03-10T12:43:33.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.455+0000 7fc0a2ffd700 1 -- 192.168.123.100:0/90135082 shutdown_connections 2026-03-10T12:43:33.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.455+0000 7fc0a2ffd700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0a4077ab0 0x7fc0a4079f60 secure :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fc0b400bd90 tx=0x7fc0b4007c00 comp rx=0 tx=0).stop 2026-03-10T12:43:33.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.455+0000 7fc0a2ffd700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0bc071a60 0x7fc0bc1a4b50 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.455+0000 7fc0a2ffd700 1 --2- 192.168.123.100:0/90135082 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0bc1a5090 0x7fc0bc076fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.455+0000 7fc0a2ffd700 1 -- 192.168.123.100:0/90135082 >> 192.168.123.100:0/90135082 conn(0x7fc0bc06d1a0 msgr2=0x7fc0bc10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:33.458 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.458+0000 7fc0a2ffd700 1 -- 192.168.123.100:0/90135082 shutdown_connections 2026-03-10T12:43:33.458 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.458+0000 7fc0a2ffd700 1 -- 192.168.123.100:0/90135082 wait complete. 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.560+0000 7fc5139db700 1 -- 192.168.123.100:0/2006780873 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 msgr2=0x7fc50c103b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.560+0000 7fc5139db700 1 --2- 192.168.123.100:0/2006780873 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c103b40 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fc50800b600 tx=0x7fc50800b910 comp rx=0 tx=0).stop 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.561+0000 7fc5139db700 1 -- 192.168.123.100:0/2006780873 shutdown_connections 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.561+0000 7fc5139db700 1 --2- 192.168.123.100:0/2006780873 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c103b40 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.561+0000 7fc5139db700 1 --2- 192.168.123.100:0/2006780873 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc50c1024f0 0x7fc50c102900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.561+0000 7fc5139db700 1 -- 192.168.123.100:0/2006780873 >> 192.168.123.100:0/2006780873 conn(0x7fc50c0fdac0 msgr2=0x7fc50c0ffed0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.562+0000 7fc5139db700 1 -- 192.168.123.100:0/2006780873 shutdown_connections 2026-03-10T12:43:33.564 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.563+0000 7fc5139db700 1 -- 192.168.123.100:0/2006780873 wait complete. 2026-03-10T12:43:33.565 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.564+0000 7fc5139db700 1 Processor -- start 2026-03-10T12:43:33.565 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.565+0000 7fc5139db700 1 -- start start 2026-03-10T12:43:33.565 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.565+0000 7fc5139db700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc50c1024f0 0x7fc50c071d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.565+0000 7fc5139db700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c0722a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.565+0000 7fc5139db700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc50c0728c0 con 0x7fc50c1036f0 2026-03-10T12:43:33.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.565+0000 7fc5139db700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc50c1a5bc0 con 0x7fc50c1024f0 2026-03-10T12:43:33.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.565+0000 7fc510f76700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c0722a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.566+0000 7fc510f76700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c0722a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:44876/0 (socket says 192.168.123.100:44876) 2026-03-10T12:43:33.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.566+0000 7fc510f76700 1 -- 192.168.123.100:0/1303059607 learned_addr learned my addr 192.168.123.100:0/1303059607 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:33.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.566+0000 7fc511777700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc50c1024f0 0x7fc50c071d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.566+0000 7fc511777700 1 -- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 msgr2=0x7fc50c0722a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.568+0000 7fc511777700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c0722a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.568+0000 7fc511777700 1 -- 192.168.123.100:0/1303059607 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc50800b050 con 0x7fc50c1024f0 2026-03-10T12:43:33.570 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.569+0000 7fc510f76700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c0722a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:43:33.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.571+0000 7fc511777700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc50c1024f0 0x7fc50c071d60 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fc50400ea80 tx=0x7fc50400ed90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.572+0000 7fc5027fc700 1 -- 192.168.123.100:0/1303059607 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc50400cb20 con 0x7fc50c1024f0 2026-03-10T12:43:33.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.572+0000 7fc5027fc700 1 -- 192.168.123.100:0/1303059607 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc504004500 con 0x7fc50c1024f0 2026-03-10T12:43:33.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.572+0000 7fc5027fc700 1 -- 192.168.123.100:0/1303059607 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc504010430 con 0x7fc50c1024f0 2026-03-10T12:43:33.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.572+0000 7fc5139db700 1 -- 192.168.123.100:0/1303059607 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc50c1a5dc0 con 0x7fc50c1024f0 2026-03-10T12:43:33.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.572+0000 7fc5139db700 1 -- 192.168.123.100:0/1303059607 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc50c1a62c0 con 0x7fc50c1024f0 2026-03-10T12:43:33.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.574+0000 7fc5139db700 1 -- 192.168.123.100:0/1303059607 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc50c04ea50 con 0x7fc50c1024f0 2026-03-10T12:43:33.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.577+0000 7fc5027fc700 1 -- 192.168.123.100:0/1303059607 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc504039b70 con 0x7fc50c1024f0 2026-03-10T12:43:33.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.578+0000 7fc5027fc700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc4f80778c0 0x7fc4f8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.578+0000 7fc5027fc700 1 -- 192.168.123.100:0/1303059607 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc504014070 con 0x7fc50c1024f0 2026-03-10T12:43:33.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.578+0000 7fc510f76700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc4f80778c0 0x7fc4f8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.579+0000 7fc510f76700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc4f80778c0 0x7fc4f8079d70 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fc50c072cc0 tx=0x7fc508006210 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.579+0000 7fc5027fc700 1 -- 192.168.123.100:0/1303059607 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc504062860 con 0x7fc50c1024f0 2026-03-10T12:43:33.729 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.725+0000 7fc5139db700 1 -- 192.168.123.100:0/1303059607 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc50c1a65a0 con 0x7fc4f80778c0 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (9m) 7s ago 10m 25.6M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (10m) 7s ago 10m 9781k - 18.2.0 dc2bc1663786 d9c35bbdf4cd 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (10m) 56s ago 10m 11.8M - 18.2.0 dc2bc1663786 2a98961ae9ca 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (4m) 7s ago 10m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (4m) 56s ago 10m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (9m) 7s ago 10m 91.2M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (9s) 7s ago 8m 16.7M - 19.2.3-678-ge911bdeb 654f31e6858e 6ba265e19d66 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 starting - - - - 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (8m) 56s ago 8m 18.7M - 18.2.0 dc2bc1663786 1b9425223bd2 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (8m) 56s ago 8m 143M - 18.2.0 dc2bc1663786 9c2b36fc1ac1 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (5m) 7s ago 11m 627M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (4m) 56s ago 9m 490M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (4m) 7s ago 11m 63.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (4m) 56s ago 9m 52.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (10m) 7s ago 10m 15.0M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (10m) 56s ago 10m 16.2M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (3m) 7s ago 9m 183M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (2m) 7s ago 9m 114M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (2m) 7s ago 9m 124M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (101s) 56s ago 9m 144M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (79s) 56s ago 8m 115M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7ac87e1c2a41 2026-03-10T12:43:33.733 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (57s) 56s ago 8m 13.1M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bd169bf00834 2026-03-10T12:43:33.734 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (4m) 7s ago 10m 72.3M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:43:33.734 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.731+0000 7fc5027fc700 1 -- 192.168.123.100:0/1303059607 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fc50c1a65a0 con 0x7fc4f80778c0 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 -- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc4f80778c0 msgr2=0x7fc4f8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc4f80778c0 0x7fc4f8079d70 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fc50c072cc0 tx=0x7fc508006210 comp rx=0 tx=0).stop 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 -- 192.168.123.100:0/1303059607 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc50c1024f0 msgr2=0x7fc50c071d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc50c1024f0 0x7fc50c071d60 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fc50400ea80 tx=0x7fc50400ed90 comp rx=0 tx=0).stop 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 -- 192.168.123.100:0/1303059607 shutdown_connections 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc4f80778c0 0x7fc4f8079d70 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc50c1024f0 0x7fc50c071d60 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 --2- 192.168.123.100:0/1303059607 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc50c1036f0 0x7fc50c0722a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 -- 192.168.123.100:0/1303059607 >> 192.168.123.100:0/1303059607 conn(0x7fc50c0fdac0 msgr2=0x7fc50c106920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 -- 192.168.123.100:0/1303059607 shutdown_connections 2026-03-10T12:43:33.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.736+0000 7fc4f7fff700 1 -- 192.168.123.100:0/1303059607 wait complete. 2026-03-10T12:43:33.825 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.824+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/2987870123 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0100d60 msgr2=0x7ff8a0103140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.825+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/2987870123 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0100d60 0x7ff8a0103140 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7ff89c009b00 tx=0x7ff89c009e10 comp rx=0 tx=0).stop 2026-03-10T12:43:33.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.825+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/2987870123 shutdown_connections 2026-03-10T12:43:33.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.825+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/2987870123 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0103680 0x7ff8a0105a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.825+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/2987870123 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0100d60 0x7ff8a0103140 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.826 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.826+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/2987870123 >> 192.168.123.100:0/2987870123 conn(0x7ff8a00fa760 msgr2=0x7ff8a00fcbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:33.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.826+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/2987870123 shutdown_connections 2026-03-10T12:43:33.827 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.826+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/2987870123 wait complete. 2026-03-10T12:43:33.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.827+0000 7ff8a7ceb700 1 Processor -- start 2026-03-10T12:43:33.828 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.827+0000 7ff8a7ceb700 1 -- start start 2026-03-10T12:43:33.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.828+0000 7ff8a7ceb700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0100d60 0x7ff8a01937f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.828+0000 7ff8a7ceb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0103680 0x7ff8a0193d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.829 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.829+0000 7ff8a5a87700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0100d60 0x7ff8a01937f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.829+0000 7ff8a5a87700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0100d60 0x7ff8a01937f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:32886/0 (socket says 192.168.123.100:32886) 2026-03-10T12:43:33.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.829+0000 7ff8a5286700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0103680 0x7ff8a0193d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.829+0000 7ff8a5286700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0103680 0x7ff8a0193d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:44886/0 (socket says 192.168.123.100:44886) 2026-03-10T12:43:33.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.829+0000 7ff8a5a87700 1 -- 192.168.123.100:0/3505227462 learned_addr learned my addr 192.168.123.100:0/3505227462 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:33.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.829+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8a0194350 con 0x7ff8a0103680 2026-03-10T12:43:33.830 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.830+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8a0194490 con 0x7ff8a0100d60 2026-03-10T12:43:33.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.830+0000 7ff8a5a87700 1 -- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0103680 msgr2=0x7ff8a0193d30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:33.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.830+0000 7ff8a5a87700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0103680 0x7ff8a0193d30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:33.831 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.830+0000 7ff8a5a87700 1 -- 192.168.123.100:0/3505227462 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff89c0097e0 con 0x7ff8a0100d60 2026-03-10T12:43:33.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.831+0000 7ff8a5a87700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0100d60 0x7ff8a01937f0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7ff89c006010 tx=0x7ff89c004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.831+0000 7ff896ffd700 1 -- 192.168.123.100:0/3505227462 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff89c01d070 con 0x7ff8a0100d60 2026-03-10T12:43:33.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.831+0000 7ff896ffd700 1 -- 192.168.123.100:0/3505227462 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff89c00bd80 con 0x7ff8a0100d60 2026-03-10T12:43:33.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.832+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff8a0198ee0 con 0x7ff8a0100d60 2026-03-10T12:43:33.833 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.832+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff8a0199350 con 0x7ff8a0100d60 2026-03-10T12:43:33.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.833+0000 7ff896ffd700 1 -- 192.168.123.100:0/3505227462 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff89c00fa40 con 0x7ff8a0100d60 2026-03-10T12:43:33.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.834+0000 7ff896ffd700 1 -- 192.168.123.100:0/3505227462 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff89c00fc60 con 0x7ff8a0100d60 2026-03-10T12:43:33.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.834+0000 7ff896ffd700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff88c07bdf0 0x7ff88c07e2a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:33.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.835+0000 7ff8a5286700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff88c07bdf0 0x7ff88c07e2a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:33.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.836+0000 7ff896ffd700 1 -- 192.168.123.100:0/3505227462 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff89c09c400 con 0x7ff8a0100d60 2026-03-10T12:43:33.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.836+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff884005320 con 0x7ff8a0100d60 2026-03-10T12:43:33.837 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.837+0000 7ff8a5286700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff88c07bdf0 0x7ff88c07e2a0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7ff890009c00 tx=0x7ff890009380 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:33.845 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:33.842+0000 7ff896ffd700 1 -- 192.168.123.100:0/3505227462 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff89c064d60 con 0x7ff8a0100d60 2026-03-10T12:43:34.032 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.031+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ff884006200 con 0x7ff8a0100d60 2026-03-10T12:43:34.034 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.032+0000 7ff896ffd700 1 -- 192.168.123.100:0/3505227462 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+815 (secure 0 0 0) 0x7ff89c0644b0 con 0x7ff8a0100d60 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2, 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 18.2.0 (5dd24139a1eada541a3bc16b6941c5dde975e26d) reef (stable)": 2, 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 12 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:43:34.035 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:43:34.038 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.037+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff88c07bdf0 msgr2=0x7ff88c07e2a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.038 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.037+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff88c07bdf0 0x7ff88c07e2a0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7ff890009c00 tx=0x7ff890009380 comp rx=0 tx=0).stop 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.037+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0100d60 msgr2=0x7ff8a01937f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0100d60 0x7ff8a01937f0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7ff89c006010 tx=0x7ff89c004c30 comp rx=0 tx=0).stop 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 shutdown_connections 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff88c07bdf0 0x7ff88c07e2a0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff8a0100d60 0x7ff8a01937f0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 --2- 192.168.123.100:0/3505227462 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff8a0103680 0x7ff8a0193d30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 >> 192.168.123.100:0/3505227462 conn(0x7ff8a00fa760 msgr2=0x7ff8a00fcbb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 shutdown_connections 2026-03-10T12:43:34.039 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.038+0000 7ff8a7ceb700 1 -- 192.168.123.100:0/3505227462 wait complete. 2026-03-10T12:43:34.077 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:33 vm00.local ceph-mon[103263]: pgmap v176: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:34.078 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:34.078 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:34.078 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:33 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:34.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.125+0000 7fad4106a700 1 -- 192.168.123.100:0/4270563031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c072440 msgr2=0x7fad3c10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.125+0000 7fad4106a700 1 --2- 192.168.123.100:0/4270563031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c072440 0x7fad3c10be90 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fad2c009b00 tx=0x7fad2c009e10 comp rx=0 tx=0).stop 2026-03-10T12:43:34.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.125+0000 7fad4106a700 1 -- 192.168.123.100:0/4270563031 shutdown_connections 2026-03-10T12:43:34.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.125+0000 7fad4106a700 1 --2- 192.168.123.100:0/4270563031 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c072440 0x7fad3c10be90 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.125+0000 7fad4106a700 1 --2- 192.168.123.100:0/4270563031 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c071a60 0x7fad3c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.125+0000 7fad4106a700 1 -- 192.168.123.100:0/4270563031 >> 192.168.123.100:0/4270563031 conn(0x7fad3c06d1a0 msgr2=0x7fad3c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:34.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 -- 192.168.123.100:0/4270563031 shutdown_connections 2026-03-10T12:43:34.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 -- 192.168.123.100:0/4270563031 wait complete. 2026-03-10T12:43:34.127 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 Processor -- start 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 -- start start 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c071a60 0x7fad3c116a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c116fc0 0x7fad3c1b2830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad3c1174c0 con 0x7fad3c071a60 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad4106a700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad3c117630 con 0x7fad3c116fc0 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.126+0000 7fad3bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c071a60 0x7fad3c116a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad3b7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c116fc0 0x7fad3c1b2830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad3b7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c116fc0 0x7fad3c1b2830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:32896/0 (socket says 192.168.123.100:32896) 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad3b7fe700 1 -- 192.168.123.100:0/3658701007 learned_addr learned my addr 192.168.123.100:0/3658701007 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad3b7fe700 1 -- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c071a60 msgr2=0x7fad3c116a80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad3b7fe700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c071a60 0x7fad3c116a80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad3b7fe700 1 -- 192.168.123.100:0/3658701007 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fad2c0097e0 con 0x7fad3c116fc0 2026-03-10T12:43:34.128 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad3b7fe700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c116fc0 0x7fad3c1b2830 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fad2c009fd0 tx=0x7fad2c00faf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:34.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.127+0000 7fad397fa700 1 -- 192.168.123.100:0/3658701007 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad2c01c070 con 0x7fad3c116fc0 2026-03-10T12:43:34.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.128+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fad3c1b2d70 con 0x7fad3c116fc0 2026-03-10T12:43:34.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.128+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fad3c1b3230 con 0x7fad3c116fc0 2026-03-10T12:43:34.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.128+0000 7fad397fa700 1 -- 192.168.123.100:0/3658701007 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fad2c00bea0 con 0x7fad3c116fc0 2026-03-10T12:43:34.130 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.128+0000 7fad397fa700 1 -- 192.168.123.100:0/3658701007 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad2c017870 con 0x7fad3c116fc0 2026-03-10T12:43:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.129+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fad28005320 con 0x7fad3c116fc0 2026-03-10T12:43:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.130+0000 7fad397fa700 1 -- 192.168.123.100:0/3658701007 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fad2c0179d0 con 0x7fad3c116fc0 2026-03-10T12:43:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.130+0000 7fad397fa700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad24077910 0x7fad24079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.130+0000 7fad397fa700 1 -- 192.168.123.100:0/3658701007 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fad2c09abf0 con 0x7fad3c116fc0 2026-03-10T12:43:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.131+0000 7fad3bfff700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad24077910 0x7fad24079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.131+0000 7fad3bfff700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad24077910 0x7fad24079dc0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fad3c072f50 tx=0x7fad30008040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:34.138 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.136+0000 7fad397fa700 1 -- 192.168.123.100:0/3658701007 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fad2c0634a0 con 0x7fad3c116fc0 2026-03-10T12:43:34.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.305+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fad28005cc0 con 0x7fad3c116fc0 2026-03-10T12:43:34.306 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.306+0000 7fad397fa700 1 -- 192.168.123.100:0/3658701007 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 24 v24) v1 ==== 76+0+1927 (secure 0 0 0) 0x7fad2c062bf0 con 0x7fad3c116fc0 2026-03-10T12:43:34.307 INFO:teuthology.orchestra.run.vm00.stdout:e24 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:btime 2026-03-10T12:43:34:155542+0000 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:epoch 19 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:43:15.758277+0000 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 0 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 1 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:in 0 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:up {0=24313} 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:stopped 1 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 24313 members: 24313 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{0:24313} state up:active seq 3 join_fscid=1 addr [v2:192.168.123.107:6824/1465224692,v1:192.168.123.107:6825/1465224692] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:24325} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{-1:34368} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:43:34.308 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{-1:44277} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:43:34.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad24077910 msgr2=0x7fad24079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad24077910 0x7fad24079dc0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7fad3c072f50 tx=0x7fad30008040 comp rx=0 tx=0).stop 2026-03-10T12:43:34.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c116fc0 msgr2=0x7fad3c1b2830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c116fc0 0x7fad3c1b2830 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fad2c009fd0 tx=0x7fad2c00faf0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 shutdown_connections 2026-03-10T12:43:34.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad24077910 0x7fad24079dc0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad3c071a60 0x7fad3c116a80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 --2- 192.168.123.100:0/3658701007 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad3c116fc0 0x7fad3c1b2830 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 >> 192.168.123.100:0/3658701007 conn(0x7fad3c06d1a0 msgr2=0x7fad3c070630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:34.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 shutdown_connections 2026-03-10T12:43:34.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.310+0000 7fad4106a700 1 -- 192.168.123.100:0/3658701007 wait complete. 2026-03-10T12:43:34.312 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 24 2026-03-10T12:43:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:33 vm07.local ceph-mon[93622]: pgmap v176: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:33 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:34.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.411+0000 7f40f643b700 1 -- 192.168.123.100:0/444318118 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0072440 msgr2=0x7f40f010be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.411+0000 7f40f643b700 1 --2- 192.168.123.100:0/444318118 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0072440 0x7f40f010be90 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f40e801c580 tx=0x7f40e801c890 comp rx=0 tx=0).stop 2026-03-10T12:43:34.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.411+0000 7f40f643b700 1 -- 192.168.123.100:0/444318118 shutdown_connections 2026-03-10T12:43:34.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.411+0000 7f40f643b700 1 --2- 192.168.123.100:0/444318118 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0072440 0x7f40f010be90 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.411+0000 7f40f643b700 1 --2- 192.168.123.100:0/444318118 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40f0071a60 0x7f40f0071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.411+0000 7f40f643b700 1 -- 192.168.123.100:0/444318118 >> 192.168.123.100:0/444318118 conn(0x7f40f006d1a0 msgr2=0x7f40f006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:34.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.417+0000 7f40f643b700 1 -- 192.168.123.100:0/444318118 shutdown_connections 2026-03-10T12:43:34.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.418+0000 7f40f643b700 1 -- 192.168.123.100:0/444318118 wait complete. 2026-03-10T12:43:34.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.418+0000 7f40f643b700 1 Processor -- start 2026-03-10T12:43:34.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f643b700 1 -- start start 2026-03-10T12:43:34.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f643b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0071a60 0x7f40f019c250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f643b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40f0072440 0x7f40f019c790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f643b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40f019cdb0 con 0x7f40f0071a60 2026-03-10T12:43:34.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f643b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40f019cef0 con 0x7f40f0072440 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f5439700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0071a60 0x7f40f019c250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f5439700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0071a60 0x7f40f019c250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:44922/0 (socket says 192.168.123.100:44922) 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f5439700 1 -- 192.168.123.100:0/2522968719 learned_addr learned my addr 192.168.123.100:0/2522968719 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f4c38700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40f0072440 0x7f40f019c790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f5439700 1 -- 192.168.123.100:0/2522968719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40f0072440 msgr2=0x7f40f019c790 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f5439700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40f0072440 0x7f40f019c790 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.419+0000 7f40f5439700 1 -- 192.168.123.100:0/2522968719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40e801c060 con 0x7f40f0071a60 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.420+0000 7f40f5439700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0071a60 0x7f40f019c250 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f40ec00ba70 tx=0x7f40ec00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.420+0000 7f40e67fc700 1 -- 192.168.123.100:0/2522968719 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40ec00c760 con 0x7f40f0071a60 2026-03-10T12:43:34.421 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.420+0000 7f40e67fc700 1 -- 192.168.123.100:0/2522968719 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f40ec00cda0 con 0x7f40f0071a60 2026-03-10T12:43:34.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.421+0000 7f40f643b700 1 -- 192.168.123.100:0/2522968719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40f01a19a0 con 0x7f40f0071a60 2026-03-10T12:43:34.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.421+0000 7f40f643b700 1 -- 192.168.123.100:0/2522968719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40f01a1ec0 con 0x7f40f0071a60 2026-03-10T12:43:34.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.421+0000 7f40e67fc700 1 -- 192.168.123.100:0/2522968719 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40ec012570 con 0x7f40f0071a60 2026-03-10T12:43:34.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.421+0000 7f40f643b700 1 -- 192.168.123.100:0/2522968719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f40f0196410 con 0x7f40f0071a60 2026-03-10T12:43:34.425 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.425+0000 7f40e67fc700 1 -- 192.168.123.100:0/2522968719 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f40ec00c8c0 con 0x7f40f0071a60 2026-03-10T12:43:34.426 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.425+0000 7f40e67fc700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40dc077710 0x7f40dc079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.426 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.425+0000 7f40e67fc700 1 -- 192.168.123.100:0/2522968719 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f40ec098df0 con 0x7f40f0071a60 2026-03-10T12:43:34.426 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.425+0000 7f40e67fc700 1 -- 192.168.123.100:0/2522968719 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f40ec0991d0 con 0x7f40f0071a60 2026-03-10T12:43:34.427 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.427+0000 7f40f4c38700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40dc077710 0x7f40dc079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.429 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.429+0000 7f40f4c38700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40dc077710 0x7f40dc079bc0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f40e8009fd0 tx=0x7f40e8007b90 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:34.568 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.566+0000 7f40f643b700 1 -- 192.168.123.100:0/2522968719 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f40f0061190 con 0x7f40dc077710 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.568+0000 7f40e67fc700 1 -- 192.168.123.100:0/2522968719 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f40f0061190 con 0x7f40dc077710 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout: "osd", 2026-03-10T12:43:34.569 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:43:34.570 INFO:teuthology.orchestra.run.vm00.stdout: "mon" 2026-03-10T12:43:34.570 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:43:34.570 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "13/23 daemons upgraded", 2026-03-10T12:43:34.570 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading mds daemons", 2026-03-10T12:43:34.570 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:43:34.570 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:43:34.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 -- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40dc077710 msgr2=0x7f40dc079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40dc077710 0x7f40dc079bc0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f40e8009fd0 tx=0x7f40e8007b90 comp rx=0 tx=0).stop 2026-03-10T12:43:34.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 -- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0071a60 msgr2=0x7f40f019c250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.573 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0071a60 0x7f40f019c250 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f40ec00ba70 tx=0x7f40ec00be30 comp rx=0 tx=0).stop 2026-03-10T12:43:34.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 -- 192.168.123.100:0/2522968719 shutdown_connections 2026-03-10T12:43:34.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40dc077710 0x7f40dc079bc0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40f0071a60 0x7f40f019c250 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 --2- 192.168.123.100:0/2522968719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40f0072440 0x7f40f019c790 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 -- 192.168.123.100:0/2522968719 >> 192.168.123.100:0/2522968719 conn(0x7f40f006d1a0 msgr2=0x7f40f010a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:34.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 -- 192.168.123.100:0/2522968719 shutdown_connections 2026-03-10T12:43:34.574 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.573+0000 7f40dbfff700 1 -- 192.168.123.100:0/2522968719 wait complete. 2026-03-10T12:43:34.678 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 -- 192.168.123.100:0/3548344246 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c072440 msgr2=0x7fb68c10be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.678 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 --2- 192.168.123.100:0/3548344246 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c072440 0x7fb68c10be90 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fb68401c580 tx=0x7fb68401c890 comp rx=0 tx=0).stop 2026-03-10T12:43:34.678 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 -- 192.168.123.100:0/3548344246 shutdown_connections 2026-03-10T12:43:34.678 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 --2- 192.168.123.100:0/3548344246 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c072440 0x7fb68c10be90 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.678 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 --2- 192.168.123.100:0/3548344246 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb68c071a60 0x7fb68c071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.678 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 -- 192.168.123.100:0/3548344246 >> 192.168.123.100:0/3548344246 conn(0x7fb68c06d1a0 msgr2=0x7fb68c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 -- 192.168.123.100:0/3548344246 shutdown_connections 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.676+0000 7fb690fa2700 1 -- 192.168.123.100:0/3548344246 wait complete. 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.680+0000 7fb690fa2700 1 Processor -- start 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.680+0000 7fb690fa2700 1 -- start start 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.680+0000 7fb690fa2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c071a60 0x7fb68c19c230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.680+0000 7fb690fa2700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb68c072440 0x7fb68c19c770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.680+0000 7fb690fa2700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb68c19cd90 con 0x7fb68c071a60 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.680+0000 7fb690fa2700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb68c19ced0 con 0x7fb68c072440 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.681+0000 7fb68b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c071a60 0x7fb68c19c230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.681+0000 7fb68b7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c071a60 0x7fb68c19c230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:44944/0 (socket says 192.168.123.100:44944) 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.681+0000 7fb68b7fe700 1 -- 192.168.123.100:0/1554972097 learned_addr learned my addr 192.168.123.100:0/1554972097 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.681+0000 7fb68affd700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb68c072440 0x7fb68c19c770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.681+0000 7fb68b7fe700 1 -- 192.168.123.100:0/1554972097 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb68c072440 msgr2=0x7fb68c19c770 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.681+0000 7fb68b7fe700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb68c072440 0x7fb68c19c770 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.681+0000 7fb68b7fe700 1 -- 192.168.123.100:0/1554972097 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb68401c060 con 0x7fb68c071a60 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.682+0000 7fb68b7fe700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c071a60 0x7fb68c19c230 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fb68000b700 tx=0x7fb68000ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:34.682 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.682+0000 7fb688ff9700 1 -- 192.168.123.100:0/1554972097 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb680011840 con 0x7fb68c071a60 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.682+0000 7fb690fa2700 1 -- 192.168.123.100:0/1554972097 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb68c1c3990 con 0x7fb68c071a60 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.682+0000 7fb690fa2700 1 -- 192.168.123.100:0/1554972097 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb68c1c3ee0 con 0x7fb68c071a60 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.683+0000 7fb688ff9700 1 -- 192.168.123.100:0/1554972097 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb680011e80 con 0x7fb68c071a60 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.683+0000 7fb688ff9700 1 -- 192.168.123.100:0/1554972097 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb68000f550 con 0x7fb68c071a60 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.683+0000 7fb688ff9700 1 -- 192.168.123.100:0/1554972097 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb68000f770 con 0x7fb68c071a60 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.684+0000 7fb688ff9700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb6740778c0 0x7fb674079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.684+0000 7fb68affd700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb6740778c0 0x7fb674079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.684+0000 7fb688ff9700 1 -- 192.168.123.100:0/1554972097 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(81..81 src has 1..81) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb680067610 con 0x7fb68c071a60 2026-03-10T12:43:34.685 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.684+0000 7fb690fa2700 1 -- 192.168.123.100:0/1554972097 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb68c196410 con 0x7fb68c071a60 2026-03-10T12:43:34.688 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.685+0000 7fb68affd700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb6740778c0 0x7fb674079d70 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fb684009fd0 tx=0x7fb68400b040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:43:34.691 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.691+0000 7fb688ff9700 1 -- 192.168.123.100:0/1554972097 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb680063030 con 0x7fb68c071a60 2026-03-10T12:43:34.885 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.884+0000 7fb690fa2700 1 -- 192.168.123.100:0/1554972097 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb68c04f2a0 con 0x7fb68c071a60 2026-03-10T12:43:34.886 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.886+0000 7fb688ff9700 1 -- 192.168.123.100:0/1554972097 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7fb680015030 con 0x7fb68c071a60 2026-03-10T12:43:34.886 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 -- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb6740778c0 msgr2=0x7fb674079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb6740778c0 0x7fb674079d70 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fb684009fd0 tx=0x7fb68400b040 comp rx=0 tx=0).stop 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 -- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c071a60 msgr2=0x7fb68c19c230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c071a60 0x7fb68c19c230 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fb68000b700 tx=0x7fb68000ba10 comp rx=0 tx=0).stop 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 -- 192.168.123.100:0/1554972097 shutdown_connections 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb6740778c0 0x7fb674079d70 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb68c071a60 0x7fb68c19c230 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 --2- 192.168.123.100:0/1554972097 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb68c072440 0x7fb68c19c770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:43:34.891 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.890+0000 7fb6727fc700 1 -- 192.168.123.100:0/1554972097 >> 192.168.123.100:0/1554972097 conn(0x7fb68c06d1a0 msgr2=0x7fb68c10a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:43:34.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.894+0000 7fb6727fc700 1 -- 192.168.123.100:0/1554972097 shutdown_connections 2026-03-10T12:43:34.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:43:34.896+0000 7fb6727fc700 1 -- 192.168.123.100:0/1554972097 wait complete. 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: from='client.44281 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: from='client.34372 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: from='client.44289 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3505227462' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:boot 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 3 up:standby 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3658701007' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:43:34.986 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:34 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1554972097' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: from='client.44281 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: from='client.34372 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: from='client.44289 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3505227462' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:boot 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.wznhgu=up:active} 3 up:standby 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3658701007' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:43:35.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:34 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1554972097' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:43:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:35 vm00.local ceph-mon[103263]: from='client.34388 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:35 vm00.local ceph-mon[103263]: pgmap v177: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:35 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:35 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:35 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:35.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:35 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:35 vm07.local ceph-mon[93622]: from='client.34388 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:43:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:35 vm07.local ceph-mon[93622]: pgmap v177: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:35 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:35 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:35 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:35 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.wznhgu"]}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: Upgrade: It appears safe to stop mds.cephfs.vm07.wznhgu 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: pgmap v178: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: Upgrade: Updating mds.cephfs.vm07.wznhgu 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:37.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:37 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00[103259]: 2026-03-10T12:43:37.511+0000 7f1d1ade0640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:37.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.wznhgu"]}]: dispatch 2026-03-10T12:43:37.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: Upgrade: It appears safe to stop mds.cephfs.vm07.wznhgu 2026-03-10T12:43:37.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: pgmap v178: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:43:37.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: Upgrade: Updating mds.cephfs.vm07.wznhgu 2026-03-10T12:43:37.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:37.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.wznhgu", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:37.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: Deploying daemon mds.cephfs.vm07.wznhgu on vm07 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: osdmap e82: 6 total, 6 up, 6 in 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: Standby daemon mds.cephfs.vm07.rhzwnr assigned to filesystem cephfs as rank 0 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T12:43:38.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:38 vm07.local ceph-mon[93622]: fsmap cephfs:1/1 {0=cephfs.vm07.rhzwnr=up:replay} 2 up:standby 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: Deploying daemon mds.cephfs.vm07.wznhgu on vm07 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: osdmap e82: 6 total, 6 up, 6 in 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: Standby daemon mds.cephfs.vm07.rhzwnr assigned to filesystem cephfs as rank 0 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T12:43:38.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:38 vm00.local ceph-mon[103263]: fsmap cephfs:1/1 {0=cephfs.vm07.rhzwnr=up:replay} 2 up:standby 2026-03-10T12:43:39.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:39 vm07.local ceph-mon[93622]: pgmap v180: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 5.7 MiB/s rd, 2 op/s 2026-03-10T12:43:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:39 vm00.local ceph-mon[103263]: pgmap v180: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 5.7 MiB/s rd, 2 op/s 2026-03-10T12:43:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:41 vm00.local ceph-mon[103263]: pgmap v181: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 7.2 MiB/s rd, 3 op/s 2026-03-10T12:43:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:41 vm07.local ceph-mon[93622]: pgmap v181: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 7.2 MiB/s rd, 3 op/s 2026-03-10T12:43:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:42 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:reconnect 2026-03-10T12:43:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:42 vm00.local ceph-mon[103263]: reconnect by client.14524 192.168.144.1:0/4070542078 after 0.004 2026-03-10T12:43:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:42 vm00.local ceph-mon[103263]: fsmap cephfs:1/1 {0=cephfs.vm07.rhzwnr=up:reconnect} 2 up:standby 2026-03-10T12:43:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:42 vm00.local ceph-mon[103263]: reconnect by client.14518 192.168.123.100:0/1707042861 after 0.00500001 2026-03-10T12:43:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:42 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:rejoin 2026-03-10T12:43:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:42 vm00.local ceph-mon[103263]: fsmap cephfs:1/1 {0=cephfs.vm07.rhzwnr=up:rejoin} 2 up:standby 2026-03-10T12:43:43.253 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:42 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:reconnect 2026-03-10T12:43:43.253 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:42 vm07.local ceph-mon[93622]: reconnect by client.14524 192.168.144.1:0/4070542078 after 0.004 2026-03-10T12:43:43.253 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:42 vm07.local ceph-mon[93622]: fsmap cephfs:1/1 {0=cephfs.vm07.rhzwnr=up:reconnect} 2 up:standby 2026-03-10T12:43:43.253 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:42 vm07.local ceph-mon[93622]: reconnect by client.14518 192.168.123.100:0/1707042861 after 0.00500001 2026-03-10T12:43:43.253 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:42 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:rejoin 2026-03-10T12:43:43.253 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:42 vm07.local ceph-mon[93622]: fsmap cephfs:1/1 {0=cephfs.vm07.rhzwnr=up:rejoin} 2 up:standby 2026-03-10T12:43:44.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:43 vm07.local ceph-mon[93622]: pgmap v182: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 9.2 MiB/s rd, 3 op/s 2026-03-10T12:43:44.172 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:43 vm07.local ceph-mon[93622]: daemon mds.cephfs.vm07.rhzwnr is now active in filesystem cephfs as rank 0 2026-03-10T12:43:44.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:43 vm00.local ceph-mon[103263]: pgmap v182: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 9.2 MiB/s rd, 3 op/s 2026-03-10T12:43:44.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:43 vm00.local ceph-mon[103263]: daemon mds.cephfs.vm07.rhzwnr is now active in filesystem cephfs as rank 0 2026-03-10T12:43:45.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:44 vm00.local ceph-mon[103263]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T12:43:45.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:44 vm00.local ceph-mon[103263]: Cluster is now healthy 2026-03-10T12:43:45.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:44 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:active 2026-03-10T12:43:45.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:44 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.rhzwnr=up:active} 2 up:standby 2026-03-10T12:43:45.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:44 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:45.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:44 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:45.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:44 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:45.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:44 vm07.local ceph-mon[93622]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T12:43:45.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:44 vm07.local ceph-mon[93622]: Cluster is now healthy 2026-03-10T12:43:45.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:44 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.107:6826/3705110268,v1:192.168.123.107:6827/3705110268] up:active 2026-03-10T12:43:45.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:44 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.rhzwnr=up:active} 2 up:standby 2026-03-10T12:43:45.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:44 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:45.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:44 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:45.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:44 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:46.160 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: pgmap v183: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 204 B/s wr, 5 op/s 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.107:6824/48365433,v1:192.168.123.107:6825/48365433] up:boot 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm07.rhzwnr=up:active} 3 up:standby 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:46.161 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: pgmap v183: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 204 B/s wr, 5 op/s 2026-03-10T12:43:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:46.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.107:6824/48365433,v1:192.168.123.107:6825/48365433] up:boot 2026-03-10T12:43:46.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm07.rhzwnr=up:active} 3 up:standby 2026-03-10T12:43:46.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:43:46.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:43:46.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:46.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: pgmap v184: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 204 B/s wr, 5 op/s 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: Detected new or changed devices on vm07 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.rhzwnr"]}]: dispatch 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: Upgrade: It appears safe to stop mds.cephfs.vm07.rhzwnr 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: osdmap e83: 6 total, 6 up, 6 in 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: Standby daemon mds.cephfs.vm00.wdwvcu assigned to filesystem cephfs as rank 0 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T12:43:48.067 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:47 vm07.local ceph-mon[93622]: fsmap cephfs:1/1 {0=cephfs.vm00.wdwvcu=up:replay} 2 up:standby 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00[103259]: 2026-03-10T12:43:47.763+0000 7f1d1ade0640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: pgmap v184: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 204 B/s wr, 5 op/s 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: Detected new or changed devices on vm07 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds ok-to-stop", "ids": ["cephfs.vm07.rhzwnr"]}]: dispatch 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: Upgrade: It appears safe to stop mds.cephfs.vm07.rhzwnr 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm07.rhzwnr", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: osdmap e83: 6 total, 6 up, 6 in 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: Standby daemon mds.cephfs.vm00.wdwvcu assigned to filesystem cephfs as rank 0 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T12:43:48.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:47 vm00.local ceph-mon[103263]: fsmap cephfs:1/1 {0=cephfs.vm00.wdwvcu=up:replay} 2 up:standby 2026-03-10T12:43:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:48 vm00.local ceph-mon[103263]: Upgrade: Updating mds.cephfs.vm07.rhzwnr 2026-03-10T12:43:49.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:48 vm00.local ceph-mon[103263]: Deploying daemon mds.cephfs.vm07.rhzwnr on vm07 2026-03-10T12:43:49.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:48 vm07.local ceph-mon[93622]: Upgrade: Updating mds.cephfs.vm07.rhzwnr 2026-03-10T12:43:49.317 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:48 vm07.local ceph-mon[93622]: Deploying daemon mds.cephfs.vm07.rhzwnr on vm07 2026-03-10T12:43:49.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:49 vm00.local ceph-mon[103263]: pgmap v186: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 921 B/s wr, 9 op/s 2026-03-10T12:43:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:49 vm07.local ceph-mon[93622]: pgmap v186: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 921 B/s wr, 9 op/s 2026-03-10T12:43:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:51 vm00.local ceph-mon[103263]: pgmap v187: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 921 B/s wr, 9 op/s 2026-03-10T12:43:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:51 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:reconnect 2026-03-10T12:43:51.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:51 vm00.local ceph-mon[103263]: fsmap cephfs:1/1 {0=cephfs.vm00.wdwvcu=up:reconnect} 2 up:standby 2026-03-10T12:43:51.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:51 vm07.local ceph-mon[93622]: pgmap v187: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 921 B/s wr, 9 op/s 2026-03-10T12:43:51.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:51 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:reconnect 2026-03-10T12:43:51.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:51 vm07.local ceph-mon[93622]: fsmap cephfs:1/1 {0=cephfs.vm00.wdwvcu=up:reconnect} 2 up:standby 2026-03-10T12:43:52.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:52 vm07.local ceph-mon[93622]: reconnect by client.14518 192.168.123.100:0/1707042861 after 0 2026-03-10T12:43:52.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:52 vm07.local ceph-mon[93622]: reconnect by client.14524 192.168.144.1:0/4070542078 after 0 2026-03-10T12:43:52.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:52 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:rejoin 2026-03-10T12:43:52.481 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:52 vm07.local ceph-mon[93622]: fsmap cephfs:1/1 {0=cephfs.vm00.wdwvcu=up:rejoin} 2 up:standby 2026-03-10T12:43:52.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:52 vm00.local ceph-mon[103263]: reconnect by client.14518 192.168.123.100:0/1707042861 after 0 2026-03-10T12:43:52.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:52 vm00.local ceph-mon[103263]: reconnect by client.14524 192.168.144.1:0/4070542078 after 0 2026-03-10T12:43:52.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:52 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:rejoin 2026-03-10T12:43:52.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:52 vm00.local ceph-mon[103263]: fsmap cephfs:1/1 {0=cephfs.vm00.wdwvcu=up:rejoin} 2 up:standby 2026-03-10T12:43:53.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:53 vm00.local ceph-mon[103263]: daemon mds.cephfs.vm00.wdwvcu is now active in filesystem cephfs as rank 0 2026-03-10T12:43:53.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:53 vm00.local ceph-mon[103263]: pgmap v188: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 921 B/s wr, 9 op/s 2026-03-10T12:43:53.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:53 vm07.local ceph-mon[93622]: daemon mds.cephfs.vm00.wdwvcu is now active in filesystem cephfs as rank 0 2026-03-10T12:43:53.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:53 vm07.local ceph-mon[93622]: pgmap v188: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 921 B/s wr, 9 op/s 2026-03-10T12:43:54.287 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:54 vm07.local ceph-mon[93622]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T12:43:54.287 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:54 vm07.local ceph-mon[93622]: Cluster is now healthy 2026-03-10T12:43:54.287 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:54 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:active 2026-03-10T12:43:54.287 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:54 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm00.wdwvcu=up:active} 2 up:standby 2026-03-10T12:43:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:54 vm00.local ceph-mon[103263]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T12:43:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:54 vm00.local ceph-mon[103263]: Cluster is now healthy 2026-03-10T12:43:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:54 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] up:active 2026-03-10T12:43:54.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:54 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm00.wdwvcu=up:active} 2 up:standby 2026-03-10T12:43:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:55 vm07.local ceph-mon[93622]: pgmap v189: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 1.6 KiB/s wr, 13 op/s 2026-03-10T12:43:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:55 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.107:6826/3408808533,v1:192.168.123.107:6827/3408808533] up:boot 2026-03-10T12:43:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:55 vm07.local ceph-mon[93622]: fsmap cephfs:1 {0=cephfs.vm00.wdwvcu=up:active} 3 up:standby 2026-03-10T12:43:55.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:55 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:43:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:55 vm00.local ceph-mon[103263]: pgmap v189: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 1.6 KiB/s wr, 13 op/s 2026-03-10T12:43:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:55 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.107:6826/3408808533,v1:192.168.123.107:6827/3408808533] up:boot 2026-03-10T12:43:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:55 vm00.local ceph-mon[103263]: fsmap cephfs:1 {0=cephfs.vm00.wdwvcu=up:active} 3 up:standby 2026-03-10T12:43:55.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:55 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:43:57.541 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:57 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:57.541 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:57 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:57.544 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:57 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:57.544 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:57 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:58.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:58 vm07.local ceph-mon[93622]: pgmap v190: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 1.6 KiB/s wr, 13 op/s 2026-03-10T12:43:58.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:58 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:58.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:58 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:58.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:58 vm00.local ceph-mon[103263]: pgmap v190: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 14 MiB/s rd, 1.6 KiB/s wr, 13 op/s 2026-03-10T12:43:58.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:58 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:58.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:58 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: pgmap v191: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 8.3 KiB/s wr, 14 op/s 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.lnokoe"}]': finished 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.wdwvcu"}]': finished 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.rhzwnr"}]': finished 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.wznhgu"}]': finished 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: daemon mds.cephfs.vm00.lnokoe assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm00.wdwvcu=up:active,1=cephfs.vm00.lnokoe=up:starting} 2 up:standby 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: daemon mds.cephfs.vm00.lnokoe is now active in filesystem cephfs as rank 1 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.737 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:43:59 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: pgmap v191: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 8.3 KiB/s wr, 14 op/s 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:43:59.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.lnokoe"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.lnokoe"}]': finished 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.wdwvcu"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm00.wdwvcu"}]': finished 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.rhzwnr"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.rhzwnr"}]': finished 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.wznhgu"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm07.wznhgu"}]': finished 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: daemon mds.cephfs.vm00.lnokoe assigned to filesystem cephfs as rank 1 (now has 2 ranks) 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "2"}]': finished 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm00.wdwvcu=up:active,1=cephfs.vm00.lnokoe=up:starting} 2 up:standby 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: daemon mds.cephfs.vm00.lnokoe is now active in filesystem cephfs as rank 1 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:43:59.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:43:59 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all mds 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: Upgrade: Scaling up filesystem cephfs max_mds to 2 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all rgw 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: Upgrade: Updating ceph-exporter.vm00 (1/2) 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: Deploying daemon ceph-exporter.vm00 on vm00 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: mds.? [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] up:active 2026-03-10T12:44:00.963 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:00 vm07.local ceph-mon[93622]: fsmap cephfs:2 {0=cephfs.vm00.wdwvcu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all mds 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: Upgrade: Scaling up filesystem cephfs max_mds to 2 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all rgw 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: Upgrade: Updating ceph-exporter.vm00 (1/2) 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm00", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: Deploying daemon ceph-exporter.vm00 on vm00 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: mds.? [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] up:active 2026-03-10T12:44:00.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:00 vm00.local ceph-mon[103263]: fsmap cephfs:2 {0=cephfs.vm00.wdwvcu=up:active,1=cephfs.vm00.lnokoe=up:active} 2 up:standby 2026-03-10T12:44:01.993 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:01 vm07.local ceph-mon[93622]: pgmap v192: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 7.1 MiB/s rd, 7.0 KiB/s wr, 7 op/s 2026-03-10T12:44:01.993 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:01.993 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:01.993 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:01.993 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:01.993 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:44:01.993 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:01 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:02.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:01 vm00.local ceph-mon[103263]: pgmap v192: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 7.1 MiB/s rd, 7.0 KiB/s wr, 7 op/s 2026-03-10T12:44:02.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:02.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:02.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:02.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:02.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm07", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T12:44:02.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:01 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:02.992 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:02 vm07.local ceph-mon[93622]: Upgrade: Updating ceph-exporter.vm07 (2/2) 2026-03-10T12:44:02.992 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:02 vm07.local ceph-mon[93622]: Deploying daemon ceph-exporter.vm07 on vm07 2026-03-10T12:44:02.992 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:02 vm07.local ceph-mon[93622]: pgmap v193: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 5.7 MiB/s rd, 7.1 KiB/s wr, 8 op/s 2026-03-10T12:44:02.992 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:02 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:02.992 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:02 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:02.992 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:02 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:03.180 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:02 vm00.local ceph-mon[103263]: Upgrade: Updating ceph-exporter.vm07 (2/2) 2026-03-10T12:44:03.180 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:02 vm00.local ceph-mon[103263]: Deploying daemon ceph-exporter.vm07 on vm07 2026-03-10T12:44:03.180 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:02 vm00.local ceph-mon[103263]: pgmap v193: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 5.7 MiB/s rd, 7.1 KiB/s wr, 8 op/s 2026-03-10T12:44:03.180 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:02 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:03.180 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:02 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:03.180 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:02 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:04.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:04 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.005+0000 7f6314fca700 1 -- 192.168.123.100:0/148136480 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100737b0 msgr2=0x7f6310073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.005+0000 7f6314fca700 1 --2- 192.168.123.100:0/148136480 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100737b0 0x7f6310073c20 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f630000b3a0 tx=0x7f630000b6b0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.005+0000 7f6314fca700 1 -- 192.168.123.100:0/148136480 shutdown_connections 2026-03-10T12:44:05.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.005+0000 7f6314fca700 1 --2- 192.168.123.100:0/148136480 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100737b0 0x7f6310073c20 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.005+0000 7f6314fca700 1 --2- 192.168.123.100:0/148136480 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6310074d80 0x7f63100731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.007 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.005+0000 7f6314fca700 1 -- 192.168.123.100:0/148136480 >> 192.168.123.100:0/148136480 conn(0x7f63100fba90 msgr2=0x7f63100fdf00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 -- 192.168.123.100:0/148136480 shutdown_connections 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 -- 192.168.123.100:0/148136480 wait complete. 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 Processor -- start 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 -- start start 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6310074d80 0x7f6310071da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100722e0 0x7f6310198ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63100727e0 con 0x7f63100722e0 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6314fca700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6310072950 con 0x7f6310074d80 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6305bff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100722e0 0x7f6310198ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6305bff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100722e0 0x7f6310198ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54108/0 (socket says 192.168.123.100:54108) 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.007+0000 7f6305bff700 1 -- 192.168.123.100:0/4077620899 learned_addr learned my addr 192.168.123.100:0/4077620899 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.008+0000 7f6305bff700 1 -- 192.168.123.100:0/4077620899 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6310074d80 msgr2=0x7f6310071da0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.008+0000 7f6305bff700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6310074d80 0x7f6310071da0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.008+0000 7f6305bff700 1 -- 192.168.123.100:0/4077620899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f630000b050 con 0x7f63100722e0 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.008+0000 7f6305bff700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100722e0 0x7f6310198ef0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f6300007b90 tx=0x7f63000095a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.008+0000 7f6307fff700 1 -- 192.168.123.100:0/4077620899 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f630000e050 con 0x7f63100722e0 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.008+0000 7f6314fca700 1 -- 192.168.123.100:0/4077620899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6310199430 con 0x7f63100722e0 2026-03-10T12:44:05.010 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.008+0000 7f6314fca700 1 -- 192.168.123.100:0/4077620899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6310199980 con 0x7f63100722e0 2026-03-10T12:44:05.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.009+0000 7f6307fff700 1 -- 192.168.123.100:0/4077620899 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6300004730 con 0x7f63100722e0 2026-03-10T12:44:05.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.009+0000 7f6307fff700 1 -- 192.168.123.100:0/4077620899 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f630001bd20 con 0x7f63100722e0 2026-03-10T12:44:05.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.010+0000 7f6307fff700 1 -- 192.168.123.100:0/4077620899 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6300019040 con 0x7f63100722e0 2026-03-10T12:44:05.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.010+0000 7f6307fff700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f62fc0779e0 0x7f62fc079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.010+0000 7f6307fff700 1 -- 192.168.123.100:0/4077620899 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f630009ad00 con 0x7f63100722e0 2026-03-10T12:44:05.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.010+0000 7f630e59c700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f62fc0779e0 0x7f62fc079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.009+0000 7f6314fca700 1 -- 192.168.123.100:0/4077620899 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f62f0005320 con 0x7f63100722e0 2026-03-10T12:44:05.014 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.011+0000 7f630e59c700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f62fc0779e0 0x7f62fc079e90 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f63101a6260 tx=0x7f62f8006c60 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.019 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.014+0000 7f6307fff700 1 -- 192.168.123.100:0/4077620899 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6300063500 con 0x7f63100722e0 2026-03-10T12:44:05.047 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:04 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.047 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:04 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.047 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:04 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.047 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:04 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.047 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:04 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.047 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:04 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.185 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.183+0000 7f6314fca700 1 -- 192.168.123.100:0/4077620899 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f62f0000bf0 con 0x7f62fc0779e0 2026-03-10T12:44:05.188 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.185+0000 7f6307fff700 1 -- 192.168.123.100:0/4077620899 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+466 (secure 0 0 0) 0x7f62f0000bf0 con 0x7f62fc0779e0 2026-03-10T12:44:05.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.189+0000 7f63053fe700 1 -- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f62fc0779e0 msgr2=0x7f62fc079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.189+0000 7f63053fe700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f62fc0779e0 0x7f62fc079e90 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f63101a6260 tx=0x7f62f8006c60 comp rx=0 tx=0).stop 2026-03-10T12:44:05.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.189+0000 7f63053fe700 1 -- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100722e0 msgr2=0x7f6310198ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.189+0000 7f63053fe700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100722e0 0x7f6310198ef0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f6300007b90 tx=0x7f63000095a0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.190+0000 7f63053fe700 1 -- 192.168.123.100:0/4077620899 shutdown_connections 2026-03-10T12:44:05.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.191+0000 7f63053fe700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f62fc0779e0 0x7f62fc079e90 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.191+0000 7f63053fe700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6310074d80 0x7f6310071da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.191+0000 7f63053fe700 1 --2- 192.168.123.100:0/4077620899 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f63100722e0 0x7f6310198ef0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.191+0000 7f63053fe700 1 -- 192.168.123.100:0/4077620899 >> 192.168.123.100:0/4077620899 conn(0x7f63100fba90 msgr2=0x7f63100fde10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.191+0000 7f63053fe700 1 -- 192.168.123.100:0/4077620899 shutdown_connections 2026-03-10T12:44:05.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.191+0000 7f63053fe700 1 -- 192.168.123.100:0/4077620899 wait complete. 2026-03-10T12:44:05.206 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:44:05.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/3619740175 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 msgr2=0x7f4a94071e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/3619740175 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 0x7f4a94071e70 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f4a90009b00 tx=0x7f4a90009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:05.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/3619740175 shutdown_connections 2026-03-10T12:44:05.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/3619740175 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a94072440 0x7f4a9410be90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/3619740175 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 0x7f4a94071e70 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/3619740175 >> 192.168.123.100:0/3619740175 conn(0x7f4a9406d1a0 msgr2=0x7f4a9406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/3619740175 shutdown_connections 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/3619740175 wait complete. 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.289+0000 7f4a9ab0e700 1 Processor -- start 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a9ab0e700 1 -- start start 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a9ab0e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 0x7f4a94116970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a9ab0e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a94072440 0x7f4a94116eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a9ab0e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a941174f0 con 0x7f4a94071a60 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a9ab0e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a94117660 con 0x7f4a94072440 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a99b0c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 0x7f4a94116970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a99b0c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 0x7f4a94116970 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54120/0 (socket says 192.168.123.100:54120) 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a99b0c700 1 -- 192.168.123.100:0/2514187258 learned_addr learned my addr 192.168.123.100:0/2514187258 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.290+0000 7f4a9930b700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a94072440 0x7f4a94116eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.291+0000 7f4a9930b700 1 -- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 msgr2=0x7f4a94116970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.291+0000 7f4a9930b700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 0x7f4a94116970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.291+0000 7f4a9930b700 1 -- 192.168.123.100:0/2514187258 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a900097e0 con 0x7f4a94072440 2026-03-10T12:44:05.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.291+0000 7f4a9930b700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a94072440 0x7f4a94116eb0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f4a94107dc0 tx=0x7f4a8400dbb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.292 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.292+0000 7f4a8affd700 1 -- 192.168.123.100:0/2514187258 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a8400f840 con 0x7f4a94072440 2026-03-10T12:44:05.292 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.292+0000 7f4a8affd700 1 -- 192.168.123.100:0/2514187258 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4a8400fe80 con 0x7f4a94072440 2026-03-10T12:44:05.293 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.292+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a94077140 con 0x7f4a94072440 2026-03-10T12:44:05.293 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.292+0000 7f4a8affd700 1 -- 192.168.123.100:0/2514187258 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a8400e5c0 con 0x7f4a94072440 2026-03-10T12:44:05.293 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.292+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a94077660 con 0x7f4a94072440 2026-03-10T12:44:05.293 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.293+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a94110c20 con 0x7f4a94072440 2026-03-10T12:44:05.295 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.295+0000 7f4a8affd700 1 -- 192.168.123.100:0/2514187258 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4a8400f9a0 con 0x7f4a94072440 2026-03-10T12:44:05.295 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.295+0000 7f4a8affd700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a80077750 0x7f4a80079c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.295+0000 7f4a8affd700 1 -- 192.168.123.100:0/2514187258 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f4a84099c20 con 0x7f4a94072440 2026-03-10T12:44:05.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.295+0000 7f4a99b0c700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a80077750 0x7f4a80079c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.296 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.296+0000 7f4a99b0c700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a80077750 0x7f4a80079c00 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f4a94117c60 tx=0x7f4a90005300 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.298 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.298+0000 7f4a8affd700 1 -- 192.168.123.100:0/2514187258 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4a84062420 con 0x7f4a94072440 2026-03-10T12:44:05.430 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.429+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4a94061190 con 0x7f4a80077750 2026-03-10T12:44:05.431 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.431+0000 7f4a8affd700 1 -- 192.168.123.100:0/2514187258 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+466 (secure 0 0 0) 0x7f4a94061190 con 0x7f4a80077750 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.433+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a80077750 msgr2=0x7f4a80079c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.433+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a80077750 0x7f4a80079c00 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f4a94117c60 tx=0x7f4a90005300 comp rx=0 tx=0).stop 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.433+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a94072440 msgr2=0x7f4a94116eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.433+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a94072440 0x7f4a94116eb0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f4a94107dc0 tx=0x7f4a8400dbb0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.434+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 shutdown_connections 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.434+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a80077750 0x7f4a80079c00 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.434+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4a94071a60 0x7f4a94116970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.434+0000 7f4a9ab0e700 1 --2- 192.168.123.100:0/2514187258 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4a94072440 0x7f4a94116eb0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.434+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 >> 192.168.123.100:0/2514187258 conn(0x7f4a9406d1a0 msgr2=0x7f4a9410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.434+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 shutdown_connections 2026-03-10T12:44:05.434 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.434+0000 7f4a9ab0e700 1 -- 192.168.123.100:0/2514187258 wait complete. 2026-03-10T12:44:05.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.511+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/4081098405 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc8079b30 msgr2=0x7f3dc8079f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.511+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/4081098405 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc8079b30 0x7f3dc8079f40 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f3db8009b00 tx=0x7f3db8009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:05.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.511+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/4081098405 shutdown_connections 2026-03-10T12:44:05.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.511+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/4081098405 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dc807ad80 0x7f3dc807b1f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.511+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/4081098405 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc8079b30 0x7f3dc8079f40 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.511+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/4081098405 >> 192.168.123.100:0/4081098405 conn(0x7f3dc80758f0 msgr2=0x7f3dc8077d40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/4081098405 shutdown_connections 2026-03-10T12:44:05.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/4081098405 wait complete. 2026-03-10T12:44:05.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 Processor -- start 2026-03-10T12:44:05.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 -- start start 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dc8079b30 0x7f3dc8071ca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc807ad80 0x7f3dc80721e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dc8072850 con 0x7f3dc807ad80 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.512+0000 7f3dcd7f5700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dc81a1420 con 0x7f3dc8079b30 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc77fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc807ad80 0x7f3dc80721e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc77fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc807ad80 0x7f3dc80721e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54132/0 (socket says 192.168.123.100:54132) 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc77fe700 1 -- 192.168.123.100:0/2046219573 learned_addr learned my addr 192.168.123.100:0/2046219573 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc7fff700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dc8079b30 0x7f3dc8071ca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc77fe700 1 -- 192.168.123.100:0/2046219573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dc8079b30 msgr2=0x7f3dc8071ca0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc77fe700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dc8079b30 0x7f3dc8071ca0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc77fe700 1 -- 192.168.123.100:0/2046219573 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3db80097e0 con 0x7f3dc807ad80 2026-03-10T12:44:05.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc77fe700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc807ad80 0x7f3dc80721e0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f3dc000c390 tx=0x7f3dc000c750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dc57fa700 1 -- 192.168.123.100:0/2046219573 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dc000e030 con 0x7f3dc807ad80 2026-03-10T12:44:05.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.513+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3dc81a1700 con 0x7f3dc807ad80 2026-03-10T12:44:05.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.514+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3dc81a1c50 con 0x7f3dc807ad80 2026-03-10T12:44:05.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.516+0000 7f3dc57fa700 1 -- 192.168.123.100:0/2046219573 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3dc000f040 con 0x7f3dc807ad80 2026-03-10T12:44:05.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.516+0000 7f3dc57fa700 1 -- 192.168.123.100:0/2046219573 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dc00146c0 con 0x7f3dc807ad80 2026-03-10T12:44:05.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.516+0000 7f3dc57fa700 1 -- 192.168.123.100:0/2046219573 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3dc0014900 con 0x7f3dc807ad80 2026-03-10T12:44:05.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.516+0000 7f3dc57fa700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3db0077ab0 0x7f3db0079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.516+0000 7f3dc7fff700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3db0077ab0 0x7f3db0079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.516+0000 7f3dc57fa700 1 -- 192.168.123.100:0/2046219573 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f3dc009ad10 con 0x7f3dc807ad80 2026-03-10T12:44:05.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.517+0000 7f3dc7fff700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3db0077ab0 0x7f3db0079f60 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f3db800b5c0 tx=0x7f3db801a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.518+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3db4005320 con 0x7f3dc807ad80 2026-03-10T12:44:05.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.521+0000 7f3dc57fa700 1 -- 192.168.123.100:0/2046219573 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3dc0063510 con 0x7f3dc807ad80 2026-03-10T12:44:05.649 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.648+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3db4000bf0 con 0x7f3db0077ab0 2026-03-10T12:44:05.654 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.654+0000 7f3dc57fa700 1 -- 192.168.123.100:0/2046219573 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f3db4000bf0 con 0x7f3db0077ab0 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (10m) 1s ago 11m 25.7M - 0.25.0 c8568f914cd2 1897443e8fdf 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (5s) 1s ago 11m 9823k - 19.2.3-678-ge911bdeb 654f31e6858e cb97d867901c 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (2s) 1s ago 10m 9667k - 19.2.3-678-ge911bdeb 654f31e6858e 04b17a97a05a 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (4m) 1s ago 11m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (4m) 1s ago 10m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (10m) 1s ago 10m 91.9M - 9.4.7 954c08fa6188 16c12a6ce1fc 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (41s) 1s ago 8m 22.8M - 19.2.3-678-ge911bdeb 654f31e6858e 6ba265e19d66 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (32s) 1s ago 8m 75.9M - 19.2.3-678-ge911bdeb 654f31e6858e 29b157465a74 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (11s) 1s ago 8m 14.5M - 19.2.3-678-ge911bdeb 654f31e6858e dc7af8899792 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (21s) 1s ago 8m 21.5M - 19.2.3-678-ge911bdeb 654f31e6858e 66059e3b13a4 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (5m) 1s ago 11m 630M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (5m) 1s ago 10m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (5m) 1s ago 11m 68.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (4m) 1s ago 10m 61.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (11m) 1s ago 11m 15.2M - 1.5.0 0da6a335fe13 7a5c14d6ba46 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (10m) 1s ago 10m 16.0M - 1.5.0 0da6a335fe13 2fac2415b763 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (4m) 1s ago 10m 187M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (2m) 1s ago 10m 120M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (2m) 1s ago 9m 128M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (2m) 1s ago 9m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (111s) 1s ago 9m 122M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7ac87e1c2a41 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (89s) 1s ago 9m 131M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bd169bf00834 2026-03-10T12:44:05.655 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (5m) 1s ago 10m 72.3M - 2.43.0 a07b618ecd1d c80074b6b052 2026-03-10T12:44:05.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.656+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3db0077ab0 msgr2=0x7f3db0079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.656+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3db0077ab0 0x7f3db0079f60 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f3db800b5c0 tx=0x7f3db801a040 comp rx=0 tx=0).stop 2026-03-10T12:44:05.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.656+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc807ad80 msgr2=0x7f3dc80721e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.657 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.656+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc807ad80 0x7f3dc80721e0 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f3dc000c390 tx=0x7f3dc000c750 comp rx=0 tx=0).stop 2026-03-10T12:44:05.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.657+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 shutdown_connections 2026-03-10T12:44:05.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.657+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3db0077ab0 0x7f3db0079f60 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.657+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3dc8079b30 0x7f3dc8071ca0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.657+0000 7f3dcd7f5700 1 --2- 192.168.123.100:0/2046219573 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3dc807ad80 0x7f3dc80721e0 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.657+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 >> 192.168.123.100:0/2046219573 conn(0x7f3dc80758f0 msgr2=0x7f3dc807dfb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.657+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 shutdown_connections 2026-03-10T12:44:05.658 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.657+0000 7f3dcd7f5700 1 -- 192.168.123.100:0/2046219573 wait complete. 2026-03-10T12:44:05.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.735+0000 7fad4fe1b700 1 -- 192.168.123.100:0/1801486555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48071980 msgr2=0x7fad48071d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.735+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/1801486555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48071980 0x7fad48071d90 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fad44009b50 tx=0x7fad44009e60 comp rx=0 tx=0).stop 2026-03-10T12:44:05.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.735+0000 7fad4fe1b700 1 -- 192.168.123.100:0/1801486555 shutdown_connections 2026-03-10T12:44:05.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.735+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/1801486555 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad48072360 0x7fad480770e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.735+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/1801486555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48071980 0x7fad48071d90 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.735+0000 7fad4fe1b700 1 -- 192.168.123.100:0/1801486555 >> 192.168.123.100:0/1801486555 conn(0x7fad4806d1a0 msgr2=0x7fad4806f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 -- 192.168.123.100:0/1801486555 shutdown_connections 2026-03-10T12:44:05.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 -- 192.168.123.100:0/1801486555 wait complete. 2026-03-10T12:44:05.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 Processor -- start 2026-03-10T12:44:05.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 -- start start 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48072360 0x7fad48082500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad48082a40 0x7fad48082eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad481b2a90 con 0x7fad48072360 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.736+0000 7fad4fe1b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad481b2bd0 con 0x7fad48082a40 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4dbb7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48072360 0x7fad48082500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4dbb7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48072360 0x7fad48082500 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54158/0 (socket says 192.168.123.100:54158) 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4dbb7700 1 -- 192.168.123.100:0/568529600 learned_addr learned my addr 192.168.123.100:0/568529600 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4d3b6700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad48082a40 0x7fad48082eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4dbb7700 1 -- 192.168.123.100:0/568529600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad48082a40 msgr2=0x7fad48082eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4dbb7700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad48082a40 0x7fad48082eb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4dbb7700 1 -- 192.168.123.100:0/568529600 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fad440097e0 con 0x7fad48072360 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.737+0000 7fad4dbb7700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48072360 0x7fad48082500 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fad44004990 tx=0x7fad44004a70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.738+0000 7fad3effd700 1 -- 192.168.123.100:0/568529600 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad4401c070 con 0x7fad48072360 2026-03-10T12:44:05.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.738+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fad481b2d10 con 0x7fad48072360 2026-03-10T12:44:05.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.739+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fad481b31b0 con 0x7fad48072360 2026-03-10T12:44:05.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.739+0000 7fad3effd700 1 -- 192.168.123.100:0/568529600 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fad440056e0 con 0x7fad48072360 2026-03-10T12:44:05.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.740+0000 7fad3effd700 1 -- 192.168.123.100:0/568529600 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad440207e0 con 0x7fad48072360 2026-03-10T12:44:05.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.741+0000 7fad3effd700 1 -- 192.168.123.100:0/568529600 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fad4402a430 con 0x7fad48072360 2026-03-10T12:44:05.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.741+0000 7fad3effd700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad34079c00 0x7fad3407c0b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:05.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.742+0000 7fad4d3b6700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad34079c00 0x7fad3407c0b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:05.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.742+0000 7fad3effd700 1 -- 192.168.123.100:0/568529600 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fad4409bc00 con 0x7fad48072360 2026-03-10T12:44:05.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.743+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fad2c005320 con 0x7fad48072360 2026-03-10T12:44:05.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.743+0000 7fad4d3b6700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad34079c00 0x7fad3407c0b0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fad48072ff0 tx=0x7fad40009040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:05.751 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.750+0000 7fad3effd700 1 -- 192.168.123.100:0/568529600 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fad44064400 con 0x7fad48072360 2026-03-10T12:44:05.778 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:05 vm00.local ceph-mon[103263]: pgmap v194: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 4.1 MiB/s rd, 7.2 KiB/s wr, 12 op/s 2026-03-10T12:44:05.778 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.778 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.778 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.778 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:05 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:05.979 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.978+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fad2c006200 con 0x7fad48072360 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.982+0000 7fad3effd700 1 -- 192.168.123.100:0/568529600 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fad44025070 con 0x7fad48072360 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:44:05.983 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:44:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.988+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad34079c00 msgr2=0x7fad3407c0b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad34079c00 0x7fad3407c0b0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fad48072ff0 tx=0x7fad40009040 comp rx=0 tx=0).stop 2026-03-10T12:44:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48072360 msgr2=0x7fad48082500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48072360 0x7fad48082500 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fad44004990 tx=0x7fad44004a70 comp rx=0 tx=0).stop 2026-03-10T12:44:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 shutdown_connections 2026-03-10T12:44:05.989 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fad34079c00 0x7fad3407c0b0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fad48072360 0x7fad48082500 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 --2- 192.168.123.100:0/568529600 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fad48082a40 0x7fad48082eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.989+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 >> 192.168.123.100:0/568529600 conn(0x7fad4806d1a0 msgr2=0x7fad480705c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.990+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 shutdown_connections 2026-03-10T12:44:05.990 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:05.990+0000 7fad4fe1b700 1 -- 192.168.123.100:0/568529600 wait complete. 2026-03-10T12:44:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:05 vm07.local ceph-mon[93622]: pgmap v194: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 4.1 MiB/s rd, 7.2 KiB/s wr, 12 op/s 2026-03-10T12:44:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:05 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:05 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:05 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:06.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:05 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:06.081 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.080+0000 7f16d775f700 1 -- 192.168.123.100:0/2971199498 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8098820 msgr2=0x7f16c8098c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.082+0000 7f16d775f700 1 --2- 192.168.123.100:0/2971199498 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8098820 0x7f16c8098c70 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f16c0009b00 tx=0x7f16c0009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:06.082 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.082+0000 7f16d775f700 1 -- 192.168.123.100:0/2971199498 shutdown_connections 2026-03-10T12:44:06.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.082+0000 7f16d775f700 1 --2- 192.168.123.100:0/2971199498 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8098820 0x7f16c8098c70 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.082+0000 7f16d775f700 1 --2- 192.168.123.100:0/2971199498 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f16c8097620 0x7f16c8097a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.082+0000 7f16d775f700 1 -- 192.168.123.100:0/2971199498 >> 192.168.123.100:0/2971199498 conn(0x7f16c8092bb0 msgr2=0x7f16c8095000 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:06.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.083+0000 7f16d775f700 1 -- 192.168.123.100:0/2971199498 shutdown_connections 2026-03-10T12:44:06.083 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.083+0000 7f16d775f700 1 -- 192.168.123.100:0/2971199498 wait complete. 2026-03-10T12:44:06.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.084+0000 7f16d775f700 1 Processor -- start 2026-03-10T12:44:06.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.084+0000 7f16d775f700 1 -- start start 2026-03-10T12:44:06.084 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.084+0000 7f16d775f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8097620 0x7f16c812cca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.084+0000 7f16d54fb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8097620 0x7f16c812cca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.084+0000 7f16d54fb700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8097620 0x7f16c812cca0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54170/0 (socket says 192.168.123.100:54170) 2026-03-10T12:44:06.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.084+0000 7f16d775f700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f16c8098820 0x7f16c812d1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.085+0000 7f16d54fb700 1 -- 192.168.123.100:0/4133362961 learned_addr learned my addr 192.168.123.100:0/4133362961 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:06.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.085+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16c812d7e0 con 0x7f16c8097620 2026-03-10T12:44:06.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.085+0000 7f16d4cfa700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f16c8098820 0x7f16c812d1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.085+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16c812d950 con 0x7f16c8098820 2026-03-10T12:44:06.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.086+0000 7f16d54fb700 1 -- 192.168.123.100:0/4133362961 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f16c8098820 msgr2=0x7f16c812d1e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.086+0000 7f16d54fb700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f16c8098820 0x7f16c812d1e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.086+0000 7f16d54fb700 1 -- 192.168.123.100:0/4133362961 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f16c00097e0 con 0x7f16c8097620 2026-03-10T12:44:06.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.086+0000 7f16d54fb700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8097620 0x7f16c812cca0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f16cc009fd0 tx=0x7f16cc00edf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:06.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.086+0000 7f16c67fc700 1 -- 192.168.123.100:0/4133362961 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f16cc009980 con 0x7f16c8097620 2026-03-10T12:44:06.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.086+0000 7f16c67fc700 1 -- 192.168.123.100:0/4133362961 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f16cc004500 con 0x7f16c8097620 2026-03-10T12:44:06.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.086+0000 7f16c67fc700 1 -- 192.168.123.100:0/4133362961 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f16cc010430 con 0x7f16c8097620 2026-03-10T12:44:06.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.087+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f16c81323e0 con 0x7f16c8097620 2026-03-10T12:44:06.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.087+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f16c8132930 con 0x7f16c8097620 2026-03-10T12:44:06.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.088+0000 7f16c67fc700 1 -- 192.168.123.100:0/4133362961 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f16cc010660 con 0x7f16c8097620 2026-03-10T12:44:06.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.089+0000 7f16c67fc700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f16bc0778e0 0x7f16bc079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.089+0000 7f16c67fc700 1 -- 192.168.123.100:0/4133362961 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f16cc014070 con 0x7f16c8097620 2026-03-10T12:44:06.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.089+0000 7f16d4cfa700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f16bc0778e0 0x7f16bc079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.092+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f16b4005320 con 0x7f16c8097620 2026-03-10T12:44:06.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.093+0000 7f16d4cfa700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f16bc0778e0 0x7f16bc079d90 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f16c812e1b0 tx=0x7f16c001a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:06.096 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.095+0000 7f16c67fc700 1 -- 192.168.123.100:0/4133362961 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f16cc09f020 con 0x7f16c8097620 2026-03-10T12:44:06.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.244+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f16b4005cc0 con 0x7f16c8097620 2026-03-10T12:44:06.245 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.244+0000 7f16c67fc700 1 -- 192.168.123.100:0/4133362961 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 38 v38) v1 ==== 76+0+1984 (secure 0 0 0) 0x7f16cc09f7d0 con 0x7f16c8097620 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:e38 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:btime 2026-03-10T12:44:00:282446+0000 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:epoch 38 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:44:00.282443+0000 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 83 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:up {0=34368,1=44277} 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 34368 members: 34368,44277 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{0:34368} state up:active seq 8 join_fscid=1 addr [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:44277} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:06.246 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:44:06.247 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:06.247 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{-1:44301} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6824/48365433,v1:192.168.123.107:6825/48365433] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:06.247 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:44305} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/3408808533,v1:192.168.123.107:6827/3408808533] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:06.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.247+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f16bc0778e0 msgr2=0x7f16bc079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f16bc0778e0 0x7f16bc079d90 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f16c812e1b0 tx=0x7f16c001a040 comp rx=0 tx=0).stop 2026-03-10T12:44:06.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8097620 msgr2=0x7f16c812cca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.248 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8097620 0x7f16c812cca0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f16cc009fd0 tx=0x7f16cc00edf0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 shutdown_connections 2026-03-10T12:44:06.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f16bc0778e0 0x7f16bc079d90 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f16c8097620 0x7f16c812cca0 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 --2- 192.168.123.100:0/4133362961 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f16c8098820 0x7f16c812d1e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 >> 192.168.123.100:0/4133362961 conn(0x7f16c8092bb0 msgr2=0x7f16c8094e40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:06.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 shutdown_connections 2026-03-10T12:44:06.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.248+0000 7f16d775f700 1 -- 192.168.123.100:0/4133362961 wait complete. 2026-03-10T12:44:06.251 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 38 2026-03-10T12:44:06.334 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.333+0000 7f061dc8f700 1 -- 192.168.123.100:0/2376153202 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 msgr2=0x7f06100a4720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.338 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.333+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2376153202 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100a4720 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f060c009b00 tx=0x7f060c009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:06.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.338+0000 7f061dc8f700 1 -- 192.168.123.100:0/2376153202 shutdown_connections 2026-03-10T12:44:06.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.338+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2376153202 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f06100a5450 0x7f06100a58c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.338+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2376153202 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100a4720 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.339 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.338+0000 7f061dc8f700 1 -- 192.168.123.100:0/2376153202 >> 192.168.123.100:0/2376153202 conn(0x7f061009f7e0 msgr2=0x7f06100a1c30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:06.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.339+0000 7f061dc8f700 1 -- 192.168.123.100:0/2376153202 shutdown_connections 2026-03-10T12:44:06.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.339+0000 7f061dc8f700 1 -- 192.168.123.100:0/2376153202 wait complete. 2026-03-10T12:44:06.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.340+0000 7f061dc8f700 1 Processor -- start 2026-03-10T12:44:06.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.340+0000 7f061dc8f700 1 -- start start 2026-03-10T12:44:06.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.341+0000 7f061dc8f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100b33b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.341+0000 7f061dc8f700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f06100a5450 0x7f06100b38f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.341+0000 7f061dc8f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f06100b3f10 con 0x7f06100a4310 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.341+0000 7f061dc8f700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f06100b4050 con 0x7f06100a5450 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.342+0000 7f0617fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f06100a5450 0x7f06100b38f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.342+0000 7f061cc8d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100b33b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.342+0000 7f061cc8d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100b33b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:54184/0 (socket says 192.168.123.100:54184) 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.342+0000 7f061cc8d700 1 -- 192.168.123.100:0/2632635964 learned_addr learned my addr 192.168.123.100:0/2632635964 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.343+0000 7f061cc8d700 1 -- 192.168.123.100:0/2632635964 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f06100a5450 msgr2=0x7f06100b38f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.343+0000 7f061cc8d700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f06100a5450 0x7f06100b38f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.343+0000 7f061cc8d700 1 -- 192.168.123.100:0/2632635964 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f060c0097e0 con 0x7f06100a4310 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.343+0000 7f061cc8d700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100b33b0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f060c00bb70 tx=0x7f060c00bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:06.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.345+0000 7f0615ffb700 1 -- 192.168.123.100:0/2632635964 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f060c01d070 con 0x7f06100a4310 2026-03-10T12:44:06.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.345+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f06100ba1d0 con 0x7f06100a4310 2026-03-10T12:44:06.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.345+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f06100ba6c0 con 0x7f06100a4310 2026-03-10T12:44:06.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.348+0000 7f0615ffb700 1 -- 192.168.123.100:0/2632635964 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f060c022470 con 0x7f06100a4310 2026-03-10T12:44:06.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.348+0000 7f0615ffb700 1 -- 192.168.123.100:0/2632635964 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f060c00f650 con 0x7f06100a4310 2026-03-10T12:44:06.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.348+0000 7f0615ffb700 1 -- 192.168.123.100:0/2632635964 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f060c00f870 con 0x7f06100a4310 2026-03-10T12:44:06.348 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.348+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0610004f80 con 0x7f06100a4310 2026-03-10T12:44:06.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.348+0000 7f0615ffb700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0608077b80 0x7f060807a030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.349+0000 7f0617fff700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0608077b80 0x7f060807a030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.349+0000 7f0617fff700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0608077b80 0x7f060807a030 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f06180666b0 tx=0x7f0618067150 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:06.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.349+0000 7f0615ffb700 1 -- 192.168.123.100:0/2632635964 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f060c09c190 con 0x7f06100a4310 2026-03-10T12:44:06.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.356+0000 7f0615ffb700 1 -- 192.168.123.100:0/2632635964 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f060c064a40 con 0x7f06100a4310 2026-03-10T12:44:06.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.517+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f06100a9b30 con 0x7f0608077b80 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.518+0000 7f0615ffb700 1 -- 192.168.123.100:0/2632635964 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+466 (secure 0 0 0) 0x7f06100a9b30 con 0x7f0608077b80 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "mds", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "mon", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "ceph-exporter", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "osd" 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "18/23 daemons upgraded", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading node-exporter daemons", 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:44:06.519 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:44:06.522 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0608077b80 msgr2=0x7f060807a030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.522 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0608077b80 0x7f060807a030 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f06180666b0 tx=0x7f0618067150 comp rx=0 tx=0).stop 2026-03-10T12:44:06.522 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 msgr2=0x7f06100b33b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.522 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100b33b0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f060c00bb70 tx=0x7f060c00bba0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 shutdown_connections 2026-03-10T12:44:06.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0608077b80 0x7f060807a030 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f06100a4310 0x7f06100b33b0 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 --2- 192.168.123.100:0/2632635964 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f06100a5450 0x7f06100b38f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.522+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 >> 192.168.123.100:0/2632635964 conn(0x7f061009f7e0 msgr2=0x7f06100a1ba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:06.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.523+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 shutdown_connections 2026-03-10T12:44:06.523 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.523+0000 7f061dc8f700 1 -- 192.168.123.100:0/2632635964 wait complete. 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 -- 192.168.123.100:0/3512635210 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950072440 msgr2=0x7f195010be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 --2- 192.168.123.100:0/3512635210 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950072440 0x7f195010be90 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f194800b3a0 tx=0x7f194800b6b0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 -- 192.168.123.100:0/3512635210 shutdown_connections 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 --2- 192.168.123.100:0/3512635210 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950072440 0x7f195010be90 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 --2- 192.168.123.100:0/3512635210 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 0x7f1950071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 -- 192.168.123.100:0/3512635210 >> 192.168.123.100:0/3512635210 conn(0x7f195006d1a0 msgr2=0x7f195006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 -- 192.168.123.100:0/3512635210 shutdown_connections 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.609+0000 7f19573a3700 1 -- 192.168.123.100:0/3512635210 wait complete. 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19573a3700 1 Processor -- start 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19573a3700 1 -- start start 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19573a3700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 0x7f1950116a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19573a3700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950116f90 0x7f19501b2800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19573a3700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1950117490 con 0x7f1950116f90 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19573a3700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1950117600 con 0x7f1950071a60 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19563a1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 0x7f1950116a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19563a1700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 0x7f1950116a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:47042/0 (socket says 192.168.123.100:47042) 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19563a1700 1 -- 192.168.123.100:0/106547358 learned_addr learned my addr 192.168.123.100:0/106547358 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f1955ba0700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950116f90 0x7f19501b2800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19563a1700 1 -- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950116f90 msgr2=0x7f19501b2800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.610+0000 7f19563a1700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950116f90 0x7f19501b2800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.615 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.611+0000 7f19563a1700 1 -- 192.168.123.100:0/106547358 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f194800b050 con 0x7f1950071a60 2026-03-10T12:44:06.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.614+0000 7f19563a1700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 0x7f1950116a50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f194c00b6e0 tx=0x7f194c00baa0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:06.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.614+0000 7f19477fe700 1 -- 192.168.123.100:0/106547358 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f194c00f800 con 0x7f1950071a60 2026-03-10T12:44:06.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.615+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f19501b2da0 con 0x7f1950071a60 2026-03-10T12:44:06.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.615+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f19501b3260 con 0x7f1950071a60 2026-03-10T12:44:06.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.616+0000 7f19477fe700 1 -- 192.168.123.100:0/106547358 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f194c00fe40 con 0x7f1950071a60 2026-03-10T12:44:06.616 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.616+0000 7f19477fe700 1 -- 192.168.123.100:0/106547358 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f194c00d5f0 con 0x7f1950071a60 2026-03-10T12:44:06.617 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.617+0000 7f19477fe700 1 -- 192.168.123.100:0/106547358 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f194c00d750 con 0x7f1950071a60 2026-03-10T12:44:06.618 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.618+0000 7f19477fe700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f193c077910 0x7f193c079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:06.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.618+0000 7f1955ba0700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f193c077910 0x7f193c079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:06.619 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.618+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1934005320 con 0x7f1950071a60 2026-03-10T12:44:06.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.620+0000 7f1955ba0700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f193c077910 0x7f193c079dc0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f1948007f20 tx=0x7f19480061f0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:06.620 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.620+0000 7f19477fe700 1 -- 192.168.123.100:0/106547358 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f194c09a960 con 0x7f1950071a60 2026-03-10T12:44:06.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.622+0000 7f19477fe700 1 -- 192.168.123.100:0/106547358 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f194c0630e0 con 0x7f1950071a60 2026-03-10T12:44:06.798 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.797+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f1934005190 con 0x7f1950071a60 2026-03-10T12:44:06.798 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.798+0000 7f19477fe700 1 -- 192.168.123.100:0/106547358 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7f194c062320 con 0x7f1950071a60 2026-03-10T12:44:06.799 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:44:06.801 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f193c077910 msgr2=0x7f193c079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.801 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f193c077910 0x7f193c079dc0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f1948007f20 tx=0x7f19480061f0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 msgr2=0x7f1950116a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 0x7f1950116a50 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f194c00b6e0 tx=0x7f194c00baa0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 shutdown_connections 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f193c077910 0x7f193c079dc0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1950071a60 0x7f1950116a50 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 --2- 192.168.123.100:0/106547358 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1950116f90 0x7f19501b2800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 >> 192.168.123.100:0/106547358 conn(0x7f195006d1a0 msgr2=0x7f1950070610 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 shutdown_connections 2026-03-10T12:44:06.802 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:06.801+0000 7f19573a3700 1 -- 192.168.123.100:0/106547358 wait complete. 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='client.34402 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='client.44313 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='client.34410 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm00"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm00"}]': finished 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/568529600' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]': finished 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/4133362961' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:44:07.564 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:07 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/106547358' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='client.34402 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='client.44313 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='client.34410 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm00"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm00"}]': finished 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/568529600' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm07"}]': finished 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:07.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:07.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/4133362961' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:44:07.567 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:07 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/106547358' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all iscsi 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all nfs 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: Upgrade: Setting container_image for all nvmeof 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: Upgrade: Updating node-exporter.vm00 (1/2) 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: Deploying daemon node-exporter.vm00 on vm00 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: from='client.34422 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:08.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:08 vm00.local ceph-mon[103263]: pgmap v195: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 7.1 KiB/s rd, 6.4 KiB/s wr, 8 op/s 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all iscsi 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all nfs 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: Upgrade: Setting container_image for all nvmeof 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: Upgrade: Updating node-exporter.vm00 (1/2) 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: Deploying daemon node-exporter.vm00 on vm00 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: from='client.34422 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:08.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:08 vm07.local ceph-mon[93622]: pgmap v195: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 7.1 KiB/s rd, 6.4 KiB/s wr, 8 op/s 2026-03-10T12:44:09.669 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:09 vm00.local ceph-mon[103263]: pgmap v196: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 6.2 KiB/s rd, 6.4 KiB/s wr, 8 op/s 2026-03-10T12:44:09.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:09 vm07.local ceph-mon[93622]: pgmap v196: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 6.2 KiB/s rd, 6.4 KiB/s wr, 8 op/s 2026-03-10T12:44:10.769 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:10.769 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:10 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:11.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:11.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:10 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:11 vm07.local ceph-mon[93622]: Upgrade: Updating node-exporter.vm07 (2/2) 2026-03-10T12:44:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:11 vm07.local ceph-mon[93622]: Deploying daemon node-exporter.vm07 on vm07 2026-03-10T12:44:12.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:11 vm07.local ceph-mon[93622]: pgmap v197: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 6.2 KiB/s rd, 170 B/s wr, 6 op/s 2026-03-10T12:44:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:11 vm00.local ceph-mon[103263]: Upgrade: Updating node-exporter.vm07 (2/2) 2026-03-10T12:44:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:11 vm00.local ceph-mon[103263]: Deploying daemon node-exporter.vm07 on vm07 2026-03-10T12:44:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:11 vm00.local ceph-mon[103263]: pgmap v197: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 6.2 KiB/s rd, 170 B/s wr, 6 op/s 2026-03-10T12:44:13.943 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:13 vm07.local ceph-mon[93622]: pgmap v198: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 6.2 KiB/s rd, 170 B/s wr, 6 op/s 2026-03-10T12:44:13.943 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:13 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:13.943 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:13 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:13.943 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:13 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:13.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:13 vm00.local ceph-mon[103263]: pgmap v198: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 6.2 KiB/s rd, 170 B/s wr, 6 op/s 2026-03-10T12:44:13.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:13 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:13.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:13 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:13.968 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:13 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:15.284 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.284 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.284 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:15 vm07.local ceph-mon[93622]: pgmap v199: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 4.3 KiB/s rd, 85 B/s wr, 4 op/s 2026-03-10T12:44:15.284 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.284 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.284 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.542 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.542 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.542 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:15 vm00.local ceph-mon[103263]: pgmap v199: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 4.3 KiB/s rd, 85 B/s wr, 4 op/s 2026-03-10T12:44:15.542 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.542 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.542 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.542 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:15.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:16.485 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: Upgrade: Updating prometheus.vm00 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: pgmap v200: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 0 op/s 2026-03-10T12:44:17.589 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:17 vm07.local ceph-mon[93622]: Deploying daemon prometheus.vm00 on vm00 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:17.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: Upgrade: Updating prometheus.vm00 2026-03-10T12:44:17.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: pgmap v200: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail; 511 B/s rd, 0 op/s 2026-03-10T12:44:17.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:17 vm00.local ceph-mon[103263]: Deploying daemon prometheus.vm00 on vm00 2026-03-10T12:44:19.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:19 vm00.local ceph-mon[103263]: pgmap v201: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:20.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:19 vm07.local ceph-mon[93622]: pgmap v201: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:21.853 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:21 vm00.local ceph-mon[103263]: pgmap v202: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:22.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:21 vm07.local ceph-mon[93622]: pgmap v202: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:23.293 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:23 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:23.293 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:23 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:23.293 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:23 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:23.293 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:23 vm00.local ceph-mon[103263]: pgmap v203: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:23.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:23 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:23.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:23 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:23.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:23 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:23.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:23 vm07.local ceph-mon[93622]: pgmap v203: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:24.473 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:24.474 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:24.474 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:24.474 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:24 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:24.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:24.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:24.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:24.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:24 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:26.352 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: pgmap v204: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.353 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:26 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: pgmap v204: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:26.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:26 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:27.362 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:27 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:44:27.362 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:27 vm00.local ceph-mon[103263]: Upgrade: Updating alertmanager.vm00 2026-03-10T12:44:27.362 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:27 vm00.local ceph-mon[103263]: Deploying daemon alertmanager.vm00 on vm00 2026-03-10T12:44:27.362 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:27 vm00.local ceph-mon[103263]: pgmap v205: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:27.362 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:27 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:27.362 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:27 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:27.362 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:27 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:27 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T12:44:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:27 vm07.local ceph-mon[93622]: Upgrade: Updating alertmanager.vm00 2026-03-10T12:44:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:27 vm07.local ceph-mon[93622]: Deploying daemon alertmanager.vm00 on vm00 2026-03-10T12:44:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:27 vm07.local ceph-mon[93622]: pgmap v205: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:27 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:27 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:27.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:27 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:29.301 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.301 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.301 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.301 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:29 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.301 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:29 vm00.local ceph-mon[103263]: pgmap v206: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:29 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:29 vm07.local ceph-mon[93622]: pgmap v206: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: Upgrade: Updating grafana.vm00 2026-03-10T12:44:30.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:30 vm00.local ceph-mon[103263]: Deploying daemon grafana.vm00 on vm00 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: Upgrade: Updating grafana.vm00 2026-03-10T12:44:30.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:30 vm07.local ceph-mon[93622]: Deploying daemon grafana.vm00 on vm00 2026-03-10T12:44:31.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:31 vm00.local ceph-mon[103263]: pgmap v207: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:31.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:31 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:31.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:31 vm07.local ceph-mon[93622]: pgmap v207: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:31.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:31 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:33.918 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:33 vm00.local ceph-mon[103263]: pgmap v208: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:34.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:33 vm07.local ceph-mon[93622]: pgmap v208: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:36.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:35 vm07.local ceph-mon[93622]: pgmap v209: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:36.107 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:35 vm00.local ceph-mon[103263]: pgmap v209: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:36.918 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.915+0000 7f291e55c700 1 -- 192.168.123.100:0/2102249717 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918103980 msgr2=0x7f2918103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:36.918 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.915+0000 7f291e55c700 1 --2- 192.168.123.100:0/2102249717 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918103980 0x7f2918103dd0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f290c009b00 tx=0x7f290c009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:36.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.921+0000 7f291e55c700 1 -- 192.168.123.100:0/2102249717 shutdown_connections 2026-03-10T12:44:36.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.921+0000 7f291e55c700 1 --2- 192.168.123.100:0/2102249717 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918103980 0x7f2918103dd0 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:36.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.921+0000 7f291e55c700 1 --2- 192.168.123.100:0/2102249717 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2918102780 0x7f2918102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:36.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.921+0000 7f291e55c700 1 -- 192.168.123.100:0/2102249717 >> 192.168.123.100:0/2102249717 conn(0x7f29180fdd10 msgr2=0x7f2918100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:36.925 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.924+0000 7f291e55c700 1 -- 192.168.123.100:0/2102249717 shutdown_connections 2026-03-10T12:44:36.925 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.925+0000 7f291e55c700 1 -- 192.168.123.100:0/2102249717 wait complete. 2026-03-10T12:44:36.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.925+0000 7f291e55c700 1 Processor -- start 2026-03-10T12:44:36.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.925+0000 7f291e55c700 1 -- start start 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.925+0000 7f291e55c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918102780 0x7f2918197ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.925+0000 7f291e55c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2918103980 0x7f2918198530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.925+0000 7f291e55c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2918198b50 con 0x7f2918102780 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.925+0000 7f291e55c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2918198c90 con 0x7f2918103980 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f2917fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918102780 0x7f2918197ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f2917fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918102780 0x7f2918197ff0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:34040/0 (socket says 192.168.123.100:34040) 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f2917fff700 1 -- 192.168.123.100:0/3084954491 learned_addr learned my addr 192.168.123.100:0/3084954491 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f29177fe700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2918103980 0x7f2918198530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f29177fe700 1 -- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918102780 msgr2=0x7f2918197ff0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f29177fe700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918102780 0x7f2918197ff0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f29177fe700 1 -- 192.168.123.100:0/3084954491 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f290c0097e0 con 0x7f2918103980 2026-03-10T12:44:36.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.926+0000 7f29177fe700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2918103980 0x7f2918198530 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f290c000c00 tx=0x7f290c00bb20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:36.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.927+0000 7f29157fa700 1 -- 192.168.123.100:0/3084954491 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f290c01d070 con 0x7f2918103980 2026-03-10T12:44:36.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.927+0000 7f29157fa700 1 -- 192.168.123.100:0/3084954491 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f290c022470 con 0x7f2918103980 2026-03-10T12:44:36.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.927+0000 7f29157fa700 1 -- 192.168.123.100:0/3084954491 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f290c005030 con 0x7f2918103980 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.928+0000 7f291e55c700 1 -- 192.168.123.100:0/3084954491 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f291819d6e0 con 0x7f2918103980 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.928+0000 7f291e55c700 1 -- 192.168.123.100:0/3084954491 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f291819db50 con 0x7f2918103980 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.929+0000 7f29157fa700 1 -- 192.168.123.100:0/3084954491 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f290c0225e0 con 0x7f2918103980 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.930+0000 7f29157fa700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2900077710 0x7f2900079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.930+0000 7f29157fa700 1 -- 192.168.123.100:0/3084954491 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f290c09af50 con 0x7f2918103980 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.930+0000 7f2917fff700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2900077710 0x7f2900079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.930+0000 7f2917fff700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2900077710 0x7f2900079bc0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f2908005950 tx=0x7f2908009450 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:36.931 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.931+0000 7f291e55c700 1 -- 192.168.123.100:0/3084954491 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2918066e40 con 0x7f2918103980 2026-03-10T12:44:36.934 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:36.934+0000 7f29157fa700 1 -- 192.168.123.100:0/3084954491 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f290c063750 con 0x7f2918103980 2026-03-10T12:44:37.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.122+0000 7f291e55c700 1 -- 192.168.123.100:0/3084954491 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f291819df40 con 0x7f2900077710 2026-03-10T12:44:37.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.126+0000 7f29157fa700 1 -- 192.168.123.100:0/3084954491 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7f291819df40 con 0x7f2900077710 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 -- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2900077710 msgr2=0x7f2900079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2900077710 0x7f2900079bc0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f2908005950 tx=0x7f2908009450 comp rx=0 tx=0).stop 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 -- 192.168.123.100:0/3084954491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2918103980 msgr2=0x7f2918198530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2918103980 0x7f2918198530 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f290c000c00 tx=0x7f290c00bb20 comp rx=0 tx=0).stop 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 -- 192.168.123.100:0/3084954491 shutdown_connections 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f2900077710 0x7f2900079bc0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2918102780 0x7f2918197ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 --2- 192.168.123.100:0/3084954491 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2918103980 0x7f2918198530 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.133+0000 7f28feffd700 1 -- 192.168.123.100:0/3084954491 >> 192.168.123.100:0/3084954491 conn(0x7f29180fdd10 msgr2=0x7f2918106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:37.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.136+0000 7f28feffd700 1 -- 192.168.123.100:0/3084954491 shutdown_connections 2026-03-10T12:44:37.137 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.137+0000 7f28feffd700 1 -- 192.168.123.100:0/3084954491 wait complete. 2026-03-10T12:44:37.159 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:44:37.269 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.268+0000 7faaa0a75700 1 -- 192.168.123.100:0/1917273989 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 msgr2=0x7faa9c103db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.269 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.268+0000 7faaa0a75700 1 --2- 192.168.123.100:0/1917273989 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 0x7faa9c103db0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7faa90009b00 tx=0x7faa90009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:37.269 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.268+0000 7faaa0a75700 1 -- 192.168.123.100:0/1917273989 shutdown_connections 2026-03-10T12:44:37.269 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.268+0000 7faaa0a75700 1 --2- 192.168.123.100:0/1917273989 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 0x7faa9c103db0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.269 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.268+0000 7faaa0a75700 1 --2- 192.168.123.100:0/1917273989 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 0x7faa9c102b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.269 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.268+0000 7faaa0a75700 1 -- 192.168.123.100:0/1917273989 >> 192.168.123.100:0/1917273989 conn(0x7faa9c0fdcf0 msgr2=0x7faa9c100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:37.271 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 -- 192.168.123.100:0/1917273989 shutdown_connections 2026-03-10T12:44:37.271 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 -- 192.168.123.100:0/1917273989 wait complete. 2026-03-10T12:44:37.271 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 Processor -- start 2026-03-10T12:44:37.271 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 -- start start 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 0x7faa9c198050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 0x7faa9c198590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faa9c198bb0 con 0x7faa9c103960 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.271+0000 7faaa0a75700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faa9c198cf0 con 0x7faa9c102760 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa9a59c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 0x7faa9c198050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa9a59c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 0x7faa9c198050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:36844/0 (socket says 192.168.123.100:36844) 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa9a59c700 1 -- 192.168.123.100:0/3566690439 learned_addr learned my addr 192.168.123.100:0/3566690439 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa99d9b700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 0x7faa9c198590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa9a59c700 1 -- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 msgr2=0x7faa9c198590 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa9a59c700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 0x7faa9c198590 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa9a59c700 1 -- 192.168.123.100:0/3566690439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faa900097e0 con 0x7faa9c102760 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa9a59c700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 0x7faa9c198050 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7faa9400c370 tx=0x7faa9400c730 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:37.272 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faa8b7fe700 1 -- 192.168.123.100:0/3566690439 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faa94017070 con 0x7faa9c102760 2026-03-10T12:44:37.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faa9c19d740 con 0x7faa9c102760 2026-03-10T12:44:37.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.272+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faa9c19dc90 con 0x7faa9c102760 2026-03-10T12:44:37.274 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.273+0000 7faa8b7fe700 1 -- 192.168.123.100:0/3566690439 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faa9400f040 con 0x7faa9c102760 2026-03-10T12:44:37.275 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.274+0000 7faa8b7fe700 1 -- 192.168.123.100:0/3566690439 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faa9400e050 con 0x7faa9c102760 2026-03-10T12:44:37.275 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.274+0000 7faa8b7fe700 1 -- 192.168.123.100:0/3566690439 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7faa94007500 con 0x7faa9c102760 2026-03-10T12:44:37.275 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.274+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faa7c005320 con 0x7faa9c102760 2026-03-10T12:44:37.275 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.275+0000 7faa8b7fe700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa840776c0 0x7faa84079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.275 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.275+0000 7faa8b7fe700 1 -- 192.168.123.100:0/3566690439 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7faa9409a750 con 0x7faa9c102760 2026-03-10T12:44:37.275 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.275+0000 7faa99d9b700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa840776c0 0x7faa84079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.276 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.276+0000 7faa99d9b700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa840776c0 0x7faa84079b70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7faa900051d0 tx=0x7faa9001a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:37.278 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.277+0000 7faa8b7fe700 1 -- 192.168.123.100:0/3566690439 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7faa94062f50 con 0x7faa9c102760 2026-03-10T12:44:37.473 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.471+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7faa7c000bf0 con 0x7faa840776c0 2026-03-10T12:44:37.474 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.473+0000 7faa8b7fe700 1 -- 192.168.123.100:0/3566690439 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7faa7c000bf0 con 0x7faa840776c0 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa840776c0 msgr2=0x7faa84079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa840776c0 0x7faa84079b70 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7faa900051d0 tx=0x7faa9001a040 comp rx=0 tx=0).stop 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 msgr2=0x7faa9c198050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 0x7faa9c198050 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7faa9400c370 tx=0x7faa9400c730 comp rx=0 tx=0).stop 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 shutdown_connections 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7faa840776c0 0x7faa84079b70 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7faa9c102760 0x7faa9c198050 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 --2- 192.168.123.100:0/3566690439 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7faa9c103960 0x7faa9c198590 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 >> 192.168.123.100:0/3566690439 conn(0x7faa9c0fdcf0 msgr2=0x7faa9c106b90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 shutdown_connections 2026-03-10T12:44:37.479 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.479+0000 7faaa0a75700 1 -- 192.168.123.100:0/3566690439 wait complete. 2026-03-10T12:44:37.576 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 -- 192.168.123.100:0/1853521235 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 msgr2=0x7f853410be90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.576 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 --2- 192.168.123.100:0/1853521235 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f853410be90 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f851c009b00 tx=0x7f851c009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:37.576 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 -- 192.168.123.100:0/1853521235 shutdown_connections 2026-03-10T12:44:37.576 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 --2- 192.168.123.100:0/1853521235 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f853410be90 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.577 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 --2- 192.168.123.100:0/1853521235 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8534071a60 0x7f8534071e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.577 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 -- 192.168.123.100:0/1853521235 >> 192.168.123.100:0/1853521235 conn(0x7f853406d1a0 msgr2=0x7f853406f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:37.577 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 -- 192.168.123.100:0/1853521235 shutdown_connections 2026-03-10T12:44:37.577 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.576+0000 7f8533fff700 1 -- 192.168.123.100:0/1853521235 wait complete. 2026-03-10T12:44:37.577 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f8533fff700 1 Processor -- start 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f8533fff700 1 -- start start 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f8533fff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8534071a60 0x7f85341a4c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f8533fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f85341a51b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f8533fff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85341a5720 con 0x7f8534072440 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f8533fff700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85341a5890 con 0x7f8534071a60 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f852bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f85341a51b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f852bfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f85341a51b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:34074/0 (socket says 192.168.123.100:34074) 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f852bfff700 1 -- 192.168.123.100:0/2418864241 learned_addr learned my addr 192.168.123.100:0/2418864241 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f852bfff700 1 -- 192.168.123.100:0/2418864241 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8534071a60 msgr2=0x7f85341a4c70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f8532ffd700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8534071a60 0x7f85341a4c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f852bfff700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8534071a60 0x7f85341a4c70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.577+0000 7f852bfff700 1 -- 192.168.123.100:0/2418864241 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f851c0097e0 con 0x7f8534072440 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.578+0000 7f8532ffd700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8534071a60 0x7f85341a4c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.578+0000 7f852bfff700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f85341a51b0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f851c000c00 tx=0x7f851c003940 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:37.578 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.578+0000 7f8530ff9700 1 -- 192.168.123.100:0/2418864241 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f851c01d070 con 0x7f8534072440 2026-03-10T12:44:37.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.578+0000 7f8530ff9700 1 -- 192.168.123.100:0/2418864241 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f851c00bea0 con 0x7f8534072440 2026-03-10T12:44:37.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.578+0000 7f8530ff9700 1 -- 192.168.123.100:0/2418864241 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f851c0219a0 con 0x7f8534072440 2026-03-10T12:44:37.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.578+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f853410f540 con 0x7f8534072440 2026-03-10T12:44:37.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.578+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f853410fa30 con 0x7f8534072440 2026-03-10T12:44:37.579 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.579+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f853419ec00 con 0x7f8534072440 2026-03-10T12:44:37.580 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.580+0000 7f8530ff9700 1 -- 192.168.123.100:0/2418864241 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f851c021b00 con 0x7f8534072440 2026-03-10T12:44:37.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.580+0000 7f8530ff9700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f85140779e0 0x7f8514079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.582 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.580+0000 7f8532ffd700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f85140779e0 0x7f8514079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.583+0000 7f8530ff9700 1 -- 192.168.123.100:0/2418864241 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f851c09aee0 con 0x7f8534072440 2026-03-10T12:44:37.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.583+0000 7f8532ffd700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f85140779e0 0x7f8514079e90 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f8524005fd0 tx=0x7f8524005e60 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:37.583 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.583+0000 7f8530ff9700 1 -- 192.168.123.100:0/2418864241 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f851c0636e0 con 0x7f8534072440 2026-03-10T12:44:37.733 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.732+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8534061190 con 0x7f85140779e0 2026-03-10T12:44:37.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.738+0000 7f8530ff9700 1 -- 192.168.123.100:0/2418864241 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f8534061190 con 0x7f85140779e0 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (11s) 9s ago 11m 16.2M - 0.25.0 c8568f914cd2 f2bbecb3fd58 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (37s) 9s ago 11m 10.4M - 19.2.3-678-ge911bdeb 654f31e6858e cb97d867901c 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (35s) 23s ago 11m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e 04b17a97a05a 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (5m) 9s ago 11m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (5m) 23s ago 11m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 starting - - - - 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (73s) 9s ago 9m 23.3M - 19.2.3-678-ge911bdeb 654f31e6858e 6ba265e19d66 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (64s) 9s ago 9m 75.1M - 19.2.3-678-ge911bdeb 654f31e6858e 29b157465a74 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (43s) 23s ago 9m 14.7M - 19.2.3-678-ge911bdeb 654f31e6858e dc7af8899792 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (53s) 23s ago 9m 21.6M - 19.2.3-678-ge911bdeb 654f31e6858e 66059e3b13a4 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (6m) 9s ago 12m 633M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (5m) 23s ago 11m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (5m) 9s ago 12m 66.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (5m) 23s ago 11m 55.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (28s) 9s ago 11m 8837k - 1.7.0 72c9c2088986 2793fc2bcf05 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (24s) 23s ago 11m 3611k - 1.7.0 72c9c2088986 77eb1de1b54e 2026-03-10T12:44:37.739 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (4m) 9s ago 10m 187M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:44:37.740 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (3m) 9s ago 10m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:44:37.740 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (3m) 9s ago 10m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:44:37.740 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (2m) 23s ago 10m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:44:37.740 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (2m) 23s ago 10m 122M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7ac87e1c2a41 2026-03-10T12:44:37.740 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (2m) 23s ago 9m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bd169bf00834 2026-03-10T12:44:37.740 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (15s) 9s ago 11m 51.1M - 2.51.0 1d3b7f56885b 5577ed86e2fa 2026-03-10T12:44:37.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f85140779e0 msgr2=0x7f8514079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f85140779e0 0x7f8514079e90 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f8524005fd0 tx=0x7f8524005e60 comp rx=0 tx=0).stop 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 msgr2=0x7f85341a51b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f85341a51b0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f851c000c00 tx=0x7f851c003940 comp rx=0 tx=0).stop 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 shutdown_connections 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f85140779e0 0x7f8514079e90 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f8534071a60 0x7f85341a4c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 --2- 192.168.123.100:0/2418864241 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f8534072440 0x7f85341a51b0 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 >> 192.168.123.100:0/2418864241 conn(0x7f853406d1a0 msgr2=0x7f853410a6d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 shutdown_connections 2026-03-10T12:44:37.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.743+0000 7f8533fff700 1 -- 192.168.123.100:0/2418864241 wait complete. 2026-03-10T12:44:37.832 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:37.832 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:37.832 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:37 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:37.832 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:37 vm00.local ceph-mon[103263]: pgmap v210: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:37.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.830+0000 7f644b59e700 1 -- 192.168.123.100:0/4098561701 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c072330 msgr2=0x7f644c0770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.830+0000 7f644b59e700 1 --2- 192.168.123.100:0/4098561701 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c072330 0x7f644c0770b0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f644400d3f0 tx=0x7f644400d700 comp rx=0 tx=0).stop 2026-03-10T12:44:37.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.830+0000 7f644b59e700 1 -- 192.168.123.100:0/4098561701 shutdown_connections 2026-03-10T12:44:37.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.830+0000 7f644b59e700 1 --2- 192.168.123.100:0/4098561701 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c072330 0x7f644c0770b0 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.830+0000 7f644b59e700 1 --2- 192.168.123.100:0/4098561701 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f644c071950 0x7f644c071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.832 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.830+0000 7f644b59e700 1 -- 192.168.123.100:0/4098561701 >> 192.168.123.100:0/4098561701 conn(0x7f644c06d1a0 msgr2=0x7f644c06f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:37.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.834+0000 7f644b59e700 1 -- 192.168.123.100:0/4098561701 shutdown_connections 2026-03-10T12:44:37.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.834+0000 7f644b59e700 1 -- 192.168.123.100:0/4098561701 wait complete. 2026-03-10T12:44:37.834 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.834+0000 7f644b59e700 1 Processor -- start 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.834+0000 7f644b59e700 1 -- start start 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.834+0000 7f644b59e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f644c071950 0x7f644c082560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.834+0000 7f644b59e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c082aa0 0x7f644c082f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.835+0000 7f644b59e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f644c1bb2a0 con 0x7f644c082aa0 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.835+0000 7f644b59e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f644c1bb3e0 con 0x7f644c071950 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.835+0000 7f6449d9b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c082aa0 0x7f644c082f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.835+0000 7f6449d9b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c082aa0 0x7f644c082f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:51494/0 (socket says 192.168.123.100:51494) 2026-03-10T12:44:37.835 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.835+0000 7f6449d9b700 1 -- 192.168.123.100:0/3383330127 learned_addr learned my addr 192.168.123.100:0/3383330127 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:37.836 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.836+0000 7f644a59c700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f644c071950 0x7f644c082560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.837+0000 7f6449d9b700 1 -- 192.168.123.100:0/3383330127 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f644c071950 msgr2=0x7f644c082560 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:37.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.837+0000 7f6449d9b700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f644c071950 0x7f644c082560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:37.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.837+0000 7f6449d9b700 1 -- 192.168.123.100:0/3383330127 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6444007ed0 con 0x7f644c082aa0 2026-03-10T12:44:37.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.838+0000 7f6449d9b700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c082aa0 0x7f644c082f10 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f6444003c30 tx=0x7f6444003d10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:37.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.838+0000 7f643b7fe700 1 -- 192.168.123.100:0/3383330127 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f644401c070 con 0x7f644c082aa0 2026-03-10T12:44:37.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.838+0000 7f644b59e700 1 -- 192.168.123.100:0/3383330127 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f644c1bb520 con 0x7f644c082aa0 2026-03-10T12:44:37.838 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.838+0000 7f644b59e700 1 -- 192.168.123.100:0/3383330127 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f644c1bba70 con 0x7f644c082aa0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.838+0000 7f643b7fe700 1 -- 192.168.123.100:0/3383330127 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f644400fcf0 con 0x7f644c082aa0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.839+0000 7f644b59e700 1 -- 192.168.123.100:0/3383330127 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f644c07c900 con 0x7f644c082aa0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.839+0000 7f643b7fe700 1 -- 192.168.123.100:0/3383330127 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6444017dd0 con 0x7f644c082aa0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.840+0000 7f643b7fe700 1 -- 192.168.123.100:0/3383330127 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f644402a430 con 0x7f644c082aa0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.840+0000 7f643b7fe700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64340779e0 0x7f6434079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.840+0000 7f643b7fe700 1 -- 192.168.123.100:0/3383330127 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f6444013070 con 0x7f644c082aa0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.840+0000 7f644a59c700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64340779e0 0x7f6434079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.840+0000 7f644a59c700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64340779e0 0x7f6434079e90 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f643c00a8b0 tx=0x7f643c008040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:37.843 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:37.843+0000 7f643b7fe700 1 -- 192.168.123.100:0/3383330127 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6444065660 con 0x7f644c082aa0 2026-03-10T12:44:38.023 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.023+0000 7f644b59e700 1 -- 192.168.123.100:0/3383330127 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f644c02d070 con 0x7f644c082aa0 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:44:38.026 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:44:38.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.023+0000 7f643b7fe700 1 -- 192.168.123.100:0/3383330127 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f644400fe60 con 0x7f644c082aa0 2026-03-10T12:44:38.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.027+0000 7f64397fa700 1 -- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64340779e0 msgr2=0x7f6434079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.027+0000 7f64397fa700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64340779e0 0x7f6434079e90 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f643c00a8b0 tx=0x7f643c008040 comp rx=0 tx=0).stop 2026-03-10T12:44:38.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.027+0000 7f64397fa700 1 -- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c082aa0 msgr2=0x7f644c082f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.027+0000 7f64397fa700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c082aa0 0x7f644c082f10 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f6444003c30 tx=0x7f6444003d10 comp rx=0 tx=0).stop 2026-03-10T12:44:38.029 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.029+0000 7f64397fa700 1 -- 192.168.123.100:0/3383330127 shutdown_connections 2026-03-10T12:44:38.029 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.029+0000 7f64397fa700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f64340779e0 0x7f6434079e90 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.029 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.029+0000 7f64397fa700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f644c071950 0x7f644c082560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.029 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.029+0000 7f64397fa700 1 --2- 192.168.123.100:0/3383330127 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f644c082aa0 0x7f644c082f10 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.029 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.029+0000 7f64397fa700 1 -- 192.168.123.100:0/3383330127 >> 192.168.123.100:0/3383330127 conn(0x7f644c06d1a0 msgr2=0x7f644c0764a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:38.029 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.029+0000 7f64397fa700 1 -- 192.168.123.100:0/3383330127 shutdown_connections 2026-03-10T12:44:38.030 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.030+0000 7f64397fa700 1 -- 192.168.123.100:0/3383330127 wait complete. 2026-03-10T12:44:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:37 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:38.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:37 vm07.local ceph-mon[93622]: pgmap v210: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:38.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 -- 192.168.123.100:0/3728961751 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a5420 msgr2=0x7fd1180a5890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 --2- 192.168.123.100:0/3728961751 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a5420 0x7fd1180a5890 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fd1200669f0 tx=0x7fd1200699f0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 -- 192.168.123.100:0/3728961751 shutdown_connections 2026-03-10T12:44:38.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 --2- 192.168.123.100:0/3728961751 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a5420 0x7fd1180a5890 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 --2- 192.168.123.100:0/3728961751 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd1180a42e0 0x7fd1180a46f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 -- 192.168.123.100:0/3728961751 >> 192.168.123.100:0/3728961751 conn(0x7fd11809f7b0 msgr2=0x7fd1180a1c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:38.112 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 -- 192.168.123.100:0/3728961751 shutdown_connections 2026-03-10T12:44:38.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.112+0000 7fd1253cf700 1 -- 192.168.123.100:0/3728961751 wait complete. 2026-03-10T12:44:38.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd1253cf700 1 Processor -- start 2026-03-10T12:44:38.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd1253cf700 1 -- start start 2026-03-10T12:44:38.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd1253cf700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a42e0 0x7fd118139b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.113 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd1253cf700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd11813a0d0 0x7fd11813f140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd1253cf700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd11813a5d0 con 0x7fd11813a0d0 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd1253cf700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd11813a740 con 0x7fd1180a42e0 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd11ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a42e0 0x7fd118139b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd11ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a42e0 0x7fd118139b90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:43782/0 (socket says 192.168.123.100:43782) 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.113+0000 7fd11ffff700 1 -- 192.168.123.100:0/2732603567 learned_addr learned my addr 192.168.123.100:0/2732603567 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.114+0000 7fd11f7fe700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd11813a0d0 0x7fd11813f140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.114+0000 7fd11ffff700 1 -- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd11813a0d0 msgr2=0x7fd11813f140 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.114+0000 7fd11ffff700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd11813a0d0 0x7fd11813f140 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.114+0000 7fd11ffff700 1 -- 192.168.123.100:0/2732603567 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd11401a720 con 0x7fd1180a42e0 2026-03-10T12:44:38.114 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.114+0000 7fd11ffff700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a42e0 0x7fd118139b90 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fd11401fe30 tx=0x7fd11401d5c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:38.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.115+0000 7fd11d7fa700 1 -- 192.168.123.100:0/2732603567 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd11401aa80 con 0x7fd1180a42e0 2026-03-10T12:44:38.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.115+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd120067050 con 0x7fd1180a42e0 2026-03-10T12:44:38.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.115+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd11813fa70 con 0x7fd1180a42e0 2026-03-10T12:44:38.115 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.115+0000 7fd11d7fa700 1 -- 192.168.123.100:0/2732603567 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd114004500 con 0x7fd1180a42e0 2026-03-10T12:44:38.116 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.115+0000 7fd11d7fa700 1 -- 192.168.123.100:0/2732603567 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd114005530 con 0x7fd1180a42e0 2026-03-10T12:44:38.117 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.117+0000 7fd11d7fa700 1 -- 192.168.123.100:0/2732603567 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd114004750 con 0x7fd1180a42e0 2026-03-10T12:44:38.117 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.117+0000 7fd11d7fa700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd1100779e0 0x7fd110079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.117+0000 7fd11f7fe700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd1100779e0 0x7fd110079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.118+0000 7fd11d7fa700 1 -- 192.168.123.100:0/2732603567 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fd114025070 con 0x7fd1180a42e0 2026-03-10T12:44:38.118 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.116+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd104005320 con 0x7fd1180a42e0 2026-03-10T12:44:38.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.122+0000 7fd11f7fe700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd1100779e0 0x7fd110079e90 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fd120066ae0 tx=0x7fd120075040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:38.124 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.122+0000 7fd11d7fa700 1 -- 192.168.123.100:0/2732603567 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd1140739a0 con 0x7fd1180a42e0 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.282+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fd104006200 con 0x7fd1180a42e0 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.284+0000 7fd11d7fa700 1 -- 192.168.123.100:0/2732603567 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 38 v38) v1 ==== 76+0+1984 (secure 0 0 0) 0x7fd11402d020 con 0x7fd1180a42e0 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:e38 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:btime 2026-03-10T12:44:00:282446+0000 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:epoch 38 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:44:00.282443+0000 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:44:38.285 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 83 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:up {0=34368,1=44277} 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 34368 members: 34368,44277 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{0:34368} state up:active seq 8 join_fscid=1 addr [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:44277} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{-1:44301} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6824/48365433,v1:192.168.123.107:6825/48365433] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:38.286 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:44305} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/3408808533,v1:192.168.123.107:6827/3408808533] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:44:38.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.288+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd1100779e0 msgr2=0x7fd110079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.288+0000 7fd1253cf700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd1100779e0 0x7fd110079e90 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fd120066ae0 tx=0x7fd120075040 comp rx=0 tx=0).stop 2026-03-10T12:44:38.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a42e0 msgr2=0x7fd118139b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.289 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a42e0 0x7fd118139b90 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fd11401fe30 tx=0x7fd11401d5c0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 shutdown_connections 2026-03-10T12:44:38.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd1100779e0 0x7fd110079e90 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd1180a42e0 0x7fd118139b90 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 --2- 192.168.123.100:0/2732603567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd11813a0d0 0x7fd11813f140 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 >> 192.168.123.100:0/2732603567 conn(0x7fd11809f7b0 msgr2=0x7fd1180a1b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:38.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.289+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 shutdown_connections 2026-03-10T12:44:38.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.290+0000 7fd1253cf700 1 -- 192.168.123.100:0/2732603567 wait complete. 2026-03-10T12:44:38.291 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 38 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.368+0000 7f4f36f9c700 1 -- 192.168.123.100:0/2899479105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30072330 msgr2=0x7f4f300770b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.368+0000 7f4f36f9c700 1 --2- 192.168.123.100:0/2899479105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30072330 0x7f4f300770b0 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f4f2800b600 tx=0x7f4f2800b910 comp rx=0 tx=0).stop 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.368+0000 7f4f36f9c700 1 -- 192.168.123.100:0/2899479105 shutdown_connections 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.368+0000 7f4f36f9c700 1 --2- 192.168.123.100:0/2899479105 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30072330 0x7f4f300770b0 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.368+0000 7f4f36f9c700 1 --2- 192.168.123.100:0/2899479105 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f30071950 0x7f4f30071d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.368+0000 7f4f36f9c700 1 -- 192.168.123.100:0/2899479105 >> 192.168.123.100:0/2899479105 conn(0x7f4f3006d1a0 msgr2=0x7f4f3006f5f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 -- 192.168.123.100:0/2899479105 shutdown_connections 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 -- 192.168.123.100:0/2899479105 wait complete. 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 Processor -- start 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 -- start start 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30071950 0x7f4f300825d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f30082b10 0x7f4f30082f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f301b2a90 con 0x7f4f30071950 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f36f9c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4f301b2bd0 con 0x7f4f30082b10 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f35f9a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30071950 0x7f4f300825d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f35f9a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30071950 0x7f4f300825d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:51522/0 (socket says 192.168.123.100:51522) 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.369+0000 7f4f35f9a700 1 -- 192.168.123.100:0/1761110403 learned_addr learned my addr 192.168.123.100:0/1761110403 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.370+0000 7f4f35799700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f30082b10 0x7f4f30082f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.370+0000 7f4f35f9a700 1 -- 192.168.123.100:0/1761110403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f30082b10 msgr2=0x7f4f30082f80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.370+0000 7f4f35f9a700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f30082b10 0x7f4f30082f80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.370+0000 7f4f35f9a700 1 -- 192.168.123.100:0/1761110403 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4f2800b050 con 0x7f4f30071950 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.370+0000 7f4f35f9a700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30071950 0x7f4f300825d0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f4f2c00ba70 tx=0x7f4f2c00bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:38.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.371+0000 7f4f26ffd700 1 -- 192.168.123.100:0/1761110403 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4f2c00c700 con 0x7f4f30071950 2026-03-10T12:44:38.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.371+0000 7f4f36f9c700 1 -- 192.168.123.100:0/1761110403 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4f301b2dd0 con 0x7f4f30071950 2026-03-10T12:44:38.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.371+0000 7f4f36f9c700 1 -- 192.168.123.100:0/1761110403 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4f301b32a0 con 0x7f4f30071950 2026-03-10T12:44:38.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.372+0000 7f4f36f9c700 1 -- 192.168.123.100:0/1761110403 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4f3007c920 con 0x7f4f30071950 2026-03-10T12:44:38.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.377+0000 7f4f26ffd700 1 -- 192.168.123.100:0/1761110403 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4f2c00cd40 con 0x7f4f30071950 2026-03-10T12:44:38.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.377+0000 7f4f26ffd700 1 -- 192.168.123.100:0/1761110403 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4f2c012340 con 0x7f4f30071950 2026-03-10T12:44:38.378 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.377+0000 7f4f26ffd700 1 -- 192.168.123.100:0/1761110403 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4f2c012580 con 0x7f4f30071950 2026-03-10T12:44:38.379 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.379+0000 7f4f26ffd700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4f1c077ab0 0x7f4f1c079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.379 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.379+0000 7f4f35799700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4f1c077ab0 0x7f4f1c079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.380 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.379+0000 7f4f26ffd700 1 -- 192.168.123.100:0/1761110403 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f4f2c099aa0 con 0x7f4f30071950 2026-03-10T12:44:38.380 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.380+0000 7f4f26ffd700 1 -- 192.168.123.100:0/1761110403 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4f2c09d2a0 con 0x7f4f30071950 2026-03-10T12:44:38.380 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.380+0000 7f4f35799700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4f1c077ab0 0x7f4f1c079f60 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f4f2800bd90 tx=0x7f4f280096e0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:38.526 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.526+0000 7f4f36f9c700 1 -- 192.168.123.100:0/1761110403 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4f30061190 con 0x7f4f1c077ab0 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.527+0000 7f4f26ffd700 1 -- 192.168.123.100:0/1761110403 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+460 (secure 0 0 0) 0x7f4f30061190 con 0x7f4f1c077ab0 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": true, 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [ 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "mds", 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "crash", 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "mon", 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "ceph-exporter", 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "mgr", 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: "osd" 2026-03-10T12:44:38.527 INFO:teuthology.orchestra.run.vm00.stdout: ], 2026-03-10T12:44:38.528 INFO:teuthology.orchestra.run.vm00.stdout: "progress": "18/23 daemons upgraded", 2026-03-10T12:44:38.528 INFO:teuthology.orchestra.run.vm00.stdout: "message": "Currently upgrading grafana daemons", 2026-03-10T12:44:38.528 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:44:38.528 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:44:38.534 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.533+0000 7f4f24ff9700 1 -- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4f1c077ab0 msgr2=0x7f4f1c079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.534 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.533+0000 7f4f24ff9700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4f1c077ab0 0x7f4f1c079f60 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f4f2800bd90 tx=0x7f4f280096e0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.534 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.533+0000 7f4f24ff9700 1 -- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30071950 msgr2=0x7f4f300825d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.534 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.533+0000 7f4f24ff9700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30071950 0x7f4f300825d0 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f4f2c00ba70 tx=0x7f4f2c00bd80 comp rx=0 tx=0).stop 2026-03-10T12:44:38.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.535+0000 7f4f24ff9700 1 -- 192.168.123.100:0/1761110403 shutdown_connections 2026-03-10T12:44:38.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.535+0000 7f4f24ff9700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4f1c077ab0 0x7f4f1c079f60 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.535+0000 7f4f24ff9700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4f30071950 0x7f4f300825d0 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.535+0000 7f4f24ff9700 1 --2- 192.168.123.100:0/1761110403 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4f30082b10 0x7f4f30082f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.535+0000 7f4f24ff9700 1 -- 192.168.123.100:0/1761110403 >> 192.168.123.100:0/1761110403 conn(0x7f4f3006d1a0 msgr2=0x7f4f300764f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:38.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.535+0000 7f4f24ff9700 1 -- 192.168.123.100:0/1761110403 shutdown_connections 2026-03-10T12:44:38.535 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.535+0000 7f4f24ff9700 1 -- 192.168.123.100:0/1761110403 wait complete. 2026-03-10T12:44:38.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 -- 192.168.123.100:0/3778253282 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff96c0737b0 msgr2=0x7ff96c073c20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:38.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 --2- 192.168.123.100:0/3778253282 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff96c0737b0 0x7ff96c073c20 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7ff960009b00 tx=0x7ff960009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:38.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 -- 192.168.123.100:0/3778253282 shutdown_connections 2026-03-10T12:44:38.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 --2- 192.168.123.100:0/3778253282 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff96c0737b0 0x7ff96c073c20 secure :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7ff960009b00 tx=0x7ff960009e10 comp rx=0 tx=0).stop 2026-03-10T12:44:38.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 --2- 192.168.123.100:0/3778253282 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 0x7ff96c0731e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 -- 192.168.123.100:0/3778253282 >> 192.168.123.100:0/3778253282 conn(0x7ff96c0fb890 msgr2=0x7ff96c0fdce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:38.634 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 -- 192.168.123.100:0/3778253282 shutdown_connections 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.630+0000 7ff97160a700 1 -- 192.168.123.100:0/3778253282 wait complete. 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.633+0000 7ff97160a700 1 Processor -- start 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.633+0000 7ff97160a700 1 -- start start 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.634+0000 7ff97160a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 0x7ff96c06cf20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.634+0000 7ff97160a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff96c06d460 0x7ff96c072510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.634+0000 7ff97160a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff96c06d960 con 0x7ff96c06d460 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.634+0000 7ff97160a700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff96c06dad0 con 0x7ff96c074d80 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.636+0000 7ff96affd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 0x7ff96c06cf20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.636+0000 7ff96affd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 0x7ff96c06cf20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:43798/0 (socket says 192.168.123.100:43798) 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.636+0000 7ff96affd700 1 -- 192.168.123.100:0/2509105273 learned_addr learned my addr 192.168.123.100:0/2509105273 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.636+0000 7ff96affd700 1 -- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff96c06d460 msgr2=0x7ff96c072510 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.636+0000 7ff96affd700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff96c06d460 0x7ff96c072510 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.636+0000 7ff96affd700 1 -- 192.168.123.100:0/2509105273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9600097e0 con 0x7ff96c074d80 2026-03-10T12:44:38.637 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.636+0000 7ff96affd700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 0x7ff96c06cf20 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7ff95c00b700 tx=0x7ff95c00bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.637+0000 7ff953fff700 1 -- 192.168.123.100:0/2509105273 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff95c010820 con 0x7ff96c074d80 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.637+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff96c072ab0 con 0x7ff96c074d80 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.637+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff96c1919b0 con 0x7ff96c074d80 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.637+0000 7ff953fff700 1 -- 192.168.123.100:0/2509105273 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff95c010e60 con 0x7ff96c074d80 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.637+0000 7ff953fff700 1 -- 192.168.123.100:0/2509105273 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff95c017570 con 0x7ff96c074d80 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.638+0000 7ff953fff700 1 -- 192.168.123.100:0/2509105273 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff95c010980 con 0x7ff96c074d80 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.639+0000 7ff953fff700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff954077910 0x7ff954079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:44:38.639 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.639+0000 7ff953fff700 1 -- 192.168.123.100:0/2509105273 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7ff95c0993c0 con 0x7ff96c074d80 2026-03-10T12:44:38.640 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.639+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff958005320 con 0x7ff96c074d80 2026-03-10T12:44:38.640 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.640+0000 7ff96a7fc700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff954077910 0x7ff954079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:44:38.641 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.641+0000 7ff96a7fc700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff954077910 0x7ff954079dc0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7ff960005fd0 tx=0x7ff960000bc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:44:38.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.642+0000 7ff953fff700 1 -- 192.168.123.100:0/2509105273 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff95c061ac0 con 0x7ff96c074d80 2026-03-10T12:44:38.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:38 vm00.local ceph-mon[103263]: from='client.44335 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:38.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:38 vm00.local ceph-mon[103263]: from='client.44339 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:38.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:38 vm00.local ceph-mon[103263]: from='client.34438 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:38.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:38 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3383330127' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:38.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:38 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2732603567' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:44:38.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:38 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:38.985 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:38 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:38.992 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.991+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff958005190 con 0x7ff96c074d80 2026-03-10T12:44:38.993 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:38.993+0000 7ff953fff700 1 -- 192.168.123.100:0/2509105273 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7ff95c061210 con 0x7ff96c074d80 2026-03-10T12:44:38.996 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:44:39.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.000+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff954077910 msgr2=0x7ff954079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:39.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.000+0000 7ff97160a700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff954077910 0x7ff954079dc0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7ff960005fd0 tx=0x7ff960000bc0 comp rx=0 tx=0).stop 2026-03-10T12:44:39.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.000+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 msgr2=0x7ff96c06cf20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:44:39.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.000+0000 7ff97160a700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 0x7ff96c06cf20 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7ff95c00b700 tx=0x7ff95c00bac0 comp rx=0 tx=0).stop 2026-03-10T12:44:39.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.000+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 shutdown_connections 2026-03-10T12:44:39.001 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.000+0000 7ff97160a700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff954077910 0x7ff954079dc0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:39.001 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.000+0000 7ff97160a700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff96c074d80 0x7ff96c06cf20 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:39.001 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.001+0000 7ff97160a700 1 --2- 192.168.123.100:0/2509105273 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff96c06d460 0x7ff96c072510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:44:39.001 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.001+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 >> 192.168.123.100:0/2509105273 conn(0x7ff96c0fb890 msgr2=0x7ff96c0fdc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:44:39.001 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.001+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 shutdown_connections 2026-03-10T12:44:39.001 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:44:39.001+0000 7ff97160a700 1 -- 192.168.123.100:0/2509105273 wait complete. 2026-03-10T12:44:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:38 vm07.local ceph-mon[93622]: from='client.44335 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:38 vm07.local ceph-mon[93622]: from='client.44339 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:38 vm07.local ceph-mon[93622]: from='client.34438 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:38 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3383330127' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:38 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2732603567' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:44:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:38 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:39.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:38 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:39 vm00.local ceph-mon[103263]: from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:39 vm00.local ceph-mon[103263]: pgmap v211: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:39 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2509105273' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:44:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:39.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:39 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:40.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:39 vm07.local ceph-mon[93622]: from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:44:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:39 vm07.local ceph-mon[93622]: pgmap v211: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:39 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2509105273' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:44:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:40.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:39 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: Upgrade: Finalizing container_image settings 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: Upgrade: Complete! 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.735 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:41 vm00.local ceph-mon[103263]: pgmap v212: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.816 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: Upgrade: Finalizing container_image settings 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: Upgrade: Complete! 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T12:44:41.817 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:41.818 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:41 vm07.local ceph-mon[93622]: pgmap v212: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:42.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:42.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:44:43.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:43 vm07.local ceph-mon[93622]: pgmap v213: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:43.734 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:43 vm00.local ceph-mon[103263]: pgmap v213: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:45 vm00.local ceph-mon[103263]: pgmap v214: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:45 vm07.local ceph-mon[93622]: pgmap v214: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:44:48.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:47 vm00.local ceph-mon[103263]: pgmap v215: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:48.242 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:47 vm07.local ceph-mon[93622]: pgmap v215: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:50.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:49 vm00.local ceph-mon[103263]: pgmap v216: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:50.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:49 vm07.local ceph-mon[93622]: pgmap v216: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:52.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:51 vm00.local ceph-mon[103263]: pgmap v217: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:52.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:51 vm07.local ceph-mon[93622]: pgmap v217: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:53.289 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:53 vm07.local ceph-mon[93622]: pgmap v218: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:53.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:53 vm00.local ceph-mon[103263]: pgmap v218: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:56.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:55 vm07.local ceph-mon[93622]: pgmap v219: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:56.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:55 vm00.local ceph-mon[103263]: pgmap v219: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:58.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:57 vm07.local ceph-mon[93622]: pgmap v220: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:44:58.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:57 vm00.local ceph-mon[103263]: pgmap v220: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:00.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:44:59 vm07.local ceph-mon[93622]: pgmap v221: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:00.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:44:59 vm00.local ceph-mon[103263]: pgmap v221: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:01.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:01.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:02.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:01 vm07.local ceph-mon[93622]: pgmap v222: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:02.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:01 vm00.local ceph-mon[103263]: pgmap v222: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:04.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:03 vm07.local ceph-mon[93622]: pgmap v223: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:04.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:03 vm00.local ceph-mon[103263]: pgmap v223: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:06.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:05 vm07.local ceph-mon[93622]: pgmap v224: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:06.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:05 vm00.local ceph-mon[103263]: pgmap v224: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:07 vm00.local ceph-mon[103263]: pgmap v225: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:08.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:07 vm07.local ceph-mon[93622]: pgmap v225: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:09.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.086+0000 7ff9d722b700 1 -- 192.168.123.100:0/1132279849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0103980 msgr2=0x7ff9d0103dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.086+0000 7ff9d722b700 1 --2- 192.168.123.100:0/1132279849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0103980 0x7ff9d0103dd0 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7ff9c4009b00 tx=0x7ff9c4009e10 comp rx=0 tx=0).stop 2026-03-10T12:45:09.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.087+0000 7ff9d722b700 1 -- 192.168.123.100:0/1132279849 shutdown_connections 2026-03-10T12:45:09.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.087+0000 7ff9d722b700 1 --2- 192.168.123.100:0/1132279849 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0103980 0x7ff9d0103dd0 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.087+0000 7ff9d722b700 1 --2- 192.168.123.100:0/1132279849 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0102780 0x7ff9d0102b90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.087+0000 7ff9d722b700 1 -- 192.168.123.100:0/1132279849 >> 192.168.123.100:0/1132279849 conn(0x7ff9d00fdd50 msgr2=0x7ff9d0100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:09.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.087+0000 7ff9d722b700 1 -- 192.168.123.100:0/1132279849 shutdown_connections 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.087+0000 7ff9d722b700 1 -- 192.168.123.100:0/1132279849 wait complete. 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9d722b700 1 Processor -- start 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9d722b700 1 -- start start 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9d722b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0102780 0x7ff9d0193bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9d722b700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0103980 0x7ff9d0194130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9d722b700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9d0194670 con 0x7ff9d0102780 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9d722b700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff9d01947b0 con 0x7ff9d0103980 2026-03-10T12:45:09.088 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9cffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0103980 0x7ff9d0194130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9d4fc7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0102780 0x7ff9d0193bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9cffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0103980 0x7ff9d0194130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48340/0 (socket says 192.168.123.100:48340) 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.088+0000 7ff9cffff700 1 -- 192.168.123.100:0/3928502047 learned_addr learned my addr 192.168.123.100:0/3928502047 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9d4fc7700 1 -- 192.168.123.100:0/3928502047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0103980 msgr2=0x7ff9d0194130 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9d4fc7700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0103980 0x7ff9d0194130 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9d4fc7700 1 -- 192.168.123.100:0/3928502047 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9c40097e0 con 0x7ff9d0102780 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9cffff700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0103980 0x7ff9d0194130 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9d4fc7700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0102780 0x7ff9d0193bf0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7ff9c000ea80 tx=0x7ff9c000ed90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9cdffb700 1 -- 192.168.123.100:0/3928502047 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9c000cb20 con 0x7ff9d0102780 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff9d01aa390 con 0x7ff9d0102780 2026-03-10T12:45:09.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff9d01aa8b0 con 0x7ff9d0102780 2026-03-10T12:45:09.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.089+0000 7ff9cdffb700 1 -- 192.168.123.100:0/3928502047 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff9c0004500 con 0x7ff9d0102780 2026-03-10T12:45:09.090 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.090+0000 7ff9cdffb700 1 -- 192.168.123.100:0/3928502047 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9c0010430 con 0x7ff9d0102780 2026-03-10T12:45:09.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.091+0000 7ff9cdffb700 1 -- 192.168.123.100:0/3928502047 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff9c0010590 con 0x7ff9d0102780 2026-03-10T12:45:09.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.091+0000 7ff9cdffb700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff9b80779e0 0x7ff9b8079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:09.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.091+0000 7ff9cffff700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff9b80779e0 0x7ff9b8079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:09.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.092+0000 7ff9cffff700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff9b80779e0 0x7ff9b8079e90 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7ff9c4005200 tx=0x7ff9c401a040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:09.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.092+0000 7ff9cdffb700 1 -- 192.168.123.100:0/3928502047 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7ff9c0014070 con 0x7ff9d0102780 2026-03-10T12:45:09.094 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.092+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff9bc005320 con 0x7ff9d0102780 2026-03-10T12:45:09.095 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.095+0000 7ff9cdffb700 1 -- 192.168.123.100:0/3928502047 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff9c00636a0 con 0x7ff9d0102780 2026-03-10T12:45:09.228 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.228+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff9bc000bf0 con 0x7ff9b80779e0 2026-03-10T12:45:09.229 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.229+0000 7ff9cdffb700 1 -- 192.168.123.100:0/3928502047 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7ff9bc000bf0 con 0x7ff9b80779e0 2026-03-10T12:45:09.232 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.232+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff9b80779e0 msgr2=0x7ff9b8079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.232 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.232+0000 7ff9d722b700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff9b80779e0 0x7ff9b8079e90 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7ff9c4005200 tx=0x7ff9c401a040 comp rx=0 tx=0).stop 2026-03-10T12:45:09.232 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.232+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0102780 msgr2=0x7ff9d0193bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.232 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.232+0000 7ff9d722b700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0102780 0x7ff9d0193bf0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7ff9c000ea80 tx=0x7ff9c000ed90 comp rx=0 tx=0).stop 2026-03-10T12:45:09.232 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.233+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 shutdown_connections 2026-03-10T12:45:09.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.233+0000 7ff9d722b700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff9b80779e0 0x7ff9b8079e90 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.233+0000 7ff9d722b700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff9d0102780 0x7ff9d0193bf0 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.233+0000 7ff9d722b700 1 --2- 192.168.123.100:0/3928502047 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff9d0103980 0x7ff9d0194130 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.233+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 >> 192.168.123.100:0/3928502047 conn(0x7ff9d00fdd50 msgr2=0x7ff9d0106bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:09.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.233+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 shutdown_connections 2026-03-10T12:45:09.233 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.233+0000 7ff9d722b700 1 -- 192.168.123.100:0/3928502047 wait complete. 2026-03-10T12:45:09.303 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T12:45:09.496 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.756+0000 7f40cd687700 1 -- 192.168.123.100:0/270020637 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8102780 msgr2=0x7f40c8102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.756+0000 7f40cd687700 1 --2- 192.168.123.100:0/270020637 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8102780 0x7f40c8102bf0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f40b8009b00 tx=0x7f40b8009e10 comp rx=0 tx=0).stop 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.757+0000 7f40cd687700 1 -- 192.168.123.100:0/270020637 shutdown_connections 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.757+0000 7f40cd687700 1 --2- 192.168.123.100:0/270020637 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8102780 0x7f40c8102bf0 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.757+0000 7f40cd687700 1 --2- 192.168.123.100:0/270020637 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8108780 0x7f40c8108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.757+0000 7f40cd687700 1 -- 192.168.123.100:0/270020637 >> 192.168.123.100:0/270020637 conn(0x7f40c80fe280 msgr2=0x7f40c8100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.757+0000 7f40cd687700 1 -- 192.168.123.100:0/270020637 shutdown_connections 2026-03-10T12:45:09.757 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.757+0000 7f40cd687700 1 -- 192.168.123.100:0/270020637 wait complete. 2026-03-10T12:45:09.758 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40cd687700 1 Processor -- start 2026-03-10T12:45:09.758 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40cd687700 1 -- start start 2026-03-10T12:45:09.758 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40cd687700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8102780 0x7f40c8198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:09.758 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40cd687700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8108780 0x7f40c81988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:09.758 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40cd687700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40c8198fb0 con 0x7f40c8108780 2026-03-10T12:45:09.758 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40cd687700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40c819ccf0 con 0x7f40c8102780 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40c6ffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8102780 0x7f40c8198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40c6ffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8102780 0x7f40c8198390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48364/0 (socket says 192.168.123.100:48364) 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.758+0000 7f40c6ffd700 1 -- 192.168.123.100:0/1329818972 learned_addr learned my addr 192.168.123.100:0/1329818972 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40c67fc700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8108780 0x7f40c81988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40c6ffd700 1 -- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8108780 msgr2=0x7f40c81988d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40c6ffd700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8108780 0x7f40c81988d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40c6ffd700 1 -- 192.168.123.100:0/1329818972 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40b80097e0 con 0x7f40c8102780 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40c67fc700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8108780 0x7f40c81988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40c6ffd700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8102780 0x7f40c8198390 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f40b000d900 tx=0x7f40b000dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40bffff700 1 -- 192.168.123.100:0/1329818972 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40b00041d0 con 0x7f40c8102780 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40c819cfd0 con 0x7f40c8102780 2026-03-10T12:45:09.759 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.759+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40c819d520 con 0x7f40c8102780 2026-03-10T12:45:09.760 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.760+0000 7f40bffff700 1 -- 192.168.123.100:0/1329818972 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f40b0004330 con 0x7f40c8102780 2026-03-10T12:45:09.760 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.760+0000 7f40bffff700 1 -- 192.168.123.100:0/1329818972 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f40b0010460 con 0x7f40c8102780 2026-03-10T12:45:09.761 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.760+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f40a8005320 con 0x7f40c8102780 2026-03-10T12:45:09.761 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.761+0000 7f40bffff700 1 -- 192.168.123.100:0/1329818972 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f40b003ca90 con 0x7f40c8102780 2026-03-10T12:45:09.761 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.762+0000 7f40bffff700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40b4077910 0x7f40b4079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:09.762 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.762+0000 7f40bffff700 1 -- 192.168.123.100:0/1329818972 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f40b0021030 con 0x7f40c8102780 2026-03-10T12:45:09.762 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.762+0000 7f40c67fc700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40b4077910 0x7f40b4079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:09.762 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.762+0000 7f40c67fc700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40b4077910 0x7f40b4079dc0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f40b800b5c0 tx=0x7f40b8005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:09.764 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.764+0000 7f40bffff700 1 -- 192.168.123.100:0/1329818972 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f40b0062270 con 0x7f40c8102780 2026-03-10T12:45:09.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.893+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f40a8000bf0 con 0x7f40b4077910 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.900+0000 7f40bffff700 1 -- 192.168.123.100:0/1329818972 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f40a8000bf0 con 0x7f40b4077910 2026-03-10T12:45:09.900 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:09 vm00.local ceph-mon[103263]: pgmap v226: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (43s) 31s ago 12m 16.2M - 0.25.0 c8568f914cd2 f2bbecb3fd58 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (69s) 31s ago 12m 10.4M - 19.2.3-678-ge911bdeb 654f31e6858e cb97d867901c 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (67s) 55s ago 11m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e 04b17a97a05a 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (5m) 31s ago 12m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (5m) 55s ago 11m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (33s) 31s ago 11m 49.2M - 10.4.0 c8b91775d855 c368865f9b2b 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (105s) 31s ago 9m 23.9M - 19.2.3-678-ge911bdeb 654f31e6858e 6ba265e19d66 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (96s) 31s ago 9m 75.1M - 19.2.3-678-ge911bdeb 654f31e6858e 29b157465a74 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (75s) 55s ago 9m 14.7M - 19.2.3-678-ge911bdeb 654f31e6858e dc7af8899792 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (85s) 55s ago 9m 21.6M - 19.2.3-678-ge911bdeb 654f31e6858e 66059e3b13a4 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (6m) 31s ago 12m 640M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (6m) 55s ago 11m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (6m) 31s ago 12m 66.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (6m) 55s ago 11m 55.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (60s) 31s ago 12m 9042k - 1.7.0 72c9c2088986 2793fc2bcf05 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (56s) 55s ago 11m 3611k - 1.7.0 72c9c2088986 77eb1de1b54e 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (5m) 31s ago 11m 187M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (4m) 31s ago 11m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (3m) 31s ago 10m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (3m) 55s ago 10m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (2m) 55s ago 10m 122M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7ac87e1c2a41 2026-03-10T12:45:09.900 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (2m) 55s ago 10m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bd169bf00834 2026-03-10T12:45:09.901 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (47s) 31s ago 11m 59.7M - 2.51.0 1d3b7f56885b 5577ed86e2fa 2026-03-10T12:45:09.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40b4077910 msgr2=0x7f40b4079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40b4077910 0x7f40b4079dc0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f40b800b5c0 tx=0x7f40b8005fb0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8102780 msgr2=0x7f40c8198390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:09.904 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8102780 0x7f40c8198390 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f40b000d900 tx=0x7f40b000dcc0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 shutdown_connections 2026-03-10T12:45:09.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f40b4077910 0x7f40b4079dc0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f40c8102780 0x7f40c8198390 secure :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f40b000d900 tx=0x7f40b000dcc0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 --2- 192.168.123.100:0/1329818972 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f40c8108780 0x7f40c81988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:09.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.904+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 >> 192.168.123.100:0/1329818972 conn(0x7f40c80fe280 msgr2=0x7f40c80ffbd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:09.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.905+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 shutdown_connections 2026-03-10T12:45:09.905 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:09.905+0000 7f40cd687700 1 -- 192.168.123.100:0/1329818972 wait complete. 2026-03-10T12:45:10.196 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-10T12:45:10.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:09 vm07.local ceph-mon[93622]: pgmap v226: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:10.355 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:45:11.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.010+0000 7f44d5e36700 1 -- 192.168.123.100:0/2037039582 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d01013a0 msgr2=0x7f44d0101770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.010+0000 7f44d5e36700 1 --2- 192.168.123.100:0/2037039582 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d01013a0 0x7f44d0101770 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f44b8009b00 tx=0x7f44b8009e10 comp rx=0 tx=0).stop 2026-03-10T12:45:11.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.011+0000 7f44d5e36700 1 -- 192.168.123.100:0/2037039582 shutdown_connections 2026-03-10T12:45:11.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.011+0000 7f44d5e36700 1 --2- 192.168.123.100:0/2037039582 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f44d0068490 0x7f44d0068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.011 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.011+0000 7f44d5e36700 1 --2- 192.168.123.100:0/2037039582 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d01013a0 0x7f44d0101770 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.011+0000 7f44d5e36700 1 -- 192.168.123.100:0/2037039582 >> 192.168.123.100:0/2037039582 conn(0x7f44d00754a0 msgr2=0x7f44d00758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:11.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.011+0000 7f44d5e36700 1 -- 192.168.123.100:0/2037039582 shutdown_connections 2026-03-10T12:45:11.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.011+0000 7f44d5e36700 1 -- 192.168.123.100:0/2037039582 wait complete. 2026-03-10T12:45:11.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.012+0000 7f44d5e36700 1 Processor -- start 2026-03-10T12:45:11.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.012+0000 7f44d5e36700 1 -- start start 2026-03-10T12:45:11.012 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.012+0000 7f44d5e36700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d0068490 0x7f44d01982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44cf7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d0068490 0x7f44d01982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44cf7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d0068490 0x7f44d01982f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:34224/0 (socket says 192.168.123.100:34224) 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44d5e36700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f44d01013a0 0x7f44d0198830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44d5e36700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f44d0198f10 con 0x7f44d0068490 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44d5e36700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f44d019cca0 con 0x7f44d01013a0 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44cf7fe700 1 -- 192.168.123.100:0/889766568 learned_addr learned my addr 192.168.123.100:0/889766568 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44ceffd700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f44d01013a0 0x7f44d0198830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44cf7fe700 1 -- 192.168.123.100:0/889766568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f44d01013a0 msgr2=0x7f44d0198830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.013+0000 7f44cf7fe700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f44d01013a0 0x7f44d0198830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.013 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.014+0000 7f44cf7fe700 1 -- 192.168.123.100:0/889766568 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f44b80097e0 con 0x7f44d0068490 2026-03-10T12:45:11.014 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.014+0000 7f44cf7fe700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d0068490 0x7f44d01982f0 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f44b800b5c0 tx=0x7f44b8004c80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:11.014 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.014+0000 7f44ccff9700 1 -- 192.168.123.100:0/889766568 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f44b801d070 con 0x7f44d0068490 2026-03-10T12:45:11.014 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.014+0000 7f44ccff9700 1 -- 192.168.123.100:0/889766568 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f44b800bac0 con 0x7f44d0068490 2026-03-10T12:45:11.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.014+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f44d019cf20 con 0x7f44d0068490 2026-03-10T12:45:11.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.014+0000 7f44ccff9700 1 -- 192.168.123.100:0/889766568 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f44b800f700 con 0x7f44d0068490 2026-03-10T12:45:11.015 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.015+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f44d019d410 con 0x7f44d0068490 2026-03-10T12:45:11.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.016+0000 7f44ccff9700 1 -- 192.168.123.100:0/889766568 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f44b800bc30 con 0x7f44d0068490 2026-03-10T12:45:11.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.016+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f44d004ea50 con 0x7f44d0068490 2026-03-10T12:45:11.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.017+0000 7f44ccff9700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f44bc0778c0 0x7f44bc079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:11.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.017+0000 7f44ccff9700 1 -- 192.168.123.100:0/889766568 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f44b809b180 con 0x7f44d0068490 2026-03-10T12:45:11.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.017+0000 7f44ceffd700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f44bc0778c0 0x7f44bc079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:11.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.019+0000 7f44ceffd700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f44bc0778c0 0x7f44bc079d70 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f44d0199910 tx=0x7f44c000b3c0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:11.020 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.020+0000 7f44ccff9700 1 -- 192.168.123.100:0/889766568 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f44b8063980 con 0x7f44d0068490 2026-03-10T12:45:11.145 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:10 vm00.local ceph-mon[103263]: from='client.34456 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:45:11.145 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:10 vm00.local ceph-mon[103263]: from='client.44355 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:45:11.145 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.145+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f44d0199760 con 0x7f44bc0778c0 2026-03-10T12:45:11.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.148+0000 7f44ccff9700 1 -- 192.168.123.100:0/889766568 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f44d0199760 con 0x7f44bc0778c0 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout: "target_image": null, 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout: "in_progress": false, 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout: "which": "", 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout: "services_complete": [], 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout: "progress": null, 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout: "message": "", 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout: "is_paused": false 2026-03-10T12:45:11.149 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f44bc0778c0 msgr2=0x7f44bc079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f44bc0778c0 0x7f44bc079d70 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f44d0199910 tx=0x7f44c000b3c0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d0068490 msgr2=0x7f44d01982f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d0068490 0x7f44d01982f0 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f44b800b5c0 tx=0x7f44b8004c80 comp rx=0 tx=0).stop 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 shutdown_connections 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f44bc0778c0 0x7f44bc079d70 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f44d0068490 0x7f44d01982f0 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 --2- 192.168.123.100:0/889766568 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f44d01013a0 0x7f44d0198830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.151 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.151+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 >> 192.168.123.100:0/889766568 conn(0x7f44d00754a0 msgr2=0x7f44d00fdd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:11.152 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.152+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 shutdown_connections 2026-03-10T12:45:11.152 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.152+0000 7f44d5e36700 1 -- 192.168.123.100:0/889766568 wait complete. 2026-03-10T12:45:11.200 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-10T12:45:11.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:10 vm07.local ceph-mon[93622]: from='client.34456 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:45:11.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:10 vm07.local ceph-mon[93622]: from='client.44355 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:45:11.363 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:45:11.648 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.647+0000 7ff06e12c700 1 -- 192.168.123.100:0/2888868957 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff068108650 msgr2=0x7ff068108a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.648 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.647+0000 7ff06e12c700 1 --2- 192.168.123.100:0/2888868957 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff068108650 0x7ff068108a20 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7ff064009b00 tx=0x7ff064009e10 comp rx=0 tx=0).stop 2026-03-10T12:45:11.648 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.647+0000 7ff06e12c700 1 -- 192.168.123.100:0/2888868957 shutdown_connections 2026-03-10T12:45:11.648 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.647+0000 7ff06e12c700 1 --2- 192.168.123.100:0/2888868957 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068102520 0x7ff068102990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.648 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.647+0000 7ff06e12c700 1 --2- 192.168.123.100:0/2888868957 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff068108650 0x7ff068108a20 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.648 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.647+0000 7ff06e12c700 1 -- 192.168.123.100:0/2888868957 >> 192.168.123.100:0/2888868957 conn(0x7ff0680fdfe0 msgr2=0x7ff0681003f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 -- 192.168.123.100:0/2888868957 shutdown_connections 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 -- 192.168.123.100:0/2888868957 wait complete. 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 Processor -- start 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 -- start start 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff068102520 0x7ff0681981a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068108650 0x7ff0681986e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff068198dc0 con 0x7ff068102520 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.649+0000 7ff06e12c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff06819cb50 con 0x7ff068108650 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.650+0000 7ff06c929700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068108650 0x7ff0681986e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.650+0000 7ff06c929700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068108650 0x7ff0681986e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48396/0 (socket says 192.168.123.100:48396) 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.650+0000 7ff06c929700 1 -- 192.168.123.100:0/976546889 learned_addr learned my addr 192.168.123.100:0/976546889 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.650+0000 7ff06c929700 1 -- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff068102520 msgr2=0x7ff0681981a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.650+0000 7ff06c929700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff068102520 0x7ff0681981a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.650+0000 7ff06c929700 1 -- 192.168.123.100:0/976546889 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff0640097e0 con 0x7ff068108650 2026-03-10T12:45:11.650 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.650+0000 7ff06c929700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068108650 0x7ff0681986e0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff05800d8d0 tx=0x7ff05800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:11.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.651+0000 7ff05e7fc700 1 -- 192.168.123.100:0/976546889 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff058009940 con 0x7ff068108650 2026-03-10T12:45:11.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.651+0000 7ff05e7fc700 1 -- 192.168.123.100:0/976546889 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff058010460 con 0x7ff068108650 2026-03-10T12:45:11.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.651+0000 7ff05e7fc700 1 -- 192.168.123.100:0/976546889 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff05800f5d0 con 0x7ff068108650 2026-03-10T12:45:11.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.651+0000 7ff06e12c700 1 -- 192.168.123.100:0/976546889 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff06819ce30 con 0x7ff068108650 2026-03-10T12:45:11.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.652+0000 7ff06e12c700 1 -- 192.168.123.100:0/976546889 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff06819d350 con 0x7ff068108650 2026-03-10T12:45:11.652 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.652+0000 7ff06e12c700 1 -- 192.168.123.100:0/976546889 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff06804f2a0 con 0x7ff068108650 2026-03-10T12:45:11.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.653+0000 7ff05e7fc700 1 -- 192.168.123.100:0/976546889 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff058009aa0 con 0x7ff068108650 2026-03-10T12:45:11.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.653+0000 7ff05e7fc700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff054077700 0x7ff054079bb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:11.653 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.653+0000 7ff05e7fc700 1 -- 192.168.123.100:0/976546889 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7ff058020030 con 0x7ff068108650 2026-03-10T12:45:11.656 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.656+0000 7ff05e7fc700 1 -- 192.168.123.100:0/976546889 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff058061c50 con 0x7ff068108650 2026-03-10T12:45:11.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.659+0000 7ff06d12a700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff054077700 0x7ff054079bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:11.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.659+0000 7ff06d12a700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff054077700 0x7ff054079bb0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7ff06400b5c0 tx=0x7ff064009f90 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:11.860 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.859+0000 7ff06e12c700 1 -- 192.168.123.100:0/976546889 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7ff06804ea50 con 0x7ff068108650 2026-03-10T12:45:11.860 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.860+0000 7ff05e7fc700 1 -- 192.168.123.100:0/976546889 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+10 (secure 0 0 0) 0x7ff058061a70 con 0x7ff068108650 2026-03-10T12:45:11.861 INFO:teuthology.orchestra.run.vm00.stdout:HEALTH_OK 2026-03-10T12:45:11.864 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.864+0000 7ff053fff700 1 -- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff054077700 msgr2=0x7ff054079bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.864 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.864+0000 7ff053fff700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff054077700 0x7ff054079bb0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7ff06400b5c0 tx=0x7ff064009f90 comp rx=0 tx=0).stop 2026-03-10T12:45:11.864 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.864+0000 7ff053fff700 1 -- 192.168.123.100:0/976546889 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068108650 msgr2=0x7ff0681986e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:11.864 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.864+0000 7ff053fff700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068108650 0x7ff0681986e0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff05800d8d0 tx=0x7ff05800dc90 comp rx=0 tx=0).stop 2026-03-10T12:45:11.864 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.864+0000 7ff053fff700 1 -- 192.168.123.100:0/976546889 shutdown_connections 2026-03-10T12:45:11.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.865+0000 7ff053fff700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff054077700 0x7ff054079bb0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.865+0000 7ff053fff700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff068102520 0x7ff0681981a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.865+0000 7ff053fff700 1 --2- 192.168.123.100:0/976546889 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff068108650 0x7ff0681986e0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:11.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.865+0000 7ff053fff700 1 -- 192.168.123.100:0/976546889 >> 192.168.123.100:0/976546889 conn(0x7ff0680fdfe0 msgr2=0x7ff0680ff830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:11.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.865+0000 7ff053fff700 1 -- 192.168.123.100:0/976546889 shutdown_connections 2026-03-10T12:45:11.865 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:11.865+0000 7ff053fff700 1 -- 192.168.123.100:0/976546889 wait complete. 2026-03-10T12:45:11.926 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T12:45:12.089 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:45:12.117 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:11 vm00.local ceph-mon[103263]: pgmap v227: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:12.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:11 vm07.local ceph-mon[93622]: pgmap v227: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:12.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.349+0000 7f205f2b5700 1 -- 192.168.123.100:0/1485369317 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f20581016e0 msgr2=0x7f2058101ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:12.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.349+0000 7f205f2b5700 1 --2- 192.168.123.100:0/1485369317 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f20581016e0 0x7f2058101ab0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f2048009b00 tx=0x7f2048009e10 comp rx=0 tx=0).stop 2026-03-10T12:45:12.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.350+0000 7f205f2b5700 1 -- 192.168.123.100:0/1485369317 shutdown_connections 2026-03-10T12:45:12.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.350+0000 7f205f2b5700 1 --2- 192.168.123.100:0/1485369317 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2058101ff0 0x7f205810a4f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.350 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.350+0000 7f205f2b5700 1 --2- 192.168.123.100:0/1485369317 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f20581016e0 0x7f2058101ab0 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.350+0000 7f205f2b5700 1 -- 192.168.123.100:0/1485369317 >> 192.168.123.100:0/1485369317 conn(0x7f20580faf00 msgr2=0x7f20580fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:12.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.351+0000 7f205f2b5700 1 -- 192.168.123.100:0/1485369317 shutdown_connections 2026-03-10T12:45:12.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.351+0000 7f205f2b5700 1 -- 192.168.123.100:0/1485369317 wait complete. 2026-03-10T12:45:12.351 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.351+0000 7f205f2b5700 1 Processor -- start 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205f2b5700 1 -- start start 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205f2b5700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f20581016e0 0x7f20581982c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205f2b5700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2058101ff0 0x7f2058198800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205f2b5700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2058198ee0 con 0x7f2058101ff0 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205f2b5700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f205819cc70 con 0x7f20581016e0 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205d051700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f20581016e0 0x7f20581982c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205d051700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f20581016e0 0x7f20581982c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48416/0 (socket says 192.168.123.100:48416) 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205d051700 1 -- 192.168.123.100:0/3901482822 learned_addr learned my addr 192.168.123.100:0/3901482822 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:45:12.352 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.352+0000 7f205d051700 1 -- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2058101ff0 msgr2=0x7f2058198800 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:12.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f205c850700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2058101ff0 0x7f2058198800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:12.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f205d051700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2058101ff0 0x7f2058198800 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f205d051700 1 -- 192.168.123.100:0/3901482822 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f20480097e0 con 0x7f20581016e0 2026-03-10T12:45:12.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f205d051700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f20581016e0 0x7f20581982c0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f2048005fd0 tx=0x7f2048004ca0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:12.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f205c850700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2058101ff0 0x7f2058198800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:45:12.353 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f204e7fc700 1 -- 192.168.123.100:0/3901482822 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f204801d070 con 0x7f20581016e0 2026-03-10T12:45:12.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f204e7fc700 1 -- 192.168.123.100:0/3901482822 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f204800bde0 con 0x7f20581016e0 2026-03-10T12:45:12.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f205819cef0 con 0x7f20581016e0 2026-03-10T12:45:12.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.353+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f205819d3e0 con 0x7f20581016e0 2026-03-10T12:45:12.354 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.354+0000 7f204e7fc700 1 -- 192.168.123.100:0/3901482822 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2048021b10 con 0x7f20581016e0 2026-03-10T12:45:12.355 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.354+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f205810a020 con 0x7f20581016e0 2026-03-10T12:45:12.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.355+0000 7f204e7fc700 1 -- 192.168.123.100:0/3901482822 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f204800f4e0 con 0x7f20581016e0 2026-03-10T12:45:12.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.355+0000 7f204e7fc700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f20440778c0 0x7f2044079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:45:12.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.355+0000 7f204e7fc700 1 -- 192.168.123.100:0/3901482822 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f204809b830 con 0x7f20581016e0 2026-03-10T12:45:12.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.356+0000 7f205c850700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f20440778c0 0x7f2044079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:45:12.356 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.356+0000 7f205c850700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f20440778c0 0x7f2044079d70 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f2054005fd0 tx=0x7f2054005ee0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:45:12.357 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.357+0000 7f204e7fc700 1 -- 192.168.123.100:0/3901482822 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2048064030 con 0x7f20581016e0 2026-03-10T12:45:12.524 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.524+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f20580689d0 con 0x7f20581016e0 2026-03-10T12:45:12.524 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.524+0000 7f204e7fc700 1 -- 192.168.123.100:0/3901482822 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f204800fe60 con 0x7f20581016e0 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:45:12.525 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:45:12.526 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f20440778c0 msgr2=0x7f2044079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f20440778c0 0x7f2044079d70 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f2054005fd0 tx=0x7f2054005ee0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f20581016e0 msgr2=0x7f20581982c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f20581016e0 0x7f20581982c0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f2048005fd0 tx=0x7f2048004ca0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 shutdown_connections 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f20440778c0 0x7f2044079d70 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f20581016e0 0x7f20581982c0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 --2- 192.168.123.100:0/3901482822 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2058101ff0 0x7f2058198800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 >> 192.168.123.100:0/3901482822 conn(0x7f20580faf00 msgr2=0x7f20580ffae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:45:12.528 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.528+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 shutdown_connections 2026-03-10T12:45:12.529 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:45:12.529+0000 7f205f2b5700 1 -- 192.168.123.100:0/3901482822 wait complete. 2026-03-10T12:45:12.572 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-10T12:45:12.727 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:45:12.917 INFO:teuthology.orchestra.run.vm00.stdout:wait for servicemap items w/ changing names to refresh 2026-03-10T12:45:12.948 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-10T12:45:13.101 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:45:13.127 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:12 vm00.local ceph-mon[103263]: from='client.34464 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:45:13.127 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:12 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/976546889' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:45:13.127 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:12 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3901482822' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:45:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:12 vm07.local ceph-mon[93622]: from='client.34464 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:45:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:12 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/976546889' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T12:45:13.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:12 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3901482822' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:45:14.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:14 vm00.local ceph-mon[103263]: pgmap v228: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:14.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:14 vm07.local ceph-mon[93622]: pgmap v228: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:15.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:15 vm07.local ceph-mon[93622]: pgmap v229: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:15.983 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:15 vm00.local ceph-mon[103263]: pgmap v229: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:16.815 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:16.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:17.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:17 vm00.local ceph-mon[103263]: pgmap v230: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:18.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:17 vm07.local ceph-mon[93622]: pgmap v230: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:19.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:19 vm07.local ceph-mon[93622]: pgmap v231: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:19.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:19 vm00.local ceph-mon[103263]: pgmap v231: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:21.984 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:21 vm00.local ceph-mon[103263]: pgmap v232: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:22.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:21 vm07.local ceph-mon[93622]: pgmap v232: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:24.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:24 vm07.local ceph-mon[93622]: pgmap v233: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:24.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:24 vm00.local ceph-mon[103263]: pgmap v233: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:25.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:25 vm07.local ceph-mon[93622]: pgmap v234: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:25.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:25 vm00.local ceph-mon[103263]: pgmap v234: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:28.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:27 vm00.local ceph-mon[103263]: pgmap v235: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:28.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:27 vm07.local ceph-mon[93622]: pgmap v235: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:30.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:29 vm00.local ceph-mon[103263]: pgmap v236: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:30.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:29 vm07.local ceph-mon[93622]: pgmap v236: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:31.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:31.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:32.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:31 vm00.local ceph-mon[103263]: pgmap v237: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:32.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:31 vm07.local ceph-mon[93622]: pgmap v237: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:34.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:33 vm07.local ceph-mon[93622]: pgmap v238: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:34.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:33 vm00.local ceph-mon[103263]: pgmap v238: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:36.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:35 vm00.local ceph-mon[103263]: pgmap v239: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:36.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:35 vm07.local ceph-mon[93622]: pgmap v239: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:38.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:37 vm00.local ceph-mon[103263]: pgmap v240: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:38.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:37 vm07.local ceph-mon[93622]: pgmap v240: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:40.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:39 vm00.local ceph-mon[103263]: pgmap v241: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:40.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:39 vm07.local ceph-mon[93622]: pgmap v241: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:41.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:40 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:45:41.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:40 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:45:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:41 vm00.local ceph-mon[103263]: pgmap v242: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:45:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:45:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:45:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:41 vm07.local ceph-mon[93622]: pgmap v242: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:45:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:45:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:45:44.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:43 vm07.local ceph-mon[93622]: pgmap v243: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:44.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:43 vm00.local ceph-mon[103263]: pgmap v243: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:45.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:45 vm00.local ceph-mon[103263]: pgmap v244: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:45.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:45 vm07.local ceph-mon[93622]: pgmap v244: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:46.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:46 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:46.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:46 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:45:47.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:47 vm00.local ceph-mon[103263]: pgmap v245: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:47.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:47 vm07.local ceph-mon[93622]: pgmap v245: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:50.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:49 vm07.local ceph-mon[93622]: pgmap v246: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:50.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:49 vm00.local ceph-mon[103263]: pgmap v246: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:52.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:51 vm07.local ceph-mon[93622]: pgmap v247: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:52.077 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:51 vm00.local ceph-mon[103263]: pgmap v247: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:54.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:53 vm07.local ceph-mon[93622]: pgmap v248: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:54.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:53 vm00.local ceph-mon[103263]: pgmap v248: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:56.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:55 vm07.local ceph-mon[93622]: pgmap v249: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:56.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:55 vm00.local ceph-mon[103263]: pgmap v249: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:58.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:57 vm00.local ceph-mon[103263]: pgmap v250: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:45:58.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:57 vm07.local ceph-mon[93622]: pgmap v250: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:00.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:45:59 vm00.local ceph-mon[103263]: pgmap v251: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:00.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:45:59 vm07.local ceph-mon[93622]: pgmap v251: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:01.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:01.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:02.181 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:01 vm00.local ceph-mon[103263]: pgmap v252: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:02.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:01 vm07.local ceph-mon[93622]: pgmap v252: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:04.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:03 vm00.local ceph-mon[103263]: pgmap v253: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:03 vm07.local ceph-mon[93622]: pgmap v253: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:06.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:05 vm00.local ceph-mon[103263]: pgmap v254: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:06.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:05 vm07.local ceph-mon[93622]: pgmap v254: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:08.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:07 vm00.local ceph-mon[103263]: pgmap v255: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:08.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:07 vm07.local ceph-mon[93622]: pgmap v255: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:10.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:09 vm00.local ceph-mon[103263]: pgmap v256: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:10.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:09 vm07.local ceph-mon[93622]: pgmap v256: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:12.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:11 vm00.local ceph-mon[103263]: pgmap v257: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:12.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:11 vm07.local ceph-mon[93622]: pgmap v257: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:13.344 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T12:46:13.490 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.736+0000 7f5653893700 1 -- 192.168.123.100:0/820771994 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c0fff00 msgr2=0x7f564c100370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.736+0000 7f5653893700 1 --2- 192.168.123.100:0/820771994 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c0fff00 0x7f564c100370 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f5648009b00 tx=0x7f5648009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.737+0000 7f5653893700 1 -- 192.168.123.100:0/820771994 shutdown_connections 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.737+0000 7f5653893700 1 --2- 192.168.123.100:0/820771994 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c0fff00 0x7f564c100370 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.737+0000 7f5653893700 1 --2- 192.168.123.100:0/820771994 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f564c104520 0x7f564c1048f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.737+0000 7f5653893700 1 -- 192.168.123.100:0/820771994 >> 192.168.123.100:0/820771994 conn(0x7f564c0754a0 msgr2=0x7f564c0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.738+0000 7f5653893700 1 -- 192.168.123.100:0/820771994 shutdown_connections 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.738+0000 7f5653893700 1 -- 192.168.123.100:0/820771994 wait complete. 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.738+0000 7f5653893700 1 Processor -- start 2026-03-10T12:46:13.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.738+0000 7f5653893700 1 -- start start 2026-03-10T12:46:13.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5653893700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f564c0fff00 0x7f564c193fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:13.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5653893700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c104520 0x7f564c1944f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:13.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5653893700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f564c194bd0 con 0x7f564c104520 2026-03-10T12:46:13.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5653893700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f564c198960 con 0x7f564c0fff00 2026-03-10T12:46:13.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5650e2e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c104520 0x7f564c1944f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:13.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5650e2e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c104520 0x7f564c1944f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:47232/0 (socket says 192.168.123.100:47232) 2026-03-10T12:46:13.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5650e2e700 1 -- 192.168.123.100:0/471210661 learned_addr learned my addr 192.168.123.100:0/471210661 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:13.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5650e2e700 1 -- 192.168.123.100:0/471210661 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f564c0fff00 msgr2=0x7f564c193fb0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:46:13.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5650e2e700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f564c0fff00 0x7f564c193fb0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:13.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5650e2e700 1 -- 192.168.123.100:0/471210661 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56480097e0 con 0x7f564c104520 2026-03-10T12:46:13.740 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f5650e2e700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c104520 0x7f564c1944f0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f5648004930 tx=0x7f5648004a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:13.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f56427fc700 1 -- 192.168.123.100:0/471210661 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f564801d070 con 0x7f564c104520 2026-03-10T12:46:13.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f56427fc700 1 -- 192.168.123.100:0/471210661 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f564800bc50 con 0x7f564c104520 2026-03-10T12:46:13.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.739+0000 7f56427fc700 1 -- 192.168.123.100:0/471210661 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f564800f790 con 0x7f564c104520 2026-03-10T12:46:13.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.740+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f564c198be0 con 0x7f564c104520 2026-03-10T12:46:13.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.740+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f564c1990d0 con 0x7f564c104520 2026-03-10T12:46:13.741 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.741+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f564c04ea50 con 0x7f564c104520 2026-03-10T12:46:13.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.744+0000 7f56427fc700 1 -- 192.168.123.100:0/471210661 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5648022ae0 con 0x7f564c104520 2026-03-10T12:46:13.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.744+0000 7f56427fc700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f563807bd30 0x7f563807e1e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:13.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.744+0000 7f56427fc700 1 -- 192.168.123.100:0/471210661 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f564809bd80 con 0x7f564c104520 2026-03-10T12:46:13.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.745+0000 7f565162f700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f563807bd30 0x7f563807e1e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:13.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.745+0000 7f565162f700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f563807bd30 0x7f563807e1e0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f563c00a910 tx=0x7f563c005c10 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:13.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.745+0000 7f56427fc700 1 -- 192.168.123.100:0/471210661 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f564809c190 con 0x7f564c104520 2026-03-10T12:46:13.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.871+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 --> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f564c195420 con 0x7f563807bd30 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.880+0000 7f56427fc700 1 -- 192.168.123.100:0/471210661 <== mgr.24563 v2:192.168.123.100:6800/464552988 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f564c195420 con 0x7f563807bd30 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:alertmanager.vm00 vm00 *:9093,9094 running (107s) 95s ago 13m 16.2M - 0.25.0 c8568f914cd2 f2bbecb3fd58 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm00 vm00 running (2m) 95s ago 13m 10.4M - 19.2.3-678-ge911bdeb 654f31e6858e cb97d867901c 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:ceph-exporter.vm07 vm07 running (2m) 119s ago 12m 10.2M - 19.2.3-678-ge911bdeb 654f31e6858e 04b17a97a05a 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm00 vm00 running (6m) 95s ago 13m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e 88cd1c2e0041 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:crash.vm07 vm07 running (6m) 119s ago 12m 7864k - 19.2.3-678-ge911bdeb 654f31e6858e 889e2b356266 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:grafana.vm00 vm00 *:3000 running (97s) 95s ago 13m 49.2M - 10.4.0 c8b91775d855 c368865f9b2b 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.lnokoe vm00 running (2m) 95s ago 11m 23.9M - 19.2.3-678-ge911bdeb 654f31e6858e 6ba265e19d66 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm00.wdwvcu vm00 running (2m) 95s ago 11m 75.1M - 19.2.3-678-ge911bdeb 654f31e6858e 29b157465a74 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.rhzwnr vm07 running (2m) 119s ago 11m 14.7M - 19.2.3-678-ge911bdeb 654f31e6858e dc7af8899792 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mds.cephfs.vm07.wznhgu vm07 running (2m) 119s ago 11m 21.6M - 19.2.3-678-ge911bdeb 654f31e6858e 66059e3b13a4 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm00.nescmq vm00 *:8443,9283,8765 running (7m) 95s ago 13m 640M - 19.2.3-678-ge911bdeb 654f31e6858e eaf11f86af53 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mgr.vm07.kfawlb vm07 *:8443,9283,8765 running (7m) 119s ago 12m 491M - 19.2.3-678-ge911bdeb 654f31e6858e 22c93435649c 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm00 vm00 running (7m) 95s ago 13m 66.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e e8cc5980a849 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:mon.vm07 vm07 running (7m) 119s ago 12m 55.9M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 032abad282fc 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm00 vm00 *:9100 running (2m) 95s ago 13m 9042k - 1.7.0 72c9c2088986 2793fc2bcf05 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:node-exporter.vm07 vm07 *:9100 running (2m) 119s ago 12m 3611k - 1.7.0 72c9c2088986 77eb1de1b54e 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:osd.0 vm00 running (6m) 95s ago 12m 187M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 9b151d44f3cf 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:osd.1 vm00 running (5m) 95s ago 12m 121M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 252ea98c5665 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:osd.2 vm00 running (4m) 95s ago 11m 130M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 249137e44eb7 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:osd.3 vm07 running (4m) 119s ago 11m 165M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 72a045e3b78b 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:osd.4 vm07 running (4m) 119s ago 11m 122M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7ac87e1c2a41 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:osd.5 vm07 running (3m) 119s ago 11m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e bd169bf00834 2026-03-10T12:46:13.880 INFO:teuthology.orchestra.run.vm00.stdout:prometheus.vm00 vm00 *:9095 running (111s) 95s ago 12m 59.7M - 2.51.0 1d3b7f56885b 5577ed86e2fa 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f563807bd30 msgr2=0x7f563807e1e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f563807bd30 0x7f563807e1e0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f563c00a910 tx=0x7f563c005c10 comp rx=0 tx=0).stop 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c104520 msgr2=0x7f564c1944f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c104520 0x7f564c1944f0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7f5648004930 tx=0x7f5648004a10 comp rx=0 tx=0).stop 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 shutdown_connections 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f563807bd30 0x7f563807e1e0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f564c0fff00 0x7f564c193fb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 --2- 192.168.123.100:0/471210661 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f564c104520 0x7f564c1944f0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:13.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 >> 192.168.123.100:0/471210661 conn(0x7f564c0754a0 msgr2=0x7f564c0feba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:13.884 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.883+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 shutdown_connections 2026-03-10T12:46:13.884 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:13.884+0000 7f5653893700 1 -- 192.168.123.100:0/471210661 wait complete. 2026-03-10T12:46:13.931 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T12:46:14.080 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:14.124 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:13 vm00.local ceph-mon[103263]: pgmap v258: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:14.307 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.307+0000 7f65486ca700 1 -- 192.168.123.100:0/857022941 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 msgr2=0x7f6540110ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.307+0000 7f65486ca700 1 --2- 192.168.123.100:0/857022941 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 0x7f6540110ff0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7f653c009b50 tx=0x7f653c009e60 comp rx=0 tx=0).stop 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 -- 192.168.123.100:0/857022941 shutdown_connections 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 --2- 192.168.123.100:0/857022941 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 0x7f6540110ff0 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 --2- 192.168.123.100:0/857022941 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 0x7f65400734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 -- 192.168.123.100:0/857022941 >> 192.168.123.100:0/857022941 conn(0x7f65400fc000 msgr2=0x7f65400fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 -- 192.168.123.100:0/857022941 shutdown_connections 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 -- 192.168.123.100:0/857022941 wait complete. 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 Processor -- start 2026-03-10T12:46:14.308 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.308+0000 7f65486ca700 1 -- start start 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f65486ca700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 0x7f65401a24f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f65486ca700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 0x7f65401a2a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f65486ca700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65401a30c0 con 0x7f6540073a00 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f65486ca700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f654019c570 con 0x7f65400730f0 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f6546466700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 0x7f65401a24f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f6546466700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 0x7f65401a24f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60128/0 (socket says 192.168.123.100:60128) 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f6546466700 1 -- 192.168.123.100:0/2472143262 learned_addr learned my addr 192.168.123.100:0/2472143262 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f6545c65700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 0x7f65401a2a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.309+0000 7f6546466700 1 -- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 msgr2=0x7f65401a2a30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:14.309 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f6546466700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 0x7f65401a2a30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f6546466700 1 -- 192.168.123.100:0/2472143262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f653c0097e0 con 0x7f65400730f0 2026-03-10T12:46:14.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f6545c65700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 0x7f65401a2a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:46:14.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f6546466700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 0x7f65401a24f0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f653000d8d0 tx=0x7f653000dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:14.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f65377fe700 1 -- 192.168.123.100:0/2472143262 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6530009880 con 0x7f65400730f0 2026-03-10T12:46:14.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f654019c850 con 0x7f65400730f0 2026-03-10T12:46:14.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f654019cd70 con 0x7f65400730f0 2026-03-10T12:46:14.310 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f65377fe700 1 -- 192.168.123.100:0/2472143262 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6530010460 con 0x7f65400730f0 2026-03-10T12:46:14.311 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.310+0000 7f65377fe700 1 -- 192.168.123.100:0/2472143262 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f653000f5d0 con 0x7f65400730f0 2026-03-10T12:46:14.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.312+0000 7f65377fe700 1 -- 192.168.123.100:0/2472143262 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f65300099e0 con 0x7f65400730f0 2026-03-10T12:46:14.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.311+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6524005320 con 0x7f65400730f0 2026-03-10T12:46:14.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.312+0000 7f65377fe700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f652c0778e0 0x7f652c079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:14.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.312+0000 7f65377fe700 1 -- 192.168.123.100:0/2472143262 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f65300996f0 con 0x7f65400730f0 2026-03-10T12:46:14.312 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.313+0000 7f6545c65700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f652c0778e0 0x7f652c079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:14.313 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.313+0000 7f6545c65700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f652c0778e0 0x7f652c079d90 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f653c009b20 tx=0x7f653c0058e0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:14.315 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.315+0000 7f65377fe700 1 -- 192.168.123.100:0/2472143262 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6530061e70 con 0x7f65400730f0 2026-03-10T12:46:14.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:13 vm07.local ceph-mon[93622]: pgmap v258: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:14.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.489+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f6524006200 con 0x7f65400730f0 2026-03-10T12:46:14.490 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.490+0000 7f65377fe700 1 -- 192.168.123.100:0/2472143262 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f65300615c0 con 0x7f65400730f0 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout:{ 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "mon": { 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "mgr": { 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "osd": { 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "mds": { 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: }, 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "overall": { 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout: } 2026-03-10T12:46:14.491 INFO:teuthology.orchestra.run.vm00.stdout:} 2026-03-10T12:46:14.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.493+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f652c0778e0 msgr2=0x7f652c079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:14.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.493+0000 7f65486ca700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f652c0778e0 0x7f652c079d90 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f653c009b20 tx=0x7f653c0058e0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.493+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 msgr2=0x7f65401a24f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:14.493 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.493+0000 7f65486ca700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 0x7f65401a24f0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f653000d8d0 tx=0x7f653000dbe0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.494+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 shutdown_connections 2026-03-10T12:46:14.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.494+0000 7f65486ca700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f652c0778e0 0x7f652c079d90 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.494+0000 7f65486ca700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f65400730f0 0x7f65401a24f0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.494+0000 7f65486ca700 1 --2- 192.168.123.100:0/2472143262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6540073a00 0x7f65401a2a30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:14.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.494+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 >> 192.168.123.100:0/2472143262 conn(0x7f65400fc000 msgr2=0x7f6540102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:14.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.494+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 shutdown_connections 2026-03-10T12:46:14.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.494+0000 7f65486ca700 1 -- 192.168.123.100:0/2472143262 wait complete. 2026-03-10T12:46:14.566 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-10T12:46:14.718 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:14.999 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:14 vm00.local ceph-mon[103263]: from='client.34472 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:46:14.999 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:14 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2472143262' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.998+0000 7fd4f7328700 1 -- 192.168.123.100:0/244466851 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f01065b0 msgr2=0x7fd4f0106980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:14.998+0000 7fd4f7328700 1 --2- 192.168.123.100:0/244466851 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f01065b0 0x7fd4f0106980 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7fd4ec00df10 tx=0x7fd4ec00f330 comp rx=0 tx=0).stop 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.000+0000 7fd4f7328700 1 -- 192.168.123.100:0/244466851 shutdown_connections 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.000+0000 7fd4f7328700 1 --2- 192.168.123.100:0/244466851 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd4f0100590 0x7fd4f0100a00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.000+0000 7fd4f7328700 1 --2- 192.168.123.100:0/244466851 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f01065b0 0x7fd4f0106980 unknown :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.000+0000 7fd4f7328700 1 -- 192.168.123.100:0/244466851 >> 192.168.123.100:0/244466851 conn(0x7fd4f00fc090 msgr2=0x7fd4f00fe4a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.000+0000 7fd4f7328700 1 -- 192.168.123.100:0/244466851 shutdown_connections 2026-03-10T12:46:15.000 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.000+0000 7fd4f7328700 1 -- 192.168.123.100:0/244466851 wait complete. 2026-03-10T12:46:15.001 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.001+0000 7fd4f7328700 1 Processor -- start 2026-03-10T12:46:15.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.001+0000 7fd4f7328700 1 -- start start 2026-03-10T12:46:15.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.001+0000 7fd4f7328700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f0100590 0x7fd4f0072800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:15.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.001+0000 7fd4f7328700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd4f01065b0 0x7fd4f006d800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:15.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.001+0000 7fd4f7328700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd4f006dd40 con 0x7fd4f0100590 2026-03-10T12:46:15.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.001+0000 7fd4f7328700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd4f006deb0 con 0x7fd4f01065b0 2026-03-10T12:46:15.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.002+0000 7fd4f50c4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f0100590 0x7fd4f0072800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.002+0000 7fd4f50c4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f0100590 0x7fd4f0072800 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:47274/0 (socket says 192.168.123.100:47274) 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.002+0000 7fd4f50c4700 1 -- 192.168.123.100:0/4113316563 learned_addr learned my addr 192.168.123.100:0/4113316563 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.002+0000 7fd4f50c4700 1 -- 192.168.123.100:0/4113316563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd4f01065b0 msgr2=0x7fd4f006d800 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.002+0000 7fd4f50c4700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd4f01065b0 0x7fd4f006d800 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.002+0000 7fd4f50c4700 1 -- 192.168.123.100:0/4113316563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd4ec00dbf0 con 0x7fd4f0100590 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.002+0000 7fd4f50c4700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f0100590 0x7fd4f0072800 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7fd4ec011010 tx=0x7fd4ec0047c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.003+0000 7fd4e27fc700 1 -- 192.168.123.100:0/4113316563 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd4ec012070 con 0x7fd4f0100590 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.003+0000 7fd4e27fc700 1 -- 192.168.123.100:0/4113316563 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd4ec004510 con 0x7fd4f0100590 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.003+0000 7fd4e27fc700 1 -- 192.168.123.100:0/4113316563 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd4ec01e400 con 0x7fd4f0100590 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.003+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd4f006e0b0 con 0x7fd4f0100590 2026-03-10T12:46:15.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.003+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd4f006e600 con 0x7fd4f0100590 2026-03-10T12:46:15.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.004+0000 7fd4e27fc700 1 -- 192.168.123.100:0/4113316563 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd4ec01e560 con 0x7fd4f0100590 2026-03-10T12:46:15.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.004+0000 7fd4e27fc700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd4dc0778c0 0x7fd4dc079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:15.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.005+0000 7fd4f48c3700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd4dc0778c0 0x7fd4dc079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:15.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.005+0000 7fd4f48c3700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd4dc0778c0 0x7fd4dc079d70 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fd4f006eff0 tx=0x7fd4e4006c60 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:15.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.005+0000 7fd4e27fc700 1 -- 192.168.123.100:0/4113316563 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fd4ec06dc20 con 0x7fd4f0100590 2026-03-10T12:46:15.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.005+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd4f004ea50 con 0x7fd4f0100590 2026-03-10T12:46:15.008 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.008+0000 7fd4e27fc700 1 -- 192.168.123.100:0/4113316563 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd4ec060f60 con 0x7fd4f0100590 2026-03-10T12:46:15.174 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.173+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fd4f0066e40 con 0x7fd4f0100590 2026-03-10T12:46:15.174 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.174+0000 7fd4e27fc700 1 -- 192.168.123.100:0/4113316563 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fd4ec060f60 con 0x7fd4f0100590 2026-03-10T12:46:15.176 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.176+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd4dc0778c0 msgr2=0x7fd4dc079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:15.176 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.176+0000 7fd4f7328700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd4dc0778c0 0x7fd4dc079d70 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fd4f006eff0 tx=0x7fd4e4006c60 comp rx=0 tx=0).stop 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f0100590 msgr2=0x7fd4f0072800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f0100590 0x7fd4f0072800 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7fd4ec011010 tx=0x7fd4ec0047c0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 shutdown_connections 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fd4dc0778c0 0x7fd4dc079d70 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fd4f0100590 0x7fd4f0072800 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 --2- 192.168.123.100:0/4113316563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fd4f01065b0 0x7fd4f006d800 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 >> 192.168.123.100:0/4113316563 conn(0x7fd4f00fc090 msgr2=0x7fd4f00fd8c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 shutdown_connections 2026-03-10T12:46:15.177 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.177+0000 7fd4f7328700 1 -- 192.168.123.100:0/4113316563 wait complete. 2026-03-10T12:46:15.187 INFO:teuthology.orchestra.run.vm00.stdout:true 2026-03-10T12:46:15.254 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-10T12:46:15.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:14 vm07.local ceph-mon[93622]: from='client.34472 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T12:46:15.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:14 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2472143262' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:46:15.428 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:15.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.741+0000 7f1cb759e700 1 -- 192.168.123.100:0/3821574540 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb810c8f0 msgr2=0x7f1cb810ccc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:15.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.741+0000 7f1cb759e700 1 --2- 192.168.123.100:0/3821574540 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb810c8f0 0x7f1cb810ccc0 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f1ca8009b00 tx=0x7f1ca8009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:15.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.741+0000 7f1cb759e700 1 -- 192.168.123.100:0/3821574540 shutdown_connections 2026-03-10T12:46:15.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.741+0000 7f1cb759e700 1 --2- 192.168.123.100:0/3821574540 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1cb8071e40 0x7f1cb80722b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.741+0000 7f1cb759e700 1 --2- 192.168.123.100:0/3821574540 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb810c8f0 0x7f1cb810ccc0 unknown :-1 s=CLOSED pgs=169 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.741+0000 7f1cb759e700 1 -- 192.168.123.100:0/3821574540 >> 192.168.123.100:0/3821574540 conn(0x7f1cb806c6c0 msgr2=0x7f1cb806cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.742+0000 7f1cb759e700 1 -- 192.168.123.100:0/3821574540 shutdown_connections 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.742+0000 7f1cb759e700 1 -- 192.168.123.100:0/3821574540 wait complete. 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.742+0000 7f1cb759e700 1 Processor -- start 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb759e700 1 -- start start 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb759e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1cb8071e40 0x7f1cb81327b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb759e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb8132cf0 0x7f1cb8133160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb759e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1cb807ee70 con 0x7f1cb8132cf0 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb759e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1cb807efe0 con 0x7f1cb8071e40 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb5d9b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb8132cf0 0x7f1cb8133160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb5d9b700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb8132cf0 0x7f1cb8133160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:47292/0 (socket says 192.168.123.100:47292) 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb5d9b700 1 -- 192.168.123.100:0/595663339 learned_addr learned my addr 192.168.123.100:0/595663339 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb5d9b700 1 -- 192.168.123.100:0/595663339 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1cb8071e40 msgr2=0x7f1cb81327b0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:46:15.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb5d9b700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1cb8071e40 0x7f1cb81327b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.743+0000 7f1cb5d9b700 1 -- 192.168.123.100:0/595663339 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ca80097e0 con 0x7f1cb8132cf0 2026-03-10T12:46:15.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.744+0000 7f1cb5d9b700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb8132cf0 0x7f1cb8133160 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f1cb0009e00 tx=0x7f1cb000c6d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:15.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.745+0000 7f1ca77fe700 1 -- 192.168.123.100:0/595663339 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1cb0011070 con 0x7f1cb8132cf0 2026-03-10T12:46:15.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.745+0000 7f1ca77fe700 1 -- 192.168.123.100:0/595663339 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1cb0004bb0 con 0x7f1cb8132cf0 2026-03-10T12:46:15.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.745+0000 7f1ca77fe700 1 -- 192.168.123.100:0/595663339 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1cb0005230 con 0x7f1cb8132cf0 2026-03-10T12:46:15.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.745+0000 7f1cb759e700 1 -- 192.168.123.100:0/595663339 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1cb807f210 con 0x7f1cb8132cf0 2026-03-10T12:46:15.745 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.745+0000 7f1cb759e700 1 -- 192.168.123.100:0/595663339 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1cb807f760 con 0x7f1cb8132cf0 2026-03-10T12:46:15.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.747+0000 7f1ca77fe700 1 -- 192.168.123.100:0/595663339 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1cb00053b0 con 0x7f1cb8132cf0 2026-03-10T12:46:15.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.747+0000 7f1ca77fe700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ca00779e0 0x7f1ca0079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:15.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.747+0000 7f1ca77fe700 1 -- 192.168.123.100:0/595663339 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f1cb009aaa0 con 0x7f1cb8132cf0 2026-03-10T12:46:15.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.747+0000 7f1cb759e700 1 -- 192.168.123.100:0/595663339 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1c98005320 con 0x7f1cb8132cf0 2026-03-10T12:46:15.747 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.747+0000 7f1cb659c700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ca00779e0 0x7f1ca0079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:15.751 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.751+0000 7f1cb659c700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ca00779e0 0x7f1ca0079e90 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f1ca8005cb0 tx=0x7f1ca8005c20 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:15.751 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.751+0000 7f1ca77fe700 1 -- 192.168.123.100:0/595663339 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1cb00632a0 con 0x7f1cb8132cf0 2026-03-10T12:46:15.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.923+0000 7f1cb759e700 1 -- 192.168.123.100:0/595663339 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f1c98006200 con 0x7f1cb8132cf0 2026-03-10T12:46:15.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.923+0000 7f1ca77fe700 1 -- 192.168.123.100:0/595663339 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f1cb0098e60 con 0x7f1cb8132cf0 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 -- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ca00779e0 msgr2=0x7f1ca0079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ca00779e0 0x7f1ca0079e90 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f1ca8005cb0 tx=0x7f1ca8005c20 comp rx=0 tx=0).stop 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 -- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb8132cf0 msgr2=0x7f1cb8133160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb8132cf0 0x7f1cb8133160 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f1cb0009e00 tx=0x7f1cb000c6d0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 -- 192.168.123.100:0/595663339 shutdown_connections 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f1ca00779e0 0x7f1ca0079e90 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f1cb8071e40 0x7f1cb81327b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 --2- 192.168.123.100:0/595663339 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1cb8132cf0 0x7f1cb8133160 secure :-1 s=CLOSED pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f1cb0009e00 tx=0x7f1cb000c6d0 comp rx=0 tx=0).stop 2026-03-10T12:46:15.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.926+0000 7f1ca57fa700 1 -- 192.168.123.100:0/595663339 >> 192.168.123.100:0/595663339 conn(0x7f1cb806c6c0 msgr2=0x7f1cb8070060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:15.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.929+0000 7f1ca57fa700 1 -- 192.168.123.100:0/595663339 shutdown_connections 2026-03-10T12:46:15.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:15.929+0000 7f1ca57fa700 1 -- 192.168.123.100:0/595663339 wait complete. 2026-03-10T12:46:15.939 INFO:teuthology.orchestra.run.vm00.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-10T12:46:15.988 DEBUG:teuthology.parallel:result is None 2026-03-10T12:46:15.988 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T12:46:15.991 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm00.local 2026-03-10T12:46:15.991 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- bash -c 'ceph fs dump' 2026-03-10T12:46:16.145 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:16.174 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:15 vm00.local ceph-mon[103263]: pgmap v259: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:16.174 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:15 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/4113316563' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:46:16.174 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:16.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:15 vm07.local ceph-mon[93622]: pgmap v259: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:16.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:15 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/4113316563' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:46:16.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.406+0000 7f69b603c700 1 -- 192.168.123.100:0/146287253 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b00738d0 msgr2=0x7f69b010c9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.406+0000 7f69b603c700 1 --2- 192.168.123.100:0/146287253 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b00738d0 0x7f69b010c9f0 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7f69a4009b00 tx=0x7f69a4009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.407+0000 7f69b603c700 1 -- 192.168.123.100:0/146287253 shutdown_connections 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.407+0000 7f69b603c700 1 --2- 192.168.123.100:0/146287253 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b00738d0 0x7f69b010c9f0 unknown :-1 s=CLOSED pgs=171 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.407+0000 7f69b603c700 1 --2- 192.168.123.100:0/146287253 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b0072fc0 0x7f69b0073390 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.407+0000 7f69b603c700 1 -- 192.168.123.100:0/146287253 >> 192.168.123.100:0/146287253 conn(0x7f69b0078580 msgr2=0x7f69b0078980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.407+0000 7f69b603c700 1 -- 192.168.123.100:0/146287253 shutdown_connections 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.407+0000 7f69b603c700 1 -- 192.168.123.100:0/146287253 wait complete. 2026-03-10T12:46:16.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69b603c700 1 Processor -- start 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69b603c700 1 -- start start 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69b603c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b0072fc0 0x7f69b0198410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69b603c700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b00738d0 0x7f69b0198950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69b603c700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69b0199030 con 0x7f69b0072fc0 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69b603c700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f69b019cdc0 con 0x7f69b00738d0 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69aeffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b00738d0 0x7f69b0198950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69aeffd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b00738d0 0x7f69b0198950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60188/0 (socket says 192.168.123.100:60188) 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69aeffd700 1 -- 192.168.123.100:0/2915459949 learned_addr learned my addr 192.168.123.100:0/2915459949 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69aeffd700 1 -- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b0072fc0 msgr2=0x7f69b0198410 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:16.408 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.408+0000 7f69af7fe700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b0072fc0 0x7f69b0198410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:16.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.409+0000 7f69aeffd700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b0072fc0 0x7f69b0198410 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:16.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.409+0000 7f69aeffd700 1 -- 192.168.123.100:0/2915459949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f69a40097e0 con 0x7f69b00738d0 2026-03-10T12:46:16.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.409+0000 7f69af7fe700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b0072fc0 0x7f69b0198410 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:46:16.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.409+0000 7f69aeffd700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b00738d0 0x7f69b0198950 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f69a400b5c0 tx=0x7f69a400bc50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:16.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.409+0000 7f69acff9700 1 -- 192.168.123.100:0/2915459949 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f69a401d070 con 0x7f69b00738d0 2026-03-10T12:46:16.409 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.409+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f69b019d040 con 0x7f69b00738d0 2026-03-10T12:46:16.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.409+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f69b019d530 con 0x7f69b00738d0 2026-03-10T12:46:16.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.410+0000 7f69acff9700 1 -- 192.168.123.100:0/2915459949 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f69a4003bf0 con 0x7f69b00738d0 2026-03-10T12:46:16.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.410+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f69b010a0f0 con 0x7f69b00738d0 2026-03-10T12:46:16.410 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.410+0000 7f69acff9700 1 -- 192.168.123.100:0/2915459949 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f69a4021620 con 0x7f69b00738d0 2026-03-10T12:46:16.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.411+0000 7f69acff9700 1 -- 192.168.123.100:0/2915459949 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f69a402b430 con 0x7f69b00738d0 2026-03-10T12:46:16.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.411+0000 7f69acff9700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6998077660 0x7f6998079b10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:16.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.411+0000 7f69acff9700 1 -- 192.168.123.100:0/2915459949 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f69a409b630 con 0x7f69b00738d0 2026-03-10T12:46:16.411 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.411+0000 7f69af7fe700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6998077660 0x7f6998079b10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:16.412 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.412+0000 7f69af7fe700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6998077660 0x7f6998079b10 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f69a0005fd0 tx=0x7f69a0005dc0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:16.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.413+0000 7f69acff9700 1 -- 192.168.123.100:0/2915459949 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f69a4065000 con 0x7f69b00738d0 2026-03-10T12:46:16.562 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.562+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f69b004ea50 con 0x7f69b00738d0 2026-03-10T12:46:16.563 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.563+0000 7f69acff9700 1 -- 192.168.123.100:0/2915459949 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 38 v38) v1 ==== 76+0+1984 (secure 0 0 0) 0x7f69a4064750 con 0x7f69b00738d0 2026-03-10T12:46:16.564 INFO:teuthology.orchestra.run.vm00.stdout:e38 2026-03-10T12:46:16.564 INFO:teuthology.orchestra.run.vm00.stdout:btime 2026-03-10T12:44:00:282446+0000 2026-03-10T12:46:16.564 INFO:teuthology.orchestra.run.vm00.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T12:46:16.564 INFO:teuthology.orchestra.run.vm00.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T12:46:16.564 INFO:teuthology.orchestra.run.vm00.stdout:legacy client fscid: 1 2026-03-10T12:46:16.564 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:Filesystem 'cephfs' (1) 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:fs_name cephfs 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:epoch 38 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:created 2026-03-10T12:35:09.477786+0000 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:modified 2026-03-10T12:44:00.282443+0000 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:tableserver 0 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:root 0 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:session_timeout 60 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:session_autoclose 300 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:max_file_size 1099511627776 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:max_xattr_size 65536 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:required_client_features {} 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:last_failure 0 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:last_failure_osd_epoch 83 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:max_mds 2 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:in 0,1 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:up {0=34368,1=44277} 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:failed 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:damaged 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:stopped 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:data_pools [3] 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:metadata_pool 2 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:inline_data disabled 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:balancer 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:bal_rank_mask -1 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:standby_count_wanted 1 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:qdb_cluster leader: 34368 members: 34368,44277 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.wdwvcu{0:34368} state up:active seq 8 join_fscid=1 addr [v2:192.168.123.100:6828/23281310,v1:192.168.123.100:6829/23281310] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm00.lnokoe{1:44277} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.100:6826/2887557827,v1:192.168.123.100:6827/2887557827] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:Standby daemons: 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.wznhgu{-1:44301} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6824/48365433,v1:192.168.123.107:6825/48365433] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:46:16.565 INFO:teuthology.orchestra.run.vm00.stdout:[mds.cephfs.vm07.rhzwnr{-1:44305} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.107:6826/3408808533,v1:192.168.123.107:6827/3408808533] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T12:46:16.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.566+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6998077660 msgr2=0x7f6998079b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:16.566 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.566+0000 7f69b603c700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6998077660 0x7f6998079b10 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f69a0005fd0 tx=0x7f69a0005dc0 comp rx=0 tx=0).stop 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.566+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b00738d0 msgr2=0x7f69b0198950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.566+0000 7f69b603c700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b00738d0 0x7f69b0198950 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f69a400b5c0 tx=0x7f69a400bc50 comp rx=0 tx=0).stop 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.567+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 shutdown_connections 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.567+0000 7f69b603c700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6998077660 0x7f6998079b10 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.567+0000 7f69b603c700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f69b0072fc0 0x7f69b0198410 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.567+0000 7f69b603c700 1 --2- 192.168.123.100:0/2915459949 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f69b00738d0 0x7f69b0198950 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.567+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 >> 192.168.123.100:0/2915459949 conn(0x7f69b0078580 msgr2=0x7f69b0107230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.567+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 shutdown_connections 2026-03-10T12:46:16.567 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:16.567+0000 7f69b603c700 1 -- 192.168.123.100:0/2915459949 wait complete. 2026-03-10T12:46:16.568 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 38 2026-03-10T12:46:16.614 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-10T12:46:16.616 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 2026-03-10T12:46:16.777 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 -- 192.168.123.100:0/2545196070 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ff480 msgr2=0x7f22c00ff850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 --2- 192.168.123.100:0/2545196070 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ff480 0x7f22c00ff850 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f22b0009b50 tx=0x7f22b0009e60 comp rx=0 tx=0).stop 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 -- 192.168.123.100:0/2545196070 shutdown_connections 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 --2- 192.168.123.100:0/2545196070 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22c00ffe20 0x7f22c0100270 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 --2- 192.168.123.100:0/2545196070 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ff480 0x7f22c00ff850 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 -- 192.168.123.100:0/2545196070 >> 192.168.123.100:0/2545196070 conn(0x7f22c00747e0 msgr2=0x7f22c0074be0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 -- 192.168.123.100:0/2545196070 shutdown_connections 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.049+0000 7f22c8004700 1 -- 192.168.123.100:0/2545196070 wait complete. 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.050+0000 7f22c8004700 1 Processor -- start 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.050+0000 7f22c8004700 1 -- start start 2026-03-10T12:46:17.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c8004700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22c00ff480 0x7f22c01a2660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c8004700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ffe20 0x7f22c01a2ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c8004700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22c01a3230 con 0x7f22c00ffe20 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c8004700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f22c019c6e0 con 0x7f22c00ff480 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c559f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ffe20 0x7f22c01a2ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c559f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ffe20 0x7f22c01a2ba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:47340/0 (socket says 192.168.123.100:47340) 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c559f700 1 -- 192.168.123.100:0/1072584461 learned_addr learned my addr 192.168.123.100:0/1072584461 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c559f700 1 -- 192.168.123.100:0/1072584461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22c00ff480 msgr2=0x7f22c01a2660 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c559f700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22c00ff480 0x7f22c01a2660 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.051+0000 7f22c559f700 1 -- 192.168.123.100:0/1072584461 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f22bc009710 con 0x7f22c00ffe20 2026-03-10T12:46:17.051 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.052+0000 7f22c559f700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ffe20 0x7f22c01a2ba0 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f22bc00eee0 tx=0x7f22bc00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:17.052 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.052+0000 7f22b6ffd700 1 -- 192.168.123.100:0/1072584461 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22bc00ce10 con 0x7f22c00ffe20 2026-03-10T12:46:17.052 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.052+0000 7f22b6ffd700 1 -- 192.168.123.100:0/1072584461 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f22bc004500 con 0x7f22c00ffe20 2026-03-10T12:46:17.052 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.052+0000 7f22b6ffd700 1 -- 192.168.123.100:0/1072584461 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f22bc005490 con 0x7f22c00ffe20 2026-03-10T12:46:17.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.052+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f22b00097e0 con 0x7f22c00ffe20 2026-03-10T12:46:17.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.053+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f22c019cd50 con 0x7f22c00ffe20 2026-03-10T12:46:17.057 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.054+0000 7f22b6ffd700 1 -- 192.168.123.100:0/1072584461 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f22bc0055f0 con 0x7f22c00ffe20 2026-03-10T12:46:17.057 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.054+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f22c0104840 con 0x7f22c00ffe20 2026-03-10T12:46:17.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.057+0000 7f22b6ffd700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f22ac077880 0x7f22ac079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:17.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.057+0000 7f22b6ffd700 1 -- 192.168.123.100:0/1072584461 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f22bc014070 con 0x7f22c00ffe20 2026-03-10T12:46:17.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.058+0000 7f22c5da0700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f22ac077880 0x7f22ac079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:17.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.058+0000 7f22b6ffd700 1 -- 192.168.123.100:0/1072584461 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f22bc061cf0 con 0x7f22c00ffe20 2026-03-10T12:46:17.058 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.058+0000 7f22c5da0700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f22ac077880 0x7f22ac079d30 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f22b000b5c0 tx=0x7f22b0005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:17.198 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:16 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/595663339' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:46:17.198 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:16 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2915459949' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:46:17.199 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.198+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f22c004ea50 con 0x7f22c00ffe20 2026-03-10T12:46:17.201 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.201+0000 7f22b6ffd700 1 -- 192.168.123.100:0/1072584461 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 38 v38) v1 ==== 94+0+5275 (secure 0 0 0) 0x7f22bc061440 con 0x7f22c00ffe20 2026-03-10T12:46:17.201 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:17.201 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":38,"btime":"2026-03-10T12:44:00:282446+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30},{"gid":44305,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/3408808533","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3408808533},{"type":"v1","addr":"192.168.123.107:6827","nonce":3408808533}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":36}],"filesystems":[{"mdsmap":{"epoch":38,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:44:00.282443+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0,1],"up":{"mds_0":34368,"mds_1":44277},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34368":{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":0,"incarnation":32,"state":"up:active","state_seq":8,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_44277":{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":37,"state":"up:active","state_seq":10,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34368,"qdb_cluster":[34368,44277]},"id":1}]} 2026-03-10T12:46:17.203 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.203+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f22ac077880 msgr2=0x7f22ac079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:17.203 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.203+0000 7f22c8004700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f22ac077880 0x7f22ac079d30 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f22b000b5c0 tx=0x7f22b0005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ffe20 msgr2=0x7f22c01a2ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ffe20 0x7f22c01a2ba0 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f22bc00eee0 tx=0x7f22bc00c5b0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 shutdown_connections 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f22ac077880 0x7f22ac079d30 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f22c00ff480 0x7f22c01a2660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 --2- 192.168.123.100:0/1072584461 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f22c00ffe20 0x7f22c01a2ba0 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 >> 192.168.123.100:0/1072584461 conn(0x7f22c00747e0 msgr2=0x7f22c0109210 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 shutdown_connections 2026-03-10T12:46:17.204 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.204+0000 7f22c8004700 1 -- 192.168.123.100:0/1072584461 wait complete. 2026-03-10T12:46:17.205 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 38 2026-03-10T12:46:17.267 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 2, 'flags': 18} 2026-03-10T12:46:17.267 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 10 2026-03-10T12:46:17.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:16 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/595663339' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T12:46:17.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:16 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2915459949' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T12:46:17.424 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.967+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/2554366730 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d38073a00 msgr2=0x7f3d38110ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.967+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/2554366730 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d38073a00 0x7f3d38110ff0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f3d34009b00 tx=0x7f3d34009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.967+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/2554366730 shutdown_connections 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.967+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/2554366730 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d38073a00 0x7f3d38110ff0 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.967+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/2554366730 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d380730f0 0x7f3d380734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.967+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/2554366730 >> 192.168.123.100:0/2554366730 conn(0x7f3d380fbfc0 msgr2=0x7f3d380fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/2554366730 shutdown_connections 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/2554366730 wait complete. 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 Processor -- start 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 -- start start 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d380730f0 0x7f3d381a2510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:17.968 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d38073a00 0x7f3d381a2a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d381a30e0 con 0x7f3d380730f0 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.968+0000 7f3d3f7fa700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d3819c590 con 0x7f3d38073a00 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3cd95700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d38073a00 0x7f3d381a2a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3cd95700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d38073a00 0x7f3d381a2a50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48928/0 (socket says 192.168.123.100:48928) 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3cd95700 1 -- 192.168.123.100:0/3777654555 learned_addr learned my addr 192.168.123.100:0/3777654555 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3d596700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d380730f0 0x7f3d381a2510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3d596700 1 -- 192.168.123.100:0/3777654555 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d38073a00 msgr2=0x7f3d381a2a50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3d596700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d38073a00 0x7f3d381a2a50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:17.969 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3d596700 1 -- 192.168.123.100:0/3777654555 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d340097e0 con 0x7f3d380730f0 2026-03-10T12:46:17.970 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.969+0000 7f3d3d596700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d380730f0 0x7f3d381a2510 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f3d2800eab0 tx=0x7f3d2800edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.970+0000 7f3d2e7fc700 1 -- 192.168.123.100:0/3777654555 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d2800cb00 con 0x7f3d380730f0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.970+0000 7f3d2e7fc700 1 -- 192.168.123.100:0/3777654555 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3d28004510 con 0x7f3d380730f0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.970+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d3819c870 con 0x7f3d380730f0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.970+0000 7f3d2e7fc700 1 -- 192.168.123.100:0/3777654555 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d28005230 con 0x7f3d380730f0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.970+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d3819cdc0 con 0x7f3d380730f0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.971+0000 7f3d2e7fc700 1 -- 192.168.123.100:0/3777654555 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3d28005390 con 0x7f3d380730f0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.971+0000 7f3d2e7fc700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3d240779e0 0x7f3d24079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.972+0000 7f3d2e7fc700 1 -- 192.168.123.100:0/3777654555 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f3d28014070 con 0x7f3d380730f0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.972+0000 7f3d3cd95700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3d240779e0 0x7f3d24079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:17.972 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.972+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d3819ca00 con 0x7f3d380730f0 2026-03-10T12:46:17.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.974+0000 7f3d3cd95700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3d240779e0 0x7f3d24079e90 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f3d380fd700 tx=0x7f3d3400b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:17.975 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:17.975+0000 7f3d2e7fc700 1 -- 192.168.123.100:0/3777654555 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3d3819ca00 con 0x7f3d380730f0 2026-03-10T12:46:18.121 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:17 vm00.local ceph-mon[103263]: pgmap v260: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:18.121 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:17 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1072584461' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T12:46:18.121 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.120+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 10, "format": "json"} v 0) v1 -- 0x7f3d3810e770 con 0x7f3d380730f0 2026-03-10T12:46:18.123 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.123+0000 7f3d2e7fc700 1 -- 192.168.123.100:0/3777654555 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 10, "format": "json"}]=0 dumped fsmap epoch 10 v38) v1 ==== 107+0+4928 (secure 0 0 0) 0x7f3d28063970 con 0x7f3d380730f0 2026-03-10T12:46:18.123 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:18.123 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":10,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":8}],"filesystems":[{"mdsmap":{"epoch":10,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:35:17.532287+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3d240779e0 msgr2=0x7f3d24079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3d240779e0 0x7f3d24079e90 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f3d380fd700 tx=0x7f3d3400b540 comp rx=0 tx=0).stop 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d380730f0 msgr2=0x7f3d381a2510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d380730f0 0x7f3d381a2510 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7f3d2800eab0 tx=0x7f3d2800edc0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 shutdown_connections 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f3d240779e0 0x7f3d24079e90 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3d380730f0 0x7f3d381a2510 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 --2- 192.168.123.100:0/3777654555 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3d38073a00 0x7f3d381a2a50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.125+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 >> 192.168.123.100:0/3777654555 conn(0x7f3d380fbfc0 msgr2=0x7f3d38102b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:18.125 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.126+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 shutdown_connections 2026-03-10T12:46:18.126 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.126+0000 7f3d3f7fa700 1 -- 192.168.123.100:0/3777654555 wait complete. 2026-03-10T12:46:18.126 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 10 2026-03-10T12:46:18.188 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 11 2026-03-10T12:46:18.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:17 vm07.local ceph-mon[93622]: pgmap v260: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:18.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:17 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1072584461' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T12:46:18.340 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:18.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.601+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2142232732 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073000 msgr2=0x7fe1280733d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:18.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.601+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2142232732 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073000 0x7fe1280733d0 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7fe110009b00 tx=0x7fe110009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:18.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.602+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2142232732 shutdown_connections 2026-03-10T12:46:18.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.602+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2142232732 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe128073910 0x7fe1281111c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.602+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2142232732 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073000 0x7fe1280733d0 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.602 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.602+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2142232732 >> 192.168.123.100:0/2142232732 conn(0x7fe128078550 msgr2=0x7fe128078950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:18.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.602+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2142232732 shutdown_connections 2026-03-10T12:46:18.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.603+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2142232732 wait complete. 2026-03-10T12:46:18.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.603+0000 7fe12d5f0700 1 Processor -- start 2026-03-10T12:46:18.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.603+0000 7fe12d5f0700 1 -- start start 2026-03-10T12:46:18.603 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.603+0000 7fe12d5f0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe128073000 0x7fe1281a2610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.603+0000 7fe12d5f0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073910 0x7fe1281a2b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.603+0000 7fe12d5f0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe1281a31e0 con 0x7fe128073910 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.603+0000 7fe12d5f0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe12819c690 con 0x7fe128073000 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe1267fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073910 0x7fe1281a2b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe1267fc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073910 0x7fe1281a2b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42916/0 (socket says 192.168.123.100:42916) 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe1267fc700 1 -- 192.168.123.100:0/2985794719 learned_addr learned my addr 192.168.123.100:0/2985794719 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe126ffd700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe128073000 0x7fe1281a2610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe1267fc700 1 -- 192.168.123.100:0/2985794719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe128073000 msgr2=0x7fe1281a2610 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe1267fc700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe128073000 0x7fe1281a2610 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe1267fc700 1 -- 192.168.123.100:0/2985794719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe1100097e0 con 0x7fe128073910 2026-03-10T12:46:18.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe126ffd700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe128073000 0x7fe1281a2610 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:46:18.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.604+0000 7fe1267fc700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073910 0x7fe1281a2b50 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7fe11800b700 tx=0x7fe11800ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:18.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.605+0000 7fe11ffff700 1 -- 192.168.123.100:0/2985794719 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe118011840 con 0x7fe128073910 2026-03-10T12:46:18.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.605+0000 7fe11ffff700 1 -- 192.168.123.100:0/2985794719 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe118011e80 con 0x7fe128073910 2026-03-10T12:46:18.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.605+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe12819c970 con 0x7fe128073910 2026-03-10T12:46:18.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.605+0000 7fe11ffff700 1 -- 192.168.123.100:0/2985794719 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe11800f340 con 0x7fe128073910 2026-03-10T12:46:18.606 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.605+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe12819cec0 con 0x7fe128073910 2026-03-10T12:46:18.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.606+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe12810e940 con 0x7fe128073910 2026-03-10T12:46:18.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.609+0000 7fe11ffff700 1 -- 192.168.123.100:0/2985794719 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe1180103e0 con 0x7fe128073910 2026-03-10T12:46:18.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.610+0000 7fe11ffff700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe114077910 0x7fe114079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:18.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.610+0000 7fe11ffff700 1 -- 192.168.123.100:0/2985794719 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fe118099bf0 con 0x7fe128073910 2026-03-10T12:46:18.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.610+0000 7fe126ffd700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe114077910 0x7fe114079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:18.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.610+0000 7fe11ffff700 1 -- 192.168.123.100:0/2985794719 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe1180c9a90 con 0x7fe128073910 2026-03-10T12:46:18.610 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.611+0000 7fe126ffd700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe114077910 0x7fe114079dc0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe110009fd0 tx=0x7fe110005800 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:18.752 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.752+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 11, "format": "json"} v 0) v1 -- 0x7fe12804ea50 con 0x7fe128073910 2026-03-10T12:46:18.753 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.752+0000 7fe11ffff700 1 -- 192.168.123.100:0/2985794719 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 11, "format": "json"}]=0 dumped fsmap epoch 11 v38) v1 ==== 107+0+4928 (secure 0 0 0) 0x7fe1180623f0 con 0x7fe128073910 2026-03-10T12:46:18.753 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:18.753 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":11,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":10,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:35:17.532287+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:18.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.755+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe114077910 msgr2=0x7fe114079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:18.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.755+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe114077910 0x7fe114079dc0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe110009fd0 tx=0x7fe110005800 comp rx=0 tx=0).stop 2026-03-10T12:46:18.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.755+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073910 msgr2=0x7fe1281a2b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:18.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.755+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073910 0x7fe1281a2b50 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7fe11800b700 tx=0x7fe11800ba10 comp rx=0 tx=0).stop 2026-03-10T12:46:18.755 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.756+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 shutdown_connections 2026-03-10T12:46:18.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.756+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fe114077910 0x7fe114079dc0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.756+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fe128073000 0x7fe1281a2610 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.756+0000 7fe12d5f0700 1 --2- 192.168.123.100:0/2985794719 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fe128073910 0x7fe1281a2b50 unknown :-1 s=CLOSED pgs=177 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:18.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.756+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 >> 192.168.123.100:0/2985794719 conn(0x7fe128078550 msgr2=0x7fe128102e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:18.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.756+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 shutdown_connections 2026-03-10T12:46:18.756 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:18.756+0000 7fe12d5f0700 1 -- 192.168.123.100:0/2985794719 wait complete. 2026-03-10T12:46:18.757 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 11 2026-03-10T12:46:18.822 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 12 2026-03-10T12:46:18.979 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:19.007 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:18 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3777654555' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-10T12:46:19.007 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:18 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2985794719' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T12:46:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.234+0000 7fda38e3f700 1 -- 192.168.123.100:0/589738530 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34108790 msgr2=0x7fda34108b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.234+0000 7fda38e3f700 1 --2- 192.168.123.100:0/589738530 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34108790 0x7fda34108b60 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7fda1c009b00 tx=0x7fda1c009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.235+0000 7fda38e3f700 1 -- 192.168.123.100:0/589738530 shutdown_connections 2026-03-10T12:46:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.235+0000 7fda38e3f700 1 --2- 192.168.123.100:0/589738530 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fda34102790 0x7fda34102c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.235+0000 7fda38e3f700 1 --2- 192.168.123.100:0/589738530 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34108790 0x7fda34108b60 unknown :-1 s=CLOSED pgs=178 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.235+0000 7fda38e3f700 1 -- 192.168.123.100:0/589738530 >> 192.168.123.100:0/589738530 conn(0x7fda340fe2b0 msgr2=0x7fda341006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:19.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.235+0000 7fda38e3f700 1 -- 192.168.123.100:0/589738530 shutdown_connections 2026-03-10T12:46:19.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.236+0000 7fda38e3f700 1 -- 192.168.123.100:0/589738530 wait complete. 2026-03-10T12:46:19.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.236+0000 7fda38e3f700 1 Processor -- start 2026-03-10T12:46:19.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.236+0000 7fda38e3f700 1 -- start start 2026-03-10T12:46:19.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda38e3f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34102790 0x7fda341982f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:19.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda38e3f700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fda34108790 0x7fda34198830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:19.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda38e3f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda34198e80 con 0x7fda34102790 2026-03-10T12:46:19.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda38e3f700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda34198fc0 con 0x7fda34108790 2026-03-10T12:46:19.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda3259c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34102790 0x7fda341982f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:19.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda3259c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34102790 0x7fda341982f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42932/0 (socket says 192.168.123.100:42932) 2026-03-10T12:46:19.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda3259c700 1 -- 192.168.123.100:0/940261417 learned_addr learned my addr 192.168.123.100:0/940261417 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:19.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.237+0000 7fda31d9b700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fda34108790 0x7fda34198830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.238+0000 7fda31d9b700 1 -- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34102790 msgr2=0x7fda341982f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.238+0000 7fda31d9b700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34102790 0x7fda341982f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.238+0000 7fda31d9b700 1 -- 192.168.123.100:0/940261417 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda1c0097e0 con 0x7fda34108790 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.238+0000 7fda3259c700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34102790 0x7fda341982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.238+0000 7fda31d9b700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fda34108790 0x7fda34198830 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fda2400c8f0 tx=0x7fda2400cc00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.238+0000 7fda2b7fe700 1 -- 192.168.123.100:0/940261417 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda240043f0 con 0x7fda34108790 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.238+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda3419ce10 con 0x7fda34108790 2026-03-10T12:46:19.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.239+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda3419d360 con 0x7fda34108790 2026-03-10T12:46:19.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.239+0000 7fda2b7fe700 1 -- 192.168.123.100:0/940261417 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fda24004550 con 0x7fda34108790 2026-03-10T12:46:19.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.239+0000 7fda2b7fe700 1 -- 192.168.123.100:0/940261417 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda24003890 con 0x7fda34108790 2026-03-10T12:46:19.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.239+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fda3404ea50 con 0x7fda34108790 2026-03-10T12:46:19.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.240+0000 7fda2b7fe700 1 -- 192.168.123.100:0/940261417 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fda240051b0 con 0x7fda34108790 2026-03-10T12:46:19.241 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.241+0000 7fda2b7fe700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fda200778c0 0x7fda20079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:19.241 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.241+0000 7fda3259c700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fda200778c0 0x7fda20079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:19.241 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.241+0000 7fda2b7fe700 1 -- 192.168.123.100:0/940261417 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fda24099390 con 0x7fda34108790 2026-03-10T12:46:19.241 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.241+0000 7fda3259c700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fda200778c0 0x7fda20079d70 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fda1c005850 tx=0x7fda1c00b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:19.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.243+0000 7fda2b7fe700 1 -- 192.168.123.100:0/940261417 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fda24061b90 con 0x7fda34108790 2026-03-10T12:46:19.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:18 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3777654555' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-10T12:46:19.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:18 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2985794719' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T12:46:19.393 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.393+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7fda34066e40 con 0x7fda34108790 2026-03-10T12:46:19.394 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.394+0000 7fda2b7fe700 1 -- 192.168.123.100:0/940261417 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v38) v1 ==== 107+0+4929 (secure 0 0 0) 0x7fda240612e0 con 0x7fda34108790 2026-03-10T12:46:19.394 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:19.394 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":12,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":12,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:36:51.752695+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[1],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fda200778c0 msgr2=0x7fda20079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fda200778c0 0x7fda20079d70 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fda1c005850 tx=0x7fda1c00b540 comp rx=0 tx=0).stop 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fda34108790 msgr2=0x7fda34198830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fda34108790 0x7fda34198830 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fda2400c8f0 tx=0x7fda2400cc00 comp rx=0 tx=0).stop 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 shutdown_connections 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fda200778c0 0x7fda20079d70 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fda34102790 0x7fda341982f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 --2- 192.168.123.100:0/940261417 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fda34108790 0x7fda34198830 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 >> 192.168.123.100:0/940261417 conn(0x7fda340fe2b0 msgr2=0x7fda340ff9f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 shutdown_connections 2026-03-10T12:46:19.397 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.397+0000 7fda38e3f700 1 -- 192.168.123.100:0/940261417 wait complete. 2026-03-10T12:46:19.398 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 12 2026-03-10T12:46:19.442 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 13 2026-03-10T12:46:19.593 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:19.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.872+0000 7fcdc2c53700 1 -- 192.168.123.100:0/3985472415 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc102780 msgr2=0x7fcdbc102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:19.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.873+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/3985472415 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc102780 0x7fcdbc102bf0 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7fcdb0009b00 tx=0x7fcdb0009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:19.873 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.873+0000 7fcdc2c53700 1 -- 192.168.123.100:0/3985472415 shutdown_connections 2026-03-10T12:46:19.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.873+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/3985472415 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc102780 0x7fcdbc102bf0 unknown :-1 s=CLOSED pgs=179 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.873+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/3985472415 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcdbc108780 0x7fcdbc108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.873+0000 7fcdc2c53700 1 -- 192.168.123.100:0/3985472415 >> 192.168.123.100:0/3985472415 conn(0x7fcdbc0fe280 msgr2=0x7fcdbc100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:19.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.873+0000 7fcdc2c53700 1 -- 192.168.123.100:0/3985472415 shutdown_connections 2026-03-10T12:46:19.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.874+0000 7fcdc2c53700 1 -- 192.168.123.100:0/3985472415 wait complete. 2026-03-10T12:46:19.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.874+0000 7fcdc2c53700 1 Processor -- start 2026-03-10T12:46:19.874 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.874+0000 7fcdc2c53700 1 -- start start 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdc2c53700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcdbc102780 0x7fcdbc198350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdc2c53700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc108780 0x7fcdbc198890 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdc2c53700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdbc198ee0 con 0x7fcdbc108780 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdc09ef700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcdbc102780 0x7fcdbc198350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdc2c53700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdbc199020 con 0x7fcdbc102780 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdbbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc108780 0x7fcdbc198890 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdbbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc108780 0x7fcdbc198890 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42950/0 (socket says 192.168.123.100:42950) 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdc09ef700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcdbc102780 0x7fcdbc198350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:48986/0 (socket says 192.168.123.100:48986) 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdbbfff700 1 -- 192.168.123.100:0/2556900559 learned_addr learned my addr 192.168.123.100:0/2556900559 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdbbfff700 1 -- 192.168.123.100:0/2556900559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcdbc102780 msgr2=0x7fcdbc198350 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:19.875 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.875+0000 7fcdbbfff700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcdbc102780 0x7fcdbc198350 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:19.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.876+0000 7fcdbbfff700 1 -- 192.168.123.100:0/2556900559 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcdb00097e0 con 0x7fcdbc108780 2026-03-10T12:46:19.876 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.876+0000 7fcdbbfff700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc108780 0x7fcdbc198890 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7fcdb0009ad0 tx=0x7fcdb00052e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:19.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.876+0000 7fcdb9ffb700 1 -- 192.168.123.100:0/2556900559 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdb001d070 con 0x7fcdbc108780 2026-03-10T12:46:19.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.876+0000 7fcdb9ffb700 1 -- 192.168.123.100:0/2556900559 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcdb000bc50 con 0x7fcdbc108780 2026-03-10T12:46:19.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.876+0000 7fcdb9ffb700 1 -- 192.168.123.100:0/2556900559 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdb000f790 con 0x7fcdbc108780 2026-03-10T12:46:19.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.876+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcdbc19ce10 con 0x7fcdbc108780 2026-03-10T12:46:19.877 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.876+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcdbc19d300 con 0x7fcdbc108780 2026-03-10T12:46:19.880 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.877+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcdbc04ea50 con 0x7fcdbc108780 2026-03-10T12:46:19.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.882+0000 7fcdb9ffb700 1 -- 192.168.123.100:0/2556900559 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcdb0022470 con 0x7fcdbc108780 2026-03-10T12:46:19.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.882+0000 7fcdb9ffb700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcdac0779e0 0x7fcdac079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:19.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.882+0000 7fcdb9ffb700 1 -- 192.168.123.100:0/2556900559 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fcdb009be40 con 0x7fcdbc108780 2026-03-10T12:46:19.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.882+0000 7fcdb9ffb700 1 -- 192.168.123.100:0/2556900559 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcdb009c2c0 con 0x7fcdbc108780 2026-03-10T12:46:19.882 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.882+0000 7fcdc09ef700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcdac0779e0 0x7fcdac079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:19.883 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:19.883+0000 7fcdc09ef700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcdac0779e0 0x7fcdac079e90 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fcda8005fd0 tx=0x7fcda8005e50 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:19.983 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:19 vm00.local ceph-mon[103263]: pgmap v261: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:19.983 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:19 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/940261417' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T12:46:20.024 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.024+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7fcdbc066e40 con 0x7fcdbc108780 2026-03-10T12:46:20.025 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.025+0000 7fcdb9ffb700 1 -- 192.168.123.100:0/2556900559 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v38) v1 ==== 107+0+4930 (secure 0 0 0) 0x7fcdb0064640 con 0x7fcdbc108780 2026-03-10T12:46:20.025 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:20.025 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":13,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":13,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:37:36.646083+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[0],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[1],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:20.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.027+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcdac0779e0 msgr2=0x7fcdac079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:20.027 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.027+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcdac0779e0 0x7fcdac079e90 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fcda8005fd0 tx=0x7fcda8005e50 comp rx=0 tx=0).stop 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc108780 msgr2=0x7fcdbc198890 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc108780 0x7fcdbc198890 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7fcdb0009ad0 tx=0x7fcdb00052e0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 shutdown_connections 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fcdac0779e0 0x7fcdac079e90 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fcdbc102780 0x7fcdbc198350 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 --2- 192.168.123.100:0/2556900559 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fcdbc108780 0x7fcdbc198890 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 >> 192.168.123.100:0/2556900559 conn(0x7fcdbc0fe280 msgr2=0x7fcdbc0ffa40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 shutdown_connections 2026-03-10T12:46:20.028 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.028+0000 7fcdc2c53700 1 -- 192.168.123.100:0/2556900559 wait complete. 2026-03-10T12:46:20.029 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 13 2026-03-10T12:46:20.090 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 14 2026-03-10T12:46:20.245 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:20.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:19 vm07.local ceph-mon[93622]: pgmap v261: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:20.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:19 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/940261417' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T12:46:20.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.508+0000 7fc37895a700 1 -- 192.168.123.100:0/2290731421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 msgr2=0x7fc3741088a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:20.508 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.508+0000 7fc37895a700 1 --2- 192.168.123.100:0/2290731421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741088a0 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7fc35c009b50 tx=0x7fc35c009e60 comp rx=0 tx=0).stop 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.508+0000 7fc37895a700 1 -- 192.168.123.100:0/2290731421 shutdown_connections 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.508+0000 7fc37895a700 1 --2- 192.168.123.100:0/2290731421 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc3741024d0 0x7fc374102940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.508+0000 7fc37895a700 1 --2- 192.168.123.100:0/2290731421 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741088a0 unknown :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.508+0000 7fc37895a700 1 -- 192.168.123.100:0/2290731421 >> 192.168.123.100:0/2290731421 conn(0x7fc3740fdff0 msgr2=0x7fc374100400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.509+0000 7fc37895a700 1 -- 192.168.123.100:0/2290731421 shutdown_connections 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.509+0000 7fc37895a700 1 -- 192.168.123.100:0/2290731421 wait complete. 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.509+0000 7fc37895a700 1 Processor -- start 2026-03-10T12:46:20.509 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.509+0000 7fc37895a700 1 -- start start 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc37895a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc3741024d0 0x7fc3741981a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc37895a700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741986e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc37895a700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc374198dc0 con 0x7fc3741084d0 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc37895a700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc37419cb50 con 0x7fc3741024d0 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc372ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741986e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc372ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741986e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:42972/0 (socket says 192.168.123.100:42972) 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc372ffd700 1 -- 192.168.123.100:0/1822175943 learned_addr learned my addr 192.168.123.100:0/1822175943 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc372ffd700 1 -- 192.168.123.100:0/1822175943 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc3741024d0 msgr2=0x7fc3741981a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc372ffd700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc3741024d0 0x7fc3741981a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.510 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.510+0000 7fc372ffd700 1 -- 192.168.123.100:0/1822175943 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc35c0097e0 con 0x7fc3741084d0 2026-03-10T12:46:20.511 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.511+0000 7fc372ffd700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741986e0 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7fc36400ed70 tx=0x7fc36400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:20.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.511+0000 7fc370ff9700 1 -- 192.168.123.100:0/1822175943 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc364009980 con 0x7fc3741084d0 2026-03-10T12:46:20.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.511+0000 7fc370ff9700 1 -- 192.168.123.100:0/1822175943 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc36400cd70 con 0x7fc3741084d0 2026-03-10T12:46:20.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.511+0000 7fc370ff9700 1 -- 192.168.123.100:0/1822175943 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc3640189c0 con 0x7fc3741084d0 2026-03-10T12:46:20.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.511+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc37419ce30 con 0x7fc3741084d0 2026-03-10T12:46:20.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.511+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc37419d380 con 0x7fc3741084d0 2026-03-10T12:46:20.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.512+0000 7fc370ff9700 1 -- 192.168.123.100:0/1822175943 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc364018b20 con 0x7fc3741084d0 2026-03-10T12:46:20.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.512+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc37410aa40 con 0x7fc3741084d0 2026-03-10T12:46:20.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.514+0000 7fc370ff9700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc360077870 0x7fc360079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:20.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.514+0000 7fc370ff9700 1 -- 192.168.123.100:0/1822175943 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fc364014070 con 0x7fc3741084d0 2026-03-10T12:46:20.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.517+0000 7fc3737fe700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc360077870 0x7fc360079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:20.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.517+0000 7fc3737fe700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc360077870 0x7fc360079d20 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fc35c000c00 tx=0x7fc35c005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:20.517 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.517+0000 7fc370ff9700 1 -- 192.168.123.100:0/1822175943 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc364063e50 con 0x7fc3741084d0 2026-03-10T12:46:20.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.663+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7fc37404f2a0 con 0x7fc3741084d0 2026-03-10T12:46:20.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.664+0000 7fc370ff9700 1 -- 192.168.123.100:0/1822175943 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v38) v1 ==== 107+0+4944 (secure 0 0 0) 0x7fc3640635a0 con 0x7fc3741084d0 2026-03-10T12:46:20.666 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:20.666 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":14,"btime":"2026-03-10T12:42:06:789257+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:42:06.789256+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[0],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24307,24313]},"id":1}]} 2026-03-10T12:46:20.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc360077870 msgr2=0x7fc360079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc360077870 0x7fc360079d20 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fc35c000c00 tx=0x7fc35c005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 msgr2=0x7fc3741986e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741986e0 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7fc36400ed70 tx=0x7fc36400c5b0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 shutdown_connections 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc360077870 0x7fc360079d20 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc3741024d0 0x7fc3741981a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 --2- 192.168.123.100:0/1822175943 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc3741084d0 0x7fc3741986e0 secure :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7fc36400ed70 tx=0x7fc36400c5b0 comp rx=0 tx=0).stop 2026-03-10T12:46:20.669 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.668+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 >> 192.168.123.100:0/1822175943 conn(0x7fc3740fdff0 msgr2=0x7fc3740fe9c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:20.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.669+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 shutdown_connections 2026-03-10T12:46:20.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:20.669+0000 7fc37895a700 1 -- 192.168.123.100:0/1822175943 wait complete. 2026-03-10T12:46:20.670 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 14 2026-03-10T12:46:20.748 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 15 2026-03-10T12:46:20.906 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:20.920 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:20 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2556900559' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T12:46:20.920 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:20 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1822175943' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.165+0000 7f1035287700 1 -- 192.168.123.100:0/2651748270 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 msgr2=0x7f10301048f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.165+0000 7f1035287700 1 --2- 192.168.123.100:0/2651748270 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 0x7f10301048f0 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f1018009b00 tx=0x7f1018009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.165+0000 7f1035287700 1 -- 192.168.123.100:0/2651748270 shutdown_connections 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.165+0000 7f1035287700 1 --2- 192.168.123.100:0/2651748270 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 0x7f1030100370 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.165+0000 7f1035287700 1 --2- 192.168.123.100:0/2651748270 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 0x7f10301048f0 unknown :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.165+0000 7f1035287700 1 -- 192.168.123.100:0/2651748270 >> 192.168.123.100:0/2651748270 conn(0x7f10300754a0 msgr2=0x7f10300758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.166+0000 7f1035287700 1 -- 192.168.123.100:0/2651748270 shutdown_connections 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.166+0000 7f1035287700 1 -- 192.168.123.100:0/2651748270 wait complete. 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.166+0000 7f1035287700 1 Processor -- start 2026-03-10T12:46:21.166 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.166+0000 7f1035287700 1 -- start start 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f1035287700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 0x7f10301983b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f1035287700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 0x7f10301988f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f1035287700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1030198fd0 con 0x7f1030104520 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f1035287700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f103019cd60 con 0x7f10300fff00 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f102effd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 0x7f10301983b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f102effd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 0x7f10301983b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:49018/0 (socket says 192.168.123.100:49018) 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f102effd700 1 -- 192.168.123.100:0/3268679058 learned_addr learned my addr 192.168.123.100:0/3268679058 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f102e7fc700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 0x7f10301988f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:21.167 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f102effd700 1 -- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 msgr2=0x7f10301988f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:21.168 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f102effd700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 0x7f10301988f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.168 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.167+0000 7f102effd700 1 -- 192.168.123.100:0/3268679058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10180097e0 con 0x7f10300fff00 2026-03-10T12:46:21.168 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.168+0000 7f102e7fc700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 0x7f10301988f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:46:21.168 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.168+0000 7f102effd700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 0x7f10301983b0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f1018017040 tx=0x7f1018015b80 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:21.168 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.168+0000 7f1027fff700 1 -- 192.168.123.100:0/3268679058 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10180052e0 con 0x7f10300fff00 2026-03-10T12:46:21.168 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.168+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f103019cfe0 con 0x7f10300fff00 2026-03-10T12:46:21.169 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.168+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f103019d4d0 con 0x7f10300fff00 2026-03-10T12:46:21.169 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.169+0000 7f1027fff700 1 -- 192.168.123.100:0/3268679058 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1018005440 con 0x7f10300fff00 2026-03-10T12:46:21.169 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.169+0000 7f1027fff700 1 -- 192.168.123.100:0/3268679058 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f101801ea90 con 0x7f10300fff00 2026-03-10T12:46:21.172 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.170+0000 7f1027fff700 1 -- 192.168.123.100:0/3268679058 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f101801ebf0 con 0x7f10300fff00 2026-03-10T12:46:21.172 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.170+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1010005320 con 0x7f10300fff00 2026-03-10T12:46:21.173 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.172+0000 7f1027fff700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f101c0778e0 0x7f101c079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:21.173 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.172+0000 7f1027fff700 1 -- 192.168.123.100:0/3268679058 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f101809ac20 con 0x7f10300fff00 2026-03-10T12:46:21.173 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.173+0000 7f102e7fc700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f101c0778e0 0x7f101c079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:21.173 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.173+0000 7f102e7fc700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f101c0778e0 0x7f101c079d90 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f10301999d0 tx=0x7f1020005f50 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:21.175 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.175+0000 7f1027fff700 1 -- 192.168.123.100:0/3268679058 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f10180635d0 con 0x7f10300fff00 2026-03-10T12:46:21.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:20 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2556900559' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T12:46:21.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:20 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1822175943' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T12:46:21.320 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.320+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7f1010005190 con 0x7f10300fff00 2026-03-10T12:46:21.321 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.320+0000 7f1027fff700 1 -- 192.168.123.100:0/3268679058 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v38) v1 ==== 107+0+4943 (secure 0 0 0) 0x7f10180057b0 con 0x7f10300fff00 2026-03-10T12:46:21.321 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:21.321 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":15,"btime":"2026-03-10T12:42:26:260745+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:42:25.757418+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":2,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313,24307]},"id":1}]} 2026-03-10T12:46:21.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.323+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f101c0778e0 msgr2=0x7f101c079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:21.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.323+0000 7f1035287700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f101c0778e0 0x7f101c079d90 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f10301999d0 tx=0x7f1020005f50 comp rx=0 tx=0).stop 2026-03-10T12:46:21.323 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.323+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 msgr2=0x7f10301983b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.324+0000 7f1035287700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 0x7f10301983b0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f1018017040 tx=0x7f1018015b80 comp rx=0 tx=0).stop 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.324+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 shutdown_connections 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.324+0000 7f1035287700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f101c0778e0 0x7f101c079d90 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.324+0000 7f1035287700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f10300fff00 0x7f10301983b0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.324+0000 7f1035287700 1 --2- 192.168.123.100:0/3268679058 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f1030104520 0x7f10301988f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.324+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 >> 192.168.123.100:0/3268679058 conn(0x7f10300754a0 msgr2=0x7f10300fec40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.324+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 shutdown_connections 2026-03-10T12:46:21.324 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.325+0000 7f1035287700 1 -- 192.168.123.100:0/3268679058 wait complete. 2026-03-10T12:46:21.325 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 15 2026-03-10T12:46:21.401 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 16 2026-03-10T12:46:21.551 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.805+0000 7fc5d8408700 1 -- 192.168.123.100:0/2463808320 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d0100620 msgr2=0x7fc5d0108b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.805+0000 7fc5d8408700 1 --2- 192.168.123.100:0/2463808320 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d0100620 0x7fc5d0108b20 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7fc5c0009b50 tx=0x7fc5c0009e60 comp rx=0 tx=0).stop 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.806+0000 7fc5d8408700 1 -- 192.168.123.100:0/2463808320 shutdown_connections 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.806+0000 7fc5d8408700 1 --2- 192.168.123.100:0/2463808320 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d0100620 0x7fc5d0108b20 unknown :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.806+0000 7fc5d8408700 1 --2- 192.168.123.100:0/2463808320 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5d00ffd10 0x7fc5d01000e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.806+0000 7fc5d8408700 1 -- 192.168.123.100:0/2463808320 >> 192.168.123.100:0/2463808320 conn(0x7fc5d00747e0 msgr2=0x7fc5d0074be0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.806+0000 7fc5d8408700 1 -- 192.168.123.100:0/2463808320 shutdown_connections 2026-03-10T12:46:21.806 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.806+0000 7fc5d8408700 1 -- 192.168.123.100:0/2463808320 wait complete. 2026-03-10T12:46:21.807 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d8408700 1 Processor -- start 2026-03-10T12:46:21.807 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d8408700 1 -- start start 2026-03-10T12:46:21.807 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d8408700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d00ffd10 0x7fc5d0193f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:21.807 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d8408700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5d0100620 0x7fc5d0194470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d8408700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5d0194b50 con 0x7fc5d00ffd10 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d8408700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc5d01988e0 con 0x7fc5d0100620 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d61a4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d00ffd10 0x7fc5d0193f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d61a4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d00ffd10 0x7fc5d0193f30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:43004/0 (socket says 192.168.123.100:43004) 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.807+0000 7fc5d61a4700 1 -- 192.168.123.100:0/367576776 learned_addr learned my addr 192.168.123.100:0/367576776 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.808+0000 7fc5d61a4700 1 -- 192.168.123.100:0/367576776 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5d0100620 msgr2=0x7fc5d0194470 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.808+0000 7fc5d61a4700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5d0100620 0x7fc5d0194470 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.808+0000 7fc5d61a4700 1 -- 192.168.123.100:0/367576776 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5c00097e0 con 0x7fc5d00ffd10 2026-03-10T12:46:21.808 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.808+0000 7fc5d61a4700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d00ffd10 0x7fc5d0193f30 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7fc5cc00eb10 tx=0x7fc5cc00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:21.809 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.808+0000 7fc5c77fe700 1 -- 192.168.123.100:0/367576776 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc5cc00cca0 con 0x7fc5d00ffd10 2026-03-10T12:46:21.809 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.809+0000 7fc5c77fe700 1 -- 192.168.123.100:0/367576776 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc5cc00ce00 con 0x7fc5d00ffd10 2026-03-10T12:46:21.809 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.809+0000 7fc5c77fe700 1 -- 192.168.123.100:0/367576776 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc5cc018910 con 0x7fc5d00ffd10 2026-03-10T12:46:21.809 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.809+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc5d0198bc0 con 0x7fc5d00ffd10 2026-03-10T12:46:21.809 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.809+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5d0199110 con 0x7fc5d00ffd10 2026-03-10T12:46:21.810 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.810+0000 7fc5c77fe700 1 -- 192.168.123.100:0/367576776 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc5cc018a70 con 0x7fc5d00ffd10 2026-03-10T12:46:21.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.811+0000 7fc5c77fe700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5bc07bdf0 0x7fc5bc07e2a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:21.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.811+0000 7fc5d59a3700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5bc07bdf0 0x7fc5bc07e2a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:21.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.811+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc5d0198d50 con 0x7fc5d00ffd10 2026-03-10T12:46:21.811 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.811+0000 7fc5c77fe700 1 -- 192.168.123.100:0/367576776 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fc5cc014070 con 0x7fc5d00ffd10 2026-03-10T12:46:21.815 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.815+0000 7fc5d59a3700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5bc07bdf0 0x7fc5bc07e2a0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fc5c0005b40 tx=0x7fc5c0005ab0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:21.815 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.815+0000 7fc5c77fe700 1 -- 192.168.123.100:0/367576776 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc5d0198d50 con 0x7fc5d00ffd10 2026-03-10T12:46:21.958 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:21 vm00.local ceph-mon[103263]: pgmap v262: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:21.958 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:21 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3268679058' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T12:46:21.958 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.957+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7fc5d0198d50 con 0x7fc5d00ffd10 2026-03-10T12:46:21.959 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.959+0000 7fc5c77fe700 1 -- 192.168.123.100:0/367576776 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v38) v1 ==== 107+0+4943 (secure 0 0 0) 0x7fc5d0198d50 con 0x7fc5d00ffd10 2026-03-10T12:46:21.960 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:21.960 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":16,"btime":"2026-03-10T12:42:40:892172+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:42:39.892643+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:active","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24307,24313]},"id":1}]} 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5bc07bdf0 msgr2=0x7fc5bc07e2a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5bc07bdf0 0x7fc5bc07e2a0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fc5c0005b40 tx=0x7fc5c0005ab0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d00ffd10 msgr2=0x7fc5d0193f30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d00ffd10 0x7fc5d0193f30 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7fc5cc00eb10 tx=0x7fc5cc00eed0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 shutdown_connections 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc5bc07bdf0 0x7fc5bc07e2a0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc5d00ffd10 0x7fc5d0193f30 unknown :-1 s=CLOSED pgs=185 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 --2- 192.168.123.100:0/367576776 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc5d0100620 0x7fc5d0194470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.962+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 >> 192.168.123.100:0/367576776 conn(0x7fc5d00747e0 msgr2=0x7fc5d01032f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:21.962 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.963+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 shutdown_connections 2026-03-10T12:46:21.963 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:21.963+0000 7fc5d8408700 1 -- 192.168.123.100:0/367576776 wait complete. 2026-03-10T12:46:21.963 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 16 2026-03-10T12:46:22.008 DEBUG:tasks.fs:max_mds reduced in epoch 16 2026-03-10T12:46:22.008 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 17 2026-03-10T12:46:22.150 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:22.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:21 vm07.local ceph-mon[93622]: pgmap v262: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:22.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:21 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3268679058' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.450+0000 7f92d2e80700 1 -- 192.168.123.100:0/3594563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc10e9e0 msgr2=0x7f92cc10edb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.450+0000 7f92d2e80700 1 --2- 192.168.123.100:0/3594563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc10e9e0 0x7f92cc10edb0 secure :-1 s=READY pgs=186 cs=0 l=1 rev1=1 crypto rx=0x7f92c8009b00 tx=0x7f92c8009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.451+0000 7f92d2e80700 1 -- 192.168.123.100:0/3594563 shutdown_connections 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.451+0000 7f92d2e80700 1 --2- 192.168.123.100:0/3594563 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f92cc071b60 0x7f92cc071fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.451+0000 7f92d2e80700 1 --2- 192.168.123.100:0/3594563 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc10e9e0 0x7f92cc10edb0 unknown :-1 s=CLOSED pgs=186 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.451+0000 7f92d2e80700 1 -- 192.168.123.100:0/3594563 >> 192.168.123.100:0/3594563 conn(0x7f92cc06c6c0 msgr2=0x7f92cc06cac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.451+0000 7f92d2e80700 1 -- 192.168.123.100:0/3594563 shutdown_connections 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.451+0000 7f92d2e80700 1 -- 192.168.123.100:0/3594563 wait complete. 2026-03-10T12:46:22.451 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.451+0000 7f92d2e80700 1 Processor -- start 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d2e80700 1 -- start start 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d2e80700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f92cc071b60 0x7f92cc1195b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d2e80700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc114650 0x7f92cc114ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d167d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc114650 0x7f92cc114ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d167d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc114650 0x7f92cc114ac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:43030/0 (socket says 192.168.123.100:43030) 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d167d700 1 -- 192.168.123.100:0/2910464916 learned_addr learned my addr 192.168.123.100:0/2910464916 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d1e7e700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f92cc071b60 0x7f92cc1195b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:22.452 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d2e80700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92cc115000 con 0x7f92cc114650 2026-03-10T12:46:22.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.452+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92cc115170 con 0x7f92cc071b60 2026-03-10T12:46:22.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.453+0000 7f92d167d700 1 -- 192.168.123.100:0/2910464916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f92cc071b60 msgr2=0x7f92cc1195b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:22.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.453+0000 7f92d167d700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f92cc071b60 0x7f92cc1195b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:22.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.453+0000 7f92d167d700 1 -- 192.168.123.100:0/2910464916 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f92c80097e0 con 0x7f92cc114650 2026-03-10T12:46:22.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.453+0000 7f92d167d700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc114650 0x7f92cc114ac0 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7f92c400c390 tx=0x7f92c400c6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:22.453 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.453+0000 7f92c2ffd700 1 -- 192.168.123.100:0/2910464916 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92c400e030 con 0x7f92cc114650 2026-03-10T12:46:22.454 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.453+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92cc1153f0 con 0x7f92cc114650 2026-03-10T12:46:22.454 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.453+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f92cc1b7ca0 con 0x7f92cc114650 2026-03-10T12:46:22.454 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.454+0000 7f92c2ffd700 1 -- 192.168.123.100:0/2910464916 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f92c400f040 con 0x7f92cc114650 2026-03-10T12:46:22.454 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.454+0000 7f92c2ffd700 1 -- 192.168.123.100:0/2910464916 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f92c4014650 con 0x7f92cc114650 2026-03-10T12:46:22.455 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.454+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f92b0005320 con 0x7f92cc114650 2026-03-10T12:46:22.455 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.455+0000 7f92c2ffd700 1 -- 192.168.123.100:0/2910464916 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f92c4009110 con 0x7f92cc114650 2026-03-10T12:46:22.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.456+0000 7f92c2ffd700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f92b8077910 0x7f92b8079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:22.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.456+0000 7f92c2ffd700 1 -- 192.168.123.100:0/2910464916 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f92c4099ad0 con 0x7f92cc114650 2026-03-10T12:46:22.456 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.456+0000 7f92d1e7e700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f92b8077910 0x7f92b8079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:22.458 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.458+0000 7f92d1e7e700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f92b8077910 0x7f92b8079dc0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f92c8000c00 tx=0x7f92c8019040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:22.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.459+0000 7f92c2ffd700 1 -- 192.168.123.100:0/2910464916 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f92c4062250 con 0x7f92cc114650 2026-03-10T12:46:22.600 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.600+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7f92b0005190 con 0x7f92cc114650 2026-03-10T12:46:22.601 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.601+0000 7f92c2ffd700 1 -- 192.168.123.100:0/2910464916 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v38) v1 ==== 107+0+4939 (secure 0 0 0) 0x7f92c40093c0 con 0x7f92cc114650 2026-03-10T12:46:22.601 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:22.601 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":17,"btime":"2026-03-10T12:42:40:906205+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:42:40.906196+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:stopping","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f92b8077910 msgr2=0x7f92b8079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f92b8077910 0x7f92b8079dc0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f92c8000c00 tx=0x7f92c8019040 comp rx=0 tx=0).stop 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc114650 msgr2=0x7f92cc114ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc114650 0x7f92cc114ac0 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7f92c400c390 tx=0x7f92c400c6a0 comp rx=0 tx=0).stop 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 shutdown_connections 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f92b8077910 0x7f92b8079dc0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f92cc071b60 0x7f92cc1195b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 --2- 192.168.123.100:0/2910464916 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f92cc114650 0x7f92cc114ac0 unknown :-1 s=CLOSED pgs=187 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 >> 192.168.123.100:0/2910464916 conn(0x7f92cc06c6c0 msgr2=0x7f92cc06f5b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 shutdown_connections 2026-03-10T12:46:22.604 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:22.604+0000 7f92d2e80700 1 -- 192.168.123.100:0/2910464916 wait complete. 2026-03-10T12:46:22.605 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 17 2026-03-10T12:46:22.670 DEBUG:tasks.fs:max_mds reduced in epoch 17 2026-03-10T12:46:22.670 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 18 2026-03-10T12:46:22.821 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:23.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.084+0000 7f52e25f9700 1 -- 192.168.123.100:0/3673214185 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc1013c0 msgr2=0x7f52dc101790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.084+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3673214185 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc1013c0 0x7f52dc101790 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f52c4009b50 tx=0x7f52c4009e60 comp rx=0 tx=0).stop 2026-03-10T12:46:23.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.085+0000 7f52e25f9700 1 -- 192.168.123.100:0/3673214185 shutdown_connections 2026-03-10T12:46:23.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.085+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3673214185 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52dc068490 0x7f52dc068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.085+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3673214185 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc1013c0 0x7f52dc101790 unknown :-1 s=CLOSED pgs=188 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.085+0000 7f52e25f9700 1 -- 192.168.123.100:0/3673214185 >> 192.168.123.100:0/3673214185 conn(0x7f52dc0754a0 msgr2=0x7f52dc0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:23.085 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.085+0000 7f52e25f9700 1 -- 192.168.123.100:0/3673214185 shutdown_connections 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.085+0000 7f52e25f9700 1 -- 192.168.123.100:0/3673214185 wait complete. 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52e25f9700 1 Processor -- start 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52e25f9700 1 -- start start 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52e25f9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc068490 0x7f52dc198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52e25f9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52dc1013c0 0x7f52dc1988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52e25f9700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52dc198fb0 con 0x7f52dc068490 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52e25f9700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52dc19ccf0 con 0x7f52dc1013c0 2026-03-10T12:46:23.086 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52dbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc068490 0x7f52dc198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52dbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc068490 0x7f52dc198390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:43054/0 (socket says 192.168.123.100:43054) 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.086+0000 7f52dbfff700 1 -- 192.168.123.100:0/3561829767 learned_addr learned my addr 192.168.123.100:0/3561829767 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52db7fe700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52dc1013c0 0x7f52dc1988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52dbfff700 1 -- 192.168.123.100:0/3561829767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52dc1013c0 msgr2=0x7f52dc1988d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52dbfff700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52dc1013c0 0x7f52dc1988d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52dbfff700 1 -- 192.168.123.100:0/3561829767 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52c40097e0 con 0x7f52dc068490 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52db7fe700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52dc1013c0 0x7f52dc1988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52dbfff700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc068490 0x7f52dc198390 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7f52c4006010 tx=0x7f52c4005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52d97fa700 1 -- 192.168.123.100:0/3561829767 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52c401d070 con 0x7f52dc068490 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52dc19cf70 con 0x7f52dc068490 2026-03-10T12:46:23.087 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.087+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52dc19d460 con 0x7f52dc068490 2026-03-10T12:46:23.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.089+0000 7f52d97fa700 1 -- 192.168.123.100:0/3561829767 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f52c400bd20 con 0x7f52dc068490 2026-03-10T12:46:23.089 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.089+0000 7f52d97fa700 1 -- 192.168.123.100:0/3561829767 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52c400f650 con 0x7f52dc068490 2026-03-10T12:46:23.091 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.089+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f52bc005320 con 0x7f52dc068490 2026-03-10T12:46:23.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.092+0000 7f52d97fa700 1 -- 192.168.123.100:0/3561829767 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f52c400f870 con 0x7f52dc068490 2026-03-10T12:46:23.092 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.092+0000 7f52d97fa700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f52c8077790 0x7f52c8079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:23.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.093+0000 7f52db7fe700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f52c8077790 0x7f52c8079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:23.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.093+0000 7f52db7fe700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f52c8077790 0x7f52c8079c40 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f52dc199960 tx=0x7f52cc009450 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:23.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.093+0000 7f52d97fa700 1 -- 192.168.123.100:0/3561829767 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f52c409c1d0 con 0x7f52dc068490 2026-03-10T12:46:23.093 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.093+0000 7f52d97fa700 1 -- 192.168.123.100:0/3561829767 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f52c40224c0 con 0x7f52dc068490 2026-03-10T12:46:23.217 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:22 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/367576776' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T12:46:23.217 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:22 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2910464916' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T12:46:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:22 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/367576776' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T12:46:23.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:22 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2910464916' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T12:46:23.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.245+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7f52bc0059f0 con 0x7f52dc068490 2026-03-10T12:46:23.246 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.246+0000 7f52d97fa700 1 -- 192.168.123.100:0/3561829767 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v38) v1 ==== 107+0+4940 (secure 0 0 0) 0x7f52c4064a80 con 0x7f52dc068490 2026-03-10T12:46:23.246 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:23.246 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":18,"btime":"2026-03-10T12:42:50:841904+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":18,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:42:50.758107+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0,1],"up":{"mds_0":24313,"mds_1":24307},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24307":{"gid":24307,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":6,"state":"up:stopping","state_seq":3,"addr":"192.168.123.100:6827/2640363946","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2640363946},{"type":"v1","addr":"192.168.123.100:6827","nonce":2640363946}]},"join_fscid":1,"export_targets":[0],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}},"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.248+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f52c8077790 msgr2=0x7f52c8079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f52c8077790 0x7f52c8079c40 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f52dc199960 tx=0x7f52cc009450 comp rx=0 tx=0).stop 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc068490 msgr2=0x7f52dc198390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc068490 0x7f52dc198390 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7f52c4006010 tx=0x7f52c4005e70 comp rx=0 tx=0).stop 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 shutdown_connections 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f52c8077790 0x7f52c8079c40 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52dc068490 0x7f52dc198390 unknown :-1 s=CLOSED pgs=189 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 --2- 192.168.123.100:0/3561829767 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52dc1013c0 0x7f52dc1988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 >> 192.168.123.100:0/3561829767 conn(0x7f52dc0754a0 msgr2=0x7f52dc0fddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.249+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 shutdown_connections 2026-03-10T12:46:23.249 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.250+0000 7f52e25f9700 1 -- 192.168.123.100:0/3561829767 wait complete. 2026-03-10T12:46:23.250 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 18 2026-03-10T12:46:23.362 DEBUG:tasks.fs:max_mds reduced in epoch 18 2026-03-10T12:46:23.363 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 19 2026-03-10T12:46:23.525 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.785+0000 7f975b171700 1 -- 192.168.123.100:0/3479676770 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 msgr2=0x7f975410c820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.785+0000 7f975b171700 1 --2- 192.168.123.100:0/3479676770 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f975410c820 secure :-1 s=READY pgs=190 cs=0 l=1 rev1=1 crypto rx=0x7f9748009b50 tx=0x7f9748009e60 comp rx=0 tx=0).stop 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.786+0000 7f975b171700 1 -- 192.168.123.100:0/3479676770 shutdown_connections 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.786+0000 7f975b171700 1 --2- 192.168.123.100:0/3479676770 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f975410c820 unknown :-1 s=CLOSED pgs=190 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.786+0000 7f975b171700 1 --2- 192.168.123.100:0/3479676770 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97540730f0 0x7f97540734c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.786+0000 7f975b171700 1 -- 192.168.123.100:0/3479676770 >> 192.168.123.100:0/3479676770 conn(0x7f97540fc000 msgr2=0x7f97540fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.786+0000 7f975b171700 1 -- 192.168.123.100:0/3479676770 shutdown_connections 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.786+0000 7f975b171700 1 -- 192.168.123.100:0/3479676770 wait complete. 2026-03-10T12:46:23.786 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.786+0000 7f975b171700 1 Processor -- start 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f975b171700 1 -- start start 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f975b171700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97540730f0 0x7f9754198270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f975b171700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f97541988e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f975b171700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f975419cbb0 con 0x7f9754073a00 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f975b171700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f975419cd20 con 0x7f97540730f0 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f9753fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f97541988e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f9753fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f97541988e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:43074/0 (socket says 192.168.123.100:43074) 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f9753fff700 1 -- 192.168.123.100:0/1853619124 learned_addr learned my addr 192.168.123.100:0/1853619124 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f9758f0d700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97540730f0 0x7f9754198270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:23.787 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.787+0000 7f9753fff700 1 -- 192.168.123.100:0/1853619124 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97540730f0 msgr2=0x7f9754198270 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.788 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f9753fff700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97540730f0 0x7f9754198270 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.788 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f9753fff700 1 -- 192.168.123.100:0/1853619124 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f97480097e0 con 0x7f9754073a00 2026-03-10T12:46:23.788 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f9753fff700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f97541988e0 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7f9748005f50 tx=0x7f9748004ef0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:23.789 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f9751ffb700 1 -- 192.168.123.100:0/1853619124 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f974801d070 con 0x7f9754073a00 2026-03-10T12:46:23.789 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f9751ffb700 1 -- 192.168.123.100:0/1853619124 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9748022470 con 0x7f9754073a00 2026-03-10T12:46:23.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f9751ffb700 1 -- 192.168.123.100:0/1853619124 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f974800f7d0 con 0x7f9754073a00 2026-03-10T12:46:23.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f975419d000 con 0x7f9754073a00 2026-03-10T12:46:23.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f975419d550 con 0x7f9754073a00 2026-03-10T12:46:23.790 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.788+0000 7f9758f0d700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97540730f0 0x7f9754198270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:46:23.792 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.789+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9754109f20 con 0x7f9754073a00 2026-03-10T12:46:23.793 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.792+0000 7f9751ffb700 1 -- 192.168.123.100:0/1853619124 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9748022ac0 con 0x7f9754073a00 2026-03-10T12:46:23.793 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.793+0000 7f9751ffb700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f973c0779e0 0x7f973c079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:23.793 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.793+0000 7f9751ffb700 1 -- 192.168.123.100:0/1853619124 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f974809bbb0 con 0x7f9754073a00 2026-03-10T12:46:23.793 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.793+0000 7f9751ffb700 1 -- 192.168.123.100:0/1853619124 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f97480cba90 con 0x7f9754073a00 2026-03-10T12:46:23.793 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.793+0000 7f9758f0d700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f973c0779e0 0x7f973c079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:23.793 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.793+0000 7f9758f0d700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f973c0779e0 0x7f973c079e90 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f9744009dd0 tx=0x7f9744009450 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:23.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:23 vm00.local ceph-mon[103263]: pgmap v263: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:23.933 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:23 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3561829767' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T12:46:23.942 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.942+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7f97540fd660 con 0x7f9754073a00 2026-03-10T12:46:23.943 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.943+0000 7f9751ffb700 1 -- 192.168.123.100:0/1853619124 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v38) v1 ==== 107+0+4140 (secure 0 0 0) 0x7f9748064470 con 0x7f9754073a00 2026-03-10T12:46:23.943 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:23.943 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":19,"btime":"2026-03-10T12:43:15:836359+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:15.758277+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:23.945 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.945+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f973c0779e0 msgr2=0x7f973c079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.945 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.945+0000 7f975b171700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f973c0779e0 0x7f973c079e90 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f9744009dd0 tx=0x7f9744009450 comp rx=0 tx=0).stop 2026-03-10T12:46:23.945 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.945+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 msgr2=0x7f97541988e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:23.945 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.945+0000 7f975b171700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f97541988e0 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7f9748005f50 tx=0x7f9748004ef0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.945+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 shutdown_connections 2026-03-10T12:46:23.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.946+0000 7f975b171700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f973c0779e0 0x7f973c079e90 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.946+0000 7f975b171700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f97540730f0 0x7f9754198270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.946+0000 7f975b171700 1 --2- 192.168.123.100:0/1853619124 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f9754073a00 0x7f97541988e0 unknown :-1 s=CLOSED pgs=191 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:23.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.946+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 >> 192.168.123.100:0/1853619124 conn(0x7f97540fc000 msgr2=0x7f9754107060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:23.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.946+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 shutdown_connections 2026-03-10T12:46:23.946 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:23.946+0000 7f975b171700 1 -- 192.168.123.100:0/1853619124 wait complete. 2026-03-10T12:46:23.947 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 19 2026-03-10T12:46:24.008 DEBUG:tasks.fs:max_mds reduced in epoch 19 2026-03-10T12:46:24.008 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 20 2026-03-10T12:46:24.151 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:24.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:23 vm07.local ceph-mon[93622]: pgmap v263: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:24.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:23 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3561829767' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.399+0000 7f68e25bd700 1 -- 192.168.123.100:0/3059279589 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f68dc069000 msgr2=0x7f68dc1051e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.399+0000 7f68e25bd700 1 --2- 192.168.123.100:0/3059279589 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f68dc069000 0x7f68dc1051e0 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f68d0009b00 tx=0x7f68d0009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.400+0000 7f68e25bd700 1 -- 192.168.123.100:0/3059279589 shutdown_connections 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.400+0000 7f68e25bd700 1 --2- 192.168.123.100:0/3059279589 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f68dc069000 0x7f68dc1051e0 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.400+0000 7f68e25bd700 1 --2- 192.168.123.100:0/3059279589 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc0686f0 0x7f68dc068ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.400+0000 7f68e25bd700 1 -- 192.168.123.100:0/3059279589 >> 192.168.123.100:0/3059279589 conn(0x7f68dc0754a0 msgr2=0x7f68dc0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.400+0000 7f68e25bd700 1 -- 192.168.123.100:0/3059279589 shutdown_connections 2026-03-10T12:46:24.400 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.400+0000 7f68e25bd700 1 -- 192.168.123.100:0/3059279589 wait complete. 2026-03-10T12:46:24.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68e25bd700 1 Processor -- start 2026-03-10T12:46:24.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68e25bd700 1 -- start start 2026-03-10T12:46:24.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68e25bd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f68dc0686f0 0x7f68dc1940b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:24.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68e25bd700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc069000 0x7f68dc1945f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:24.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68e25bd700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68dc194cd0 con 0x7f68dc0686f0 2026-03-10T12:46:24.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68e25bd700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68dc198a60 con 0x7f68dc069000 2026-03-10T12:46:24.401 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68db7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc069000 0x7f68dc1945f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68db7fe700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc069000 0x7f68dc1945f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:49096/0 (socket says 192.168.123.100:49096) 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68db7fe700 1 -- 192.168.123.100:0/3671680857 learned_addr learned my addr 192.168.123.100:0/3671680857 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68db7fe700 1 -- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f68dc0686f0 msgr2=0x7f68dc1940b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68db7fe700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f68dc0686f0 0x7f68dc1940b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.401+0000 7f68db7fe700 1 -- 192.168.123.100:0/3671680857 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f68d00097e0 con 0x7f68dc069000 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.402+0000 7f68db7fe700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc069000 0x7f68dc1945f0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f68d0009ad0 tx=0x7f68d00052e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.402+0000 7f68d97fa700 1 -- 192.168.123.100:0/3671680857 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68d001d070 con 0x7f68dc069000 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.402+0000 7f68e25bd700 1 -- 192.168.123.100:0/3671680857 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f68dc198ce0 con 0x7f68dc069000 2026-03-10T12:46:24.402 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.402+0000 7f68e25bd700 1 -- 192.168.123.100:0/3671680857 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f68dc1991d0 con 0x7f68dc069000 2026-03-10T12:46:24.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.403+0000 7f68d97fa700 1 -- 192.168.123.100:0/3671680857 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f68d000bc50 con 0x7f68dc069000 2026-03-10T12:46:24.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.403+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f68c8005320 con 0x7f68dc069000 2026-03-10T12:46:24.403 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.403+0000 7f68d97fa700 1 -- 192.168.123.100:0/3671680857 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68d000f670 con 0x7f68dc069000 2026-03-10T12:46:24.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.404+0000 7f68d97fa700 1 -- 192.168.123.100:0/3671680857 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f68d000f8b0 con 0x7f68dc069000 2026-03-10T12:46:24.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.405+0000 7f68d97fa700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f68c407bcd0 0x7f68c407e180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:24.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.405+0000 7f68dbfff700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f68c407bcd0 0x7f68c407e180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:24.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.405+0000 7f68d97fa700 1 -- 192.168.123.100:0/3671680857 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f68d009b930 con 0x7f68dc069000 2026-03-10T12:46:24.405 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.405+0000 7f68dbfff700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f68c407bcd0 0x7f68c407e180 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f68cc0097b0 tx=0x7f68cc006d20 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:24.407 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.407+0000 7f68d97fa700 1 -- 192.168.123.100:0/3671680857 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f68d0064450 con 0x7f68dc069000 2026-03-10T12:46:24.550 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.550+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f68c8005190 con 0x7f68dc069000 2026-03-10T12:46:24.551 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.551+0000 7f68d97fa700 1 -- 192.168.123.100:0/3671680857 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v38) v1 ==== 107+0+4923 (secure 0 0 0) 0x7f68d0063ba0 con 0x7f68dc069000 2026-03-10T12:46:24.551 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:24.551 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":20,"btime":"2026-03-10T12:43:16:844095+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34358,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/1069803323","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":1069803323},{"type":"v1","addr":"192.168.123.100:6827","nonce":1069803323}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":20}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:15.758277+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f68c407bcd0 msgr2=0x7f68c407e180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f68c407bcd0 0x7f68c407e180 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f68cc0097b0 tx=0x7f68cc006d20 comp rx=0 tx=0).stop 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc069000 msgr2=0x7f68dc1945f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc069000 0x7f68dc1945f0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f68d0009ad0 tx=0x7f68d00052e0 comp rx=0 tx=0).stop 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 shutdown_connections 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f68c407bcd0 0x7f68c407e180 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f68dc0686f0 0x7f68dc1940b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 --2- 192.168.123.100:0/3671680857 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f68dc069000 0x7f68dc1945f0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 >> 192.168.123.100:0/3671680857 conn(0x7f68dc0754a0 msgr2=0x7f68dc0ff750 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 shutdown_connections 2026-03-10T12:46:24.554 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:24.554+0000 7f68c2ffd700 1 -- 192.168.123.100:0/3671680857 wait complete. 2026-03-10T12:46:24.555 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 20 2026-03-10T12:46:24.616 DEBUG:tasks.fs:max_mds reduced in epoch 20 2026-03-10T12:46:24.616 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 21 2026-03-10T12:46:24.770 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:25.035 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.034+0000 7f5d50550700 1 -- 192.168.123.100:0/1169143844 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48106560 msgr2=0x7f5d48106930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.035 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.034+0000 7f5d50550700 1 --2- 192.168.123.100:0/1169143844 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48106560 0x7f5d48106930 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f5d38009b00 tx=0x7f5d38009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:25.035 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.035+0000 7f5d50550700 1 -- 192.168.123.100:0/1169143844 shutdown_connections 2026-03-10T12:46:25.035 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.035+0000 7f5d50550700 1 --2- 192.168.123.100:0/1169143844 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48100540 0x7f5d481009b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.035+0000 7f5d50550700 1 --2- 192.168.123.100:0/1169143844 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48106560 0x7f5d48106930 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.035+0000 7f5d50550700 1 -- 192.168.123.100:0/1169143844 >> 192.168.123.100:0/1169143844 conn(0x7f5d480fbfc0 msgr2=0x7f5d480fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:25.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.035+0000 7f5d50550700 1 -- 192.168.123.100:0/1169143844 shutdown_connections 2026-03-10T12:46:25.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.035+0000 7f5d50550700 1 -- 192.168.123.100:0/1169143844 wait complete. 2026-03-10T12:46:25.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.036+0000 7f5d50550700 1 Processor -- start 2026-03-10T12:46:25.036 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.036+0000 7f5d50550700 1 -- start start 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d50550700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48100540 0x7f5d48193f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d50550700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48106560 0x7f5d48194440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d50550700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d48194a90 con 0x7f5d48100540 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d50550700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d48194bd0 con 0x7f5d48106560 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d4daeb700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48106560 0x7f5d48194440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d4daeb700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48106560 0x7f5d48194440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:49112/0 (socket says 192.168.123.100:49112) 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d4daeb700 1 -- 192.168.123.100:0/2967621500 learned_addr learned my addr 192.168.123.100:0/2967621500 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d4e2ec700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48100540 0x7f5d48193f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:25.037 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d4e2ec700 1 -- 192.168.123.100:0/2967621500 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48106560 msgr2=0x7f5d48194440 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.038 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d4e2ec700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48106560 0x7f5d48194440 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.038 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.037+0000 7f5d4e2ec700 1 -- 192.168.123.100:0/2967621500 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d380097e0 con 0x7f5d48100540 2026-03-10T12:46:25.038 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.038+0000 7f5d4daeb700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48106560 0x7f5d48194440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T12:46:25.038 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.038+0000 7f5d4e2ec700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48100540 0x7f5d48193f00 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f5d38009fd0 tx=0x7f5d38004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:25.038 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.038+0000 7f5d3f7fe700 1 -- 192.168.123.100:0/2967621500 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d3801d070 con 0x7f5d48100540 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.038+0000 7f5d3f7fe700 1 -- 192.168.123.100:0/2967621500 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5d38004b90 con 0x7f5d48100540 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.038+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d481989c0 con 0x7f5d48100540 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.038+0000 7f5d3f7fe700 1 -- 192.168.123.100:0/2967621500 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d3800f670 con 0x7f5d48100540 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.038+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d48198eb0 con 0x7f5d48100540 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.039+0000 7f5d3f7fe700 1 -- 192.168.123.100:0/2967621500 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5d3800f7d0 con 0x7f5d48100540 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.040+0000 7f5d3f7fe700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5d340036f0 0x7f5d34005ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.040+0000 7f5d3f7fe700 1 -- 192.168.123.100:0/2967621500 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f5d3809c100 con 0x7f5d48100540 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.040+0000 7f5d4daeb700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5d340036f0 0x7f5d34005ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:25.040 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.040+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d4804ea50 con 0x7f5d48100540 2026-03-10T12:46:25.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.044+0000 7f5d4daeb700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5d340036f0 0x7f5d34005ba0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f5d48195520 tx=0x7f5d44005f90 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:25.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.044+0000 7f5d3f7fe700 1 -- 192.168.123.100:0/2967621500 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5d380649b0 con 0x7f5d48100540 2026-03-10T12:46:25.193 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:24 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1853619124' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T12:46:25.193 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:24 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3671680857' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T12:46:25.193 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.191+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7f5d48195260 con 0x7f5d48100540 2026-03-10T12:46:25.194 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.194+0000 7f5d3f7fe700 1 -- 192.168.123.100:0/2967621500 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v38) v1 ==== 107+0+4140 (secure 0 0 0) 0x7f5d38027090 con 0x7f5d48100540 2026-03-10T12:46:25.194 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:25.194 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":21,"btime":"2026-03-10T12:43:23:891658+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:15.758277+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5d340036f0 msgr2=0x7f5d34005ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5d340036f0 0x7f5d34005ba0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f5d48195520 tx=0x7f5d44005f90 comp rx=0 tx=0).stop 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48100540 msgr2=0x7f5d48193f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48100540 0x7f5d48193f00 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7f5d38009fd0 tx=0x7f5d38004930 comp rx=0 tx=0).stop 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 shutdown_connections 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5d340036f0 0x7f5d34005ba0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5d48100540 0x7f5d48193f00 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 --2- 192.168.123.100:0/2967621500 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5d48106560 0x7f5d48194440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 >> 192.168.123.100:0/2967621500 conn(0x7f5d480fbfc0 msgr2=0x7f5d480fd8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 shutdown_connections 2026-03-10T12:46:25.197 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.197+0000 7f5d50550700 1 -- 192.168.123.100:0/2967621500 wait complete. 2026-03-10T12:46:25.198 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 21 2026-03-10T12:46:25.236 DEBUG:tasks.fs:max_mds reduced in epoch 21 2026-03-10T12:46:25.237 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 22 2026-03-10T12:46:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:24 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1853619124' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T12:46:25.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:24 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3671680857' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T12:46:25.394 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:25.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.642+0000 7fb62cee2700 1 -- 192.168.123.100:0/1914400034 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 msgr2=0x7fb628068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.642+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1914400034 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb628068900 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7fb618009b00 tx=0x7fb618009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:25.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.643+0000 7fb62cee2700 1 -- 192.168.123.100:0/1914400034 shutdown_connections 2026-03-10T12:46:25.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.643+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1914400034 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb628068900 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.643+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1914400034 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6281066c0 0x7fb628106a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.643 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.643+0000 7fb62cee2700 1 -- 192.168.123.100:0/1914400034 >> 192.168.123.100:0/1914400034 conn(0x7fb6280754a0 msgr2=0x7fb6280758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:25.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.643+0000 7fb62cee2700 1 -- 192.168.123.100:0/1914400034 shutdown_connections 2026-03-10T12:46:25.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.644+0000 7fb62cee2700 1 -- 192.168.123.100:0/1914400034 wait complete. 2026-03-10T12:46:25.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.644+0000 7fb62cee2700 1 Processor -- start 2026-03-10T12:46:25.644 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.644+0000 7fb62cee2700 1 -- start start 2026-03-10T12:46:25.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb62cee2700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb6281961e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:25.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb62659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb6281961e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:25.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb62659c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb6281961e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:43130/0 (socket says 192.168.123.100:43130) 2026-03-10T12:46:25.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb62cee2700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6281066c0 0x7fb628196720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:25.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb62cee2700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb628196e00 con 0x7fb628068490 2026-03-10T12:46:25.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb62cee2700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb62819ab90 con 0x7fb6281066c0 2026-03-10T12:46:25.645 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb62659c700 1 -- 192.168.123.100:0/1980577054 learned_addr learned my addr 192.168.123.100:0/1980577054 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:25.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.645+0000 7fb625d9b700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6281066c0 0x7fb628196720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:25.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb62659c700 1 -- 192.168.123.100:0/1980577054 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6281066c0 msgr2=0x7fb628196720 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb62659c700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6281066c0 0x7fb628196720 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb62659c700 1 -- 192.168.123.100:0/1980577054 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb6180097e0 con 0x7fb628068490 2026-03-10T12:46:25.646 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb62659c700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb6281961e0 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7fb61000b700 tx=0x7fb61000ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:25.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb61f7fe700 1 -- 192.168.123.100:0/1980577054 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb6100107c0 con 0x7fb628068490 2026-03-10T12:46:25.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb61f7fe700 1 -- 192.168.123.100:0/1980577054 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb610010e00 con 0x7fb628068490 2026-03-10T12:46:25.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb61f7fe700 1 -- 192.168.123.100:0/1980577054 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb61000f360 con 0x7fb628068490 2026-03-10T12:46:25.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb62819ae10 con 0x7fb628068490 2026-03-10T12:46:25.647 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.646+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb62819b360 con 0x7fb628068490 2026-03-10T12:46:25.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.648+0000 7fb61f7fe700 1 -- 192.168.123.100:0/1980577054 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb610017360 con 0x7fb628068490 2026-03-10T12:46:25.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.648+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb62804ea50 con 0x7fb628068490 2026-03-10T12:46:25.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.648+0000 7fb61f7fe700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb614077870 0x7fb614079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:25.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.648+0000 7fb61f7fe700 1 -- 192.168.123.100:0/1980577054 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fb6100991c0 con 0x7fb628068490 2026-03-10T12:46:25.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.649+0000 7fb625d9b700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb614077870 0x7fb614079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:25.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.650+0000 7fb625d9b700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb614077870 0x7fb614079d20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fb628197800 tx=0x7fb61800b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:25.651 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.651+0000 7fb61f7fe700 1 -- 192.168.123.100:0/1980577054 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb610061940 con 0x7fb628068490 2026-03-10T12:46:25.797 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.797+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7fb628066e40 con 0x7fb628068490 2026-03-10T12:46:25.798 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.798+0000 7fb61f7fe700 1 -- 192.168.123.100:0/1980577054 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v38) v1 ==== 107+0+4991 (secure 0 0 0) 0x7fb610061090 con 0x7fb628068490 2026-03-10T12:46:25.798 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:25.798 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":22,"btime":"2026-03-10T12:43:25:340576+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14490,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.100:6829/2948081127","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":2948081127},{"type":"v1","addr":"192.168.123.100:6829","nonce":2948081127}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:15.758277+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb614077870 msgr2=0x7fb614079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb614077870 0x7fb614079d20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fb628197800 tx=0x7fb61800b540 comp rx=0 tx=0).stop 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 msgr2=0x7fb6281961e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb6281961e0 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7fb61000b700 tx=0x7fb61000ba10 comp rx=0 tx=0).stop 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 shutdown_connections 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fb614077870 0x7fb614079d20 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fb628068490 0x7fb6281961e0 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 --2- 192.168.123.100:0/1980577054 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fb6281066c0 0x7fb628196720 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:25.800 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 >> 192.168.123.100:0/1980577054 conn(0x7fb6280754a0 msgr2=0x7fb6280fecb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:25.801 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 shutdown_connections 2026-03-10T12:46:25.801 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:25.800+0000 7fb62cee2700 1 -- 192.168.123.100:0/1980577054 wait complete. 2026-03-10T12:46:25.801 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 22 2026-03-10T12:46:25.867 DEBUG:tasks.fs:max_mds reduced in epoch 22 2026-03-10T12:46:25.867 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 23 2026-03-10T12:46:26.017 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:26.052 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:25 vm00.local ceph-mon[103263]: pgmap v264: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:26.052 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:25 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2967621500' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T12:46:26.052 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:25 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1980577054' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T12:46:26.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.256+0000 7fc0cc839700 1 -- 192.168.123.100:0/3919290742 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c40730f0 msgr2=0x7fc0c40734c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:26.256 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.256+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3919290742 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c40730f0 0x7fc0c40734c0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7fc0b0009b00 tx=0x7fc0b0009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.256+0000 7fc0cc839700 1 -- 192.168.123.100:0/3919290742 shutdown_connections 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.256+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3919290742 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0c4073a00 0x7fc0c410c820 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.256+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3919290742 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c40730f0 0x7fc0c40734c0 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.256+0000 7fc0cc839700 1 -- 192.168.123.100:0/3919290742 >> 192.168.123.100:0/3919290742 conn(0x7fc0c40fc000 msgr2=0x7fc0c40fe410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.257+0000 7fc0cc839700 1 -- 192.168.123.100:0/3919290742 shutdown_connections 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.257+0000 7fc0cc839700 1 -- 192.168.123.100:0/3919290742 wait complete. 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.257+0000 7fc0cc839700 1 Processor -- start 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.257+0000 7fc0cc839700 1 -- start start 2026-03-10T12:46:26.257 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0cc839700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0c40730f0 0x7fc0c41981d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0cc839700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c4073a00 0x7fc0c4198840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0cc839700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0c419cb80 con 0x7fc0c4073a00 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0cc839700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0c419ccf0 con 0x7fc0c40730f0 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0c9dd4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c4073a00 0x7fc0c4198840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0c9dd4700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c4073a00 0x7fc0c4198840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:43158/0 (socket says 192.168.123.100:43158) 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0c9dd4700 1 -- 192.168.123.100:0/3999687345 learned_addr learned my addr 192.168.123.100:0/3999687345 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0ca5d5700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0c40730f0 0x7fc0c41981d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0c9dd4700 1 -- 192.168.123.100:0/3999687345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0c40730f0 msgr2=0x7fc0c41981d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0c9dd4700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0c40730f0 0x7fc0c41981d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0c9dd4700 1 -- 192.168.123.100:0/3999687345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc0b00097e0 con 0x7fc0c4073a00 2026-03-10T12:46:26.258 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.258+0000 7fc0c9dd4700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c4073a00 0x7fc0c4198840 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fc0b800cc60 tx=0x7fc0b800cf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:26.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.259+0000 7fc0bf7fe700 1 -- 192.168.123.100:0/3999687345 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc0b8007960 con 0x7fc0c4073a00 2026-03-10T12:46:26.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.259+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0c419cfd0 con 0x7fc0c4073a00 2026-03-10T12:46:26.259 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.259+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0c419d520 con 0x7fc0c4073a00 2026-03-10T12:46:26.260 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.259+0000 7fc0bf7fe700 1 -- 192.168.123.100:0/3999687345 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc0b800f450 con 0x7fc0c4073a00 2026-03-10T12:46:26.260 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.259+0000 7fc0bf7fe700 1 -- 192.168.123.100:0/3999687345 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc0b80186a0 con 0x7fc0c4073a00 2026-03-10T12:46:26.260 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.260+0000 7fc0bf7fe700 1 -- 192.168.123.100:0/3999687345 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc0b8018800 con 0x7fc0c4073a00 2026-03-10T12:46:26.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.261+0000 7fc0bf7fe700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0b4077870 0x7fc0b4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:26.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.261+0000 7fc0ca5d5700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0b4077870 0x7fc0b4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:26.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.261+0000 7fc0ca5d5700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0b4077870 0x7fc0b4079d20 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fc0b0006010 tx=0x7fc0b000b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:26.261 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.262+0000 7fc0bf7fe700 1 -- 192.168.123.100:0/3999687345 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fc0b8098e70 con 0x7fc0c4073a00 2026-03-10T12:46:26.262 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.262+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc0a8005320 con 0x7fc0c4073a00 2026-03-10T12:46:26.265 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.265+0000 7fc0bf7fe700 1 -- 192.168.123.100:0/3999687345 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc0b809e050 con 0x7fc0c4073a00 2026-03-10T12:46:26.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:25 vm07.local ceph-mon[93622]: pgmap v264: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:26.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:25 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2967621500' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T12:46:26.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:25 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1980577054' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T12:46:26.413 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.412+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7fc0a8005190 con 0x7fc0c4073a00 2026-03-10T12:46:26.415 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.415+0000 7fc0bf7fe700 1 -- 192.168.123.100:0/3999687345 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v38) v1 ==== 107+0+4208 (secure 0 0 0) 0x7fc0b8062830 con 0x7fc0c4073a00 2026-03-10T12:46:26.416 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:26.416 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":23,"btime":"2026-03-10T12:43:29:289935+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:15.758277+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:26.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.418+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0b4077870 msgr2=0x7fc0b4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:26.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.418+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0b4077870 0x7fc0b4079d20 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fc0b0006010 tx=0x7fc0b000b540 comp rx=0 tx=0).stop 2026-03-10T12:46:26.418 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.418+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c4073a00 msgr2=0x7fc0c4198840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:26.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.419+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c4073a00 0x7fc0c4198840 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7fc0b800cc60 tx=0x7fc0b800cf70 comp rx=0 tx=0).stop 2026-03-10T12:46:26.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.419+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 shutdown_connections 2026-03-10T12:46:26.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.419+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fc0b4077870 0x7fc0b4079d20 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.419+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fc0c40730f0 0x7fc0c41981d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.419+0000 7fc0cc839700 1 --2- 192.168.123.100:0/3999687345 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fc0c4073a00 0x7fc0c4198840 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.419+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 >> 192.168.123.100:0/3999687345 conn(0x7fc0c40fc000 msgr2=0x7fc0c4107060 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:26.419 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.420+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 shutdown_connections 2026-03-10T12:46:26.420 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.420+0000 7fc0cc839700 1 -- 192.168.123.100:0/3999687345 wait complete. 2026-03-10T12:46:26.421 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 23 2026-03-10T12:46:26.491 DEBUG:tasks.fs:max_mds reduced in epoch 23 2026-03-10T12:46:26.492 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 24 2026-03-10T12:46:26.643 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:26.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.893+0000 7f31769a9700 1 -- 192.168.123.100:0/1134611539 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 msgr2=0x7f3170108b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:26.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.893+0000 7f31769a9700 1 --2- 192.168.123.100:0/1134611539 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 0x7f3170108b60 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7f3164009b50 tx=0x7f3164009e60 comp rx=0 tx=0).stop 2026-03-10T12:46:26.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.894+0000 7f31769a9700 1 -- 192.168.123.100:0/1134611539 shutdown_connections 2026-03-10T12:46:26.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.894+0000 7f31769a9700 1 --2- 192.168.123.100:0/1134611539 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 0x7f3170102c00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.894 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.894+0000 7f31769a9700 1 --2- 192.168.123.100:0/1134611539 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 0x7f3170108b60 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.894+0000 7f31769a9700 1 -- 192.168.123.100:0/1134611539 >> 192.168.123.100:0/1134611539 conn(0x7f31700fe2b0 msgr2=0x7f31701006c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.894+0000 7f31769a9700 1 -- 192.168.123.100:0/1134611539 shutdown_connections 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f31769a9700 1 -- 192.168.123.100:0/1134611539 wait complete. 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f31769a9700 1 Processor -- start 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f31769a9700 1 -- start start 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f31769a9700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 0x7f3170198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f31769a9700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 0x7f31701988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f31769a9700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3170198fb0 con 0x7f3170108790 2026-03-10T12:46:26.895 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f31769a9700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f317019cd40 con 0x7f3170102790 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f316ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 0x7f3170198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f316ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 0x7f3170198390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:49184/0 (socket says 192.168.123.100:49184) 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.895+0000 7f316ffff700 1 -- 192.168.123.100:0/2058718656 learned_addr learned my addr 192.168.123.100:0/2058718656 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.896+0000 7f316ffff700 1 -- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 msgr2=0x7f31701988d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.896+0000 7f316f7fe700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 0x7f31701988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.896+0000 7f316ffff700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 0x7f31701988d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.896+0000 7f316ffff700 1 -- 192.168.123.100:0/2058718656 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31640097e0 con 0x7f3170102790 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.896+0000 7f316ffff700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 0x7f3170198390 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f31640048f0 tx=0x7f31640049d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.896+0000 7f316f7fe700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 0x7f31701988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:46:26.896 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.896+0000 7f316d7fa700 1 -- 192.168.123.100:0/2058718656 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f316401d070 con 0x7f3170102790 2026-03-10T12:46:26.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.897+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f317019cfc0 con 0x7f3170102790 2026-03-10T12:46:26.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.897+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f317019d4b0 con 0x7f3170102790 2026-03-10T12:46:26.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.897+0000 7f316d7fa700 1 -- 192.168.123.100:0/2058718656 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f316400bc50 con 0x7f3170102790 2026-03-10T12:46:26.897 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.897+0000 7f316d7fa700 1 -- 192.168.123.100:0/2058718656 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3164017610 con 0x7f3170102790 2026-03-10T12:46:26.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.898+0000 7f316d7fa700 1 -- 192.168.123.100:0/2058718656 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f316400f460 con 0x7f3170102790 2026-03-10T12:46:26.898 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.898+0000 7f316d7fa700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f31580778c0 0x7f3158079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:26.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.899+0000 7f316d7fa700 1 -- 192.168.123.100:0/2058718656 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f316409b250 con 0x7f3170102790 2026-03-10T12:46:26.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.899+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f317004ea50 con 0x7f3170102790 2026-03-10T12:46:26.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.899+0000 7f316f7fe700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f31580778c0 0x7f3158079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:26.899 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.899+0000 7f316f7fe700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f31580778c0 0x7f3158079d70 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f31701999b0 tx=0x7f3160009450 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:26.903 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:26.903+0000 7f316d7fa700 1 -- 192.168.123.100:0/2058718656 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f31640639d0 con 0x7f3170102790 2026-03-10T12:46:27.049 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.048+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7f3170199720 con 0x7f3170102790 2026-03-10T12:46:27.050 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.050+0000 7f316d7fa700 1 -- 192.168.123.100:0/2058718656 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v38) v1 ==== 107+0+5053 (secure 0 0 0) 0x7f3164027070 con 0x7f3170102790 2026-03-10T12:46:27.050 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:27.050 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":24,"btime":"2026-03-10T12:43:34:155542+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:15.758277+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm07.wznhgu","rank":0,"incarnation":4,"state":"up:active","state_seq":3,"addr":"192.168.123.107:6825/1465224692","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":1465224692},{"type":"v1","addr":"192.168.123.107:6825","nonce":1465224692}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T12:46:27.052 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.053+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f31580778c0 msgr2=0x7f3158079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:27.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.053+0000 7f31769a9700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f31580778c0 0x7f3158079d70 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f31701999b0 tx=0x7f3160009450 comp rx=0 tx=0).stop 2026-03-10T12:46:27.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.053+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 msgr2=0x7f3170198390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:27.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.053+0000 7f31769a9700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 0x7f3170198390 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f31640048f0 tx=0x7f31640049d0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.053+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 shutdown_connections 2026-03-10T12:46:27.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.053+0000 7f31769a9700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f31580778c0 0x7f3158079d70 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.053 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.053+0000 7f31769a9700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f3170102790 0x7f3170198390 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.054 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.054+0000 7f31769a9700 1 --2- 192.168.123.100:0/2058718656 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f3170108790 0x7f31701988d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.054 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.054+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 >> 192.168.123.100:0/2058718656 conn(0x7f31700fe2b0 msgr2=0x7f31700ffaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:27.054 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.054+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 shutdown_connections 2026-03-10T12:46:27.054 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.054+0000 7f31769a9700 1 -- 192.168.123.100:0/2058718656 wait complete. 2026-03-10T12:46:27.055 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 24 2026-03-10T12:46:27.099 DEBUG:tasks.fs:max_mds reduced in epoch 24 2026-03-10T12:46:27.099 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 25 2026-03-10T12:46:27.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:26 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3999687345' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T12:46:27.245 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:27.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:26 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3999687345' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T12:46:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.512+0000 7f0be7d2e700 1 -- 192.168.123.100:0/1027567867 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ff450 msgr2=0x7f0be00ff820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.512+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/1027567867 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ff450 0x7f0be00ff820 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7f0bdc009b00 tx=0x7f0bdc009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.513+0000 7f0be7d2e700 1 -- 192.168.123.100:0/1027567867 shutdown_connections 2026-03-10T12:46:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.513+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/1027567867 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0be00ffdf0 0x7f0be0103dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.513+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/1027567867 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ff450 0x7f0be00ff820 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.513+0000 7f0be7d2e700 1 -- 192.168.123.100:0/1027567867 >> 192.168.123.100:0/1027567867 conn(0x7f0be00facf0 msgr2=0x7f0be00fd100 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:27.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.513+0000 7f0be7d2e700 1 -- 192.168.123.100:0/1027567867 shutdown_connections 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.513+0000 7f0be7d2e700 1 -- 192.168.123.100:0/1027567867 wait complete. 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be7d2e700 1 Processor -- start 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be7d2e700 1 -- start start 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be7d2e700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ffdf0 0x7f0be0193e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be7d2e700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0be0198e30 0x7f0be0194370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be7d2e700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0be01949e0 con 0x7f0be00ffdf0 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be7d2e700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0be0194b50 con 0x7f0be0198e30 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be5aca700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ffdf0 0x7f0be0193e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be5aca700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ffdf0 0x7f0be0193e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:43184/0 (socket says 192.168.123.100:43184) 2026-03-10T12:46:27.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.514+0000 7f0be5aca700 1 -- 192.168.123.100:0/2002659066 learned_addr learned my addr 192.168.123.100:0/2002659066 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:27.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.515+0000 7f0be52c9700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0be0198e30 0x7f0be0194370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:27.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.515+0000 7f0be5aca700 1 -- 192.168.123.100:0/2002659066 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0be0198e30 msgr2=0x7f0be0194370 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:27.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.515+0000 7f0be5aca700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0be0198e30 0x7f0be0194370 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.515+0000 7f0be5aca700 1 -- 192.168.123.100:0/2002659066 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0bdc0097e0 con 0x7f0be00ffdf0 2026-03-10T12:46:27.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.515+0000 7f0be5aca700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ffdf0 0x7f0be0193e30 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f0bdc009fd0 tx=0x7f0bdc0049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:27.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.516+0000 7f0bd6ffd700 1 -- 192.168.123.100:0/2002659066 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bdc01d070 con 0x7f0be00ffdf0 2026-03-10T12:46:27.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.516+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0be006a770 con 0x7f0be00ffdf0 2026-03-10T12:46:27.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.516+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0be006ac60 con 0x7f0be00ffdf0 2026-03-10T12:46:27.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.516+0000 7f0bd6ffd700 1 -- 192.168.123.100:0/2002659066 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0bdc004b80 con 0x7f0be00ffdf0 2026-03-10T12:46:27.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.516+0000 7f0bd6ffd700 1 -- 192.168.123.100:0/2002659066 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0bdc00f670 con 0x7f0be00ffdf0 2026-03-10T12:46:27.519 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.518+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0bc4005320 con 0x7f0be00ffdf0 2026-03-10T12:46:27.520 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.518+0000 7f0bd6ffd700 1 -- 192.168.123.100:0/2002659066 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0bdc00bc50 con 0x7f0be00ffdf0 2026-03-10T12:46:27.520 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.518+0000 7f0bd6ffd700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0bcc0779e0 0x7f0bcc079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:27.520 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.519+0000 7f0bd6ffd700 1 -- 192.168.123.100:0/2002659066 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f0bdc09b000 con 0x7f0be00ffdf0 2026-03-10T12:46:27.524 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.524+0000 7f0be52c9700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0bcc0779e0 0x7f0bcc079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:27.524 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.524+0000 7f0bd6ffd700 1 -- 192.168.123.100:0/2002659066 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0bdc0d19f0 con 0x7f0be00ffdf0 2026-03-10T12:46:27.529 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.529+0000 7f0be52c9700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0bcc0779e0 0x7f0bcc079e90 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f0bd0006fd0 tx=0x7f0bd0008040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:27.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.666+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7f0bc4005190 con 0x7f0be00ffdf0 2026-03-10T12:46:27.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.667+0000 7f0bd6ffd700 1 -- 192.168.123.100:0/2002659066 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v38) v1 ==== 107+0+4252 (secure 0 0 0) 0x7f0bdc0638b0 con 0x7f0be00ffdf0 2026-03-10T12:46:27.667 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:27.667 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":25,"btime":"2026-03-10T12:43:37:512187+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":25,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:37.512186+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[1],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0bcc0779e0 msgr2=0x7f0bcc079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0bcc0779e0 0x7f0bcc079e90 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7f0bd0006fd0 tx=0x7f0bd0008040 comp rx=0 tx=0).stop 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ffdf0 msgr2=0x7f0be0193e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ffdf0 0x7f0be0193e30 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f0bdc009fd0 tx=0x7f0bdc0049e0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 shutdown_connections 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0bcc0779e0 0x7f0bcc079e90 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0be00ffdf0 0x7f0be0193e30 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 --2- 192.168.123.100:0/2002659066 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0be0198e30 0x7f0be0194370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 >> 192.168.123.100:0/2002659066 conn(0x7f0be00facf0 msgr2=0x7f0be00fc410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:27.670 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.670+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 shutdown_connections 2026-03-10T12:46:27.671 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:27.671+0000 7f0be7d2e700 1 -- 192.168.123.100:0/2002659066 wait complete. 2026-03-10T12:46:27.671 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 25 2026-03-10T12:46:27.739 DEBUG:tasks.fs:max_mds reduced in epoch 25 2026-03-10T12:46:27.739 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 26 2026-03-10T12:46:27.886 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:28.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.140+0000 7f5a19263700 1 -- 192.168.123.100:0/3271339572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073980 msgr2=0x7f5a1410c8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.140+0000 7f5a19263700 1 --2- 192.168.123.100:0/3271339572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073980 0x7f5a1410c8a0 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f5a04009b30 tx=0x7f5a04009e40 comp rx=0 tx=0).stop 2026-03-10T12:46:28.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.141+0000 7f5a19263700 1 -- 192.168.123.100:0/3271339572 shutdown_connections 2026-03-10T12:46:28.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.141+0000 7f5a19263700 1 --2- 192.168.123.100:0/3271339572 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073980 0x7f5a1410c8a0 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.141+0000 7f5a19263700 1 --2- 192.168.123.100:0/3271339572 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a14073070 0x7f5a14073440 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.141 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.141+0000 7f5a19263700 1 -- 192.168.123.100:0/3271339572 >> 192.168.123.100:0/3271339572 conn(0x7f5a140fbfc0 msgr2=0x7f5a140fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.141+0000 7f5a19263700 1 -- 192.168.123.100:0/3271339572 shutdown_connections 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.141+0000 7f5a19263700 1 -- 192.168.123.100:0/3271339572 wait complete. 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a19263700 1 Processor -- start 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a19263700 1 -- start start 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a19263700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073070 0x7f5a14198450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a19263700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a14073980 0x7f5a14198990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a19263700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a14199070 con 0x7f5a14073070 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a19263700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a1419cdb0 con 0x7f5a14073980 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a12ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073070 0x7f5a14198450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:28.142 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.142+0000 7f5a12ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073070 0x7f5a14198450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60808/0 (socket says 192.168.123.100:60808) 2026-03-10T12:46:28.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a12ffd700 1 -- 192.168.123.100:0/1974154763 learned_addr learned my addr 192.168.123.100:0/1974154763 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:28.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a127fc700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a14073980 0x7f5a14198990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:28.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a12ffd700 1 -- 192.168.123.100:0/1974154763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a14073980 msgr2=0x7f5a14198990 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a12ffd700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a14073980 0x7f5a14198990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a12ffd700 1 -- 192.168.123.100:0/1974154763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a040097e0 con 0x7f5a14073070 2026-03-10T12:46:28.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a127fc700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a14073980 0x7f5a14198990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:46:28.143 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a12ffd700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073070 0x7f5a14198450 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f59fc00ba70 tx=0x7f59fc00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:28.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a0bfff700 1 -- 192.168.123.100:0/1974154763 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f59fc00c780 con 0x7f5a14073070 2026-03-10T12:46:28.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a0bfff700 1 -- 192.168.123.100:0/1974154763 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f59fc00cdc0 con 0x7f5a14073070 2026-03-10T12:46:28.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a0bfff700 1 -- 192.168.123.100:0/1974154763 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f59fc012550 con 0x7f5a14073070 2026-03-10T12:46:28.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a1419d090 con 0x7f5a14073070 2026-03-10T12:46:28.144 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.143+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a1419d5e0 con 0x7f5a14073070 2026-03-10T12:46:28.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.145+0000 7f5a0bfff700 1 -- 192.168.123.100:0/1974154763 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f59fc0126b0 con 0x7f5a14073070 2026-03-10T12:46:28.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.145+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5a14109fa0 con 0x7f5a14073070 2026-03-10T12:46:28.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.145+0000 7f5a0bfff700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5a00077870 0x7f5a00079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:28.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.145+0000 7f5a0bfff700 1 -- 192.168.123.100:0/1974154763 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f59fc098be0 con 0x7f5a14073070 2026-03-10T12:46:28.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.145+0000 7f5a127fc700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5a00077870 0x7f5a00079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:28.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.147+0000 7f5a127fc700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5a00077870 0x7f5a00079d20 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f5a04005b40 tx=0x7f5a04005ab0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:28.148 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.148+0000 7f5a0bfff700 1 -- 192.168.123.100:0/1974154763 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f59fc061550 con 0x7f5a14073070 2026-03-10T12:46:28.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:27 vm00.local ceph-mon[103263]: pgmap v265: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:28.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:27 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2058718656' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T12:46:28.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:27 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2002659066' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T12:46:28.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.288+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7f5a1404ea50 con 0x7f5a14073070 2026-03-10T12:46:28.288 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.288+0000 7f5a0bfff700 1 -- 192.168.123.100:0/1974154763 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v38) v1 ==== 107+0+4263 (secure 0 0 0) 0x7f59fc060ca0 con 0x7f5a14073070 2026-03-10T12:46:28.289 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:28.289 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":26,"btime":"2026-03-10T12:43:37:524216+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":26,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:37.524211+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24325},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24325":{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":0,"incarnation":26,"state":"up:replay","state_seq":2,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5a00077870 msgr2=0x7f5a00079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5a00077870 0x7f5a00079d20 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f5a04005b40 tx=0x7f5a04005ab0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073070 msgr2=0x7f5a14198450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073070 0x7f5a14198450 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f59fc00ba70 tx=0x7f59fc00be30 comp rx=0 tx=0).stop 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 shutdown_connections 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f5a00077870 0x7f5a00079d20 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f5a14073070 0x7f5a14198450 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 --2- 192.168.123.100:0/1974154763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f5a14073980 0x7f5a14198990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 >> 192.168.123.100:0/1974154763 conn(0x7f5a140fbfc0 msgr2=0x7f5a141070e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 shutdown_connections 2026-03-10T12:46:28.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.291+0000 7f5a19263700 1 -- 192.168.123.100:0/1974154763 wait complete. 2026-03-10T12:46:28.292 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 26 2026-03-10T12:46:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:27 vm07.local ceph-mon[93622]: pgmap v265: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:27 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2058718656' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T12:46:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:27 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2002659066' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T12:46:28.350 DEBUG:tasks.fs:max_mds reduced in epoch 26 2026-03-10T12:46:28.350 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 27 2026-03-10T12:46:28.491 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.734+0000 7f099abcc700 1 -- 192.168.123.100:0/2117211778 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 msgr2=0x7f0994102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.734+0000 7f099abcc700 1 --2- 192.168.123.100:0/2117211778 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994102bf0 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f0988009b00 tx=0x7f0988009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.735+0000 7f099abcc700 1 -- 192.168.123.100:0/2117211778 shutdown_connections 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.735+0000 7f099abcc700 1 --2- 192.168.123.100:0/2117211778 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994102bf0 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.735+0000 7f099abcc700 1 --2- 192.168.123.100:0/2117211778 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0994108780 0x7f0994108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.735+0000 7f099abcc700 1 -- 192.168.123.100:0/2117211778 >> 192.168.123.100:0/2117211778 conn(0x7f09940fe280 msgr2=0x7f0994100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.735+0000 7f099abcc700 1 -- 192.168.123.100:0/2117211778 shutdown_connections 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.735+0000 7f099abcc700 1 -- 192.168.123.100:0/2117211778 wait complete. 2026-03-10T12:46:28.735 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f099abcc700 1 Processor -- start 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f099abcc700 1 -- start start 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f099abcc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994198440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f099abcc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0994108780 0x7f0994198980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f099abcc700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0994199060 con 0x7f0994102780 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f099abcc700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f099419cdf0 con 0x7f0994108780 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f0998968700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994198440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f0998968700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994198440 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60820/0 (socket says 192.168.123.100:60820) 2026-03-10T12:46:28.736 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f0998968700 1 -- 192.168.123.100:0/4114648342 learned_addr learned my addr 192.168.123.100:0/4114648342 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:28.737 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f0993fff700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0994108780 0x7f0994198980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:28.737 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f0998968700 1 -- 192.168.123.100:0/4114648342 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0994108780 msgr2=0x7f0994198980 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.737 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f0998968700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0994108780 0x7f0994198980 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.737 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.736+0000 7f0998968700 1 -- 192.168.123.100:0/4114648342 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f09880097e0 con 0x7f0994102780 2026-03-10T12:46:28.737 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.737+0000 7f0998968700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994198440 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f098400d900 tx=0x7f098400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:28.737 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.737+0000 7f0991ffb700 1 -- 192.168.123.100:0/4114648342 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f09840041d0 con 0x7f0994102780 2026-03-10T12:46:28.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.737+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f099419d0d0 con 0x7f0994102780 2026-03-10T12:46:28.738 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.737+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f099419d620 con 0x7f0994102780 2026-03-10T12:46:28.739 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.738+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f099410ac80 con 0x7f0994102780 2026-03-10T12:46:28.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.740+0000 7f0991ffb700 1 -- 192.168.123.100:0/4114648342 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0984004330 con 0x7f0994102780 2026-03-10T12:46:28.742 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.740+0000 7f0991ffb700 1 -- 192.168.123.100:0/4114648342 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0984003da0 con 0x7f0994102780 2026-03-10T12:46:28.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.743+0000 7f0991ffb700 1 -- 192.168.123.100:0/4114648342 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0984009730 con 0x7f0994102780 2026-03-10T12:46:28.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.743+0000 7f0991ffb700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f097c077780 0x7f097c079c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:28.743 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.743+0000 7f0993fff700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f097c077780 0x7f097c079c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:28.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.744+0000 7f0993fff700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f097c077780 0x7f097c079c30 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f0994199a60 tx=0x7f098800b540 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:28.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.744+0000 7f0991ffb700 1 -- 192.168.123.100:0/4114648342 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f0984021030 con 0x7f0994102780 2026-03-10T12:46:28.744 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.744+0000 7f0991ffb700 1 -- 192.168.123.100:0/4114648342 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f098409a2f0 con 0x7f0994102780 2026-03-10T12:46:28.888 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.887+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7f099404ea50 con 0x7f0994102780 2026-03-10T12:46:28.890 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.890+0000 7f0991ffb700 1 -- 192.168.123.100:0/4114648342 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v38) v1 ==== 107+0+4268 (secure 0 0 0) 0x7f0984062830 con 0x7f0994102780 2026-03-10T12:46:28.890 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:28.890 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":27,"btime":"2026-03-10T12:43:41:833707+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:40.905860+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24325},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24325":{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":0,"incarnation":26,"state":"up:reconnect","state_seq":128,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f097c077780 msgr2=0x7f097c079c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f097c077780 0x7f097c079c30 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f0994199a60 tx=0x7f098800b540 comp rx=0 tx=0).stop 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 msgr2=0x7f0994198440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994198440 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f098400d900 tx=0x7f098400dcc0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 shutdown_connections 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f097c077780 0x7f097c079c30 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0994102780 0x7f0994198440 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 --2- 192.168.123.100:0/4114648342 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0994108780 0x7f0994198980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.893+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 >> 192.168.123.100:0/4114648342 conn(0x7f09940fe280 msgr2=0x7f09940ffc00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.894+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 shutdown_connections 2026-03-10T12:46:28.893 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:28.894+0000 7f099abcc700 1 -- 192.168.123.100:0/4114648342 wait complete. 2026-03-10T12:46:28.894 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 27 2026-03-10T12:46:28.954 DEBUG:tasks.fs:max_mds reduced in epoch 27 2026-03-10T12:46:28.954 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 28 2026-03-10T12:46:29.097 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:29.144 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:28 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1974154763' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T12:46:29.144 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:28 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/4114648342' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T12:46:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:28 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1974154763' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T12:46:29.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:28 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/4114648342' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.340+0000 7feb59097700 1 -- 192.168.123.100:0/2124172614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 msgr2=0x7feb54102c80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.340+0000 7feb59097700 1 --2- 192.168.123.100:0/2124172614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54102c80 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7feb44009b00 tx=0x7feb44009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.340+0000 7feb59097700 1 -- 192.168.123.100:0/2124172614 shutdown_connections 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.340+0000 7feb59097700 1 --2- 192.168.123.100:0/2124172614 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54102c80 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.340+0000 7feb59097700 1 --2- 192.168.123.100:0/2124172614 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb54108810 0x7feb54108be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.340+0000 7feb59097700 1 -- 192.168.123.100:0/2124172614 >> 192.168.123.100:0/2124172614 conn(0x7feb540fe330 msgr2=0x7feb54100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.341+0000 7feb59097700 1 -- 192.168.123.100:0/2124172614 shutdown_connections 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.341+0000 7feb59097700 1 -- 192.168.123.100:0/2124172614 wait complete. 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.341+0000 7feb59097700 1 Processor -- start 2026-03-10T12:46:29.341 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.341+0000 7feb59097700 1 -- start start 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb59097700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54198470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb52d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54198470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb52d9d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54198470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60834/0 (socket says 192.168.123.100:60834) 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb59097700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb54108810 0x7feb541989b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb59097700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb54199090 con 0x7feb54102810 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb52d9d700 1 -- 192.168.123.100:0/2335769452 learned_addr learned my addr 192.168.123.100:0/2335769452 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb59097700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb5419ce20 con 0x7feb54108810 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb5259c700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb54108810 0x7feb541989b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:29.342 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb52d9d700 1 -- 192.168.123.100:0/2335769452 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb54108810 msgr2=0x7feb541989b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:29.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.342+0000 7feb52d9d700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb54108810 0x7feb541989b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.343 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.343+0000 7feb52d9d700 1 -- 192.168.123.100:0/2335769452 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feb440097e0 con 0x7feb54102810 2026-03-10T12:46:29.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.343+0000 7feb52d9d700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54198470 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7feb3c00b700 tx=0x7feb3c00ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:29.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.343+0000 7feb4bfff700 1 -- 192.168.123.100:0/2335769452 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb3c011840 con 0x7feb54102810 2026-03-10T12:46:29.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.343+0000 7feb4bfff700 1 -- 192.168.123.100:0/2335769452 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7feb3c011e80 con 0x7feb54102810 2026-03-10T12:46:29.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.343+0000 7feb4bfff700 1 -- 192.168.123.100:0/2335769452 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb3c00f550 con 0x7feb54102810 2026-03-10T12:46:29.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.343+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feb5419d100 con 0x7feb54102810 2026-03-10T12:46:29.344 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.343+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feb5419d570 con 0x7feb54102810 2026-03-10T12:46:29.345 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.345+0000 7feb4bfff700 1 -- 192.168.123.100:0/2335769452 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feb3c00f6b0 con 0x7feb54102810 2026-03-10T12:46:29.346 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.345+0000 7feb4bfff700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb40077910 0x7feb40079dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:29.346 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.345+0000 7feb4bfff700 1 -- 192.168.123.100:0/2335769452 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7feb3c099640 con 0x7feb54102810 2026-03-10T12:46:29.346 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.345+0000 7feb5259c700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb40077910 0x7feb40079dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:29.346 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.345+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feb5404ea50 con 0x7feb54102810 2026-03-10T12:46:29.346 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.346+0000 7feb5259c700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb40077910 0x7feb40079dc0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7feb54199a90 tx=0x7feb44005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:29.349 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.349+0000 7feb4bfff700 1 -- 192.168.123.100:0/2335769452 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feb3c061e60 con 0x7feb54102810 2026-03-10T12:46:29.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.491+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7feb54066e40 con 0x7feb54102810 2026-03-10T12:46:29.491 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.491+0000 7feb4bfff700 1 -- 192.168.123.100:0/2335769452 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v38) v1 ==== 107+0+4265 (secure 0 0 0) 0x7feb3c015020 con 0x7feb54102810 2026-03-10T12:46:29.492 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:29.492 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":28,"btime":"2026-03-10T12:43:42:836950+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":28,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:41.842729+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24325},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24325":{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":0,"incarnation":26,"state":"up:rejoin","state_seq":129,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:29.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.494+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb40077910 msgr2=0x7feb40079dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:29.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.494+0000 7feb59097700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb40077910 0x7feb40079dc0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7feb54199a90 tx=0x7feb44005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.494+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 msgr2=0x7feb54198470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:29.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.494+0000 7feb59097700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54198470 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7feb3c00b700 tx=0x7feb3c00ba10 comp rx=0 tx=0).stop 2026-03-10T12:46:29.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.494+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 shutdown_connections 2026-03-10T12:46:29.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.495+0000 7feb59097700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb40077910 0x7feb40079dc0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.494 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.495+0000 7feb59097700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feb54102810 0x7feb54198470 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.495+0000 7feb59097700 1 --2- 192.168.123.100:0/2335769452 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feb54108810 0x7feb541989b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.495+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 >> 192.168.123.100:0/2335769452 conn(0x7feb540fe330 msgr2=0x7feb540ffaa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:29.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.495+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 shutdown_connections 2026-03-10T12:46:29.495 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.495+0000 7feb59097700 1 -- 192.168.123.100:0/2335769452 wait complete. 2026-03-10T12:46:29.496 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 28 2026-03-10T12:46:29.539 DEBUG:tasks.fs:max_mds reduced in epoch 28 2026-03-10T12:46:29.539 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 29 2026-03-10T12:46:29.686 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:29.919 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.919+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/4044507085 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f641066c0 msgr2=0x7f0f64106a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:29.919 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.919+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/4044507085 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f641066c0 0x7f0f64106a90 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f0f54009b00 tx=0x7f0f54009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.919+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/4044507085 shutdown_connections 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.919+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/4044507085 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0f64068490 0x7f0f64068900 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.919+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/4044507085 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f641066c0 0x7f0f64106a90 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.919+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/4044507085 >> 192.168.123.100:0/4044507085 conn(0x7f0f640754a0 msgr2=0x7f0f640758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.920+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/4044507085 shutdown_connections 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.920+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/4044507085 wait complete. 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.920+0000 7f0f6b5a8700 1 Processor -- start 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.920+0000 7f0f6b5a8700 1 -- start start 2026-03-10T12:46:29.920 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f6b5a8700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f64068490 0x7f0f64194060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f69344700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f64068490 0x7f0f64194060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f69344700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f64068490 0x7f0f64194060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60856/0 (socket says 192.168.123.100:60856) 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f6b5a8700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0f641066c0 0x7f0f641945a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f6b5a8700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f64194c80 con 0x7f0f64068490 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f6b5a8700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0f64198a10 con 0x7f0f641066c0 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f69344700 1 -- 192.168.123.100:0/3800747766 learned_addr learned my addr 192.168.123.100:0/3800747766 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f68b43700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0f641066c0 0x7f0f641945a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f69344700 1 -- 192.168.123.100:0/3800747766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0f641066c0 msgr2=0x7f0f641945a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f69344700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0f641066c0 0x7f0f641945a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:29.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.921+0000 7f0f69344700 1 -- 192.168.123.100:0/3800747766 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0f540097e0 con 0x7f0f64068490 2026-03-10T12:46:29.922 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.922+0000 7f0f69344700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f64068490 0x7f0f64194060 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f0f540094d0 tx=0x7f0f54004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:29.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.922+0000 7f0f5a7fc700 1 -- 192.168.123.100:0/3800747766 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f5401d070 con 0x7f0f64068490 2026-03-10T12:46:29.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.922+0000 7f0f5a7fc700 1 -- 192.168.123.100:0/3800747766 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0f54022470 con 0x7f0f64068490 2026-03-10T12:46:29.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.922+0000 7f0f5a7fc700 1 -- 192.168.123.100:0/3800747766 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0f5400f650 con 0x7f0f64068490 2026-03-10T12:46:29.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.922+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0f64198c90 con 0x7f0f64068490 2026-03-10T12:46:29.923 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.922+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0f641990a0 con 0x7f0f64068490 2026-03-10T12:46:29.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.923+0000 7f0f5a7fc700 1 -- 192.168.123.100:0/3800747766 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0f5400baa0 con 0x7f0f64068490 2026-03-10T12:46:29.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.923+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0f6404ea50 con 0x7f0f64068490 2026-03-10T12:46:29.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.924+0000 7f0f5a7fc700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0f50077880 0x7f0f50079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:29.926 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.924+0000 7f0f5a7fc700 1 -- 192.168.123.100:0/3800747766 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f0f5409b130 con 0x7f0f64068490 2026-03-10T12:46:29.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.924+0000 7f0f68b43700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0f50077880 0x7f0f50079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:29.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.924+0000 7f0f68b43700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0f50077880 0x7f0f50079d30 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f0f64195680 tx=0x7f0f60009380 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:29.927 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:29.927+0000 7f0f5a7fc700 1 -- 192.168.123.100:0/3800747766 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0f54063960 con 0x7f0f64068490 2026-03-10T12:46:30.061 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.060+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7f0f64066e40 con 0x7f0f64068490 2026-03-10T12:46:30.061 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:29 vm00.local ceph-mon[103263]: pgmap v266: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:30.061 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:29 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2335769452' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T12:46:30.063 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.063+0000 7f0f5a7fc700 1 -- 192.168.123.100:0/3800747766 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v38) v1 ==== 107+0+4274 (secure 0 0 0) 0x7f0f540630b0 con 0x7f0f64068490 2026-03-10T12:46:30.063 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:30.063 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":29,"btime":"2026-03-10T12:43:43:870543+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:43.870537+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24325},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24325":{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":0,"incarnation":26,"state":"up:active","state_seq":130,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24325,"qdb_cluster":[24325]},"id":1}]} 2026-03-10T12:46:30.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0f50077880 msgr2=0x7f0f50079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:30.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0f50077880 0x7f0f50079d30 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f0f64195680 tx=0x7f0f60009380 comp rx=0 tx=0).stop 2026-03-10T12:46:30.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f64068490 msgr2=0x7f0f64194060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:30.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f64068490 0x7f0f64194060 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f0f540094d0 tx=0x7f0f54004930 comp rx=0 tx=0).stop 2026-03-10T12:46:30.065 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 shutdown_connections 2026-03-10T12:46:30.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f0f50077880 0x7f0f50079d30 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f0f64068490 0x7f0f64194060 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 --2- 192.168.123.100:0/3800747766 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f0f641066c0 0x7f0f641945a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.065+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 >> 192.168.123.100:0/3800747766 conn(0x7f0f640754a0 msgr2=0x7f0f640fec50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:30.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.066+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 shutdown_connections 2026-03-10T12:46:30.066 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.066+0000 7f0f6b5a8700 1 -- 192.168.123.100:0/3800747766 wait complete. 2026-03-10T12:46:30.066 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 29 2026-03-10T12:46:30.103 DEBUG:tasks.fs:max_mds reduced in epoch 29 2026-03-10T12:46:30.103 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 30 2026-03-10T12:46:30.235 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:30.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:29 vm07.local ceph-mon[93622]: pgmap v266: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:29 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2335769452' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.460+0000 7fa385c64700 1 -- 192.168.123.100:0/1102389272 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 msgr2=0x7fa380102bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.460+0000 7fa385c64700 1 --2- 192.168.123.100:0/1102389272 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380102bf0 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7fa370009b00 tx=0x7fa370009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 -- 192.168.123.100:0/1102389272 shutdown_connections 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 --2- 192.168.123.100:0/1102389272 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380102bf0 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 --2- 192.168.123.100:0/1102389272 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa380108780 0x7fa380108b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 -- 192.168.123.100:0/1102389272 >> 192.168.123.100:0/1102389272 conn(0x7fa3800fe280 msgr2=0x7fa380100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 -- 192.168.123.100:0/1102389272 shutdown_connections 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 -- 192.168.123.100:0/1102389272 wait complete. 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 Processor -- start 2026-03-10T12:46:30.461 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.461+0000 7fa385c64700 1 -- start start 2026-03-10T12:46:30.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa385c64700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:30.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa385c64700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa380108780 0x7fa3800757a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:30.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa385c64700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3800793f0 con 0x7fa380102780 2026-03-10T12:46:30.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa385c64700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa380075ce0 con 0x7fa380108780 2026-03-10T12:46:30.462 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa37f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa37f7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380075260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60880/0 (socket says 192.168.123.100:60880) 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa37f7fe700 1 -- 192.168.123.100:0/1069135262 learned_addr learned my addr 192.168.123.100:0/1069135262 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa37f7fe700 1 -- 192.168.123.100:0/1069135262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa380108780 msgr2=0x7fa3800757a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.462+0000 7fa37effd700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa380108780 0x7fa3800757a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.463+0000 7fa37f7fe700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa380108780 0x7fa3800757a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.463+0000 7fa37f7fe700 1 -- 192.168.123.100:0/1069135262 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3700097e0 con 0x7fa380102780 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.463+0000 7fa37effd700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa380108780 0x7fa3800757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:46:30.463 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.463+0000 7fa37f7fe700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380075260 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fa36800b700 tx=0x7fa36800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:30.464 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.463+0000 7fa37cff9700 1 -- 192.168.123.100:0/1069135262 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa368010820 con 0x7fa380102780 2026-03-10T12:46:30.464 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.463+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa380075fc0 con 0x7fa380102780 2026-03-10T12:46:30.464 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.463+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa3801a6c30 con 0x7fa380102780 2026-03-10T12:46:30.464 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.464+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa38004ea50 con 0x7fa380102780 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.465+0000 7fa37cff9700 1 -- 192.168.123.100:0/1069135262 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa368010e60 con 0x7fa380102780 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.465+0000 7fa37cff9700 1 -- 192.168.123.100:0/1069135262 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa368017570 con 0x7fa380102780 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.465+0000 7fa37cff9700 1 -- 192.168.123.100:0/1069135262 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa368017750 con 0x7fa380102780 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.465+0000 7fa37cff9700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fa36c077800 0x7fa36c079cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.466+0000 7fa37effd700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fa36c077800 0x7fa36c079cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.466+0000 7fa37effd700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fa36c077800 0x7fa36c079cb0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fa370009fd0 tx=0x7fa370005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.466+0000 7fa37cff9700 1 -- 192.168.123.100:0/1069135262 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7fa36809a5b0 con 0x7fa380102780 2026-03-10T12:46:30.467 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.467+0000 7fa37cff9700 1 -- 192.168.123.100:0/1069135262 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa368062d30 con 0x7fa380102780 2026-03-10T12:46:30.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.604+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7fa380066e40 con 0x7fa380102780 2026-03-10T12:46:30.605 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.604+0000 7fa37cff9700 1 -- 192.168.123.100:0/1069135262 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v38) v1 ==== 107+0+5119 (secure 0 0 0) 0x7fa368062480 con 0x7fa380102780 2026-03-10T12:46:30.606 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:30.606 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":30,"btime":"2026-03-10T12:43:45:356913+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:43.870537+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24325},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_24325":{"gid":24325,"name":"cephfs.vm07.rhzwnr","rank":0,"incarnation":26,"state":"up:active","state_seq":130,"addr":"192.168.123.107:6827/3705110268","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3705110268},{"type":"v1","addr":"192.168.123.107:6827","nonce":3705110268}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24325,"qdb_cluster":[24325]},"id":1}]} 2026-03-10T12:46:30.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.608+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fa36c077800 msgr2=0x7fa36c079cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:30.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.608+0000 7fa385c64700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fa36c077800 0x7fa36c079cb0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fa370009fd0 tx=0x7fa370005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.608+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 msgr2=0x7fa380075260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:30.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.608+0000 7fa385c64700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380075260 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7fa36800b700 tx=0x7fa36800bac0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.608 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.608+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 shutdown_connections 2026-03-10T12:46:30.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.609+0000 7fa385c64700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7fa36c077800 0x7fa36c079cb0 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.609+0000 7fa385c64700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7fa380102780 0x7fa380075260 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.609+0000 7fa385c64700 1 --2- 192.168.123.100:0/1069135262 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7fa380108780 0x7fa3800757a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:30.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.609+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 >> 192.168.123.100:0/1069135262 conn(0x7fa3800fe280 msgr2=0x7fa3800ffb20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:30.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.609+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 shutdown_connections 2026-03-10T12:46:30.609 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:30.609+0000 7fa385c64700 1 -- 192.168.123.100:0/1069135262 wait complete. 2026-03-10T12:46:30.610 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 30 2026-03-10T12:46:30.668 DEBUG:tasks.fs:max_mds reduced in epoch 30 2026-03-10T12:46:30.668 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 31 2026-03-10T12:46:30.809 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.041+0000 7f52a63a0700 1 -- 192.168.123.100:0/3552915681 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 msgr2=0x7f52a010a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.041+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3552915681 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 0x7f52a010a590 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f5290009b00 tx=0x7f5290009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.042+0000 7f52a63a0700 1 -- 192.168.123.100:0/3552915681 shutdown_connections 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.042+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3552915681 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 0x7f52a010a590 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.042+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3552915681 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 0x7f52a0101980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.042+0000 7f52a63a0700 1 -- 192.168.123.100:0/3552915681 >> 192.168.123.100:0/3552915681 conn(0x7f52a00faf00 msgr2=0x7f52a00fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.042+0000 7f52a63a0700 1 -- 192.168.123.100:0/3552915681 shutdown_connections 2026-03-10T12:46:31.042 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.042+0000 7f52a63a0700 1 -- 192.168.123.100:0/3552915681 wait complete. 2026-03-10T12:46:31.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f52a63a0700 1 Processor -- start 2026-03-10T12:46:31.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f52a63a0700 1 -- start start 2026-03-10T12:46:31.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f52a63a0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 0x7f52a0198450 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:31.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f52a63a0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 0x7f52a0198990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:31.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f52a63a0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52a0199070 con 0x7f52a0101ec0 2026-03-10T12:46:31.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f52a63a0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52a019cdb0 con 0x7f52a01015b0 2026-03-10T12:46:31.043 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f529ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 0x7f52a0198450 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f529ffff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 0x7f52a0198450 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60106/0 (socket says 192.168.123.100:60106) 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f529ffff700 1 -- 192.168.123.100:0/3831337904 learned_addr learned my addr 192.168.123.100:0/3831337904 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.043+0000 7f529ffff700 1 -- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 msgr2=0x7f52a0198990 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529f7fe700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 0x7f52a0198990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529ffff700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 0x7f52a0198990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529ffff700 1 -- 192.168.123.100:0/3831337904 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52900097e0 con 0x7f52a01015b0 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529ffff700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 0x7f52a0198450 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f528800b700 tx=0x7f528800bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:31.044 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529f7fe700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 0x7f52a0198990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:46:31.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529d7fa700 1 -- 192.168.123.100:0/3831337904 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5288010840 con 0x7f52a01015b0 2026-03-10T12:46:31.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529d7fa700 1 -- 192.168.123.100:0/3831337904 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5288010e80 con 0x7f52a01015b0 2026-03-10T12:46:31.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f529d7fa700 1 -- 192.168.123.100:0/3831337904 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f528800d590 con 0x7f52a01015b0 2026-03-10T12:46:31.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52a019d090 con 0x7f52a01015b0 2026-03-10T12:46:31.045 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.044+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52a019d5e0 con 0x7f52a01015b0 2026-03-10T12:46:31.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.046+0000 7f529d7fa700 1 -- 192.168.123.100:0/3831337904 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f52880109a0 con 0x7f52a01015b0 2026-03-10T12:46:31.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.046+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f52a010a090 con 0x7f52a01015b0 2026-03-10T12:46:31.046 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.046+0000 7f529d7fa700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f528c0778c0 0x7f528c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:31.047 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.046+0000 7f529d7fa700 1 -- 192.168.123.100:0/3831337904 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f52880994a0 con 0x7f52a01015b0 2026-03-10T12:46:31.047 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.047+0000 7f529f7fe700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f528c0778c0 0x7f528c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:31.047 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.047+0000 7f529f7fe700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f528c0778c0 0x7f528c079d70 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f5290009fd0 tx=0x7f5290005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:31.049 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.049+0000 7f529d7fa700 1 -- 192.168.123.100:0/3831337904 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5288061c20 con 0x7f52a01015b0 2026-03-10T12:46:31.185 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:30 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3800747766' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T12:46:31.185 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:30 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1069135262' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T12:46:31.185 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:31.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.185+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f52a004ea50 con 0x7f52a01015b0 2026-03-10T12:46:31.186 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.185+0000 7f529d7fa700 1 -- 192.168.123.100:0/3831337904 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v38) v1 ==== 107+0+4314 (secure 0 0 0) 0x7f5288061370 con 0x7f52a01015b0 2026-03-10T12:46:31.188 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:31.188 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":31,"btime":"2026-03-10T12:43:47:764428+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":24},{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":31,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:47.764419+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[1],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.190+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f528c0778c0 msgr2=0x7f528c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.190+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f528c0778c0 0x7f528c079d70 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7f5290009fd0 tx=0x7f5290005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.190+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 msgr2=0x7f52a0198450 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.190+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 0x7f52a0198450 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f528800b700 tx=0x7f528800bac0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.190+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 shutdown_connections 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.190+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f528c0778c0 0x7f528c079d70 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.190+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f52a01015b0 0x7f52a0198450 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.190 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.191+0000 7f52a63a0700 1 --2- 192.168.123.100:0/3831337904 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f52a0101ec0 0x7f52a0198990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.191+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 >> 192.168.123.100:0/3831337904 conn(0x7f52a00faf00 msgr2=0x7f52a00ffba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:31.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.191+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 shutdown_connections 2026-03-10T12:46:31.191 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.191+0000 7f52a63a0700 1 -- 192.168.123.100:0/3831337904 wait complete. 2026-03-10T12:46:31.192 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 31 2026-03-10T12:46:31.250 DEBUG:tasks.fs:max_mds reduced in epoch 31 2026-03-10T12:46:31.250 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 32 2026-03-10T12:46:31.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:30 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3800747766' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T12:46:31.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:30 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1069135262' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T12:46:31.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:31.393 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:31.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.620+0000 7f4ab54f7700 1 -- 192.168.123.100:0/897851300 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0108780 msgr2=0x7f4ab0108b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:31.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.620+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/897851300 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0108780 0x7f4ab0108b50 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f4aa800b3a0 tx=0x7f4aa800b6b0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.620+0000 7f4ab54f7700 1 -- 192.168.123.100:0/897851300 shutdown_connections 2026-03-10T12:46:31.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.620+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/897851300 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab0102780 0x7f4ab0102bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.620+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/897851300 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0108780 0x7f4ab0108b50 secure :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f4aa800b3a0 tx=0x7f4aa800b6b0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.621 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.620+0000 7f4ab54f7700 1 -- 192.168.123.100:0/897851300 >> 192.168.123.100:0/897851300 conn(0x7f4ab00fe280 msgr2=0x7f4ab0100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:31.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 -- 192.168.123.100:0/897851300 shutdown_connections 2026-03-10T12:46:31.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 -- 192.168.123.100:0/897851300 wait complete. 2026-03-10T12:46:31.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 Processor -- start 2026-03-10T12:46:31.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 -- start start 2026-03-10T12:46:31.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0102780 0x7f4ab0075260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:31.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab00757a0 0x7f4ab0075c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:31.622 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ab00792e0 con 0x7f4ab0102780 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.622+0000 7f4ab54f7700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4ab0079450 con 0x7f4ab00757a0 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aae7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab00757a0 0x7f4ab0075c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aae7fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab00757a0 0x7f4ab0075c10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60118/0 (socket says 192.168.123.100:60118) 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aae7fc700 1 -- 192.168.123.100:0/3334856289 learned_addr learned my addr 192.168.123.100:0/3334856289 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aaeffd700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0102780 0x7f4ab0075260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aae7fc700 1 -- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0102780 msgr2=0x7f4ab0075260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aae7fc700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0102780 0x7f4ab0075260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.623 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aae7fc700 1 -- 192.168.123.100:0/3334856289 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4aa800b050 con 0x7f4ab00757a0 2026-03-10T12:46:31.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aae7fc700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab00757a0 0x7f4ab0075c10 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f4aa400b700 tx=0x7f4aa400bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:31.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4a97fff700 1 -- 192.168.123.100:0/3334856289 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4aa4010840 con 0x7f4ab00757a0 2026-03-10T12:46:31.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4ab01a6d20 con 0x7f4ab00757a0 2026-03-10T12:46:31.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4aaeffd700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0102780 0x7f4ab0075260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T12:46:31.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4ab01a71e0 con 0x7f4ab00757a0 2026-03-10T12:46:31.624 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4a97fff700 1 -- 192.168.123.100:0/3334856289 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4aa4010e80 con 0x7f4ab00757a0 2026-03-10T12:46:31.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.623+0000 7f4a97fff700 1 -- 192.168.123.100:0/3334856289 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4aa400d590 con 0x7f4ab00757a0 2026-03-10T12:46:31.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.625+0000 7f4a97fff700 1 -- 192.168.123.100:0/3334856289 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4aa40109a0 con 0x7f4ab00757a0 2026-03-10T12:46:31.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.625+0000 7f4a97fff700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a980778c0 0x7f4a98079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:31.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.625+0000 7f4a97fff700 1 -- 192.168.123.100:0/3334856289 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f4aa40994e0 con 0x7f4ab00757a0 2026-03-10T12:46:31.625 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.625+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a9c005320 con 0x7f4ab00757a0 2026-03-10T12:46:31.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.625+0000 7f4aaeffd700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a980778c0 0x7f4a98079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:31.626 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.626+0000 7f4aaeffd700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a980778c0 0x7f4a98079d70 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f4aa800b370 tx=0x7f4aa800bf90 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:31.628 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.629+0000 7f4a97fff700 1 -- 192.168.123.100:0/3334856289 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4aa4061be0 con 0x7f4ab00757a0 2026-03-10T12:46:31.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.768+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 32, "format": "json"} v 0) v1 -- 0x7f4a9c005190 con 0x7f4ab00757a0 2026-03-10T12:46:31.769 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.769+0000 7f4a97fff700 1 -- 192.168.123.100:0/3334856289 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 32, "format": "json"}]=0 dumped fsmap epoch 32 v38) v1 ==== 107+0+4393 (secure 0 0 0) 0x7f4aa4061330 con 0x7f4ab00757a0 2026-03-10T12:46:31.770 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:31.770 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":32,"btime":"2026-03-10T12:43:47:772902+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":32,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:47.772881+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34368},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34368":{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":0,"incarnation":32,"state":"up:replay","state_seq":1,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:31.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.772+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a980778c0 msgr2=0x7f4a98079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:31.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.772+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a980778c0 0x7f4a98079d70 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f4aa800b370 tx=0x7f4aa800bf90 comp rx=0 tx=0).stop 2026-03-10T12:46:31.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.772+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab00757a0 msgr2=0x7f4ab0075c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:31.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.772+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab00757a0 0x7f4ab0075c10 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f4aa400b700 tx=0x7f4aa400bac0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.772 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.772+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 shutdown_connections 2026-03-10T12:46:31.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.773+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f4a980778c0 0x7f4a98079d70 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.773+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f4ab0102780 0x7f4ab0075260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.773+0000 7f4ab54f7700 1 --2- 192.168.123.100:0/3334856289 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f4ab00757a0 0x7f4ab0075c10 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:31.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.773+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 >> 192.168.123.100:0/3334856289 conn(0x7f4ab00fe280 msgr2=0x7f4ab00ffcd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:31.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.773+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 shutdown_connections 2026-03-10T12:46:31.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:31.773+0000 7f4ab54f7700 1 -- 192.168.123.100:0/3334856289 wait complete. 2026-03-10T12:46:31.774 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 32 2026-03-10T12:46:31.833 DEBUG:tasks.fs:max_mds reduced in epoch 32 2026-03-10T12:46:31.833 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 33 2026-03-10T12:46:31.972 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:32.020 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:31 vm00.local ceph-mon[103263]: pgmap v267: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:32.020 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:31 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3831337904' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T12:46:32.020 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:31 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3334856289' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T12:46:32.216 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.215+0000 7f8102a57700 1 -- 192.168.123.100:0/871767545 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc0686f0 msgr2=0x7f80fc068ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:32.216 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.215+0000 7f8102a57700 1 --2- 192.168.123.100:0/871767545 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc0686f0 0x7f80fc068ac0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f80e4009b00 tx=0x7f80e4009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:32.216 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.216+0000 7f8102a57700 1 -- 192.168.123.100:0/871767545 shutdown_connections 2026-03-10T12:46:32.216 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.216+0000 7f8102a57700 1 --2- 192.168.123.100:0/871767545 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80fc069000 0x7f80fc1051e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.216 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.216+0000 7f8102a57700 1 --2- 192.168.123.100:0/871767545 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc0686f0 0x7f80fc068ac0 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.216 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.216+0000 7f8102a57700 1 -- 192.168.123.100:0/871767545 >> 192.168.123.100:0/871767545 conn(0x7f80fc0754a0 msgr2=0x7f80fc0758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:32.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.216+0000 7f8102a57700 1 -- 192.168.123.100:0/871767545 shutdown_connections 2026-03-10T12:46:32.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.216+0000 7f8102a57700 1 -- 192.168.123.100:0/871767545 wait complete. 2026-03-10T12:46:32.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.217+0000 7f8102a57700 1 Processor -- start 2026-03-10T12:46:32.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.217+0000 7f8102a57700 1 -- start start 2026-03-10T12:46:32.217 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f8102a57700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80fc0686f0 0x7f80fc193fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f8102a57700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc069000 0x7f80fc194510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f8102a57700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80fc194bf0 con 0x7f80fc069000 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f8102a57700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80fc198980 con 0x7f80fc0686f0 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f80fb7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc069000 0x7f80fc194510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f80fbfff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80fc0686f0 0x7f80fc193fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f80fb7fe700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc069000 0x7f80fc194510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60934/0 (socket says 192.168.123.100:60934) 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f80fb7fe700 1 -- 192.168.123.100:0/3022024000 learned_addr learned my addr 192.168.123.100:0/3022024000 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f80fb7fe700 1 -- 192.168.123.100:0/3022024000 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80fc0686f0 msgr2=0x7f80fc193fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f80fb7fe700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80fc0686f0 0x7f80fc193fd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.218 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.218+0000 7f80fb7fe700 1 -- 192.168.123.100:0/3022024000 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80e40097e0 con 0x7f80fc069000 2026-03-10T12:46:32.219 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.219+0000 7f80fb7fe700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc069000 0x7f80fc194510 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f80ec00cc60 tx=0x7f80ec00cf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:32.220 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.219+0000 7f80f97fa700 1 -- 192.168.123.100:0/3022024000 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80ec007960 con 0x7f80fc069000 2026-03-10T12:46:32.220 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.219+0000 7f80f97fa700 1 -- 192.168.123.100:0/3022024000 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f80ec00f450 con 0x7f80fc069000 2026-03-10T12:46:32.220 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.219+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80fc198c60 con 0x7f80fc069000 2026-03-10T12:46:32.220 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.219+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80fc1991b0 con 0x7f80fc069000 2026-03-10T12:46:32.220 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.220+0000 7f80f97fa700 1 -- 192.168.123.100:0/3022024000 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80ec0186a0 con 0x7f80fc069000 2026-03-10T12:46:32.220 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.220+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80fc108b30 con 0x7f80fc069000 2026-03-10T12:46:32.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.221+0000 7f80f97fa700 1 -- 192.168.123.100:0/3022024000 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f80ec01f030 con 0x7f80fc069000 2026-03-10T12:46:32.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.222+0000 7f80f97fa700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f80e807bdf0 0x7f80e807e2a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:32.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.222+0000 7f80f97fa700 1 -- 192.168.123.100:0/3022024000 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f80ec099d20 con 0x7f80fc069000 2026-03-10T12:46:32.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.222+0000 7f80fbfff700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f80e807bdf0 0x7f80e807e2a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:32.223 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.222+0000 7f80fbfff700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f80e807bdf0 0x7f80e807e2a0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f80e400b5c0 tx=0x7f80e4005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:32.224 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.223+0000 7f80f97fa700 1 -- 192.168.123.100:0/3022024000 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f80ec062520 con 0x7f80fc069000 2026-03-10T12:46:32.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:31 vm07.local ceph-mon[93622]: pgmap v267: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:32.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:31 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3831337904' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T12:46:32.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:31 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3334856289' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T12:46:32.367 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.364+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 33, "format": "json"} v 0) v1 -- 0x7f80fc04ea50 con 0x7f80fc069000 2026-03-10T12:46:32.369 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.369+0000 7f80f97fa700 1 -- 192.168.123.100:0/3022024000 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 33, "format": "json"}]=0 dumped fsmap epoch 33 v38) v1 ==== 107+0+4396 (secure 0 0 0) 0x7f80ec061c70 con 0x7f80fc069000 2026-03-10T12:46:32.369 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:32.369 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":33,"btime":"2026-03-10T12:43:51:175313+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":33,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:50.741508+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34368},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34368":{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":0,"incarnation":32,"state":"up:reconnect","state_seq":6,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:32.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.372+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f80e807bdf0 msgr2=0x7f80e807e2a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:32.372 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.372+0000 7f8102a57700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f80e807bdf0 0x7f80e807e2a0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f80e400b5c0 tx=0x7f80e4005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.373+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc069000 msgr2=0x7f80fc194510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:32.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.373+0000 7f8102a57700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc069000 0x7f80fc194510 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f80ec00cc60 tx=0x7f80ec00cf70 comp rx=0 tx=0).stop 2026-03-10T12:46:32.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.373+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 shutdown_connections 2026-03-10T12:46:32.373 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.373+0000 7f8102a57700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f80e807bdf0 0x7f80e807e2a0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.374+0000 7f8102a57700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f80fc0686f0 0x7f80fc193fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.374+0000 7f8102a57700 1 --2- 192.168.123.100:0/3022024000 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f80fc069000 0x7f80fc194510 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.374+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 >> 192.168.123.100:0/3022024000 conn(0x7f80fc0754a0 msgr2=0x7f80fc0ff760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:32.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.374+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 shutdown_connections 2026-03-10T12:46:32.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.374+0000 7f8102a57700 1 -- 192.168.123.100:0/3022024000 wait complete. 2026-03-10T12:46:32.375 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 33 2026-03-10T12:46:32.420 DEBUG:tasks.fs:max_mds reduced in epoch 33 2026-03-10T12:46:32.420 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 34 2026-03-10T12:46:32.588 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.847+0000 7f6a71617700 1 -- 192.168.123.100:0/3335136201 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c0ff220 msgr2=0x7f6a6c0ff690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.847+0000 7f6a71617700 1 --2- 192.168.123.100:0/3335136201 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c0ff220 0x7f6a6c0ff690 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f6a54009b00 tx=0x7f6a54009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.848+0000 7f6a71617700 1 -- 192.168.123.100:0/3335136201 shutdown_connections 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.848+0000 7f6a71617700 1 --2- 192.168.123.100:0/3335136201 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c0ff220 0x7f6a6c0ff690 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.848+0000 7f6a71617700 1 --2- 192.168.123.100:0/3335136201 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a6c1024b0 0x7f6a6c102880 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.848+0000 7f6a71617700 1 -- 192.168.123.100:0/3335136201 >> 192.168.123.100:0/3335136201 conn(0x7f6a6c0747e0 msgr2=0x7f6a6c074be0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.848+0000 7f6a71617700 1 -- 192.168.123.100:0/3335136201 shutdown_connections 2026-03-10T12:46:32.848 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.848+0000 7f6a71617700 1 -- 192.168.123.100:0/3335136201 wait complete. 2026-03-10T12:46:32.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.849+0000 7f6a71617700 1 Processor -- start 2026-03-10T12:46:32.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.849+0000 7f6a71617700 1 -- start start 2026-03-10T12:46:32.849 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.849+0000 7f6a71617700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a6c0ff220 0x7f6a6c1983e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:32.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.850+0000 7f6a71617700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c1024b0 0x7f6a6c198920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:32.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.850+0000 7f6a71617700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a6c199000 con 0x7f6a6c1024b0 2026-03-10T12:46:32.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.850+0000 7f6a71617700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a6c19cd90 con 0x7f6a6c0ff220 2026-03-10T12:46:32.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.850+0000 7f6a63fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c1024b0 0x7f6a6c198920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:32.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.850+0000 7f6a63fff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c1024b0 0x7f6a6c198920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60946/0 (socket says 192.168.123.100:60946) 2026-03-10T12:46:32.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.850+0000 7f6a63fff700 1 -- 192.168.123.100:0/830741735 learned_addr learned my addr 192.168.123.100:0/830741735 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:32.850 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.850+0000 7f6a6affd700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a6c0ff220 0x7f6a6c1983e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:32.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a63fff700 1 -- 192.168.123.100:0/830741735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a6c0ff220 msgr2=0x7f6a6c1983e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:32.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a63fff700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a6c0ff220 0x7f6a6c1983e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:32.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a63fff700 1 -- 192.168.123.100:0/830741735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a540097e0 con 0x7f6a6c1024b0 2026-03-10T12:46:32.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a6affd700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a6c0ff220 0x7f6a6c1983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:46:32.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a63fff700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c1024b0 0x7f6a6c198920 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f6a54004930 tx=0x7f6a54004a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:32.851 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a68ff9700 1 -- 192.168.123.100:0/830741735 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a5401d070 con 0x7f6a6c1024b0 2026-03-10T12:46:32.853 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a68ff9700 1 -- 192.168.123.100:0/830741735 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6a5400bc50 con 0x7f6a6c1024b0 2026-03-10T12:46:32.853 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a68ff9700 1 -- 192.168.123.100:0/830741735 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a5400f790 con 0x7f6a6c1024b0 2026-03-10T12:46:32.853 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a6c19d070 con 0x7f6a6c1024b0 2026-03-10T12:46:32.853 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.851+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a6c19d5c0 con 0x7f6a6c1024b0 2026-03-10T12:46:32.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.853+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a6c10a990 con 0x7f6a6c1024b0 2026-03-10T12:46:32.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.853+0000 7f6a68ff9700 1 -- 192.168.123.100:0/830741735 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6a54022470 con 0x7f6a6c1024b0 2026-03-10T12:46:32.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.854+0000 7f6a68ff9700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6a4c0778c0 0x7f6a4c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:32.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.854+0000 7f6a68ff9700 1 -- 192.168.123.100:0/830741735 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f6a5409b390 con 0x7f6a6c1024b0 2026-03-10T12:46:32.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.856+0000 7f6a68ff9700 1 -- 192.168.123.100:0/830741735 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6a54063b10 con 0x7f6a6c1024b0 2026-03-10T12:46:32.856 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.856+0000 7f6a6affd700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6a4c0778c0 0x7f6a4c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:32.857 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:32.857+0000 7f6a6affd700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6a4c0778c0 0x7f6a4c079d70 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f6a5c0099d0 tx=0x7f6a5c008040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:33.002 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.002+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 34, "format": "json"} v 0) v1 -- 0x7f6a6c04ea50 con 0x7f6a6c1024b0 2026-03-10T12:46:33.003 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.003+0000 7f6a68ff9700 1 -- 192.168.123.100:0/830741735 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 34, "format": "json"}]=0 dumped fsmap epoch 34 v38) v1 ==== 107+0+4393 (secure 0 0 0) 0x7f6a54063260 con 0x7f6a6c1024b0 2026-03-10T12:46:33.003 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:33.003 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":34,"btime":"2026-03-10T12:43:52:203027+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":34,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:51.209601+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34368},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34368":{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":0,"incarnation":32,"state":"up:rejoin","state_seq":7,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T12:46:33.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.005+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6a4c0778c0 msgr2=0x7f6a4c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:33.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.005+0000 7f6a71617700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6a4c0778c0 0x7f6a4c079d70 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f6a5c0099d0 tx=0x7f6a5c008040 comp rx=0 tx=0).stop 2026-03-10T12:46:33.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.005+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c1024b0 msgr2=0x7f6a6c198920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:33.005 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.005+0000 7f6a71617700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c1024b0 0x7f6a6c198920 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f6a54004930 tx=0x7f6a54004a10 comp rx=0 tx=0).stop 2026-03-10T12:46:33.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.006+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 shutdown_connections 2026-03-10T12:46:33.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.006+0000 7f6a71617700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f6a4c0778c0 0x7f6a4c079d70 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.006+0000 7f6a71617700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f6a6c0ff220 0x7f6a6c1983e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.006+0000 7f6a71617700 1 --2- 192.168.123.100:0/830741735 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f6a6c1024b0 0x7f6a6c198920 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.006+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 >> 192.168.123.100:0/830741735 conn(0x7f6a6c0747e0 msgr2=0x7f6a6c0fc340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:33.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.006+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 shutdown_connections 2026-03-10T12:46:33.006 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.006+0000 7f6a71617700 1 -- 192.168.123.100:0/830741735 wait complete. 2026-03-10T12:46:33.007 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 34 2026-03-10T12:46:33.071 DEBUG:tasks.fs:max_mds reduced in epoch 34 2026-03-10T12:46:33.071 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 35 2026-03-10T12:46:33.224 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:33.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:32 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/3022024000' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T12:46:33.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:32 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/3022024000' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T12:46:33.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.511+0000 7f2407cff700 1 -- 192.168.123.100:0/1902738893 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 msgr2=0x7f2400102bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:33.512 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.511+0000 7f2407cff700 1 --2- 192.168.123.100:0/1902738893 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 0x7f2400102bd0 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f23f0009b00 tx=0x7f23f0009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 -- 192.168.123.100:0/1902738893 shutdown_connections 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 --2- 192.168.123.100:0/1902738893 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 0x7f2400102bd0 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 --2- 192.168.123.100:0/1902738893 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 0x7f2400108b30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 -- 192.168.123.100:0/1902738893 >> 192.168.123.100:0/1902738893 conn(0x7f24000fe280 msgr2=0x7f2400100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 -- 192.168.123.100:0/1902738893 shutdown_connections 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 -- 192.168.123.100:0/1902738893 wait complete. 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 Processor -- start 2026-03-10T12:46:33.513 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.513+0000 7f2407cff700 1 -- start start 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f2407cff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 0x7f2400198420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f2407cff700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 0x7f2400198960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f2407cff700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2400199040 con 0x7f2400102760 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f2407cff700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f240019cdd0 con 0x7f2400108760 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f240529a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 0x7f2400198960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f240529a700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 0x7f2400198960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60176/0 (socket says 192.168.123.100:60176) 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f240529a700 1 -- 192.168.123.100:0/2118766013 learned_addr learned my addr 192.168.123.100:0/2118766013 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f2405a9b700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 0x7f2400198420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:33.514 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f240529a700 1 -- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 msgr2=0x7f2400198420 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:33.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f240529a700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 0x7f2400198420 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f240529a700 1 -- 192.168.123.100:0/2118766013 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f23f00097e0 con 0x7f2400108760 2026-03-10T12:46:33.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.514+0000 7f2405a9b700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 0x7f2400198420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:46:33.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.515+0000 7f240529a700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 0x7f2400198960 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f23f00048c0 tx=0x7f23f00049a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:33.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.515+0000 7f23f6ffd700 1 -- 192.168.123.100:0/2118766013 <== mon.1 v2:192.168.123.107:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23f001d070 con 0x7f2400108760 2026-03-10T12:46:33.515 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.515+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f240019d050 con 0x7f2400108760 2026-03-10T12:46:33.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.515+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f240019d5c0 con 0x7f2400108760 2026-03-10T12:46:33.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.516+0000 7f23f6ffd700 1 -- 192.168.123.100:0/2118766013 <== mon.1 v2:192.168.123.107:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f23f000bc50 con 0x7f2400108760 2026-03-10T12:46:33.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.516+0000 7f23f6ffd700 1 -- 192.168.123.100:0/2118766013 <== mon.1 v2:192.168.123.107:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f23f000f670 con 0x7f2400108760 2026-03-10T12:46:33.516 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.516+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f23e4005320 con 0x7f2400108760 2026-03-10T12:46:33.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.517+0000 7f23f6ffd700 1 -- 192.168.123.100:0/2118766013 <== mon.1 v2:192.168.123.107:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f23f000f8b0 con 0x7f2400108760 2026-03-10T12:46:33.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.517+0000 7f23f6ffd700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f23ec077870 0x7f23ec079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:33.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.517+0000 7f23f6ffd700 1 -- 192.168.123.100:0/2118766013 <== mon.1 v2:192.168.123.107:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f23f009b180 con 0x7f2400108760 2026-03-10T12:46:33.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.518+0000 7f2405a9b700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f23ec077870 0x7f23ec079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:33.518 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.518+0000 7f2405a9b700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f23ec077870 0x7f23ec079d20 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f24001038a0 tx=0x7f23fc008040 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:33.520 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.520+0000 7f23f6ffd700 1 -- 192.168.123.100:0/2118766013 <== mon.1 v2:192.168.123.107:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f23f00639b0 con 0x7f2400108760 2026-03-10T12:46:33.659 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.659+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 35, "format": "json"} v 0) v1 -- 0x7f23e4005190 con 0x7f2400108760 2026-03-10T12:46:33.660 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.660+0000 7f23f6ffd700 1 -- 192.168.123.100:0/2118766013 <== mon.1 v2:192.168.123.107:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 35, "format": "json"}]=0 dumped fsmap epoch 35 v38) v1 ==== 107+0+4402 (secure 0 0 0) 0x7f23f0063100 con 0x7f2400108760 2026-03-10T12:46:33.661 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:33.661 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":35,"btime":"2026-03-10T12:43:53:232805+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30}],"filesystems":[{"mdsmap":{"epoch":35,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:53.232804+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34368},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34368":{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":0,"incarnation":32,"state":"up:active","state_seq":8,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34368,"qdb_cluster":[34368]},"id":1}]} 2026-03-10T12:46:33.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.663+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f23ec077870 msgr2=0x7f23ec079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:33.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.663+0000 7f2407cff700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f23ec077870 0x7f23ec079d20 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f24001038a0 tx=0x7f23fc008040 comp rx=0 tx=0).stop 2026-03-10T12:46:33.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.663+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 msgr2=0x7f2400198960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:33.663 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.663+0000 7f2407cff700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 0x7f2400198960 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f23f00048c0 tx=0x7f23f00049a0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.664+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 shutdown_connections 2026-03-10T12:46:33.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.664+0000 7f2407cff700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f23ec077870 0x7f23ec079d20 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.664+0000 7f2407cff700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f2400102760 0x7f2400198420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.664+0000 7f2407cff700 1 --2- 192.168.123.100:0/2118766013 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f2400108760 0x7f2400198960 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:33.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.664+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 >> 192.168.123.100:0/2118766013 conn(0x7f24000fe280 msgr2=0x7f24000ffa50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:33.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.664+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 shutdown_connections 2026-03-10T12:46:33.664 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:33.664+0000 7f2407cff700 1 -- 192.168.123.100:0/2118766013 wait complete. 2026-03-10T12:46:33.665 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 35 2026-03-10T12:46:33.713 DEBUG:tasks.fs:max_mds reduced in epoch 35 2026-03-10T12:46:33.714 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 36 2026-03-10T12:46:33.871 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:34.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.130+0000 7f3403d40700 1 -- 192.168.123.100:0/3433576082 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc108810 msgr2=0x7f33fc108be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.130+0000 7f3403d40700 1 --2- 192.168.123.100:0/3433576082 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc108810 0x7f33fc108be0 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f33ec009b00 tx=0x7f33ec009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:34.131 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.131+0000 7f3403d40700 1 -- 192.168.123.100:0/3433576082 shutdown_connections 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.131+0000 7f3403d40700 1 --2- 192.168.123.100:0/3433576082 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f33fc102810 0x7f33fc102c80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.131+0000 7f3403d40700 1 --2- 192.168.123.100:0/3433576082 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc108810 0x7f33fc108be0 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.131+0000 7f3403d40700 1 -- 192.168.123.100:0/3433576082 >> 192.168.123.100:0/3433576082 conn(0x7f33fc0fe330 msgr2=0x7f33fc100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.131+0000 7f3403d40700 1 -- 192.168.123.100:0/3433576082 shutdown_connections 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.132+0000 7f3403d40700 1 -- 192.168.123.100:0/3433576082 wait complete. 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.132+0000 7f3403d40700 1 Processor -- start 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.132+0000 7f3403d40700 1 -- start start 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.132+0000 7f3403d40700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc102810 0x7f33fc1983f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:34.132 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.132+0000 7f3403d40700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f33fc108810 0x7f33fc198930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.132+0000 7f3403d40700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33fc199010 con 0x7f33fc102810 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.132+0000 7f3403d40700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33fc19cda0 con 0x7f33fc108810 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f3401adc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc102810 0x7f33fc1983f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f3401adc700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc102810 0x7f33fc1983f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:60988/0 (socket says 192.168.123.100:60988) 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f3401adc700 1 -- 192.168.123.100:0/926709157 learned_addr learned my addr 192.168.123.100:0/926709157 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f34012db700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f33fc108810 0x7f33fc198930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f3401adc700 1 -- 192.168.123.100:0/926709157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f33fc108810 msgr2=0x7f33fc198930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f3401adc700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f33fc108810 0x7f33fc198930 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f3401adc700 1 -- 192.168.123.100:0/926709157 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33ec0097e0 con 0x7f33fc102810 2026-03-10T12:46:34.133 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.133+0000 7f3401adc700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc102810 0x7f33fc1983f0 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f33ec00bb30 tx=0x7f33ec00bc10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:34.134 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.134+0000 7f33f2ffd700 1 -- 192.168.123.100:0/926709157 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f33ec01d070 con 0x7f33fc102810 2026-03-10T12:46:34.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.134+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33fc19d020 con 0x7f33fc102810 2026-03-10T12:46:34.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.134+0000 7f33f2ffd700 1 -- 192.168.123.100:0/926709157 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f33ec00f460 con 0x7f33fc102810 2026-03-10T12:46:34.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.134+0000 7f33f2ffd700 1 -- 192.168.123.100:0/926709157 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f33ec021600 con 0x7f33fc102810 2026-03-10T12:46:34.135 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.134+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33fc19d510 con 0x7f33fc102810 2026-03-10T12:46:34.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.135+0000 7f33f2ffd700 1 -- 192.168.123.100:0/926709157 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f33ec02b430 con 0x7f33fc102810 2026-03-10T12:46:34.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.136+0000 7f33f2ffd700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f33e80778c0 0x7f33e8079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:34.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.136+0000 7f33f2ffd700 1 -- 192.168.123.100:0/926709157 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7f33ec09bf20 con 0x7f33fc102810 2026-03-10T12:46:34.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.136+0000 7f34012db700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f33e80778c0 0x7f33e8079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:34.136 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.136+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f33fc04ea50 con 0x7f33fc102810 2026-03-10T12:46:34.139 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.139+0000 7f34012db700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f33e80778c0 0x7f33e8079d70 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f33fc199a10 tx=0x7f33f8009450 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:34.140 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.139+0000 7f33f2ffd700 1 -- 192.168.123.100:0/926709157 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f33ec0646a0 con 0x7f33fc102810 2026-03-10T12:46:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:33 vm00.local ceph-mon[103263]: pgmap v268: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:33 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/830741735' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-10T12:46:34.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:33 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/2118766013' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 35, "format": "json"}]: dispatch 2026-03-10T12:46:34.287 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.286+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 36, "format": "json"} v 0) v1 -- 0x7f33fc199750 con 0x7f33fc102810 2026-03-10T12:46:34.287 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.287+0000 7f33f2ffd700 1 -- 192.168.123.100:0/926709157 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 36, "format": "json"}]=0 dumped fsmap epoch 36 v38) v1 ==== 107+0+5253 (secure 0 0 0) 0x7f33ec063df0 con 0x7f33fc102810 2026-03-10T12:46:34.288 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:34.288 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":36,"btime":"2026-03-10T12:43:55:237735+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":22},{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30},{"gid":44305,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/3408808533","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3408808533},{"type":"v1","addr":"192.168.123.107:6827","nonce":3408808533}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":36}],"filesystems":[{"mdsmap":{"epoch":35,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:53.232804+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34368},"failed":[],"damaged":[],"stopped":[1],"info":{"gid_34368":{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":0,"incarnation":32,"state":"up:active","state_seq":8,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34368,"qdb_cluster":[34368]},"id":1}]} 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f33e80778c0 msgr2=0x7f33e8079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f33e80778c0 0x7f33e8079d70 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f33fc199a10 tx=0x7f33f8009450 comp rx=0 tx=0).stop 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc102810 msgr2=0x7f33fc1983f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc102810 0x7f33fc1983f0 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f33ec00bb30 tx=0x7f33ec00bc10 comp rx=0 tx=0).stop 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 shutdown_connections 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7f33e80778c0 0x7f33e8079d70 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7f33fc102810 0x7f33fc1983f0 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.290 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.290+0000 7f3403d40700 1 --2- 192.168.123.100:0/926709157 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7f33fc108810 0x7f33fc198930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.291+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 >> 192.168.123.100:0/926709157 conn(0x7f33fc0fe330 msgr2=0x7f33fc0ffa90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:34.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.291+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 shutdown_connections 2026-03-10T12:46:34.291 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.291+0000 7f3403d40700 1 -- 192.168.123.100:0/926709157 wait complete. 2026-03-10T12:46:34.292 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 36 2026-03-10T12:46:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:33 vm07.local ceph-mon[93622]: pgmap v268: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:33 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/830741735' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-10T12:46:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:33 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/2118766013' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 35, "format": "json"}]: dispatch 2026-03-10T12:46:34.352 DEBUG:tasks.fs:max_mds reduced in epoch 36 2026-03-10T12:46:34.352 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph fs dump --format=json 37 2026-03-10T12:46:34.501 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.772+0000 7feac997f700 1 -- 192.168.123.100:0/1150332544 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 msgr2=0x7feac4068900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.772+0000 7feac997f700 1 --2- 192.168.123.100:0/1150332544 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac4068900 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7feab4009b00 tx=0x7feab4009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.773+0000 7feac997f700 1 -- 192.168.123.100:0/1150332544 shutdown_connections 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.773+0000 7feac997f700 1 --2- 192.168.123.100:0/1150332544 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac4068900 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.773+0000 7feac997f700 1 --2- 192.168.123.100:0/1150332544 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feac41066c0 0x7feac4106a90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.773+0000 7feac997f700 1 -- 192.168.123.100:0/1150332544 >> 192.168.123.100:0/1150332544 conn(0x7feac40754a0 msgr2=0x7feac40758a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.773+0000 7feac997f700 1 -- 192.168.123.100:0/1150332544 shutdown_connections 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.773+0000 7feac997f700 1 -- 192.168.123.100:0/1150332544 wait complete. 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.773+0000 7feac997f700 1 Processor -- start 2026-03-10T12:46:34.773 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac997f700 1 -- start start 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac997f700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac40fffd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac997f700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feac41066c0 0x7feac4100510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac997f700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feac4101d80 con 0x7feac4068490 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac997f700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feac4101ef0 con 0x7feac41066c0 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac40fffd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac27fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feac41066c0 0x7feac4100510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac2ffd700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac40fffd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32768/0 (socket says 192.168.123.100:32768) 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac2ffd700 1 -- 192.168.123.100:0/4203536567 learned_addr learned my addr 192.168.123.100:0/4203536567 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.774+0000 7feac27fc700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feac41066c0 0x7feac4100510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.107:3300/0 says I am v2:192.168.123.100:60200/0 (socket says 192.168.123.100:60200) 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac2ffd700 1 -- 192.168.123.100:0/4203536567 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feac41066c0 msgr2=0x7feac4100510 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac2ffd700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feac41066c0 0x7feac4100510 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.774 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac2ffd700 1 -- 192.168.123.100:0/4203536567 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feab40097e0 con 0x7feac4068490 2026-03-10T12:46:34.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac2ffd700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac40fffd0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7feaac00b700 tx=0x7feaac00ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:34.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac897d700 1 -- 192.168.123.100:0/4203536567 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feaac0107c0 con 0x7feac4068490 2026-03-10T12:46:34.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac897d700 1 -- 192.168.123.100:0/4203536567 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7feaac010e00 con 0x7feac4068490 2026-03-10T12:46:34.775 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feac4100b10 con 0x7feac4068490 2026-03-10T12:46:34.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.775+0000 7feac897d700 1 -- 192.168.123.100:0/4203536567 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feaac00f360 con 0x7feac4068490 2026-03-10T12:46:34.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.776+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feac41a6e10 con 0x7feac4068490 2026-03-10T12:46:34.776 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.777+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feac4103950 con 0x7feac4068490 2026-03-10T12:46:34.779 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.777+0000 7feac897d700 1 -- 192.168.123.100:0/4203536567 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feaac017360 con 0x7feac4068490 2026-03-10T12:46:34.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.780+0000 7feac897d700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab0077870 0x7feab0079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:34.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.780+0000 7feac897d700 1 -- 192.168.123.100:0/4203536567 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7feaac0986b0 con 0x7feac4068490 2026-03-10T12:46:34.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.780+0000 7feac897d700 1 -- 192.168.123.100:0/4203536567 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feaac098990 con 0x7feac4068490 2026-03-10T12:46:34.780 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.780+0000 7feac27fc700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab0077870 0x7feab0079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:34.781 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.781+0000 7feac27fc700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab0077870 0x7feab0079d20 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7feab4009ad0 tx=0x7feab4005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:34.921 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.921+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 37, "format": "json"} v 0) v1 -- 0x7feac404ea50 con 0x7feac4068490 2026-03-10T12:46:34.925 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.925+0000 7feac897d700 1 -- 192.168.123.100:0/4203536567 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 37, "format": "json"}]=0 dumped fsmap epoch 37 v38) v1 ==== 107+0+5270 (secure 0 0 0) 0x7feaac060e40 con 0x7feac4068490 2026-03-10T12:46:34.926 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:46:34.926 INFO:teuthology.orchestra.run.vm00.stdout:{"epoch":37,"btime":"2026-03-10T12:43:59:271412+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44301,"name":"cephfs.vm07.wznhgu","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6825/48365433","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6824","nonce":48365433},{"type":"v1","addr":"192.168.123.107:6825","nonce":48365433}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":30},{"gid":44305,"name":"cephfs.vm07.rhzwnr","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.107:6827/3408808533","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.107:6826","nonce":3408808533},{"type":"v1","addr":"192.168.123.107:6827","nonce":3408808533}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":36}],"filesystems":[{"mdsmap":{"epoch":37,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T12:35:09.477786+0000","modified":"2026-03-10T12:43:59.271396+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":2,"in":[0,1],"up":{"mds_0":34368,"mds_1":44277},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34368":{"gid":34368,"name":"cephfs.vm00.wdwvcu","rank":0,"incarnation":32,"state":"up:active","state_seq":8,"addr":"192.168.123.100:6829/23281310","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6828","nonce":23281310},{"type":"v1","addr":"192.168.123.100:6829","nonce":23281310}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}},"gid_44277":{"gid":44277,"name":"cephfs.vm00.lnokoe","rank":1,"incarnation":37,"state":"up:starting","state_seq":1,"addr":"192.168.123.100:6827/2887557827","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.100:6826","nonce":2887557827},{"type":"v1","addr":"192.168.123.100:6827","nonce":2887557827}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34368,"qdb_cluster":[34368]},"id":1}]} 2026-03-10T12:46:34.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.928+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab0077870 msgr2=0x7feab0079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.928 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.928+0000 7feac997f700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab0077870 0x7feab0079d20 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7feab4009ad0 tx=0x7feab4005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.928+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 msgr2=0x7feac40fffd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.928+0000 7feac997f700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac40fffd0 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7feaac00b700 tx=0x7feaac00ba10 comp rx=0 tx=0).stop 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.929+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 shutdown_connections 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.929+0000 7feac997f700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feab0077870 0x7feab0079d20 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.929+0000 7feac997f700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7feac4068490 0x7feac40fffd0 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.929+0000 7feac997f700 1 --2- 192.168.123.100:0/4203536567 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7feac41066c0 0x7feac4100510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.929+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 >> 192.168.123.100:0/4203536567 conn(0x7feac40754a0 msgr2=0x7feac40fede0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.929+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 shutdown_connections 2026-03-10T12:46:34.929 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:34.929+0000 7feac997f700 1 -- 192.168.123.100:0/4203536567 wait complete. 2026-03-10T12:46:34.930 INFO:teuthology.orchestra.run.vm00.stderr:dumped fsmap epoch 37 2026-03-10T12:46:35.006 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-10T12:46:35.009 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-10T12:46:35.009 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:46:35.009 DEBUG:teuthology.orchestra.run.vm00:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T12:46:35.026 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:46:35.026 DEBUG:teuthology.orchestra.run.vm00:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T12:46:35.084 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd blocklist ls 2026-03-10T12:46:35.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:34 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/926709157' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 36, "format": "json"}]: dispatch 2026-03-10T12:46:35.235 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:34 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/4203536567' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 37, "format": "json"}]: dispatch 2026-03-10T12:46:35.285 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:35.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:34 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/926709157' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 36, "format": "json"}]: dispatch 2026-03-10T12:46:35.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:34 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/4203536567' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 37, "format": "json"}]: dispatch 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.536+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/2654498015 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073070 msgr2=0x7ff7bc073440 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.536+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/2654498015 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073070 0x7ff7bc073440 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7ff7a8009b30 tx=0x7ff7a8009e40 comp rx=0 tx=0).stop 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.537+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/2654498015 shutdown_connections 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.537+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/2654498015 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7bc073980 0x7ff7bc10c8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.537+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/2654498015 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073070 0x7ff7bc073440 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.537+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/2654498015 >> 192.168.123.100:0/2654498015 conn(0x7ff7bc0fbfc0 msgr2=0x7ff7bc0fe3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.537+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/2654498015 shutdown_connections 2026-03-10T12:46:35.537 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.538+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/2654498015 wait complete. 2026-03-10T12:46:35.538 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.538+0000 7ff7c2fd0700 1 Processor -- start 2026-03-10T12:46:35.538 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.538+0000 7ff7c2fd0700 1 -- start start 2026-03-10T12:46:35.538 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.538+0000 7ff7c2fd0700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7bc073070 0x7ff7bc198390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7c2fd0700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073980 0x7ff7bc1988d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7c2fd0700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7bc198fb0 con 0x7ff7bc073980 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7c2fd0700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7bc19ccf0 con 0x7ff7bc073070 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7bbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073980 0x7ff7bc1988d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7bbfff700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073980 0x7ff7bc1988d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32790/0 (socket says 192.168.123.100:32790) 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7bbfff700 1 -- 192.168.123.100:0/813768532 learned_addr learned my addr 192.168.123.100:0/813768532 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7c0d6c700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7bc073070 0x7ff7bc198390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7bbfff700 1 -- 192.168.123.100:0/813768532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7bc073070 msgr2=0x7ff7bc198390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7bbfff700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7bc073070 0x7ff7bc198390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7bbfff700 1 -- 192.168.123.100:0/813768532 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7a80097e0 con 0x7ff7bc073980 2026-03-10T12:46:35.539 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.539+0000 7ff7bbfff700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073980 0x7ff7bc1988d0 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7ff7b000d8d0 tx=0x7ff7b000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:35.540 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.540+0000 7ff7b9ffb700 1 -- 192.168.123.100:0/813768532 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff7b0009880 con 0x7ff7bc073980 2026-03-10T12:46:35.540 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.540+0000 7ff7b9ffb700 1 -- 192.168.123.100:0/813768532 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff7b0010460 con 0x7ff7bc073980 2026-03-10T12:46:35.540 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.540+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff7bc19cfd0 con 0x7ff7bc073980 2026-03-10T12:46:35.540 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.540+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff7bc19d520 con 0x7ff7bc073980 2026-03-10T12:46:35.541 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.541+0000 7ff7b9ffb700 1 -- 192.168.123.100:0/813768532 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff7b000f5d0 con 0x7ff7bc073980 2026-03-10T12:46:35.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.541+0000 7ff7b9ffb700 1 -- 192.168.123.100:0/813768532 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff7b000f730 con 0x7ff7bc073980 2026-03-10T12:46:35.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.542+0000 7ff7b9ffb700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff7ac0779e0 0x7ff7ac079e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:35.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.542+0000 7ff7b9ffb700 1 -- 192.168.123.100:0/813768532 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7ff7b00998d0 con 0x7ff7bc073980 2026-03-10T12:46:35.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.542+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff7a0005320 con 0x7ff7bc073980 2026-03-10T12:46:35.542 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.542+0000 7ff7c0d6c700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff7ac0779e0 0x7ff7ac079e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:35.543 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.543+0000 7ff7c0d6c700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff7ac0779e0 0x7ff7ac079e90 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7ff7a800b580 tx=0x7ff7a8005fb0 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:35.546 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.546+0000 7ff7b9ffb700 1 -- 192.168.123.100:0/813768532 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff7b00630e0 con 0x7ff7bc073980 2026-03-10T12:46:35.667 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.667+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7ff7a0005f70 con 0x7ff7bc073980 2026-03-10T12:46:35.668 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.668+0000 7ff7b9ffb700 1 -- 192.168.123.100:0/813768532 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 35 entries v83) v1 ==== 81+0+2141 (secure 0 0 0) 0x7ff7b0020070 con 0x7ff7bc073980 2026-03-10T12:46:35.670 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6826/3705110268 2026-03-11T12:43:47.763979+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6825/1465224692 2026-03-11T12:43:37.511992+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6824/1465224692 2026-03-11T12:43:37.511992+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/718783970 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/771107042 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2753083811 2026-03-11T12:32:46.753519+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6801/2 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/2251444368 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/69960775 2026-03-11T12:33:23.075938+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/385023950 2026-03-11T12:33:23.075938+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2641999781 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2255478548 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6800/3276280342 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/3783258003 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/4057224011 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3472231466 2026-03-11T12:32:46.753519+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6829/3729807627 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6828/3729807627 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/4113305903 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3579075241 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6800/2 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/1498007334 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3513500591 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1487901880 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/627900096 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1015166415 2026-03-11T12:32:46.753519+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/2254989276 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2792084710 2026-03-11T12:33:23.075938+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1442998252 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/1317719985 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1850197913 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1819709408 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.671 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6827/3705110268 2026-03-11T12:43:47.763979+0000 2026-03-10T12:46:35.672 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6801/3276280342 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.672 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3868925788 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:35.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.672+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff7ac0779e0 msgr2=0x7ff7ac079e90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:35.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.672+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff7ac0779e0 0x7ff7ac079e90 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7ff7a800b580 tx=0x7ff7a8005fb0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.672+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073980 msgr2=0x7ff7bc1988d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:35.672 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.673+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073980 0x7ff7bc1988d0 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7ff7b000d8d0 tx=0x7ff7b000dbe0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.673+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 shutdown_connections 2026-03-10T12:46:35.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.673+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7ff7ac0779e0 0x7ff7ac079e90 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.673+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7ff7bc073070 0x7ff7bc198390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.673+0000 7ff7c2fd0700 1 --2- 192.168.123.100:0/813768532 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7ff7bc073980 0x7ff7bc1988d0 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:35.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.673+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 >> 192.168.123.100:0/813768532 conn(0x7ff7bc0fbfc0 msgr2=0x7ff7bc1070e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:35.673 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.673+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 shutdown_connections 2026-03-10T12:46:35.674 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:35.674+0000 7ff7c2fd0700 1 -- 192.168.123.100:0/813768532 wait complete. 2026-03-10T12:46:35.675 INFO:teuthology.orchestra.run.vm00.stderr:listed 35 entries 2026-03-10T12:46:35.744 DEBUG:teuthology.orchestra.run.vm00:> set -ex 2026-03-10T12:46:35.744 DEBUG:teuthology.orchestra.run.vm00:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T12:46:35.763 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph osd blocklist ls 2026-03-10T12:46:35.954 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:46:36.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:35 vm00.local ceph-mon[103263]: pgmap v269: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:36.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:35 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/813768532' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T12:46:36.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.233+0000 7febb519d700 1 -- 192.168.123.100:0/1272576518 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 msgr2=0x7febb010a4f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:36.235 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.233+0000 7febb519d700 1 --2- 192.168.123.100:0/1272576518 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb010a4f0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7feba0009b00 tx=0x7feba0009e10 comp rx=0 tx=0).stop 2026-03-10T12:46:36.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 -- 192.168.123.100:0/1272576518 shutdown_connections 2026-03-10T12:46:36.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 --2- 192.168.123.100:0/1272576518 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb010a4f0 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:36.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 --2- 192.168.123.100:0/1272576518 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febb01016e0 0x7febb0101ab0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:36.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 -- 192.168.123.100:0/1272576518 >> 192.168.123.100:0/1272576518 conn(0x7febb00faf00 msgr2=0x7febb00fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:36.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 -- 192.168.123.100:0/1272576518 shutdown_connections 2026-03-10T12:46:36.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 -- 192.168.123.100:0/1272576518 wait complete. 2026-03-10T12:46:36.236 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 Processor -- start 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 -- start start 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 --2- >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febb01016e0 0x7febb019adc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb0193de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 -- --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febb019b450 con 0x7febb0101ff0 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.236+0000 7febb519d700 1 -- --> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] -- mon_getmap magic: 0 v1 -- 0x7febb0194320 con 0x7febb01016e0 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febae59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb0193de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febae59c700 1 --2- >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb0193de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.100:3300/0 says I am v2:192.168.123.100:32820/0 (socket says 192.168.123.100:32820) 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febae59c700 1 -- 192.168.123.100:0/1217931763 learned_addr learned my addr 192.168.123.100:0/1217931763 (peer_addr_for_me v2:192.168.123.100:0/0) 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febaed9d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febb01016e0 0x7febb019adc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:36.237 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febae59c700 1 -- 192.168.123.100:0/1217931763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febb01016e0 msgr2=0x7febb019adc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:36.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febae59c700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febb01016e0 0x7febb019adc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:36.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febae59c700 1 -- 192.168.123.100:0/1217931763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feba00097e0 con 0x7febb0101ff0 2026-03-10T12:46:36.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.237+0000 7febaed9d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febb01016e0 0x7febb019adc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T12:46:36.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.238+0000 7febae59c700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb0193de0 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7feba0004930 tx=0x7feba0004a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:36.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.238+0000 7feba7fff700 1 -- 192.168.123.100:0/1217931763 <== mon.0 v2:192.168.123.100:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feba001d070 con 0x7febb0101ff0 2026-03-10T12:46:36.238 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.238+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7febb0194520 con 0x7febb0101ff0 2026-03-10T12:46:36.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.238+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7febb0194a10 con 0x7febb0101ff0 2026-03-10T12:46:36.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.239+0000 7feba7fff700 1 -- 192.168.123.100:0/1217931763 <== mon.0 v2:192.168.123.100:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7feba000bc50 con 0x7febb0101ff0 2026-03-10T12:46:36.239 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.239+0000 7feba7fff700 1 -- 192.168.123.100:0/1217931763 <== mon.0 v2:192.168.123.100:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feba000f830 con 0x7febb0101ff0 2026-03-10T12:46:36.240 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.239+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7febb004ea50 con 0x7febb0101ff0 2026-03-10T12:46:36.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.241+0000 7feba7fff700 1 -- 192.168.123.100:0/1217931763 <== mon.0 v2:192.168.123.100:3300/0 4 ==== mgrmap(e 37) v1 ==== 100115+0+0 (secure 0 0 0) 0x7feba000f990 con 0x7febb0101ff0 2026-03-10T12:46:36.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.241+0000 7feba7fff700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb9c0778c0 0x7feb9c079d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T12:46:36.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.241+0000 7febaed9d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb9c0778c0 0x7feb9c079d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T12:46:36.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.241+0000 7febaed9d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb9c0778c0 0x7feb9c079d70 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7feb98005fd0 tx=0x7feb98005e30 comp rx=0 tx=0).ready entity=mgr.24563 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T12:46:36.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.241+0000 7feba7fff700 1 -- 192.168.123.100:0/1217931763 <== mon.0 v2:192.168.123.100:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6308+0+0 (secure 0 0 0) 0x7feba009b270 con 0x7febb0101ff0 2026-03-10T12:46:36.243 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.243+0000 7feba7fff700 1 -- 192.168.123.100:0/1217931763 <== mon.0 v2:192.168.123.100:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7feba0064a60 con 0x7febb0101ff0 2026-03-10T12:46:36.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:35 vm07.local ceph-mon[93622]: pgmap v269: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:36.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:35 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/813768532' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T12:46:36.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.369+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 --> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7febb0195310 con 0x7febb0101ff0 2026-03-10T12:46:36.370 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.369+0000 7feba7fff700 1 -- 192.168.123.100:0/1217931763 <== mon.0 v2:192.168.123.100:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 35 entries v83) v1 ==== 81+0+2141 (secure 0 0 0) 0x7feba00641b0 con 0x7febb0101ff0 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6826/3705110268 2026-03-11T12:43:47.763979+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6825/1465224692 2026-03-11T12:43:37.511992+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6824/1465224692 2026-03-11T12:43:37.511992+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/718783970 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/771107042 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2753083811 2026-03-11T12:32:46.753519+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6801/2 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/2251444368 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/69960775 2026-03-11T12:33:23.075938+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/385023950 2026-03-11T12:33:23.075938+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2641999781 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2255478548 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6800/3276280342 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/3783258003 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/4057224011 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3472231466 2026-03-11T12:32:46.753519+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6829/3729807627 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.372 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6828/3729807627 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/4113305903 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3579075241 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6800/2 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/1498007334 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3513500591 2026-03-11T12:38:06.442275+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1487901880 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/627900096 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1015166415 2026-03-11T12:32:46.753519+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/2254989276 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/2792084710 2026-03-11T12:33:23.075938+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1442998252 2026-03-11T12:32:32.101116+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:0/1317719985 2026-03-11T12:38:32.745529+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1850197913 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/1819709408 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.107:6827/3705110268 2026-03-11T12:43:47.763979+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:6801/3276280342 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.373 INFO:teuthology.orchestra.run.vm00.stdout:192.168.123.100:0/3868925788 2026-03-11T12:39:00.598323+0000 2026-03-10T12:46:36.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.374+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb9c0778c0 msgr2=0x7feb9c079d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:36.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.374+0000 7febb519d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb9c0778c0 0x7feb9c079d70 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7feb98005fd0 tx=0x7feb98005e30 comp rx=0 tx=0).stop 2026-03-10T12:46:36.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.374+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 msgr2=0x7febb0193de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T12:46:36.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.374+0000 7febb519d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb0193de0 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7feba0004930 tx=0x7feba0004a10 comp rx=0 tx=0).stop 2026-03-10T12:46:36.374 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.375+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 shutdown_connections 2026-03-10T12:46:36.375 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.375+0000 7febb519d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:6800/464552988,v1:192.168.123.100:6801/464552988] conn(0x7feb9c0778c0 0x7feb9c079d70 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:36.375 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.375+0000 7febb519d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.107:3300/0,v1:192.168.123.107:6789/0] conn(0x7febb01016e0 0x7febb019adc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:36.375 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.375+0000 7febb519d700 1 --2- 192.168.123.100:0/1217931763 >> [v2:192.168.123.100:3300/0,v1:192.168.123.100:6789/0] conn(0x7febb0101ff0 0x7febb0193de0 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T12:46:36.375 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.375+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 >> 192.168.123.100:0/1217931763 conn(0x7febb00faf00 msgr2=0x7febb00ffb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T12:46:36.375 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.375+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 shutdown_connections 2026-03-10T12:46:36.375 INFO:teuthology.orchestra.run.vm00.stderr:2026-03-10T12:46:36.375+0000 7febb519d700 1 -- 192.168.123.100:0/1217931763 wait complete. 2026-03-10T12:46:36.376 INFO:teuthology.orchestra.run.vm00.stderr:listed 35 entries 2026-03-10T12:46:36.438 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm00.local... 2026-03-10T12:46:36.438 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T12:46:36.438 DEBUG:teuthology.orchestra.run.vm00:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-10T12:46:36.470 INFO:teuthology.orchestra.run:waiting for 300 2026-03-10T12:46:37.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:37 vm07.local ceph-mon[93622]: from='client.? 192.168.123.100:0/1217931763' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T12:46:37.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:37 vm07.local ceph-mon[93622]: pgmap v270: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:37.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:37 vm00.local ceph-mon[103263]: from='client.? 192.168.123.100:0/1217931763' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T12:46:37.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:37 vm00.local ceph-mon[103263]: pgmap v270: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:40.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:39 vm07.local ceph-mon[93622]: pgmap v271: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:40.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:39 vm00.local ceph-mon[103263]: pgmap v271: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:42.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:41 vm07.local ceph-mon[93622]: pgmap v272: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:42.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:46:42.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:46:42.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:46:42.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:46:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:41 vm00.local ceph-mon[103263]: pgmap v272: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:46:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:46:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:46:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:46:44.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:43 vm07.local ceph-mon[93622]: pgmap v273: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:44.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:43 vm00.local ceph-mon[103263]: pgmap v273: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:46.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:45 vm00.local ceph-mon[103263]: pgmap v274: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:46.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:45 vm07.local ceph-mon[93622]: pgmap v274: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:46:47.983 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:47 vm00.local ceph-mon[103263]: pgmap v275: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:48.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:47 vm07.local ceph-mon[93622]: pgmap v275: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:50.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:49 vm00.local ceph-mon[103263]: pgmap v276: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:50.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:49 vm07.local ceph-mon[93622]: pgmap v276: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:52.183 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:51 vm00.local ceph-mon[103263]: pgmap v277: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:52.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:51 vm07.local ceph-mon[93622]: pgmap v277: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:54.118 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:53 vm07.local ceph-mon[93622]: pgmap v278: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:54.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:53 vm00.local ceph-mon[103263]: pgmap v278: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:56.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:55 vm00.local ceph-mon[103263]: pgmap v279: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:56.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:55 vm07.local ceph-mon[93622]: pgmap v279: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:58.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:57 vm00.local ceph-mon[103263]: pgmap v280: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:46:58.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:57 vm07.local ceph-mon[93622]: pgmap v280: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:00.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:46:59 vm00.local ceph-mon[103263]: pgmap v281: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:00.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:46:59 vm07.local ceph-mon[93622]: pgmap v281: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:01.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:01.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:02.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:01 vm00.local ceph-mon[103263]: pgmap v282: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:02.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:01 vm07.local ceph-mon[93622]: pgmap v282: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:04.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:03 vm00.local ceph-mon[103263]: pgmap v283: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:03 vm07.local ceph-mon[93622]: pgmap v283: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:06.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:05 vm00.local ceph-mon[103263]: pgmap v284: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:06.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:05 vm07.local ceph-mon[93622]: pgmap v284: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:08.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:07 vm00.local ceph-mon[103263]: pgmap v285: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:08.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:07 vm07.local ceph-mon[93622]: pgmap v285: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:10.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:09 vm00.local ceph-mon[103263]: pgmap v286: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:10.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:09 vm07.local ceph-mon[93622]: pgmap v286: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:12.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:11 vm00.local ceph-mon[103263]: pgmap v287: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:12.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:11 vm07.local ceph-mon[93622]: pgmap v287: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:14.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:13 vm00.local ceph-mon[103263]: pgmap v288: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:14.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:13 vm07.local ceph-mon[93622]: pgmap v288: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:15 vm00.local ceph-mon[103263]: pgmap v289: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:16.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:15 vm07.local ceph-mon[93622]: pgmap v289: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:16.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:18.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:17 vm00.local ceph-mon[103263]: pgmap v290: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:18.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:17 vm07.local ceph-mon[93622]: pgmap v290: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:20.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:19 vm00.local ceph-mon[103263]: pgmap v291: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:20.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:19 vm07.local ceph-mon[93622]: pgmap v291: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:22.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:21 vm00.local ceph-mon[103263]: pgmap v292: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:22.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:21 vm07.local ceph-mon[93622]: pgmap v292: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:24.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:23 vm00.local ceph-mon[103263]: pgmap v293: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:24.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:23 vm07.local ceph-mon[93622]: pgmap v293: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:26.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:25 vm00.local ceph-mon[103263]: pgmap v294: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:26.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:25 vm07.local ceph-mon[93622]: pgmap v294: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:28.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:27 vm07.local ceph-mon[93622]: pgmap v295: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:28.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:27 vm00.local ceph-mon[103263]: pgmap v295: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:30.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:29 vm07.local ceph-mon[93622]: pgmap v296: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:30.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:29 vm00.local ceph-mon[103263]: pgmap v296: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:31.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:31.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:32.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:32 vm07.local ceph-mon[93622]: pgmap v297: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:32.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:32 vm00.local ceph-mon[103263]: pgmap v297: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:33.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:33 vm07.local ceph-mon[93622]: pgmap v298: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:33.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:33 vm00.local ceph-mon[103263]: pgmap v298: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:36.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:35 vm07.local ceph-mon[93622]: pgmap v299: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:36.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:35 vm00.local ceph-mon[103263]: pgmap v299: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:38.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:37 vm07.local ceph-mon[93622]: pgmap v300: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:38.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:37 vm00.local ceph-mon[103263]: pgmap v300: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:40.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:39 vm07.local ceph-mon[93622]: pgmap v301: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:40.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:39 vm00.local ceph-mon[103263]: pgmap v301: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:41 vm00.local ceph-mon[103263]: pgmap v302: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:47:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:47:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:47:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:47:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:41 vm07.local ceph-mon[93622]: pgmap v302: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:47:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:47:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:47:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:47:44.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:43 vm00.local ceph-mon[103263]: pgmap v303: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:44.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:43 vm07.local ceph-mon[93622]: pgmap v303: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:46.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:45 vm00.local ceph-mon[103263]: pgmap v304: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:45 vm07.local ceph-mon[93622]: pgmap v304: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:47:48.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:47 vm00.local ceph-mon[103263]: pgmap v305: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:48.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:47 vm07.local ceph-mon[93622]: pgmap v305: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:50.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:49 vm00.local ceph-mon[103263]: pgmap v306: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:50.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:49 vm07.local ceph-mon[93622]: pgmap v306: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:52.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:51 vm00.local ceph-mon[103263]: pgmap v307: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:52.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:51 vm07.local ceph-mon[93622]: pgmap v307: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:54.117 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:53 vm07.local ceph-mon[93622]: pgmap v308: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:54.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:53 vm00.local ceph-mon[103263]: pgmap v308: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:56.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:55 vm00.local ceph-mon[103263]: pgmap v309: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:56.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:55 vm07.local ceph-mon[93622]: pgmap v309: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:58.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:57 vm00.local ceph-mon[103263]: pgmap v310: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:47:58.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:57 vm07.local ceph-mon[93622]: pgmap v310: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:00.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:47:59 vm00.local ceph-mon[103263]: pgmap v311: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:00.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:47:59 vm07.local ceph-mon[93622]: pgmap v311: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:01.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:01.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:02.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:01 vm00.local ceph-mon[103263]: pgmap v312: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:02.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:01 vm07.local ceph-mon[93622]: pgmap v312: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:04.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:03 vm00.local ceph-mon[103263]: pgmap v313: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:03 vm07.local ceph-mon[93622]: pgmap v313: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:06.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:05 vm00.local ceph-mon[103263]: pgmap v314: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:06.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:05 vm07.local ceph-mon[93622]: pgmap v314: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:08.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:07 vm00.local ceph-mon[103263]: pgmap v315: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:08.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:07 vm07.local ceph-mon[93622]: pgmap v315: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:10.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:09 vm00.local ceph-mon[103263]: pgmap v316: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:10.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:09 vm07.local ceph-mon[93622]: pgmap v316: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:12.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:11 vm00.local ceph-mon[103263]: pgmap v317: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:12.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:11 vm07.local ceph-mon[93622]: pgmap v317: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:14.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:13 vm00.local ceph-mon[103263]: pgmap v318: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:14.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:13 vm07.local ceph-mon[93622]: pgmap v318: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:15.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:15 vm07.local ceph-mon[93622]: pgmap v319: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:15.733 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:15 vm00.local ceph-mon[103263]: pgmap v319: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:16.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:16.733 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:16 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:17.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:17 vm07.local ceph-mon[93622]: pgmap v320: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:17.733 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:17 vm00.local ceph-mon[103263]: pgmap v320: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:20.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:19 vm07.local ceph-mon[93622]: pgmap v321: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:20.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:19 vm00.local ceph-mon[103263]: pgmap v321: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:22.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:21 vm00.local ceph-mon[103263]: pgmap v322: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:22.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:21 vm07.local ceph-mon[93622]: pgmap v322: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:24.117 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:23 vm07.local ceph-mon[93622]: pgmap v323: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:24.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:23 vm00.local ceph-mon[103263]: pgmap v323: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:26.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:25 vm00.local ceph-mon[103263]: pgmap v324: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:26.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:25 vm07.local ceph-mon[93622]: pgmap v324: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:28.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:27 vm00.local ceph-mon[103263]: pgmap v325: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:28.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:27 vm07.local ceph-mon[93622]: pgmap v325: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:29.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:29 vm00.local ceph-mon[103263]: pgmap v326: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:29.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:29 vm07.local ceph-mon[93622]: pgmap v326: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:31.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:31.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:32.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:31 vm00.local ceph-mon[103263]: pgmap v327: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:32.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:31 vm07.local ceph-mon[93622]: pgmap v327: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:34.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:33 vm00.local ceph-mon[103263]: pgmap v328: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:34.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:33 vm07.local ceph-mon[93622]: pgmap v328: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:36.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:35 vm00.local ceph-mon[103263]: pgmap v329: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:35 vm07.local ceph-mon[93622]: pgmap v329: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:38.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:37 vm00.local ceph-mon[103263]: pgmap v330: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:38.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:37 vm07.local ceph-mon[93622]: pgmap v330: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:40.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:39 vm00.local ceph-mon[103263]: pgmap v331: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:40.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:39 vm07.local ceph-mon[93622]: pgmap v331: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:41 vm00.local ceph-mon[103263]: pgmap v332: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:48:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:48:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:48:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:48:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:41 vm07.local ceph-mon[93622]: pgmap v332: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:48:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:48:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:48:42.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:48:44.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:43 vm00.local ceph-mon[103263]: pgmap v333: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:44.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:43 vm07.local ceph-mon[93622]: pgmap v333: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:46.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:45 vm00.local ceph-mon[103263]: pgmap v334: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:46.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:45 vm07.local ceph-mon[93622]: pgmap v334: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:48:48.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:47 vm00.local ceph-mon[103263]: pgmap v335: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:48.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:47 vm07.local ceph-mon[93622]: pgmap v335: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:50.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:49 vm00.local ceph-mon[103263]: pgmap v336: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:50.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:49 vm07.local ceph-mon[93622]: pgmap v336: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:52.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:51 vm00.local ceph-mon[103263]: pgmap v337: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:52.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:51 vm07.local ceph-mon[93622]: pgmap v337: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:54.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:53 vm00.local ceph-mon[103263]: pgmap v338: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:54.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:53 vm07.local ceph-mon[93622]: pgmap v338: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:56.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:55 vm00.local ceph-mon[103263]: pgmap v339: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:56.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:55 vm07.local ceph-mon[93622]: pgmap v339: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:58.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:57 vm00.local ceph-mon[103263]: pgmap v340: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:48:58.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:57 vm07.local ceph-mon[93622]: pgmap v340: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:00.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:48:59 vm00.local ceph-mon[103263]: pgmap v341: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:00.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:48:59 vm07.local ceph-mon[93622]: pgmap v341: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:01.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:01.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:02.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:01 vm00.local ceph-mon[103263]: pgmap v342: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:02.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:01 vm07.local ceph-mon[93622]: pgmap v342: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:04.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:03 vm00.local ceph-mon[103263]: pgmap v343: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:03 vm07.local ceph-mon[93622]: pgmap v343: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:06.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:06 vm07.local ceph-mon[93622]: pgmap v344: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:06.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:06 vm00.local ceph-mon[103263]: pgmap v344: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:07.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:07 vm07.local ceph-mon[93622]: pgmap v345: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:07.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:07 vm00.local ceph-mon[103263]: pgmap v345: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:10.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:09 vm00.local ceph-mon[103263]: pgmap v346: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:10.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:09 vm07.local ceph-mon[93622]: pgmap v346: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:12.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:11 vm00.local ceph-mon[103263]: pgmap v347: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:12.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:11 vm07.local ceph-mon[93622]: pgmap v347: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:14.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:13 vm00.local ceph-mon[103263]: pgmap v348: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:14.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:13 vm07.local ceph-mon[93622]: pgmap v348: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:16.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:15 vm07.local ceph-mon[93622]: pgmap v349: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:16.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:16.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:15 vm00.local ceph-mon[103263]: pgmap v349: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:16.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:18.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:17 vm00.local ceph-mon[103263]: pgmap v350: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:18.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:17 vm07.local ceph-mon[93622]: pgmap v350: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:20.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:19 vm00.local ceph-mon[103263]: pgmap v351: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:20.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:19 vm07.local ceph-mon[93622]: pgmap v351: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:22.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:21 vm00.local ceph-mon[103263]: pgmap v352: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:22.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:21 vm07.local ceph-mon[93622]: pgmap v352: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:24.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:23 vm00.local ceph-mon[103263]: pgmap v353: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:24.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:23 vm07.local ceph-mon[93622]: pgmap v353: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:26.066 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:25 vm07.local ceph-mon[93622]: pgmap v354: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:26.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:25 vm00.local ceph-mon[103263]: pgmap v354: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:28.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:27 vm00.local ceph-mon[103263]: pgmap v355: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:28.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:27 vm07.local ceph-mon[93622]: pgmap v355: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:30.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:29 vm00.local ceph-mon[103263]: pgmap v356: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:30.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:29 vm07.local ceph-mon[93622]: pgmap v356: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:31.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:31.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:32.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:31 vm00.local ceph-mon[103263]: pgmap v357: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:32.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:31 vm07.local ceph-mon[93622]: pgmap v357: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:34.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:33 vm00.local ceph-mon[103263]: pgmap v358: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:34.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:33 vm07.local ceph-mon[93622]: pgmap v358: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:36.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:35 vm00.local ceph-mon[103263]: pgmap v359: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:36.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:35 vm07.local ceph-mon[93622]: pgmap v359: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:38.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:37 vm00.local ceph-mon[103263]: pgmap v360: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:38.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:37 vm07.local ceph-mon[93622]: pgmap v360: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:40.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:39 vm00.local ceph-mon[103263]: pgmap v361: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:40.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:39 vm07.local ceph-mon[93622]: pgmap v361: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:41 vm00.local ceph-mon[103263]: pgmap v362: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:42.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:41 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:49:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:41 vm07.local ceph-mon[93622]: pgmap v362: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:41 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:49:43.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:49:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:49:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:49:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:49:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:49:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm07", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:49:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config rm", "who": "osd/host:vm00", "name": "osd_memory_target"}]: dispatch 2026-03-10T12:49:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:49:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:49:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:49:44.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:43 vm00.local ceph-mon[103263]: pgmap v363: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:44.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:43 vm07.local ceph-mon[93622]: pgmap v363: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:45 vm00.local ceph-mon[103263]: pgmap v364: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:45 vm07.local ceph-mon[93622]: pgmap v364: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:46.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:49:48.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:47 vm00.local ceph-mon[103263]: pgmap v365: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:48.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:47 vm07.local ceph-mon[93622]: pgmap v365: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:50.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:49 vm00.local ceph-mon[103263]: pgmap v366: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:50.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:49 vm07.local ceph-mon[93622]: pgmap v366: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:52.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:51 vm00.local ceph-mon[103263]: pgmap v367: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:52.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:51 vm07.local ceph-mon[93622]: pgmap v367: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:54.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:53 vm00.local ceph-mon[103263]: pgmap v368: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:54.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:53 vm07.local ceph-mon[93622]: pgmap v368: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:56.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:55 vm00.local ceph-mon[103263]: pgmap v369: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:56.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:55 vm07.local ceph-mon[93622]: pgmap v369: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:58.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:57 vm00.local ceph-mon[103263]: pgmap v370: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:49:58.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:57 vm07.local ceph-mon[93622]: pgmap v370: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:00.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:49:59 vm00.local ceph-mon[103263]: pgmap v371: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:00.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:49:59 vm07.local ceph-mon[93622]: pgmap v371: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:01.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:00 vm07.local ceph-mon[93622]: overall HEALTH_OK 2026-03-10T12:50:01.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:01.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:00 vm00.local ceph-mon[103263]: overall HEALTH_OK 2026-03-10T12:50:01.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:02.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:01 vm07.local ceph-mon[93622]: pgmap v372: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:02.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:01 vm00.local ceph-mon[103263]: pgmap v372: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:04.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:04 vm07.local ceph-mon[93622]: pgmap v373: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:04.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:04 vm00.local ceph-mon[103263]: pgmap v373: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:06.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:06 vm07.local ceph-mon[93622]: pgmap v374: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:06.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:06 vm00.local ceph-mon[103263]: pgmap v374: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:08.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:08 vm07.local ceph-mon[93622]: pgmap v375: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:08.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:08 vm00.local ceph-mon[103263]: pgmap v375: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:09.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:09 vm07.local ceph-mon[93622]: pgmap v376: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:09.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:09 vm00.local ceph-mon[103263]: pgmap v376: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:12.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:11 vm00.local ceph-mon[103263]: pgmap v377: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:12.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:11 vm07.local ceph-mon[93622]: pgmap v377: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:14.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:13 vm00.local ceph-mon[103263]: pgmap v378: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:14.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:13 vm07.local ceph-mon[93622]: pgmap v378: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:16 vm07.local ceph-mon[93622]: pgmap v379: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:16.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:16 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:16.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:15 vm00.local ceph-mon[103263]: pgmap v379: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:16.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:18.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:17 vm07.local ceph-mon[93622]: pgmap v380: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:18.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:17 vm00.local ceph-mon[103263]: pgmap v380: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:20.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:20 vm07.local ceph-mon[93622]: pgmap v381: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:20.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:20 vm00.local ceph-mon[103263]: pgmap v381: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:21.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:21 vm00.local ceph-mon[103263]: pgmap v382: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:21.566 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:21 vm07.local ceph-mon[93622]: pgmap v382: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:24.186 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:23 vm00.local ceph-mon[103263]: pgmap v383: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:24.269 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:23 vm07.local ceph-mon[93622]: pgmap v383: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:26.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:25 vm00.local ceph-mon[103263]: pgmap v384: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:26.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:25 vm07.local ceph-mon[93622]: pgmap v384: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:28.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:27 vm00.local ceph-mon[103263]: pgmap v385: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:28.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:27 vm07.local ceph-mon[93622]: pgmap v385: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:29.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:29 vm00.local ceph-mon[103263]: pgmap v386: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:29.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:29 vm07.local ceph-mon[93622]: pgmap v386: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:31.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:31.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:32.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:31 vm00.local ceph-mon[103263]: pgmap v387: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:32.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:31 vm07.local ceph-mon[93622]: pgmap v387: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:34.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:33 vm00.local ceph-mon[103263]: pgmap v388: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:34.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:33 vm07.local ceph-mon[93622]: pgmap v388: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:36.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:35 vm00.local ceph-mon[103263]: pgmap v389: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:36.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:35 vm07.local ceph-mon[93622]: pgmap v389: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:38.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:37 vm00.local ceph-mon[103263]: pgmap v390: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:38.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:37 vm07.local ceph-mon[93622]: pgmap v390: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:40.224 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:39 vm00.local ceph-mon[103263]: pgmap v391: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:40.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:39 vm07.local ceph-mon[93622]: pgmap v391: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:42.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:41 vm00.local ceph-mon[103263]: pgmap v392: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:42.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:41 vm07.local ceph-mon[93622]: pgmap v392: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:43.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:50:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:50:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:50:43.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:42 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:50:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T12:50:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T12:50:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T12:50:43.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:42 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' 2026-03-10T12:50:44.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:43 vm00.local ceph-mon[103263]: pgmap v393: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:44.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:43 vm07.local ceph-mon[93622]: pgmap v393: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:46.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:45 vm00.local ceph-mon[103263]: pgmap v394: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:46.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:45 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:45 vm07.local ceph-mon[93622]: pgmap v394: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:46.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:45 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:50:48.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:47 vm00.local ceph-mon[103263]: pgmap v395: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:48.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:47 vm07.local ceph-mon[93622]: pgmap v395: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:50.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:49 vm00.local ceph-mon[103263]: pgmap v396: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:50.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:49 vm07.local ceph-mon[93622]: pgmap v396: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:52.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:51 vm00.local ceph-mon[103263]: pgmap v397: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:52.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:51 vm07.local ceph-mon[93622]: pgmap v397: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:54.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:53 vm00.local ceph-mon[103263]: pgmap v398: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:54.316 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:53 vm07.local ceph-mon[93622]: pgmap v398: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:56.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:55 vm00.local ceph-mon[103263]: pgmap v399: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:56.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:55 vm07.local ceph-mon[93622]: pgmap v399: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:58.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:57 vm00.local ceph-mon[103263]: pgmap v400: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:50:58.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:57 vm07.local ceph-mon[93622]: pgmap v400: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:00.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:50:59 vm00.local ceph-mon[103263]: pgmap v401: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:00.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:50:59 vm07.local ceph-mon[93622]: pgmap v401: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:01.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:00 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:51:01.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:00 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:51:02.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:01 vm07.local ceph-mon[93622]: pgmap v402: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:02.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:01 vm00.local ceph-mon[103263]: pgmap v402: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:04.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:03 vm07.local ceph-mon[93622]: pgmap v403: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:04.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:03 vm00.local ceph-mon[103263]: pgmap v403: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:05.484 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:05 vm00.local ceph-mon[103263]: pgmap v404: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:05.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:05 vm07.local ceph-mon[93622]: pgmap v404: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:07.483 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:07 vm00.local ceph-mon[103263]: pgmap v405: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:07.565 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:07 vm07.local ceph-mon[93622]: pgmap v405: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:10.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:09 vm00.local ceph-mon[103263]: pgmap v406: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:10.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:09 vm07.local ceph-mon[93622]: pgmap v406: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:12.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:11 vm07.local ceph-mon[93622]: pgmap v407: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:12.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:11 vm00.local ceph-mon[103263]: pgmap v407: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:14.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:13 vm00.local ceph-mon[103263]: pgmap v408: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:14.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:13 vm07.local ceph-mon[93622]: pgmap v408: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:16.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:15 vm00.local ceph-mon[103263]: pgmap v409: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:16.234 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:15 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:51:16.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:15 vm07.local ceph-mon[93622]: pgmap v409: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:16.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:15 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:51:18.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:17 vm00.local ceph-mon[103263]: pgmap v410: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:18.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:17 vm07.local ceph-mon[93622]: pgmap v410: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:20.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:19 vm00.local ceph-mon[103263]: pgmap v411: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:20.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:19 vm07.local ceph-mon[93622]: pgmap v411: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:22.065 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:21 vm07.local ceph-mon[93622]: pgmap v412: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:22.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:21 vm00.local ceph-mon[103263]: pgmap v412: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:24.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:23 vm00.local ceph-mon[103263]: pgmap v413: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:24.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:23 vm07.local ceph-mon[93622]: pgmap v413: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:25.983 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:25 vm00.local ceph-mon[103263]: pgmap v414: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:26.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:25 vm07.local ceph-mon[93622]: pgmap v414: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:28.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:27 vm00.local ceph-mon[103263]: pgmap v415: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:28.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:27 vm07.local ceph-mon[93622]: pgmap v415: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:30.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:29 vm00.local ceph-mon[103263]: pgmap v416: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:30.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:29 vm07.local ceph-mon[93622]: pgmap v416: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:31.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:30 vm00.local ceph-mon[103263]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:51:31.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:30 vm07.local ceph-mon[93622]: from='mgr.24563 192.168.123.100:0/3318965105' entity='mgr.vm00.nescmq' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T12:51:32.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:31 vm00.local ceph-mon[103263]: pgmap v417: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:32.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:31 vm07.local ceph-mon[93622]: pgmap v417: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:34.233 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:33 vm00.local ceph-mon[103263]: pgmap v418: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:34.315 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:33 vm07.local ceph-mon[93622]: pgmap v418: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:35.522 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-10T12:51:35.522 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T12:51:35.523 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-10T12:51:35.526 INFO:tasks.cephadm:Teardown begin 2026-03-10T12:51:35.526 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T12:51:35.527 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T12:51:35.556 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T12:51:35.581 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-10T12:51:35.581 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.0 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 -- ceph mgr module disable cephadm 2026-03-10T12:51:35.732 INFO:teuthology.orchestra.run.vm00.stderr:Inferring config /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/mon.vm00/config 2026-03-10T12:51:35.867 INFO:teuthology.orchestra.run.vm00.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-10T12:51:35.885 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-10T12:51:35.885 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-10T12:51:35.885 DEBUG:teuthology.orchestra.run.vm00:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T12:51:35.900 DEBUG:teuthology.orchestra.run.vm07:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T12:51:35.915 INFO:tasks.cephadm:Stopping all daemons... 2026-03-10T12:51:35.916 INFO:tasks.cephadm.mon.vm00:Stopping mon.vm00... 2026-03-10T12:51:35.916 DEBUG:teuthology.orchestra.run.vm00:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00 2026-03-10T12:51:35.984 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:35 vm07.local ceph-mon[93622]: pgmap v419: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:36.080 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:35 vm00.local systemd[1]: Stopping Ceph mon.vm00 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:51:36.080 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:35 vm00.local ceph-mon[103263]: pgmap v419: 65 pgs: 65 active+clean; 150 MiB data, 717 MiB used, 119 GiB / 120 GiB avail 2026-03-10T12:51:36.080 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:36 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00[103259]: 2026-03-10T12:51:36.039+0000 7f1d205eb640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm00 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:51:36.080 INFO:journalctl@ceph.mon.vm00.vm00.stdout:Mar 10 12:51:36 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-mon-vm00[103259]: 2026-03-10T12:51:36.039+0000 7f1d205eb640 -1 mon.vm00@0(leader) e3 *** Got Signal Terminated *** 2026-03-10T12:51:36.158 DEBUG:teuthology.orchestra.run.vm00:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm00.service' 2026-03-10T12:51:36.189 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:51:36.189 INFO:tasks.cephadm.mon.vm00:Stopped mon.vm00 2026-03-10T12:51:36.189 INFO:tasks.cephadm.mon.vm07:Stopping mon.vm07... 2026-03-10T12:51:36.189 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm07 2026-03-10T12:51:36.295 INFO:journalctl@ceph.mon.vm07.vm07.stdout:Mar 10 12:51:36 vm07.local systemd[1]: Stopping Ceph mon.vm07 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:51:36.505 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@mon.vm07.service' 2026-03-10T12:51:36.536 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:51:36.536 INFO:tasks.cephadm.mon.vm07:Stopped mon.vm07 2026-03-10T12:51:36.536 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-10T12:51:36.536 DEBUG:teuthology.orchestra.run.vm00:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.0 2026-03-10T12:51:36.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:36 vm00.local systemd[1]: Stopping Ceph osd.0 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:51:36.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:36 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[109346]: 2026-03-10T12:51:36.637+0000 7f4053cff640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:51:36.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:36 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[109346]: 2026-03-10T12:51:36.637+0000 7f4053cff640 -1 osd.0 83 *** Got signal Terminated *** 2026-03-10T12:51:36.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:36 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0[109346]: 2026-03-10T12:51:36.637+0000 7f4053cff640 -1 osd.0 83 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local podman[153316]: 2026-03-10 12:51:41.673800349 +0000 UTC m=+5.050646651 container died 9b151d44f3cf6043c87ac7fcfa5325a6c8ae8e87753e1530528f422236f5312d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local podman[153316]: 2026-03-10 12:51:41.698611665 +0000 UTC m=+5.075457967 container remove 9b151d44f3cf6043c87ac7fcfa5325a6c8ae8e87753e1530528f422236f5312d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0) 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local bash[153316]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local podman[153383]: 2026-03-10 12:51:41.829395055 +0000 UTC m=+0.015538478 container create 33d4ebe3101b7f02c846ff63c5c600f96375613d07f405f35eb8798b80af4c1c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local podman[153383]: 2026-03-10 12:51:41.866479626 +0000 UTC m=+0.052623058 container init 33d4ebe3101b7f02c846ff63c5c600f96375613d07f405f35eb8798b80af4c1c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local podman[153383]: 2026-03-10 12:51:41.871168082 +0000 UTC m=+0.057311505 container start 33d4ebe3101b7f02c846ff63c5c600f96375613d07f405f35eb8798b80af4c1c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local podman[153383]: 2026-03-10 12:51:41.872180157 +0000 UTC m=+0.058323569 container attach 33d4ebe3101b7f02c846ff63c5c600f96375613d07f405f35eb8798b80af4c1c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-0-deactivate, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:51:41.984 INFO:journalctl@ceph.osd.0.vm00.stdout:Mar 10 12:51:41 vm00.local podman[153383]: 2026-03-10 12:51:41.822530154 +0000 UTC m=+0.008673577 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:51:42.028 DEBUG:teuthology.orchestra.run.vm00:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.0.service' 2026-03-10T12:51:42.060 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:51:42.061 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-10T12:51:42.061 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-10T12:51:42.061 DEBUG:teuthology.orchestra.run.vm00:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.1 2026-03-10T12:51:42.483 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:42 vm00.local systemd[1]: Stopping Ceph osd.1 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:51:42.483 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:42 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[113894]: 2026-03-10T12:51:42.203+0000 7f3f53d06640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:51:42.483 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:42 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[113894]: 2026-03-10T12:51:42.203+0000 7f3f53d06640 -1 osd.1 83 *** Got signal Terminated *** 2026-03-10T12:51:42.483 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:42 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1[113894]: 2026-03-10T12:51:42.203+0000 7f3f53d06640 -1 osd.1 83 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:51:47.565 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153478]: 2026-03-10 12:51:47.236149221 +0000 UTC m=+5.045004620 container died 252ea98c56650e3214a7e4635ecdcce97d5f8c7ae0e18f5b3c56bb10fdebca62 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2) 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153478]: 2026-03-10 12:51:47.268325253 +0000 UTC m=+5.077180652 container remove 252ea98c56650e3214a7e4635ecdcce97d5f8c7ae0e18f5b3c56bb10fdebca62 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local bash[153478]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153559]: 2026-03-10 12:51:47.393630187 +0000 UTC m=+0.015337862 container create 6efcd7485dc4b36f5d4488904fac9c1ed02237d320554588038992cf7c652cfc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_REF=squid) 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153559]: 2026-03-10 12:51:47.430444412 +0000 UTC m=+0.052152088 container init 6efcd7485dc4b36f5d4488904fac9c1ed02237d320554588038992cf7c652cfc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0) 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153559]: 2026-03-10 12:51:47.433576014 +0000 UTC m=+0.055283689 container start 6efcd7485dc4b36f5d4488904fac9c1ed02237d320554588038992cf7c652cfc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153559]: 2026-03-10 12:51:47.434617464 +0000 UTC m=+0.056325139 container attach 6efcd7485dc4b36f5d4488904fac9c1ed02237d320554588038992cf7c652cfc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153559]: 2026-03-10 12:51:47.387085296 +0000 UTC m=+0.008792961 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:51:47.566 INFO:journalctl@ceph.osd.1.vm00.stdout:Mar 10 12:51:47 vm00.local podman[153559]: 2026-03-10 12:51:47.566328049 +0000 UTC m=+0.188035724 container died 6efcd7485dc4b36f5d4488904fac9c1ed02237d320554588038992cf7c652cfc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-1-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:51:47.595 DEBUG:teuthology.orchestra.run.vm00:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.1.service' 2026-03-10T12:51:47.625 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:51:47.625 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-10T12:51:47.625 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-10T12:51:47.625 DEBUG:teuthology.orchestra.run.vm00:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.2 2026-03-10T12:51:47.983 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:47 vm00.local systemd[1]: Stopping Ceph osd.2 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:51:47.983 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:47 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[118033]: 2026-03-10T12:51:47.753+0000 7fa8c3451640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:51:47.983 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:47 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[118033]: 2026-03-10T12:51:47.753+0000 7fa8c3451640 -1 osd.2 83 *** Got signal Terminated *** 2026-03-10T12:51:47.983 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:47 vm00.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2[118033]: 2026-03-10T12:51:47.753+0000 7fa8c3451640 -1 osd.2 83 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:52 vm00.local podman[153657]: 2026-03-10 12:51:52.793387847 +0000 UTC m=+5.050122521 container died 249137e44eb73320cc3b7d5fb2611352f188ae04d8ea34073c8950f66c9054fc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:52 vm00.local podman[153657]: 2026-03-10 12:51:52.818604121 +0000 UTC m=+5.075338784 container remove 249137e44eb73320cc3b7d5fb2611352f188ae04d8ea34073c8950f66c9054fc (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:52 vm00.local bash[153657]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:52 vm00.local podman[153721]: 2026-03-10 12:51:52.944455088 +0000 UTC m=+0.015996967 container create be06bbddae1a5d286a09fbc9eca02657da392bd1d30f6967e2a1ccac58f9cf26 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:52 vm00.local podman[153721]: 2026-03-10 12:51:52.971238494 +0000 UTC m=+0.042780373 container init be06bbddae1a5d286a09fbc9eca02657da392bd1d30f6967e2a1ccac58f9cf26 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2) 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:52 vm00.local podman[153721]: 2026-03-10 12:51:52.976397611 +0000 UTC m=+0.047939491 container start be06bbddae1a5d286a09fbc9eca02657da392bd1d30f6967e2a1ccac58f9cf26 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:52 vm00.local podman[153721]: 2026-03-10 12:51:52.977292467 +0000 UTC m=+0.048834347 container attach be06bbddae1a5d286a09fbc9eca02657da392bd1d30f6967e2a1ccac58f9cf26 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:53 vm00.local podman[153721]: 2026-03-10 12:51:52.938587133 +0000 UTC m=+0.010129022 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:51:53.096 INFO:journalctl@ceph.osd.2.vm00.stdout:Mar 10 12:51:53 vm00.local podman[153721]: 2026-03-10 12:51:53.095978793 +0000 UTC m=+0.167520672 container died be06bbddae1a5d286a09fbc9eca02657da392bd1d30f6967e2a1ccac58f9cf26 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-2-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20260223, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T12:51:53.123 DEBUG:teuthology.orchestra.run.vm00:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.2.service' 2026-03-10T12:51:53.154 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:51:53.154 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-10T12:51:53.154 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-10T12:51:53.154 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.3 2026-03-10T12:51:53.565 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:53 vm07.local systemd[1]: Stopping Ceph osd.3 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:51:53.565 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:53 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[99893]: 2026-03-10T12:51:53.252+0000 7f8070c09640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:51:53.565 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:53 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[99893]: 2026-03-10T12:51:53.252+0000 7f8070c09640 -1 osd.3 83 *** Got signal Terminated *** 2026-03-10T12:51:53.565 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:53 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3[99893]: 2026-03-10T12:51:53.252+0000 7f8070c09640 -1 osd.3 83 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local podman[124438]: 2026-03-10 12:51:58.298199693 +0000 UTC m=+5.057633877 container died 72a045e3b78b360940476b4ac5c0a1e208ea1de379c5afe8114a7bc3afa315f3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=squid) 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local podman[124438]: 2026-03-10 12:51:58.326281579 +0000 UTC m=+5.085715764 container remove 72a045e3b78b360940476b4ac5c0a1e208ea1de379c5afe8114a7bc3afa315f3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3, OSD_FLAVOR=default, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local bash[124438]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local podman[124517]: 2026-03-10 12:51:58.452982086 +0000 UTC m=+0.015147512 container create b98120af6d08556eb46f5b56c139e9c2cbb3dcfc647f7c35ce3efc06492b66c6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=squid) 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local podman[124517]: 2026-03-10 12:51:58.494172714 +0000 UTC m=+0.056338151 container init b98120af6d08556eb46f5b56c139e9c2cbb3dcfc647f7c35ce3efc06492b66c6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local podman[124517]: 2026-03-10 12:51:58.497117328 +0000 UTC m=+0.059282754 container start b98120af6d08556eb46f5b56c139e9c2cbb3dcfc647f7c35ce3efc06492b66c6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local podman[124517]: 2026-03-10 12:51:58.501123018 +0000 UTC m=+0.063288453 container attach b98120af6d08556eb46f5b56c139e9c2cbb3dcfc647f7c35ce3efc06492b66c6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2) 2026-03-10T12:51:58.566 INFO:journalctl@ceph.osd.3.vm07.stdout:Mar 10 12:51:58 vm07.local podman[124517]: 2026-03-10 12:51:58.446378361 +0000 UTC m=+0.008543787 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T12:51:58.654 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.3.service' 2026-03-10T12:51:58.688 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:51:58.688 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-10T12:51:58.688 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-10T12:51:58.688 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.4 2026-03-10T12:51:58.819 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:51:58 vm07.local systemd[1]: Stopping Ceph osd.4 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:51:59.315 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:51:58 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:51:58.819+0000 7f4def05d640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:51:59.315 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:51:58 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:51:58.819+0000 7f4def05d640 -1 osd.4 83 *** Got signal Terminated *** 2026-03-10T12:51:59.315 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:51:58 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:51:58.819+0000 7f4def05d640 -1 osd.4 83 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:52:03.565 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:03 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:03.221+0000 7f764e1a7640 -1 osd.5 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:40.556067+0000 front 2026-03-10T12:51:40.556110+0000 (oldest deadline 2026-03-10T12:52:02.855845+0000) 2026-03-10T12:52:04.066 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:03 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4[103575]: 2026-03-10T12:52:03.619+0000 7f4deb665640 -1 osd.4 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:38.526886+0000 front 2026-03-10T12:51:38.526876+0000 (oldest deadline 2026-03-10T12:52:02.626531+0000) 2026-03-10T12:52:04.066 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:03 vm07.local podman[124613]: 2026-03-10 12:52:03.857832881 +0000 UTC m=+5.049683811 container died 7ac87e1c2a4169504cccb779482cb077f05d40d198bb42a29fecf6e153b05972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0) 2026-03-10T12:52:04.372 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:04 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:04.227+0000 7f764e1a7640 -1 osd.5 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:40.556067+0000 front 2026-03-10T12:51:40.556110+0000 (oldest deadline 2026-03-10T12:52:02.855845+0000) 2026-03-10T12:52:04.372 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:04 vm07.local podman[124613]: 2026-03-10 12:52:04.133203598 +0000 UTC m=+5.325054528 container remove 7ac87e1c2a4169504cccb779482cb077f05d40d198bb42a29fecf6e153b05972 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T12:52:04.372 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:04 vm07.local bash[124613]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4 2026-03-10T12:52:04.372 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:04 vm07.local podman[124679]: 2026-03-10 12:52:04.281932346 +0000 UTC m=+0.017493133 container create 2b3e82c71e027330f169a484ef83ae7b7b164de2f38efa74af78e0b381dfdb8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:52:04.372 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:04 vm07.local podman[124679]: 2026-03-10 12:52:04.326452259 +0000 UTC m=+0.062013056 container init 2b3e82c71e027330f169a484ef83ae7b7b164de2f38efa74af78e0b381dfdb8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223) 2026-03-10T12:52:04.372 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:04 vm07.local podman[124679]: 2026-03-10 12:52:04.329789887 +0000 UTC m=+0.065350674 container start 2b3e82c71e027330f169a484ef83ae7b7b164de2f38efa74af78e0b381dfdb8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:52:04.372 INFO:journalctl@ceph.osd.4.vm07.stdout:Mar 10 12:52:04 vm07.local podman[124679]: 2026-03-10 12:52:04.330812322 +0000 UTC m=+0.066373109 container attach 2b3e82c71e027330f169a484ef83ae7b7b164de2f38efa74af78e0b381dfdb8a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, ceph=True) 2026-03-10T12:52:04.492 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.4.service' 2026-03-10T12:52:04.528 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:52:04.528 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-10T12:52:04.528 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-10T12:52:04.528 DEBUG:teuthology.orchestra.run.vm07:> sudo systemctl stop ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.5 2026-03-10T12:52:04.676 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:04 vm07.local systemd[1]: Stopping Ceph osd.5 for 1a52002a-1c7d-11f1-af82-51cdd81caea8... 2026-03-10T12:52:05.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:04 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:04.675+0000 7f76523a0640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T12:52:05.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:04 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:04.675+0000 7f76523a0640 -1 osd.5 83 *** Got signal Terminated *** 2026-03-10T12:52:05.066 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:04 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:04.675+0000 7f76523a0640 -1 osd.5 83 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T12:52:05.565 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:05 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:05.227+0000 7f764e1a7640 -1 osd.5 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:40.556067+0000 front 2026-03-10T12:51:40.556110+0000 (oldest deadline 2026-03-10T12:52:02.855845+0000) 2026-03-10T12:52:06.565 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:06 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:06.198+0000 7f764e1a7640 -1 osd.5 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:40.556067+0000 front 2026-03-10T12:51:40.556110+0000 (oldest deadline 2026-03-10T12:52:02.855845+0000) 2026-03-10T12:52:07.565 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:07 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:07.223+0000 7f764e1a7640 -1 osd.5 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:40.556067+0000 front 2026-03-10T12:51:40.556110+0000 (oldest deadline 2026-03-10T12:52:02.855845+0000) 2026-03-10T12:52:08.565 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:08 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:08.273+0000 7f764e1a7640 -1 osd.5 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:40.556067+0000 front 2026-03-10T12:51:40.556110+0000 (oldest deadline 2026-03-10T12:52:02.855845+0000) 2026-03-10T12:52:09.565 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5[107356]: 2026-03-10T12:52:09.290+0000 7f764e1a7640 -1 osd.5 83 heartbeat_check: no reply from 192.168.123.100:6806 osd.0 since back 2026-03-10T12:51:40.556067+0000 front 2026-03-10T12:51:40.556110+0000 (oldest deadline 2026-03-10T12:52:02.855845+0000) 2026-03-10T12:52:09.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local podman[124775]: 2026-03-10 12:52:09.717413029 +0000 UTC m=+5.055671733 container died bd169bf008349682e5af0c95f71fd72efadde2c22c5f54dcc2cc6420647631c4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS) 2026-03-10T12:52:09.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local podman[124775]: 2026-03-10 12:52:09.744022037 +0000 UTC m=+5.082280751 container remove bd169bf008349682e5af0c95f71fd72efadde2c22c5f54dcc2cc6420647631c4 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T12:52:09.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local bash[124775]: ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5 2026-03-10T12:52:09.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local podman[124842]: 2026-03-10 12:52:09.896340588 +0000 UTC m=+0.018578335 container create 9085f834e9606389fe38a54311f160fe8aeb7288e325b36cbe7ae0faabd8ea5a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:52:09.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local podman[124842]: 2026-03-10 12:52:09.940179255 +0000 UTC m=+0.062417013 container init 9085f834e9606389fe38a54311f160fe8aeb7288e325b36cbe7ae0faabd8ea5a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-10T12:52:09.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local podman[124842]: 2026-03-10 12:52:09.943256537 +0000 UTC m=+0.065494284 container start 9085f834e9606389fe38a54311f160fe8aeb7288e325b36cbe7ae0faabd8ea5a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T12:52:09.988 INFO:journalctl@ceph.osd.5.vm07.stdout:Mar 10 12:52:09 vm07.local podman[124842]: 2026-03-10 12:52:09.946127652 +0000 UTC m=+0.068365399 container attach 9085f834e9606389fe38a54311f160fe8aeb7288e325b36cbe7ae0faabd8ea5a (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8-osd-5-deactivate, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T12:52:10.102 DEBUG:teuthology.orchestra.run.vm07:> sudo pkill -f 'journalctl -f -n 0 -u ceph-1a52002a-1c7d-11f1-af82-51cdd81caea8@osd.5.service' 2026-03-10T12:52:10.137 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T12:52:10.137 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-10T12:52:10.137 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 --force --keep-logs 2026-03-10T12:52:10.246 INFO:teuthology.orchestra.run.vm00.stdout:Deleting cluster with fsid: 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:52:11.699 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm00.stderr:ceph-fuse[91644]: fuse finished with error 0 and tester_r 0 2026-03-10T12:52:24.071 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 --force --keep-logs 2026-03-10T12:52:24.184 INFO:teuthology.orchestra.run.vm07.stdout:Deleting cluster with fsid: 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:52:29.410 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T12:52:29.442 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T12:52:29.471 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-10T12:52:29.472 DEBUG:teuthology.misc:Transferring archived files from vm00:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029/remote/vm00/crash 2026-03-10T12:52:29.472 DEBUG:teuthology.orchestra.run.vm00:> sudo tar c -f - -C /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/crash -- . 2026-03-10T12:52:29.508 INFO:teuthology.orchestra.run.vm00.stderr:tar: /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/crash: Cannot open: No such file or directory 2026-03-10T12:52:29.508 INFO:teuthology.orchestra.run.vm00.stderr:tar: Error is not recoverable: exiting now 2026-03-10T12:52:29.509 DEBUG:teuthology.misc:Transferring archived files from vm07:/var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029/remote/vm07/crash 2026-03-10T12:52:29.509 DEBUG:teuthology.orchestra.run.vm07:> sudo tar c -f - -C /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/crash -- . 2026-03-10T12:52:29.538 INFO:teuthology.orchestra.run.vm07.stderr:tar: /var/lib/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/crash: Cannot open: No such file or directory 2026-03-10T12:52:29.538 INFO:teuthology.orchestra.run.vm07.stderr:tar: Error is not recoverable: exiting now 2026-03-10T12:52:29.539 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-10T12:52:29.539 DEBUG:teuthology.orchestra.run.vm00:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-10T12:52:29.616 INFO:tasks.cephadm:Compressing logs... 2026-03-10T12:52:29.616 DEBUG:teuthology.orchestra.run.vm00:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T12:52:29.618 DEBUG:teuthology.orchestra.run.vm07:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T12:52:29.641 INFO:teuthology.orchestra.run.vm00.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T12:52:29.641 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T12:52:29.641 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mon.vm00.log 2026-03-10T12:52:29.642 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.log 2026-03-10T12:52:29.642 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mgr.vm00.nescmq.log 2026-03-10T12:52:29.649 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mon.vm00.log: /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.log: 87.5% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.log.gz 2026-03-10T12:52:29.653 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.audit.log 2026-03-10T12:52:29.658 INFO:teuthology.orchestra.run.vm07.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T12:52:29.658 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T12:52:29.659 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mgr.vm00.nescmq.log: 91.9% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T12:52:29.660 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-volume.log 2026-03-10T12:52:29.660 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-client.ceph-exporter.vm07.log 2026-03-10T12:52:29.661 INFO:teuthology.orchestra.run.vm07.stderr: 92.6% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T12:52:29.662 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.cephadm.log 2026-03-10T12:52:29.663 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-client.ceph-exporter.vm07.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mgr.vm07.kfawlb.log 2026-03-10T12:52:29.666 INFO:teuthology.orchestra.run.vm07.stderr: 94.0% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-client.ceph-exporter.vm07.log.gz 2026-03-10T12:52:29.668 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.audit.log: 91.3% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.audit.log.gz 2026-03-10T12:52:29.670 INFO:teuthology.orchestra.run.vm07.stderr: 94.1% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-volume.log.gz 2026-03-10T12:52:29.670 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mon.vm07.log 2026-03-10T12:52:29.670 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-volume.log 2026-03-10T12:52:29.670 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mgr.vm07.kfawlb.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.audit.log 2026-03-10T12:52:29.671 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mon.vm07.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.log 2026-03-10T12:52:29.672 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.cephadm.log: 85.2% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.cephadm.log.gz 2026-03-10T12:52:29.676 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-client.ceph-exporter.vm00.log 2026-03-10T12:52:29.683 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.audit.log: 91.5% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.audit.log.gz 2026-03-10T12:52:29.685 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.cephadm.log 2026-03-10T12:52:29.686 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.0.log 2026-03-10T12:52:29.688 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-client.ceph-exporter.vm00.log: 94.1% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-client.ceph-exporter.vm00.log.gz 2026-03-10T12:52:29.690 INFO:teuthology.orchestra.run.vm00.stderr: 94.1% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-volume.log.gz 2026-03-10T12:52:29.693 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.log: 87.6% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.log.gz 2026-03-10T12:52:29.695 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.3.log 2026-03-10T12:52:29.698 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.1.log 2026-03-10T12:52:29.700 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.cephadm.log: 85.1% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph.cephadm.log.gz 2026-03-10T12:52:29.700 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.4.log 2026-03-10T12:52:29.708 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.3.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.5.log 2026-03-10T12:52:29.715 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.2.log 2026-03-10T12:52:29.720 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm07.wznhgu.log 2026-03-10T12:52:29.722 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm00.lnokoe.log 2026-03-10T12:52:29.723 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.5.log: 89.3% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mgr.vm07.kfawlb.log.gz 2026-03-10T12:52:29.729 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm07.rhzwnr.log 2026-03-10T12:52:29.730 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm00.wdwvcu.log 2026-03-10T12:52:29.738 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm00.lnokoe.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-10T12:52:29.739 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm07.wznhgu.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-10T12:52:30.263 INFO:teuthology.orchestra.run.vm00.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm00.wdwvcu.log: /var/log/ceph/ceph-client.0.log: 89.4% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mgr.vm00.nescmq.log.gz 2026-03-10T12:52:30.413 INFO:teuthology.orchestra.run.vm07.stderr:/var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm07.rhzwnr.log: /var/log/ceph/ceph-client.1.log: 92.3% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mon.vm07.log.gz 2026-03-10T12:52:31.272 INFO:teuthology.orchestra.run.vm00.stderr: 90.5% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mon.vm00.log.gz 2026-03-10T12:52:38.264 INFO:teuthology.orchestra.run.vm07.stderr: 95.1% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm07.rhzwnr.log.gz 2026-03-10T12:52:38.341 INFO:teuthology.orchestra.run.vm00.stderr: 95.0% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm00.wdwvcu.log.gz 2026-03-10T12:52:38.761 INFO:teuthology.orchestra.run.vm07.stderr: 93.5% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.4.log.gz 2026-03-10T12:52:39.185 INFO:teuthology.orchestra.run.vm00.stderr: 93.5% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.2.log.gz 2026-03-10T12:52:39.768 INFO:teuthology.orchestra.run.vm00.stderr: 93.6% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.0.log.gz 2026-03-10T12:52:40.327 INFO:teuthology.orchestra.run.vm00.stderr: 93.6% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.1.log.gz 2026-03-10T12:52:40.789 INFO:teuthology.orchestra.run.vm07.stderr: 93.8% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.5.log.gz 2026-03-10T12:52:41.472 INFO:teuthology.orchestra.run.vm07.stderr: 93.6% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-osd.3.log.gz 2026-03-10T12:52:45.800 INFO:teuthology.orchestra.run.vm00.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-10T12:52:45.930 INFO:teuthology.orchestra.run.vm00.stderr: 93.6% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-10T12:52:46.139 INFO:teuthology.orchestra.run.vm07.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-10T12:52:46.139 INFO:teuthology.orchestra.run.vm07.stderr: 93.5% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-10T12:53:06.448 INFO:teuthology.orchestra.run.vm00.stderr: 93.4% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm00.lnokoe.log.gz 2026-03-10T12:53:06.452 INFO:teuthology.orchestra.run.vm00.stderr: 2026-03-10T12:53:06.452 INFO:teuthology.orchestra.run.vm00.stderr:real 0m36.821s 2026-03-10T12:53:06.452 INFO:teuthology.orchestra.run.vm00.stderr:user 0m48.872s 2026-03-10T12:53:06.452 INFO:teuthology.orchestra.run.vm00.stderr:sys 0m3.399s 2026-03-10T12:53:11.097 INFO:teuthology.orchestra.run.vm07.stderr: 93.2% -- replaced with /var/log/ceph/1a52002a-1c7d-11f1-af82-51cdd81caea8/ceph-mds.cephfs.vm07.wznhgu.log.gz 2026-03-10T12:53:11.100 INFO:teuthology.orchestra.run.vm07.stderr: 2026-03-10T12:53:11.100 INFO:teuthology.orchestra.run.vm07.stderr:real 0m41.463s 2026-03-10T12:53:11.100 INFO:teuthology.orchestra.run.vm07.stderr:user 0m53.788s 2026-03-10T12:53:11.100 INFO:teuthology.orchestra.run.vm07.stderr:sys 0m3.362s 2026-03-10T12:53:11.100 INFO:tasks.cephadm:Archiving logs... 2026-03-10T12:53:11.100 DEBUG:teuthology.misc:Transferring archived files from vm00:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029/remote/vm00/log 2026-03-10T12:53:11.100 DEBUG:teuthology.orchestra.run.vm00:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T12:53:13.591 DEBUG:teuthology.misc:Transferring archived files from vm07:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029/remote/vm07/log 2026-03-10T12:53:13.591 DEBUG:teuthology.orchestra.run.vm07:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T12:53:16.394 INFO:tasks.cephadm:Removing cluster... 2026-03-10T12:53:16.395 DEBUG:teuthology.orchestra.run.vm00:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 --force 2026-03-10T12:53:16.501 INFO:teuthology.orchestra.run.vm00.stdout:Deleting cluster with fsid: 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:53:17.277 DEBUG:teuthology.orchestra.run.vm07:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 1a52002a-1c7d-11f1-af82-51cdd81caea8 --force 2026-03-10T12:53:17.383 INFO:teuthology.orchestra.run.vm07.stdout:Deleting cluster with fsid: 1a52002a-1c7d-11f1-af82-51cdd81caea8 2026-03-10T12:53:18.274 INFO:tasks.cephadm:Removing cephadm ... 2026-03-10T12:53:18.274 DEBUG:teuthology.orchestra.run.vm00:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T12:53:18.293 DEBUG:teuthology.orchestra.run.vm07:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T12:53:18.314 INFO:tasks.cephadm:Teardown complete 2026-03-10T12:53:18.314 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-10T12:53:18.317 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T12:53:18.317 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-10T12:53:18.317 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T12:53:18.335 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T12:53:18.391 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T12:53:18.391 DEBUG:teuthology.orchestra.run.vm00:> 2026-03-10T12:53:18.391 DEBUG:teuthology.orchestra.run.vm00:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T12:53:18.391 DEBUG:teuthology.orchestra.run.vm00:> sudo yum -y remove $d || true 2026-03-10T12:53:18.391 DEBUG:teuthology.orchestra.run.vm00:> done 2026-03-10T12:53:18.397 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T12:53:18.397 DEBUG:teuthology.orchestra.run.vm07:> 2026-03-10T12:53:18.397 DEBUG:teuthology.orchestra.run.vm07:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T12:53:18.397 DEBUG:teuthology.orchestra.run.vm07:> sudo yum -y remove $d || true 2026-03-10T12:53:18.397 DEBUG:teuthology.orchestra.run.vm07:> done 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:Removing unused dependencies: 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:Remove 2 Packages 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:18.711 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 31 M 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw x86_64 2:18.2.0-0.el9 @ceph 31 M 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:Remove 2 Packages 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 31 M 2026-03-10T12:53:18.712 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:18.717 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:18.717 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:18.717 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:18.717 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:18.733 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:18.734 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:18.734 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:18.734 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:18.768 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:18.769 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:18.794 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.794 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:18.794 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T12:53:18.794 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T12:53:18.794 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T12:53:18.794 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:18.795 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.795 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:18.795 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T12:53:18.795 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T12:53:18.795 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T12:53:18.795 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:18.796 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.797 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.803 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.808 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.818 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T12:53:18.823 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T12:53:18.900 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T12:53:18.901 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.906 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T12:53:18.907 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-radosgw-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:18.962 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T12:53:18.962 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:18.962 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:18.962 INFO:teuthology.orchestra.run.vm07.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T12:53:18.962 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:18.962 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:18.967 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T12:53:18.967 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:18.967 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:18.967 INFO:teuthology.orchestra.run.vm00.stdout: ceph-radosgw-2:18.2.0-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T12:53:18.967 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:18.967 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:19.169 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:Remove 4 Packages 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 166 M 2026-03-10T12:53:19.170 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:19.174 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:19.174 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:19.182 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout: ceph-test x86_64 2:18.2.0-0.el9 @ceph 164 M 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:Removing unused dependencies: 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:Remove 4 Packages 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 166 M 2026-03-10T12:53:19.183 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:19.186 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:19.186 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:19.201 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:19.201 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:19.213 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:19.213 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:19.258 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:19.264 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T12:53:19.266 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T12:53:19.268 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:19.270 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T12:53:19.275 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T12:53:19.278 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T12:53:19.281 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T12:53:19.289 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T12:53:19.297 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T12:53:19.362 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T12:53:19.362 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T12:53:19.362 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T12:53:19.362 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T12:53:19.379 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T12:53:19.379 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-test-2:18.2.0-0.el9.x86_64 1/4 2026-03-10T12:53:19.379 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T12:53:19.379 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T12:53:19.414 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T12:53:19.414 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.414 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:19.414 INFO:teuthology.orchestra.run.vm07.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T12:53:19.414 INFO:teuthology.orchestra.run.vm07.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T12:53:19.414 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.414 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:19.427 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T12:53:19.427 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.427 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:19.427 INFO:teuthology.orchestra.run.vm00.stdout: ceph-test-2:18.2.0-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T12:53:19.427 INFO:teuthology.orchestra.run.vm00.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T12:53:19.427 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.427 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:19.642 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:Remove 8 Packages 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.643 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 89 M 2026-03-10T12:53:19.644 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:19.646 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:19.646 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:19.667 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout: Package Arch Version Repository Size 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout: ceph x86_64 2:18.2.0-0.el9 @ceph 0 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout:Removing unused dependencies: 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mds x86_64 2:18.2.0-0.el9 @ceph 6.4 M 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mon x86_64 2:18.2.0-0.el9 @ceph 20 M 2026-03-10T12:53:19.668 INFO:teuthology.orchestra.run.vm00.stdout: ceph-osd x86_64 2:18.2.0-0.el9 @ceph 61 M 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout:Remove 8 Packages 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 89 M 2026-03-10T12:53:19.669 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:19.672 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:19.672 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:19.675 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:19.675 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:19.699 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:19.699 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:19.719 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:19.721 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T12:53:19.741 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:19.741 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.741 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:19.741 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T12:53:19.741 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T12:53:19.741 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T12:53:19.741 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.743 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T12:53:19.743 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.754 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.765 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.765 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:19.765 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T12:53:19.765 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T12:53:19.765 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T12:53:19.765 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.767 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.769 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T12:53:19.769 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T12:53:19.769 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.770 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T12:53:19.777 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-osd-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.788 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T12:53:19.791 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T12:53:19.792 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T12:53:19.792 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T12:53:19.792 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.793 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T12:53:19.793 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T12:53:19.794 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T12:53:19.811 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T12:53:19.811 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:19.811 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T12:53:19.811 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T12:53:19.811 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T12:53:19.811 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.811 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T12:53:19.814 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T12:53:19.817 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T12:53:19.817 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T12:53:19.820 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T12:53:19.822 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T12:53:19.834 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T12:53:19.834 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:19.834 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T12:53:19.834 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T12:53:19.834 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T12:53:19.834 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:19.835 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T12:53:19.850 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T12:53:19.850 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:19.850 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T12:53:19.850 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T12:53:19.850 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T12:53:19.850 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.850 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T12:53:19.859 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mds-2:18.2.0-0.el9.x86_64 7/8 2026-03-10T12:53:19.886 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T12:53:19.887 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:19.887 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T12:53:19.887 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T12:53:19.887 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T12:53:19.887 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:19.888 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T12:53:19.939 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T12:53:19.985 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mon-2:18.2.0-0.el9.x86_64 8/8 2026-03-10T12:53:19.985 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-2:18.2.0-0.el9.x86_64 1/8 2026-03-10T12:53:19.986 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mds-2:18.2.0-0.el9.x86_64 2/8 2026-03-10T12:53:19.986 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mon-2:18.2.0-0.el9.x86_64 3/8 2026-03-10T12:53:19.986 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-osd-2:18.2.0-0.el9.x86_64 4/8 2026-03-10T12:53:19.986 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T12:53:19.986 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T12:53:19.986 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:20.000 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:20.044 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout: ceph-2:18.2.0-0.el9.x86_64 ceph-mds-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mon-2:18.2.0-0.el9.x86_64 ceph-osd-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:20.045 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:20.213 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T12:53:20.218 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T12:53:20.219 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout:Remove 84 Packages 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 433 M 2026-03-10T12:53:20.220 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:20.244 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:20.244 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:20.265 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:20.270 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:20.270 INFO:teuthology.orchestra.run.vm00.stdout: Package Arch Version Repository Size 2026-03-10T12:53:20.270 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:20.270 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:20.270 INFO:teuthology.orchestra.run.vm00.stdout: ceph-base x86_64 2:18.2.0-0.el9 @ceph 22 M 2026-03-10T12:53:20.270 INFO:teuthology.orchestra.run.vm00.stdout:Removing dependent packages: 2026-03-10T12:53:20.270 INFO:teuthology.orchestra.run.vm00.stdout: ceph-immutable-object-cache x86_64 2:18.2.0-0.el9 @ceph 399 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr x86_64 2:18.2.0-0.el9 @ceph 4.5 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 654 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-dashboard noarch 2:18.2.0-0.el9 @ceph-noarch 7.1 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.0-0.el9 @ceph-noarch 66 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-rook noarch 2:18.2.0-0.el9 @ceph-noarch 567 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: rbd-mirror x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout:Removing unused dependencies: 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-common x86_64 2:18.2.0-0.el9 @ceph 70 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-grafana-dashboards noarch 2:18.2.0-0.el9 @ceph-noarch 319 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-modules-core noarch 2:18.2.0-0.el9 @ceph-noarch 1.4 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-prometheus-alerts noarch 2:18.2.0-0.el9 @ceph-noarch 40 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: ceph-selinux x86_64 2:18.2.0-0.el9 @ceph 138 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: libcephsqlite x86_64 2:18.2.0-0.el9 @ceph 426 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: libradosstriper1 x86_64 2:18.2.0-0.el9 @ceph 1.5 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-common x86_64 2:18.2.0-0.el9 @ceph 570 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T12:53:20.271 INFO:teuthology.orchestra.run.vm00.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T12:53:20.272 INFO:teuthology.orchestra.run.vm00.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout:Remove 84 Packages 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 433 M 2026-03-10T12:53:20.273 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:20.296 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:20.296 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:20.356 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:20.356 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:20.415 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:20.415 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:20.500 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:20.500 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T12:53:20.509 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T12:53:20.533 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:20.533 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:20.533 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T12:53:20.534 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T12:53:20.534 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T12:53:20.534 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:20.534 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:20.549 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:20.559 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-10T12:53:20.559 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T12:53:20.562 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:20.562 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T12:53:20.570 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.0-0.el9.noarch 1/84 2026-03-10T12:53:20.588 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:20.588 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:20.588 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T12:53:20.588 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T12:53:20.588 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T12:53:20.588 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:20.589 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:20.604 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:20.614 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 3/84 2026-03-10T12:53:20.614 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T12:53:20.621 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T12:53:20.632 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T12:53:20.637 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T12:53:20.637 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T12:53:20.651 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T12:53:20.659 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T12:53:20.663 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T12:53:20.665 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T12:53:20.671 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T12:53:20.675 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 4/84 2026-03-10T12:53:20.677 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T12:53:20.683 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T12:53:20.687 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T12:53:20.687 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T12:53:20.687 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T12:53:20.698 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 7/84 2026-03-10T12:53:20.700 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T12:53:20.705 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T12:53:20.707 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T12:53:20.708 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T12:53:20.710 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T12:53:20.715 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T12:53:20.719 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T12:53:20.721 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T12:53:20.726 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T12:53:20.730 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T12:53:20.742 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T12:53:20.748 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T12:53:20.758 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T12:53:20.759 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T12:53:20.764 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T12:53:20.766 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T12:53:20.770 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T12:53:20.780 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T12:53:20.788 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T12:53:20.788 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T12:53:20.798 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T12:53:20.837 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T12:53:20.839 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T12:53:20.842 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T12:53:20.851 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T12:53:20.859 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T12:53:20.860 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T12:53:20.868 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 23/84 2026-03-10T12:53:20.940 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T12:53:20.969 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T12:53:20.971 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T12:53:20.976 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T12:53:20.983 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T12:53:20.988 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T12:53:20.991 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T12:53:20.994 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T12:53:20.997 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T12:53:21.000 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T12:53:21.002 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T12:53:21.003 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T12:53:21.006 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T12:53:21.007 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T12:53:21.013 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T12:53:21.017 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T12:53:21.019 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T12:53:21.021 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T12:53:21.022 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T12:53:21.025 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T12:53:21.028 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T12:53:21.030 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T12:53:21.030 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T12:53:21.033 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T12:53:21.036 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T12:53:21.047 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T12:53:21.054 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T12:53:21.058 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T12:53:21.087 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T12:53:21.101 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T12:53:21.104 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T12:53:21.107 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T12:53:21.109 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T12:53:21.120 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T12:53:21.122 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T12:53:21.123 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T12:53:21.127 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T12:53:21.130 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T12:53:21.132 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T12:53:21.134 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T12:53:21.142 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T12:53:21.142 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:21.142 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T12:53:21.142 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T12:53:21.143 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T12:53:21.143 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:21.143 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T12:53:21.154 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T12:53:21.157 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T12:53:21.157 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:21.157 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T12:53:21.157 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T12:53:21.157 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T12:53:21.157 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:21.158 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T12:53:21.170 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: rbd-mirror-2:18.2.0-0.el9.x86_64 44/84 2026-03-10T12:53:21.176 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T12:53:21.176 INFO:teuthology.orchestra.run.vm07.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:21.176 INFO:teuthology.orchestra.run.vm07.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T12:53:21.176 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:21.176 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T12:53:21.187 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T12:53:21.188 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T12:53:21.191 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T12:53:21.192 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T12:53:21.193 INFO:teuthology.orchestra.run.vm00.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T12:53:21.193 INFO:teuthology.orchestra.run.vm00.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T12:53:21.193 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:21.193 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T12:53:21.194 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T12:53:21.196 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T12:53:21.198 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T12:53:21.201 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T12:53:21.202 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 45/84 2026-03-10T12:53:21.204 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T12:53:21.205 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T12:53:21.206 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T12:53:21.209 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T12:53:21.209 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T12:53:21.212 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T12:53:21.218 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T12:53:21.224 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T12:53:21.224 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T12:53:21.227 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T12:53:21.227 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T12:53:21.230 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T12:53:21.230 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T12:53:21.233 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T12:53:21.237 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T12:53:21.240 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T12:53:21.245 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T12:53:21.249 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T12:53:21.251 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T12:53:21.254 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T12:53:21.257 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T12:53:21.257 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T12:53:21.259 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T12:53:21.262 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T12:53:21.263 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T12:53:21.267 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T12:53:21.268 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T12:53:21.271 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T12:53:21.272 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T12:53:21.275 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T12:53:21.277 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T12:53:21.283 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T12:53:21.286 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T12:53:21.290 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T12:53:21.292 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T12:53:21.297 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T12:53:21.298 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T12:53:21.301 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T12:53:21.301 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T12:53:21.303 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-10T12:53:21.305 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T12:53:21.310 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-10T12:53:21.314 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T12:53:21.314 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T12:53:21.319 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T12:53:21.334 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T12:53:21.334 INFO:teuthology.orchestra.run.vm07.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T12:53:21.334 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:21.340 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T12:53:21.341 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T12:53:21.403 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T12:53:21.406 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 71/84 2026-03-10T12:53:21.413 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 72/84 2026-03-10T12:53:21.414 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T12:53:21.414 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T12:53:21.416 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T12:53:21.434 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T12:53:21.434 INFO:teuthology.orchestra.run.vm00.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T12:53:21.434 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:21.441 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T12:53:21.458 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-base-2:18.2.0-0.el9.x86_64 74/84 2026-03-10T12:53:21.458 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /sys 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /proc 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /mnt 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /var/tmp 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /home 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /root 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout:skipping the directory /tmp 2026-03-10T12:53:27.519 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:27.529 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T12:53:27.546 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-selinux-2:18.2.0-0.el9.x86_64 75/84 2026-03-10T12:53:27.546 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /sys 2026-03-10T12:53:27.546 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /proc 2026-03-10T12:53:27.546 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /mnt 2026-03-10T12:53:27.546 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /var/tmp 2026-03-10T12:53:27.546 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /home 2026-03-10T12:53:27.546 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /root 2026-03-10T12:53:27.547 INFO:teuthology.orchestra.run.vm00.stdout:skipping the directory /tmp 2026-03-10T12:53:27.547 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:27.555 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T12:53:27.558 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T12:53:27.559 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-10T12:53:27.561 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T12:53:27.563 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T12:53:27.563 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T12:53:27.578 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T12:53:27.581 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T12:53:27.583 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-common-2:18.2.0-0.el9.x86_64 76/84 2026-03-10T12:53:27.584 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T12:53:27.586 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-ceph-common-2:18.2.0-0.el9.x86_64 77/84 2026-03-10T12:53:27.587 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T12:53:27.587 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T12:53:27.588 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T12:53:27.589 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T12:53:27.589 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T12:53:27.601 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libradosstriper1-2:18.2.0-0.el9.x86_64 80/84 2026-03-10T12:53:27.603 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T12:53:27.605 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T12:53:27.608 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T12:53:27.608 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T12:53:27.684 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T12:53:27.685 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T12:53:27.686 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T12:53:27.686 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T12:53:27.686 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T12:53:27.686 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T12:53:27.686 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T12:53:27.687 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T12:53:27.688 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T12:53:27.713 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libcephsqlite-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-base-2:18.2.0-0.el9.x86_64 1/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-common-2:18.2.0-0.el9.x86_64 2/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 3/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 4/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-2:18.2.0-0.el9.x86_64 5/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 6/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 7/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarc 8/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 9/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-mgr-rook-2:18.2.0-0.el9.noarch 10/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 11/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-selinux-2:18.2.0-0.el9.x86_64 12/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T12:53:27.714 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libcephsqlite-2:18.2.0-0.el9.x86_64 17/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libradosstriper1-2:18.2.0-0.el9.x86_64 21/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-ceph-common-2:18.2.0-0.el9.x86_64 30/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T12:53:27.716 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T12:53:27.717 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T12:53:27.764 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T12:53:27.764 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:27.764 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T12:53:27.765 INFO:teuthology.orchestra.run.vm07.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:27.766 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : rbd-mirror-2:18.2.0-0.el9.x86_64 84/84 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-base-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-grafana-dashboards-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-immutable-object-cache-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-dashboard-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-diskprediction-local-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-modules-core-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-mgr-rook-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-prometheus-alerts-2:18.2.0-0.el9.noarch 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: ceph-selinux-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: libcephsqlite-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: libradosstriper1-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.789 INFO:teuthology.orchestra.run.vm00.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-common-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T12:53:27.790 INFO:teuthology.orchestra.run.vm00.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: rbd-mirror-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:27.791 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:Remove 1 Package 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 200 k 2026-03-10T12:53:27.980 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:27.982 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:27.982 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:27.983 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:27.983 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:28.003 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:28.003 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T12:53:28.017 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout: cephadm noarch 2:18.2.0-0.el9 @ceph-noarch 200 k 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:Remove 1 Package 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 200 k 2026-03-10T12:53:28.018 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:28.020 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:28.020 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:28.021 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:28.021 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:28.039 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:28.039 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T12:53:28.145 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T12:53:28.155 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T12:53:28.194 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T12:53:28.194 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:28.194 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:28.194 INFO:teuthology.orchestra.run.vm07.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:53:28.194 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:28.194 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:28.199 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : cephadm-2:18.2.0-0.el9.noarch 1/1 2026-03-10T12:53:28.199 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:28.199 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:28.199 INFO:teuthology.orchestra.run.vm00.stdout: cephadm-2:18.2.0-0.el9.noarch 2026-03-10T12:53:28.199 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:28.199 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:28.398 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T12:53:28.398 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:28.401 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:28.402 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:28.402 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:28.402 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T12:53:28.402 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:28.406 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:28.406 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:28.406 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:28.588 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr 2026-03-10T12:53:28.591 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:28.591 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:28.591 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:28.591 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:28.601 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: ceph-mgr 2026-03-10T12:53:28.601 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:28.604 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:28.605 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:28.605 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:28.779 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T12:53:28.779 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:28.782 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:28.783 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:28.783 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:28.849 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T12:53:28.849 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:28.852 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:28.853 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:28.853 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:29.028 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T12:53:29.028 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:29.031 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:29.032 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:29.032 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:29.073 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T12:53:29.073 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:29.076 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:29.103 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:29.103 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:29.282 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-rook 2026-03-10T12:53:29.282 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:29.283 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: ceph-mgr-rook 2026-03-10T12:53:29.283 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:29.285 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:29.286 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:29.286 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:29.286 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:29.287 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:29.287 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:29.468 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T12:53:29.468 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:29.471 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:29.472 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:29.472 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:29.475 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T12:53:29.476 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:29.479 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:29.480 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:29.480 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:29.681 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:Remove 1 Package 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 2.4 M 2026-03-10T12:53:29.682 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:29.684 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:29.684 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse x86_64 2:18.2.0-0.el9 @ceph 2.4 M 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:Remove 1 Package 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 2.4 M 2026-03-10T12:53:29.688 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:29.690 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:29.690 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:29.698 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:29.698 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:29.704 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:29.705 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:29.727 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:29.735 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:29.744 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T12:53:29.749 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T12:53:29.823 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T12:53:29.827 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T12:53:29.871 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T12:53:29.871 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:29.871 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:29.871 INFO:teuthology.orchestra.run.vm00.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:29.871 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:29.871 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:29.872 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : ceph-fuse-2:18.2.0-0.el9.x86_64 1/1 2026-03-10T12:53:29.872 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:29.872 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:29.872 INFO:teuthology.orchestra.run.vm07.stdout: ceph-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:29.872 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:29.872 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout:Removing dependent packages: 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:30.082 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:30.083 INFO:teuthology.orchestra.run.vm00.stdout:Remove 2 Packages 2026-03-10T12:53:30.083 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.083 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 593 k 2026-03-10T12:53:30.083 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:30.084 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:30.084 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:30.086 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:30.086 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:30.086 INFO:teuthology.orchestra.run.vm07.stdout: Package Architecture Version Repository Size 2026-03-10T12:53:30.086 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout: librados-devel x86_64 2:18.2.0-0.el9 @ceph 456 k 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel x86_64 2:18.2.0-0.el9 @ceph 137 k 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout:Remove 2 Packages 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 593 k 2026-03-10T12:53:30.087 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:30.088 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:30.089 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:30.095 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:30.095 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:30.099 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:30.099 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:30.150 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:30.150 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:30.152 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:30.152 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:30.166 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T12:53:30.166 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T12:53:30.237 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T12:53:30.237 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:30.238 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T12:53:30.238 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs-devel-2:18.2.0-0.el9.x86_64 1/2 2026-03-10T12:53:30.289 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T12:53:30.289 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.289 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:30.289 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.289 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.289 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:30.292 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librados-devel-2:18.2.0-0.el9.x86_64 2/2 2026-03-10T12:53:30.293 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.293 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:30.293 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs-devel-2:18.2.0-0.el9.x86_64 librados-devel-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.293 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.293 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:30.490 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:Remove 3 Packages 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 2.5 M 2026-03-10T12:53:30.491 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:30.493 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:30.493 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:30.501 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout: Package Arch Version Repository Size 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs2 x86_64 2:18.2.0-0.el9 @ceph 1.8 M 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:Removing dependent packages: 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout: python3-cephfs x86_64 2:18.2.0-0.el9 @ceph 480 k 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:Removing unused dependencies: 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-argparse x86_64 2:18.2.0-0.el9 @ceph 186 k 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:Remove 3 Packages 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.502 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 2.5 M 2026-03-10T12:53:30.503 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:30.504 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:30.505 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:30.505 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:30.506 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:30.518 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:30.518 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:30.533 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:30.535 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T12:53:30.537 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T12:53:30.537 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T12:53:30.545 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:30.547 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-cephfs-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T12:53:30.549 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T12:53:30.549 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T12:53:30.601 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T12:53:30.601 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T12:53:30.601 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T12:53:30.610 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: libcephfs2-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T12:53:30.610 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libcephfs2-2:18.2.0-0.el9.x86_64 1/3 2026-03-10T12:53:30.610 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2/3 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:30.645 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-cephfs-2:18.2.0-0.el9.x86_64 3/3 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout: libcephfs2-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout: python3-ceph-argparse-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout: python3-cephfs-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:30.656 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:30.840 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: libcephfs-devel 2026-03-10T12:53:30.840 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:30.843 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:30.844 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:30.844 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:30.846 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: libcephfs-devel 2026-03-10T12:53:30.846 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:30.849 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:30.849 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:30.849 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:31.037 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:31.038 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:31.038 INFO:teuthology.orchestra.run.vm00.stdout: Package Arch Version Repository Size 2026-03-10T12:53:31.038 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:31.038 INFO:teuthology.orchestra.run.vm00.stdout:Removing: 2026-03-10T12:53:31.038 INFO:teuthology.orchestra.run.vm00.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout:Removing dependent packages: 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout:Removing unused dependencies: 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout:Transaction Summary 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout:================================================================================ 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout:Remove 21 Packages 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout:Freed space: 74 M 2026-03-10T12:53:31.039 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction check 2026-03-10T12:53:31.041 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm00.stdout:Transaction check succeeded. 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction test 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: Package Arch Version Repository Size 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout:Removing: 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: librados2 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout:Removing dependent packages: 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd x86_64 2:18.2.0-0.el9 @ceph 1.1 M 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw x86_64 2:18.2.0-0.el9 @ceph 269 k 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse x86_64 2:18.2.0-0.el9 @ceph 230 k 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd x86_64 2:18.2.0-0.el9 @ceph 490 k 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout:Removing unused dependencies: 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T12:53:31.043 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: librbd1 x86_64 2:18.2.0-0.el9 @ceph 12 M 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: librgw2 x86_64 2:18.2.0-0.el9 @ceph 15 M 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout:Transaction Summary 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout:================================================================================ 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout:Remove 21 Packages 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout:Freed space: 74 M 2026-03-10T12:53:31.044 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction check 2026-03-10T12:53:31.048 INFO:teuthology.orchestra.run.vm07.stdout:Transaction check succeeded. 2026-03-10T12:53:31.048 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction test 2026-03-10T12:53:31.066 INFO:teuthology.orchestra.run.vm00.stdout:Transaction test succeeded. 2026-03-10T12:53:31.066 INFO:teuthology.orchestra.run.vm00.stdout:Running transaction 2026-03-10T12:53:31.072 INFO:teuthology.orchestra.run.vm07.stdout:Transaction test succeeded. 2026-03-10T12:53:31.072 INFO:teuthology.orchestra.run.vm07.stdout:Running transaction 2026-03-10T12:53:31.109 INFO:teuthology.orchestra.run.vm00.stdout: Preparing : 1/1 2026-03-10T12:53:31.112 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-10T12:53:31.114 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-10T12:53:31.114 INFO:teuthology.orchestra.run.vm07.stdout: Preparing : 1/1 2026-03-10T12:53:31.116 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-10T12:53:31.116 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T12:53:31.117 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-nbd-2:18.2.0-0.el9.x86_64 1/21 2026-03-10T12:53:31.120 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : rbd-fuse-2:18.2.0-0.el9.x86_64 2/21 2026-03-10T12:53:31.122 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rgw-2:18.2.0-0.el9.x86_64 3/21 2026-03-10T12:53:31.122 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T12:53:31.130 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T12:53:31.133 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T12:53:31.135 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-10T12:53:31.137 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T12:53:31.140 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T12:53:31.140 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T12:53:31.140 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librgw2-2:18.2.0-0.el9.x86_64 4/21 2026-03-10T12:53:31.142 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T12:53:31.144 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rbd-2:18.2.0-0.el9.x86_64 6/21 2026-03-10T12:53:31.147 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : python3-rados-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T12:53:31.150 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T12:53:31.150 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T12:53:31.156 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T12:53:31.156 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T12:53:31.156 INFO:teuthology.orchestra.run.vm00.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T12:53:31.156 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:31.166 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librbd1-2:18.2.0-0.el9.x86_64 9/21 2026-03-10T12:53:31.166 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T12:53:31.166 INFO:teuthology.orchestra.run.vm07.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T12:53:31.166 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:31.171 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T12:53:31.174 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T12:53:31.176 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T12:53:31.178 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T12:53:31.181 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T12:53:31.185 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librados2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T12:53:31.186 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T12:53:31.188 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T12:53:31.189 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T12:53:31.190 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T12:53:31.192 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T12:53:31.193 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T12:53:31.194 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T12:53:31.196 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T12:53:31.197 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T12:53:31.199 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T12:53:31.200 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T12:53:31.204 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T12:53:31.207 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T12:53:31.210 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T12:53:31.212 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T12:53:31.213 INFO:teuthology.orchestra.run.vm00.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T12:53:31.214 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T12:53:31.228 INFO:teuthology.orchestra.run.vm07.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-10T12:53:31.277 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-10T12:53:31.278 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-10T12:53:31.278 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T12:53:31.278 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-10T12:53:31.278 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-10T12:53:31.278 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librados2-2:18.2.0-0.el9.x86_64 7/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librbd1-2:18.2.0-0.el9.x86_64 8/21 2026-03-10T12:53:31.306 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : librgw2-2:18.2.0-0.el9.x86_64 10/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rados-2:18.2.0-0.el9.x86_64 14/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rbd-2:18.2.0-0.el9.x86_64 15/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : python3-rgw-2:18.2.0-0.el9.x86_64 16/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-fuse-2:18.2.0-0.el9.x86_64 18/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : rbd-nbd-2:18.2.0-0.el9.x86_64 19/21 2026-03-10T12:53:31.307 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout:Removed: 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout: 2026-03-10T12:53:31.329 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout:Removed: 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: librados2-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: librbd1-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: librgw2-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: python3-rados-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: python3-rbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: python3-rgw-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: rbd-fuse-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: rbd-nbd-2:18.2.0-0.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T12:53:31.359 INFO:teuthology.orchestra.run.vm07.stdout: 2026-03-10T12:53:31.360 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:31.542 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: librbd1 2026-03-10T12:53:31.542 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:31.546 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:31.547 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:31.547 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:31.562 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: librbd1 2026-03-10T12:53:31.562 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:31.567 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:31.567 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:31.568 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:31.754 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: python3-rados 2026-03-10T12:53:31.754 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:31.757 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:31.758 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:31.758 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:31.778 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rados 2026-03-10T12:53:31.778 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:31.782 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:31.783 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:31.783 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:31.956 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: python3-rgw 2026-03-10T12:53:31.957 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:31.960 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:31.961 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:31.961 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:31.980 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rgw 2026-03-10T12:53:31.980 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:31.983 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:31.984 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:31.984 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:32.147 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: python3-cephfs 2026-03-10T12:53:32.148 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:32.151 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:32.152 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:32.152 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:32.168 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-cephfs 2026-03-10T12:53:32.168 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:32.172 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:32.173 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:32.173 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:32.335 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: python3-rbd 2026-03-10T12:53:32.336 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:32.339 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:32.340 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:32.340 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:32.357 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: python3-rbd 2026-03-10T12:53:32.357 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:32.361 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:32.362 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:32.362 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:32.524 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: rbd-fuse 2026-03-10T12:53:32.524 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:32.527 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:32.528 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:32.528 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:32.543 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-fuse 2026-03-10T12:53:32.543 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:32.546 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:32.547 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:32.547 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:32.706 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: rbd-mirror 2026-03-10T12:53:32.706 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:32.710 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:32.710 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:32.711 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:32.736 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-mirror 2026-03-10T12:53:32.736 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:32.740 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:32.740 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:32.740 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:32.891 INFO:teuthology.orchestra.run.vm00.stdout:No match for argument: rbd-nbd 2026-03-10T12:53:32.892 INFO:teuthology.orchestra.run.vm00.stderr:No packages marked for removal. 2026-03-10T12:53:32.895 INFO:teuthology.orchestra.run.vm00.stdout:Dependencies resolved. 2026-03-10T12:53:32.895 INFO:teuthology.orchestra.run.vm00.stdout:Nothing to do. 2026-03-10T12:53:32.895 INFO:teuthology.orchestra.run.vm00.stdout:Complete! 2026-03-10T12:53:32.923 INFO:teuthology.orchestra.run.vm07.stdout:No match for argument: rbd-nbd 2026-03-10T12:53:32.923 INFO:teuthology.orchestra.run.vm07.stderr:No packages marked for removal. 2026-03-10T12:53:32.924 DEBUG:teuthology.orchestra.run.vm00:> sudo yum clean all 2026-03-10T12:53:32.926 INFO:teuthology.orchestra.run.vm07.stdout:Dependencies resolved. 2026-03-10T12:53:32.927 INFO:teuthology.orchestra.run.vm07.stdout:Nothing to do. 2026-03-10T12:53:32.927 INFO:teuthology.orchestra.run.vm07.stdout:Complete! 2026-03-10T12:53:32.953 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean all 2026-03-10T12:53:33.050 INFO:teuthology.orchestra.run.vm00.stdout:56 files removed 2026-03-10T12:53:33.076 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T12:53:33.077 INFO:teuthology.orchestra.run.vm07.stdout:56 files removed 2026-03-10T12:53:33.102 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T12:53:33.103 DEBUG:teuthology.orchestra.run.vm00:> sudo yum clean expire-cache 2026-03-10T12:53:33.125 DEBUG:teuthology.orchestra.run.vm07:> sudo yum clean expire-cache 2026-03-10T12:53:33.263 INFO:teuthology.orchestra.run.vm00.stdout:Cache was expired 2026-03-10T12:53:33.263 INFO:teuthology.orchestra.run.vm00.stdout:0 files removed 2026-03-10T12:53:33.283 INFO:teuthology.orchestra.run.vm07.stdout:Cache was expired 2026-03-10T12:53:33.283 INFO:teuthology.orchestra.run.vm07.stdout:0 files removed 2026-03-10T12:53:33.288 DEBUG:teuthology.parallel:result is None 2026-03-10T12:53:33.306 DEBUG:teuthology.parallel:result is None 2026-03-10T12:53:33.306 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm00.local 2026-03-10T12:53:33.307 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm07.local 2026-03-10T12:53:33.307 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T12:53:33.307 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T12:53:33.333 DEBUG:teuthology.orchestra.run.vm00:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T12:53:33.334 DEBUG:teuthology.orchestra.run.vm07:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T12:53:33.401 DEBUG:teuthology.parallel:result is None 2026-03-10T12:53:33.405 DEBUG:teuthology.parallel:result is None 2026-03-10T12:53:33.405 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-10T12:53:33.408 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-10T12:53:33.408 DEBUG:teuthology.orchestra.run.vm00:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T12:53:33.443 DEBUG:teuthology.orchestra.run.vm07:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T12:53:33.458 INFO:teuthology.orchestra.run.vm00.stderr:bash: line 1: ntpq: command not found 2026-03-10T12:53:33.460 INFO:teuthology.orchestra.run.vm07.stderr:bash: line 1: ntpq: command not found 2026-03-10T12:53:33.534 INFO:teuthology.orchestra.run.vm00.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T12:53:33.534 INFO:teuthology.orchestra.run.vm00.stdout:=============================================================================== 2026-03-10T12:53:33.534 INFO:teuthology.orchestra.run.vm00.stdout:^* cp.hypermediaa.de 2 6 377 5 +42us[ +40us] +/- 17ms 2026-03-10T12:53:33.534 INFO:teuthology.orchestra.run.vm00.stdout:^- 217.160.19.219 2 7 377 11 +1964us[+1961us] +/- 52ms 2026-03-10T12:53:33.534 INFO:teuthology.orchestra.run.vm00.stdout:^- frank.askja.de 2 6 377 0 -2869us[-2869us] +/- 55ms 2026-03-10T12:53:33.534 INFO:teuthology.orchestra.run.vm00.stdout:^- ip217-154-182-60.pbiaas.> 2 7 377 10 +4463us[+4461us] +/- 94ms 2026-03-10T12:53:33.535 INFO:teuthology.orchestra.run.vm07.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T12:53:33.535 INFO:teuthology.orchestra.run.vm07.stdout:=============================================================================== 2026-03-10T12:53:33.535 INFO:teuthology.orchestra.run.vm07.stdout:^- 217.160.19.219 2 7 377 8 +1906us[+1923us] +/- 52ms 2026-03-10T12:53:33.535 INFO:teuthology.orchestra.run.vm07.stdout:^- ip217-154-182-60.pbiaas.> 2 6 377 9 +4112us[+4128us] +/- 94ms 2026-03-10T12:53:33.535 INFO:teuthology.orchestra.run.vm07.stdout:^- bond1-1201.fsn-lf-s02.pr> 2 6 377 5 +42us[ +42us] +/- 20ms 2026-03-10T12:53:33.535 INFO:teuthology.orchestra.run.vm07.stdout:^* cp.hypermediaa.de 2 6 377 5 +3441ns[ +20us] +/- 17ms 2026-03-10T12:53:33.535 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-10T12:53:33.538 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-10T12:53:33.538 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-10T12:53:33.541 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-10T12:53:33.544 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-10T12:53:33.551 INFO:teuthology.task.internal:Duration was 1482.962221 seconds 2026-03-10T12:53:33.551 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-10T12:53:33.554 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-10T12:53:33.554 DEBUG:teuthology.orchestra.run.vm00:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T12:53:33.576 DEBUG:teuthology.orchestra.run.vm07:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T12:53:33.618 INFO:teuthology.orchestra.run.vm00.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T12:53:33.622 INFO:teuthology.orchestra.run.vm07.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T12:53:34.016 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-10T12:53:34.016 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm00.local 2026-03-10T12:53:34.016 DEBUG:teuthology.orchestra.run.vm00:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T12:53:34.043 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm07.local 2026-03-10T12:53:34.043 DEBUG:teuthology.orchestra.run.vm07:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T12:53:34.068 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-10T12:53:34.068 DEBUG:teuthology.orchestra.run.vm00:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T12:53:34.085 DEBUG:teuthology.orchestra.run.vm07:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T12:53:34.883 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-10T12:53:34.883 DEBUG:teuthology.orchestra.run.vm00:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T12:53:34.885 DEBUG:teuthology.orchestra.run.vm07:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T12:53:34.907 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T12:53:34.907 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T12:53:34.907 INFO:teuthology.orchestra.run.vm00.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T12:53:34.908 INFO:teuthology.orchestra.run.vm00.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T12:53:34.908 INFO:teuthology.orchestra.run.vm00.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T12:53:34.909 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T12:53:34.909 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T12:53:34.909 INFO:teuthology.orchestra.run.vm07.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T12:53:34.909 INFO:teuthology.orchestra.run.vm07.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T12:53:34.910 INFO:teuthology.orchestra.run.vm07.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0%/home/ubuntu/cephtest/archive/syslog/journalctl.log: -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T12:53:35.062 INFO:teuthology.orchestra.run.vm07.stderr: 97.9% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T12:53:35.097 INFO:teuthology.orchestra.run.vm00.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 97.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T12:53:35.099 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-10T12:53:35.103 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-10T12:53:35.103 DEBUG:teuthology.orchestra.run.vm00:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T12:53:35.168 DEBUG:teuthology.orchestra.run.vm07:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T12:53:35.232 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-10T12:53:35.235 DEBUG:teuthology.orchestra.run.vm00:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T12:53:35.237 DEBUG:teuthology.orchestra.run.vm07:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T12:53:35.261 INFO:teuthology.orchestra.run.vm00.stdout:kernel.core_pattern = core 2026-03-10T12:53:35.303 INFO:teuthology.orchestra.run.vm07.stdout:kernel.core_pattern = core 2026-03-10T12:53:35.318 DEBUG:teuthology.orchestra.run.vm00:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T12:53:35.334 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:53:35.334 DEBUG:teuthology.orchestra.run.vm07:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T12:53:35.373 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:53:35.374 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-10T12:53:35.377 INFO:teuthology.task.internal:Transferring archived files... 2026-03-10T12:53:35.377 DEBUG:teuthology.misc:Transferring archived files from vm00:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029/remote/vm00 2026-03-10T12:53:35.377 DEBUG:teuthology.orchestra.run.vm00:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T12:53:35.413 DEBUG:teuthology.misc:Transferring archived files from vm07:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/1029/remote/vm07 2026-03-10T12:53:35.413 DEBUG:teuthology.orchestra.run.vm07:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T12:53:35.449 INFO:teuthology.task.internal:Removing archive directory... 2026-03-10T12:53:35.449 DEBUG:teuthology.orchestra.run.vm00:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T12:53:35.452 DEBUG:teuthology.orchestra.run.vm07:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T12:53:35.507 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-10T12:53:35.511 INFO:teuthology.task.internal:Not uploading archives. 2026-03-10T12:53:35.511 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-10T12:53:35.514 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-10T12:53:35.514 DEBUG:teuthology.orchestra.run.vm00:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T12:53:35.516 DEBUG:teuthology.orchestra.run.vm07:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T12:53:35.532 INFO:teuthology.orchestra.run.vm00.stdout: 8532145 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 10 12:53 /home/ubuntu/cephtest 2026-03-10T12:53:35.532 INFO:teuthology.orchestra.run.vm00.stdout: 67229191 0 d--------- 2 ubuntu ubuntu 6 Mar 10 12:35 /home/ubuntu/cephtest/mnt.0 2026-03-10T12:53:35.532 INFO:teuthology.orchestra.run.vm00.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-10T12:53:35.532 INFO:teuthology.orchestra.run.vm00.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-10T12:53:35.550 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T12:53:35.550 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm00 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-10T12:53:35.550 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-10T12:53:35.553 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T12:53:35.554 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/no kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.0} 1-volume/{0-create 1-ranks/2 2-allow_standby_replay/no 3-inline/no 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1482.9622211456299 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-10T12:53:35.554 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T12:53:35.576 INFO:teuthology.run:FAIL